For those of you who may not be familiar with it, you have the ability to set up a federated identity relationship between your local Active Directory and your Office 365 authentication. In this way, your people, simply logging in with their local domain accounts, are able to be automatically authenticated against Office 365, because Office 365’s use of Windows Azure Active Directory, and you have the ability to set up an ADFS relationship between the authentication in Office 365 and your company’s Active Directory domain. So, you manage one set of user accounts locally, just like you always have, and Office 365 can grant access based on the “claim” that the user account is known and valid. Your client (laptop, tablet, or other mobile device) gets the claim from your Active Directory (preferably by accessing an ADFS Proxy in your company’s perimeter network), and then passes that acquired claim up to Office 365.

In short – Your users are either already authenticated, or just have to set up the authentication parameters one time for their use of the cloud-based services such as Office 365, Windows InTune, or other such services.

So this is great. No matter where I am, or where my people are in the world, they can use their domain account and local profile and just open up Outlook or access the cloud-based SharePoint or their SkyDrive Pro storage, and they’re authenticated. And even if they’re using a non-domain machine or a mobile device, they’ll use the same company credentials they’re already familiar with to connect to their company e-mail or other resources.

The Problem: I’m outside the office, and the connection to my ADFS Proxy is unavailable. What happens then?

“Yeah.. what happens then?!”

I’ll tell you what happens then. It’s a problem, because, your device needs to get to the ADFS (STS) proxy to verify that you are who you say you are, and to give you the claim token that is passed up to Office 365. If it is unavailable, then your users can’t be trusted by their cloud-based resources. Outlook won’t be able to connect to the Office 365 Exchange server. Yeah.. a big problem. That’s why so much documentation (and even the promise of Microsoft support) is devoted to the configuration of a load-balanced farm of servers to keep that proxy service high-performing and highly available.

Granted, it’s an even bigger problem for the people who are sitting in that office. Presumably they can’t access the Internet at all. So assuming that your company, like most others, is becoming more and more dependent upon that Internet connection being live in order to get their work done, you’ve probably already addressed alternatives. And many people nowadays have multiple personal paths to the Internet that would restore some amount of personal access. But that doesn’t fix their problem of not being able to get Outlook to connect.

The Solution: Put a copy of your domain in “the cloud”!

Think about it: If I have a replicated copy of my domain up on a virtual machine running in Windows Azure, then that domain controller can also serve as the trusted location where Office 365 and the ADFS trust can be connected!

“Sounds like an interesting idea. But what if I don’t want a copy of my domain up in the cloud?”

Then another option would be to Windows Azure virtual machines as your ADFS Proxies. Basically think of Windows Azure as an alternative to (or an extension of) your Perimeter network (DMZ). Of course in this case if the availability of your home datacenter goes down, you’re still going to have authentication issues.

Here’s a thought: Do both! Have an AD site up in Windows Azure, with a secured/authenticated/encrypted connection back to the corporate network. And then build an externally available, load-balanced set of machines in a separate “perimeter” network in Windows Azure as well. In this way, even if your connection back to your main office and the local AD DCs goes down, you still have AD authentication available “locally” within your Windows Azure subscription.

Those of you who are familiar with System Center 2012, and in particular the Configuration Manager component, are already familiar with the concept of Distribution Points. But for those of you who are new to it, here is a very brief definition that will make it all clear: Ahem… : A Distribution Point is a point from which things are distributed.

“Oh yeah, crystal-clear, Kevin.”

You’re welcome.

It’s really not complicated (or at least, the idea isn’t complicated). In a large organization, with centralized IT Management, and perhaps with many locations around the globe, it’s important to be able to define locations from which those far-flung users are getting their software or updates from. So System Center 2012 Configuration Manager has

But consider this: What if I were able to use Windows Azure – a cloud-based, highly available and globally scalable service - to act as my distribution points?

It is further documented at TechNet here: Install Cloud-Based Distribution Points in Windows Azure. NOTE: The cloud-based distribution point is going to be used deployments other than Microsoft updates. Updates are already available “in the cloud” through Microsoft Update, and it’s just as easy to configure your company’s devices to use Microsoft for operating system and application updates.

For the rest of this article, I’ll break the task of installing and testing this into these steps:

Install System Center 2012 SP1 Configuration Manager

Certificates

Create the Distribution Point

Considerations for Client Access

and we’ll wrap things up with a Summary

Install System Center 2012 SP1 Configuration Manager

To test creating a cloud-based distribution point, I installed the evaluation of System Center 2012 SP1 Configuration Manager on a local virtual machine in my test domain. My installation was a new Configuration Manager standalone primary site:

(Prior to this installation I had installed the evaluation of SQL Server 2012 on the same machine, but I could have used the “typical installation” option to also install SQL Express to use as the local database. For a good write-up on installing a test machine like this as a Windows Azure Virtual Machine, read THIS EXCELLENT ARTICLE by Keith Mayer.)

After installing and configuring the prerequisites, I also just took the defaults from that point on.

Certificates

Of course to make an authenticated, secured (SSL) connection between your Configuration Manager installation and your Windows Azure subscription, you’re going to need to generate use a management certificate. And like most situations where we’re just trying new capabilities out that require certificates, there is a simple way, and there is a recommended-for-production way. The recommended-for-production way is to use a PKI, and use the templates and certificate types for Server and Client authentication as described in this document: PKI Certificate Requirements for Configuration Manager

---UPDATE: There was at least one company who used my instructions below, pulling a cert from their site server, in their production installation, and they had problems that required a call to Microsoft support. So I want to stress again that my instructions that follow are ONLY for TESTING and proof-of-concept, and should NOT BE USED IN PRODUCTION ENVIRONMENT. ---

For my purposes, just to get the distribution point created and the trust established between my local Configuration Manager site server and the Azure subscription, I exported both a .CER and a .PFX file from the local machine certificate that was created for my SCCM server and its relationship with SQL Server. It was already of the proper type (from the proper template), so worked fine for my test. Here’s how I did that…

Open MMC (On the start screen, type MMC and run MMC.EXE).

On the File Menu, choose Add/Remove Snap-in… then in the left-hand list, select Certificates, and click Add.

When prompted for what your want to manage certificates for, select Computer Account, click Next, and then click Finish. Click OK to close the Add/Remove Snap-ins form.

Now, in the MMC, navigate to Certificates (Local Computer) –> Personal –> Certificates. You should find a Server Authentication certificate there with the name of your server in the Issued To column.

We’re going to do two export operations on this certificate; one to get a .cer file that we’ll upload to Windows Azure, and the other to create a password-protected .pfx file that we’ll use to configure the connection from our local Configuration Manager to create the cloud-based distribution point.

On the File to Export page, browse to and select a file system location that you can easily remember and navigate to later; either your desktop or documents folder, and give your file a name. Make sure it’s saving as a *.cer file. Click Save,then click Next.

On the Completing the File Export Wizard page, click Finish. Click OK on the resulting “The export was successful.” message.

On the Security page, check the check-box next to Password, and then enter a password in the Password and Confirm password fields. Click Next.

On the File to Export page, browse to and select a file system location that you can easily remember and navigate to later; either your desktop or documents folder, and give your file a name. Make sure it’s saving as a *.pfx file. Click Save, then click Next.

On the Completing the File Export Wizard page, click Finish. Click OK on the resulting “The export was successful.” message.

You can now close the MMC. We’re done with it. We have the exports we need.

Login to your Windows Azure subscription, and at the bottom of the list on the left, select Settings.

At the bottom of the browser window, click the UPLOAD icon.

In the Upload a management certificate form, click Browse for a file, browse for and select the .cer file that you exported earlier, and then click the check-box at the bottom right.

You will now see a job running message that says “Uploading…” followed shortly by a “Successfully uploaded..” message, and your certificate now shows up in the Management Certificates list.

Before we move over to Configuration Manager, this is a good opportunity to copy and then paste (maybe in Notepad) the value in the SUBSCRIPTION ID column for your certificate. It is a very long value that we’ll need later when we’re configuring Configuration Manager.

And there you go. The certificate for our test is in place. Now we’re ready to create and connect Configuration Manager to a new cloud-based distribution point.

Create the Distribution Point

Open up Configuration Manager.

On the lower-left, click Administration, and then in the section above under Overview, expand Hierarchy Configuration and select Cloud. (Yes, Cloud!)

Right-Click on Cloud and then click on Create Cloud Distribution Point.

On the Specify details for this cloud service page, this is where we’ll use the copy/pasted Subscription ID we saved, as well as the .pfx file that we exported earlier. In the Subscription ID: field, pasted the subscription ID you saved.

Next to the Management Certificate field, click Browse. Navigate to and select the .pfx file that you saved earlier. After you select it and click Open, you'll be prompted for the password you used to protect it. Enter the password and click OK.

Click Next.

On the Specify additional details for this distribution point form, note the various regions of the world where you could put your distribution point. For your Certificate file, click Browse and again navigate to and select your .pfx file, entering the password. Notice that this also fills in the Service FQDN value that was found in the certificate. Click Next.

On the Configure alerts for this distribution point page, make note of the different alert thresholds that can be set. We’ll leave the defaults and click Next.

On the Summary page, review the Details, and then click Next.

If all goes as it should, you should quickly see a successful completion. Click Close.

And now you’ll see your new Cloud Distribution Point listed in the main part of the page, that will have a status of Provisioning. Eventually that status will change to Ready.

Go back to your browser and to your Windows Azure administration page. Navigate to the Cloud Services section on the left. It will take several minutes but eventually you will see a new cloud service with a long-and-ugly name show up.

Note toward the right that you have a value in the URL column. That value (which is essentially <your service name>.cloudapp.net) is the DNS name that your clients will use for connecting to the distribution point and getting their software.

Below Cloud Services, find and click on Storage. Here you’ll see that a new storage account has been created with the same ugly name that the new cloud service has.

As I’m sure you’ve guessed, this is the storage account that will hold all software and other items that you’ve deployed to your distribution point.

And now you’re ready to distribute some software to your new distribution point in the clouds. Try it out by distributing the Configuration Manager Client Package up to the your distribution point.

In the Add Distribution Points list of available distribution points, check the box next to your cloud-based distribution point. Click OK, and then click Next.

On the Summary page, click Next. The distribution should complete successfully, so click Close.

Now let’s see if that package is being distributed.

In Configuration Manager, on the bottom left, click and open the Monitoring section. In the section above, under Overview –> Distribution Status click Content Status.

In the detailspane, select your Configuration Manager Client Package, and note below that the completion statistics show that the distribution is In Progress. Eventually that yellow circle will turn to green when the distribution is complete.

Another way to show that you’ve succeeded is to go back to your Windows Azure administration page, click on Storage, click on the your storage account, and select the Containers tab. You’ll see new containers being created that you can drill-down into and actually see the files and their URLs.

System Center 2012 SP1 Configuration Manager adds the ability to configure and use a Windows Azure-base service to hose a Distribution Point as what is now known as a “Cloud-Based Distribution Point”. Once certificates are in place, the actual creation of the distribution point in your Windows Azure subscription is fairly straight-forward, and for distributing content, it becomes just another option when choosing where to distribute your deployed applications and packages.

---

What do you think? Are the wheels turning as you’re now envisioning all of the flexibility that this new capability will give you? If not, you’d better read this article again.

One of the first questions that IT Pros have when considering using Windows Azure to host some (or all) of their company’s computing infrastructure is, “How do I move an existing virtual machine to the cloud?”

“I was just about to ask that!”

I bet you were.

So, how does one take a virtual machine and duplicate it up “in the cloud”? In this article – a part of our 20 Key Scenarios with Windows Azure Infrastructure Services series - I am going outline the process for you, and hopefully leave you with some useful resources and other things to consider.

The steps are pretty basic:

Create a place to store your hard disk in Windows Azure

Prepare your virtual hard disk

Upload your virtual hard disk

Create your machine in Windows Azure

This is makes a pretty good outline for what we need to cover here, so let’s go with it. But before we do, I need to encourage you to set up your local machine’s PowerShell to authenticate with, connect to, allow you to manage your Windows Azure subscription and objects using PowerShell. To do this, read and follow the steps in my article: Configuring PowerShell for Windows Azure.

NOTE: The name of your storage account must be a 3-24 letter name made up of lowercase letters and numbers only. It must also be globally unique, because it will be used in URLs such as “https://krtempstorageaccount.blob.core.windows.net/”

Prepare your virtual hard disk

If you are going to upload a machine to the cloud, your hard disk file has to be configured a certain way. It must be a .VHD file (not a VHDX file, must be between 20MB and 2TB in size, and must be a fixed-size virtual disk

Also, if you’re going to use this hard disk as an image from which multiple machines will be created, then you’ll want to make sure to either sysprep the machine first before shutting down and uploading the hard disk, or creating the machine in Windows Azure and doing the sysprep and then “capture” the stopped machine as an image.

Upload your virtual hard disk

Now that you have your hard disk ready to upload, you have a couple of ways of getting it up to the cloud:

Note: the service name must be unique. You can verify that a name is unique by using the Test-AzureName cmdlet prior to actually creating new cloud services or cloud storage.

Other Considerations

“Hey Kevin – Can I upload a new-style .VHDX file into Windows Azure?”

Not at this time, no. Currently Windows Azure only supports fixed-size .VHD files as the basis for storage disks or virtual machines running in Windows Azure.

“You mention ‘storage disks’. Does this mean I can just create .VHDs and populate them with stuff, and then use them as attached disks to Windows Azure virtual machines?”

Absolutely. Once you have that disk in a Windows Azure storage account, you’ll see it as a disk you can either use as a machine’s OS disk, or attach to an existing VM as an attached disk. (But it can’t be both at the same time!)

“Hey Kevin… I would like to use Windows Azure as a location where I set up a SQL Server database mirror. Is this something I can do?”

Absolutely, yes! In fact, it doesn’t even require anything fancy as far as setting up special connectors or site-to-site VPN in order to try this out (though in production you should consider a more secured connection).

This tutorial shows you how to implement SQL Server database mirroring for disaster recovery end-to-end in a hybrid IT environment. In this configuration, the principal database server runs on-premise and the mirror database server runs in Windows Azure. You can implement this scenario without a VPN connection between Windows Azure and your on-premise network if you use server certificates. Furthermore, it is possible for your principal database server to run behind a NAT device on-premise if you forward the appropriate ports on your NAT device to the server.

To try this out, I created a new Windows Server 2012 (using the free evaluation) server as a Hyper-V virtual machine named SQLOnPrem, and had a downloaded .ISO of the SQL Server 2012 evaluation installation connected to it (in the virtual DVD drive). I didn’t even have to manually install SQL on that box, since the tutorial actually gives you the scripts to do all the heavy lifting. And as long as you can redirect port 5022 through your NAT/firewall to your local VM, it’s really easy to set up.

The tutorial provides PowerShell code to build the Windows Azure based virtual machine, which is an evaluation SQL Server 2012 running on Windows Server 2008 R2. Note however that, like the SQL Server AlwaysOn High Availability tutorial I discussed and used last week, the PowerShell code in the tutorial is a little out-of-date and won’t work as provided. In particular the PowerShell command for the creation of the Azure-based “SQLinCloud” server should be:

Notice how this is essentially one-line of PowerShell that creates a VM in a specific Affinity Group, creates the Azure Service, connects the Machine to your Azure Network and subnet, sets up two TCP endpoints! Amazing!

Another fix needed to be made in the script to configure the firewall on the SQLonPrem machine (your local SQL Server). It should read:

I’ve put my copy of the corrected PowerShell that I used to create the Windows Azure VM up on SkyDrive for you. GET IT HERE.

Other than that, it works great, and is a very useful tutorial. In the end you’ll have a database installed locally, and a mirror up in your server running under Windows Azure.

"Oh yeah? Prove it?”

(sigh) Okay… here’s my local SQL Server…

…and here’s my SQL Server in the cloud.

For those of you not familiar with SQL Server Database Mirroring, I recommend you look at this page: Database Mirroring (SQL Server). Important note: Database Mirroring is a feature that is being phased out in favor of the newer SQL Server 2012 AlwaysOn Availability Groups for the purpose of High Availability (See my blog post HERE). But.. if you want to mirror, you can. And now you can mirror to the cloud.

---

What do you think? Are you considering using the cloud and Windows Azure as part of your storage and disaster recovery solutions? Give us your comments or questions!

As many of you may already know, this month my Microsoft US DPE Central Region teammates (Matt Hester, Brian Lewis, and Keith Mayer) and I are blogging about “20 Key Scenarios with Windows Azure Infrastructure Services”. Throughout this month I’ll be adding links, making any updates or changes. So keep checking back for more, or keep watching my blog. As before, I plan on adding my own blog post every day, either introducing the day’s topic, or providing the article myself.

UPDATE: It’s done! Below is the complete list of series articles!

HINTS:

You might want to get your FREE TRIAL of Windows Azure, if you haven’t already,

“But… I want my SQL Servers running in Windows Azure. And I don’t have shared storage in “the cloud”. But I still need the benefits of clustered high availability.”

Wow.. you sure want a lot. Thankfully, I have a great solution. SQL Server 2012 has a new high availability feature called SQL Server 2012 AlwaysOn. With AlwaysOn configured, you can have two or more (up to 5) complete copies of a database that are maintained by each of the participating SQL Servers. Each SQL Server could be hosting the database on it’s local storage, or at some other location (SMB 3.0 file share? You bet!), but from the perspective of SQL Server 2012 AlwaysOn, it’s a copy of the database served up by each of the SQL Servers. The SQL Servers in turn are nodes in a Windows Server Failover Clustering cluster.

You’re in luck! There’s are two very detailed step-by-step walkthroughs on how to build an example up in Windows Azure. One describes how to build the machines manually, and the other actually uses PowerShell and other command line scripting to configure everything. In the end you have 4 servers:

ContosoDC (Domain Controller)

ContosoQuorum (Cluster Quorum Server)

ContosoSQL1 (SQL Server), and

ContosoSQL2 (SQL Server)

The examples also walk you through how to create the required Azure networks, the Availability Group for your machines, the configuration of the cluster, and the creating and configuration of a database that is highly available.

For my own learning, I used the scripted option. Unfortunately, some of the commands listed needed to be tweaked a little bit to work with the most recent version of the Windows Azure Powershell module, and there were several typos in the text. But once I worked out the kinks, I was able to consistently create the machines and configure the cluster and the SQL Servers.

“Prove it!”

Okay…

Here are my servers in Windows Azure…

And here’s looking at my Availability Group “AG1” on one of the SQL Servers…

And looking in the Failover Cluster Manager I can see the clustered “AG1” service, currently being served by “CONTOSOSQL1”.

And I can manually move that clustered service…

…so that it’s failed-over and now being served by “contososql2”.

TRY IT: I’ve created and uploaded a .txt file containing my modified snippets of PowerShell used to configure my Windows Azure subscription and to create the storage, the networking and the four Virtual Machines. YOU CAN FIND IT HERE. I strongly suggest that you change the file extension to .PS1, and open it with the PowerShell Integrated Scripting Environment. Read the notes at the top, and only run the specific segments at the time they’re asked for in the tutorial.

So, you’ve got your Windows Azure subscription all set up (and if you don’t you can set up a FREE TRIAL HERE), and now you want to use PowerShell to work with your Windows Azure-based resources. In case you weren’t aware, Microsoft provides a Windows Azure PowerShell module for scripted management of Windows Azure services.

“Yes! That’s what I want, Kevin!”

Okay then… here’s how you do it:

If you’re running Windows 7 w/SP1, Windows Server 2008 R2 w/SP1, or Windows Server 2008 w/SP2, you’ll need the most recent version of the Windows Management Framework installed. This includes updates to Windows Remote Management (WinRM), Windows Management Instrumentation (WMI), and, importantly, Windows PowerShell 3.0. If you’re running Windows 8 or Windows Server 2012 or newer, then you’re all set with the newest version of PowerShell.

Right-click on Windows PowerShell in your Start Menu or Start Screen and choose Run As Administrator.

Set the PowerShell Execution Policy for scripts by running the following command at the PowerShell command prompt:

PS C:\> Set-ExecutionPolicy RemoteSigned

Import the Windows Azure PowerShell module and supporting cmdlets by running the following command at the PowerShell command prompt:

PS C:\> Import-Module Azure

Now run the following command to connect your system to your Azure subscription:

PS C:\> Add-AzureAccount

When prompted to Sign in, sign in with the same Microsoft account (LiveID) or organization account credentials that you use for your Windows Azure subscription.

Confirm that your Windows Azure subscription has been properly connected via PowerShell by running the following cmdlet:

PS C:\> Get-AzureSubscription

This should list all of the subscriptions that you currently are able to administer with your current login.

You can set the default Windows Azure subscription for your session by running the cmdlets below. Be sure to substitute your subscription name that was listed in the Get-AzureSubscription cmdlet output.

And there you have it! You’ve automatically installed the proper certificate locally so that your PowerShell session will be authenticated and have a secured interaction with your Windows Azure subscription.

Perhaps you’re not aware of this, but Windows Azure supports web sites. In fact, it makes it very easy to create and run any web site – from the most simple to the most high-end, complex web application scaled for global reach.

If you are a developer, or if you are a member of an IT organization which supports a software development organization, then you know that sometimes you need more hardware than you can realistically afford. And if you do buy the hardware, it becomes obsolete all-too-soon, or just sits there idle because the need for it was short-lived.

“One of the roadblocks to building a Windows 8 new interface application is that you need Windows 8 or Server 2012 to develop on. It just so happens that Microsoft has this great virtual server environment, called Windows Azure, where we can remote into a 2012 Server and build Windows 8 applications.”

So now that we’ve been talking about building virtual machines in Windows Azure, and extending your datacenter into “the cloud”, we need to actually make the connection. But how does one actually build the Windows Azure Network, and then that subnet our local subnets?

If you can extend your datacenter into the cloud using a secured network path and Windows Azure, it sure should follow that you can take advantage of cloud-based virtual machines to perform some of your more common infrastructure tasks. For example: File Servers.

“Yeah, I get it, Kevin. You have a secured network connection, and you can just treat it like any other server.”

Ah.. but what if you want to use something like WebDAV for storage on a server instead? (WebDAV = Web-based Distributed Authoring and Versioning.) Perhaps you want to map a drive to a WebDAV connection to a cloud-based server? That doesn’t require anything other than HTTP to access Windows Server-based storage.

This this episode of TechNet Radio: Cloud Innovators, I welcome Managing Director at Cognizant Technology Solutions, Ramesh Panuganty to the show. We discuss their product Cloud360, an enterprise-class cloud management platform that helps you quickly and cost-effectively deploy, manage and operate applications on private, public and hybrid clouds. Tune in as they discuss its features and benefits as well as how you can get started with this powerful cloud management service. Click here to learn more about Cloud360!

If you want to catch-up on the complete series, or save the list as a browser favorite for future reference, you will find the full series here: http://aka.ms/2013may.

And if you’re interested in catching up on ALL of the series’ that our team did over the past year, you’ll find the list here: http://aka.ms/FY13Series

“Are you guys doing anymore series’ anytime soon?”

As a matter of fact, yes! During the month of June, my Central Region counterparts and I will be blogging about our impressions of TechEd 2013. We’ll kick it off this Sunday, June 2nd – the day I leave for TechEd. (I can’t wait!)

You read that title correctly.. We know that many of you are still using VMware. And just about as many of you are already migrating or considering the migration of some (or all) of your virtualization to Microsoft’s Hyper-V. And if you’re already going that route, perhaps you should also investigate moving some (or all) of those virtual machines up to Windows Azure. (FREE TRIAL HERE).

The demand has been so great for more information about using Windows Azure Infrastructure Services as an extension of your datacenter and IT Operations, that we’ve decided to schedule a few more FREE events to close out the end of our fiscal year here in Central Region Microsoft US DPE. The four of us (Brian Lewis, Keith Mayer, Matt Hester and I) are holding events at four locations at the end of this month (and one in mid-June):

Southfield, MI (Detroit), May 21, 2013, and

Irving, TX (Dallas), May 30, 2013

Edina, MN (Minneapolis), May 31, 2013, and

Downers Grove, IL (Chicago), June 13, 2013

Each of these days will be made up of two half-day events on two different topics, giving you threedifferent registration options.

“Huh?”

You can register for the morning session. You can register for the afternoon session. OR you can register for the full-day.

“Cool. What are the topics?”

I’m glad you asked…

Morning Topic: Using Windows Azure as a server and datacenter backup solution – Windows Azure Backup. We’ll talk briefly about, and then walk through a hands-on example (you will follow along and do this on your own computers) of enabling, configuring, and leveraging Windows Azure Backup.

Afternoon Topic: Building a Microsoft SharePoint 2013 lab entirely in Windows Azure. Again, at the end of this you will have a Microsoft SharePoint 2013 lab configured in your own cloud based lab in Windows Azure.

As I said, you can register for either one, or register for the full day. PLEASE just register one time, so that we can get an accurate estimate of attendance.

Space is limited, so register early. And make sure you heed the requirements prior to coming. You’ll need at least some Internet and Remote-Desktop capable hardware, and a Windows Azure subscription. (Get a free 90-Day Trial here)

What is the plural of Series? (And how is it pronounced?) Well.. that’s not so important. What’s important is that we – the Technology Evangelists of Microsoft DPE in the U.S. have been busy busy bloggers these past several months. You’ve seen many articles, and many links to articles and resources, and we sincerely hope that you’ve found them useful.

In the interest of summarizing our efforts through June (which not coincidentally, is through the end of Microsoft’s “fiscal year 2013”), I’ve created this post as a place to send people who are interested in drilling down to all of our various monthly topics.

In part 6 of our Windows 8 Tips and Tricks series, Principal Technical Account Manager Lex Thomas and I show off some of the new kinds of customization options found in Windows 8.

There’s never been a better time to build for Windows! Join the App Builder Program and learn about the Windows ecosystem opportunity, design and monetization tips and partner development frameworks. Resources:

For today’s article in our “20 Key Scenarios with Windows Azure Infrastructure Services”, my friend Matt Hester gives us a detailed lab assignment which, in the end, will result in you having a load-balanced web application supported by a SQL Server database; all without having to use any of your local computer’s resources. Yep, we’re doing it all in “the cloud”.

Make Windows Azure Your Datacenter! This Jump Start will help you understand how you can use Windows Azure Infrastructure Services, such as Virtual Machines and Virtual Networks, to migrate, extend, run, manage and monitor common workloads in the cloud. The jumpstart will be led by Microsoft Lead Azure Technical Evangelist David Tesar and Azure Group Technical Product Manager David Aiken. Get ready for a live online interactive experience highlighting the latest and greatest via numerous scenarios and demos all while answering questions from the audience.