Now anyone using Windows Azure Storage can sleep without any tension, by knowing that their data in Windows Azure Storage (blob, table and queue) have another safe copy replicated within the same region. I am awake at 10:20 PM for other reasons however I am taking sigh of relief by knowing that I have a copy of my data is replicated..

The Service Bus managed API leverages the bi-directional TCP protocol for improved performance over REST/HTTP.

Management

REST over HTTP

REST over HTTP

Messaging Fundamentals

Ordering Guarantees

No

First-In-First-Out (FIFO)

Note: guaranteed FIFO requires the use of sessions.

Message processing guarantees

At-Least-Once (ALO)

At Least-Once (ALO)Exactly-Once (EO)

The Service Bus generally supports the ALO guarantee; however EO can be supported by using SessionState to store application state and using transactions to atomically receive messages and update the SessionState. The AppFabric workflow uses this technique to provide EO processing guarantees.

Peek Lock

Yes Visibility timeout: default=30s; max=2h

Yes Lock timeout: default=30s; max=5m

Windows Azure queues offer a visibility timeout to be set on each receive operation, while Service Bus lock timeouts are set per entity.

Duplicate Detection

No

Yes, send-side duplicate detection

The Service Bus will remove duplicate messages sent to a queue/topic (based on MessageId).

Transactions

No

Partial

The Service Bus supports local transactions involving a single entity (and its children). Transactions can also include updates to SessionState.

The profound effects of the Consumerization of IT (CoIT) is blurring the lines between consumers and the enterprise. The fact that virtually every type of mobile device is now a candidate to make employees productive means that cross-platform, enabling technologies are a must. Luckily, Microsoft has brought the power to synchronize data with either SQL Server on-premise or SQL Azure in the cloud to the world of mobility. If you’ve ever synched the music on your iPhone with iTunes, the calendar on your Android device with Gmail, or the Outlook email on your Windows Phone with Exchange, then you understand the importance of sync. In my experience architecting and building enterprise mobile apps for the world’s largest organizations over the last decade, data sync has always been a critical ingredient.

The new Sync Framework Toolkit found on MSDN builds on the existing Sync Framework 2.1′s ability to create disconnected applications, making it easier to expose data for synchronization to apps running on any client platform. Where Sync Framework 2.1 required clients to be based on Windows, this free toolkit allows other Microsoft platforms to be used for offline clients such as Silverlight, Windows Phone 7, Windows Mobile, Windows Embedded Handheld, and new Windows Slates. Additionally, non-Microsoft platforms such as iPhones, iPads, Android phones and tablets, Blackberries and browsers supporting HTML5 are all first-class sync citizens. The secret is that we no longer require the installation of the Sync Framework runtime on client devices. When coupled with use of an open protocol like OData for data transport, no platform or programming language is prevented from synchronizing data with our on-premise and cloud databases. When the data arrives on your device, you can serialize it as JSON, or insert it into SQL Server Compact or SQLite depending on your platform preferences.

The Sync Framework Toolkit provides all the features enabled by theSync Framework 4.0 October 2010 CTP. We are releasing the toolkit as source code samples on MSDN with the source code utilizing Sync Framework 2.1. Source code provides the flexibility to customize or extend the capabilities we have provided to suit your specific requirements. The client-side source code in the package is released under the Apache 2.0 license and the server-side source code under the MS-LPL license. The Sync Framework 2.1 is fully supported by Microsoft and the mobile-enabling source code is yours to use, build upon, and support for the apps you create.

Now some of you might be wondering why you would use a sync technology to move data rather than SOAP or REST web services. The reason has to do with performance and bandwidth efficiency. Using SOA, one would retrieve all the data needed to the device in order to see what has changed in SQL Server. The same goes for uploading data. Using the Sync Framework Toolkit, only the changes, or deltas, are transmitted over the air. The boosts performance and reduces bandwidth usage which saves time and money in a world of congested mobile data networks with capped mobile data plans. You also get a feature called batching, which breaks up the data sent over wireless networks into manageable pieces. This not only prevents you from blowing out your limited bandwidth, but it also keeps you from using too much RAM memory both on the server and your memory-constrained mobile device. When combined with conflict resolution and advanced filtering, I’m sold!

I think you’ll find the Sync Framework Toolkit to be an immensely valuable component of your MEAP solutions for the enterprise as well as the ones you build for consumers.

So far, I’ve written several Window Phone 7 apps, including the OData and Windows Phone quickstart, a Northwind-based app for my MVVM walkthrough, and a couple of others—mostly consuming (as you might guess) OData. Since I created these apps to supplement documentation, they have never been much to look at—I never intended to publish them to the Marketplace. However, now that I have my Samsung Focus unlocked for development, I figured it was time to try to create a “real-world” app that I can a) get certified for the Marketplace and b) I can be proud of having on my mom’s phone.

Fortunately, I’ve been working on an Azure-based OData content service project that can integrate very nicely with a Window Phone app, so for the past few weeks I have been coding and learning what makes these apps look so cool. I’ll try to share some of what I’ve learned in this post.

Coolest Windows Phone Apps (IMHO)

Just to give you some idea of where I can coming from, here’s a partial list of some of the cooler-looking apps that I’ve seen on the Windows Phone platform (feel free to add your favs in comments to this post):

Pictures – this built-in app features a Panorama control that displays layers of pictures, which makes even the most mundane snaps seem cooler.

IMDB – OK I admit that I must use IMDB when I watch movies, I’m weird that way. Plus this app is slick, with it’s use of multiple levels of Panorama controls and tons of excellent graphics (image is everything in Hollywood, right?)—if anything there may be too much bling.

Ferry Master – A nifty little app written by someone on the phone team for folks who take Washington State ferries, including background images of Puget Sound scenes.

Mix11 Explorer – this app, written for the Mix 2011 conference, uses a simple layering of basic shapes that mirrors the design of the Mix 2011 web site.

Kindle – Amazon’s classy eReader on the phone, with obviously high production value and graphics.

I would love to show screenshots of these, but most screens aren’t even available outside of the phone (just search Marketplace for these app names and check-out their screenshots.)

Find Classy Graphics

I know that the general design principles from the phone folks are that “modern design in a touch application is undecorated, free of chrome elements, and minimally designed.” However, graphics is the kind of content that really pops on the phone. Make sure that the graphics that you use are clean and have impact, and that you have the rights to use them or you could get blocked in certification (and you don’t want to get hassled by the owner after you publish the app). Many professional apps leverage some of the brand images and design themes of their corporate web sites.

Use Expression Blend

Up to this point, I’ve used exclusively Visual Studio Express to write my Windows Phone apps. As I mentioned, my apps have mostly involved programming the OData Client Library for Windows Phone. While much better for writing code (IMHO) and essential for debugging, the design facilities in Visual Studio are, shall we say, limited. Unless you are an expert in the powerful-but-labyrinthine XAML expression language, you are going to need some extra help. Fortunately, Microsoft’s Expression Blend is designed specifically for XAML (WPF, Silverlight, Windows Phone), and it even supports animations. Here’s a rundown of Expression Blend versus Visual Studio Express:

Expression Blend

While I’m still not completely comfortable with the UI, Expression Blend has proven very adept at these design aspects:

Applying styles – Expression is much more intuitive and visual than VS, even making it easy to create gradients

Working with graphics – very easy to add background images to elements

Animations – I haven’t even tried any of these yet, but good luck creating a storyboard in VS

Entering text – XAML is weird with multi-line text in text-blocks, and Expression will generate this code automagically

Preview in both orientations and in both light and dark themes – this is akin to previewing web pages in multiple browsers—make sure you take advantage of this functionality

Visual Studio Express

Visual Studio is hands-down best with these programming aspects:

Solution/project management – I prefer to setup the project and add resources in VS

Build/debug – both will launch the emulator, but in VS you can actually debug in the emulator—or the device, which is much better

Add Service Reference – I don’t think that Expression has anything like this tool, which is very important for OData (especially in Mango when it actually works)

Because each IDE has it’s own strengths, I’ve found myself keeping my project open in both Expression and VS, and flipping back and forth during development.

Use Layering

In XAML, you can control the layer (z) of elements in the display as well as the opacity of elements. This control enables you to create a nice, modern layered look (think of today’s Windows 7 versus XP) with background element being partly visible through elements in front. This enables you to create a more “composed” screen with a sense of depth.

User Interaction

Smartphones are interactive devices, and you can make it respond to orientation changes, motion, location, and even giggling. Plus, navigating a page by swiping with your finger is the best part. In fact, swiping is so cool that Windows 8 features it heavily in the newly-announced Windows Metro (which looks a lot like Windows Phone 7). Make sure that you include some of this in your apps, at least handle the orientation change from portrait to landscape, which is easier than you think if you do your XAML correctly.

Panorama Control is Cool

As a key component to nearly all of my favorite apps, the Panorama control features all of the aspects we just discussed: leveraging graphics, using layers, and user interactions. It’s basically a long horizontal control that, unlike the Pivot control, displays a single, contiguous background image across multiple screen-sized items. As the user flips between items, there is a nice layered motion effect (like in those 1930’s Popeye cartoons) where the individual items, and their graphics, moves faster than the background images, giving an illusion of depth to the app.

Follow the Guidelines

To support the general “coolness” of the platform, the Windows Phone team has published an extensive set of design guidelines. Of course, they recommend that you stick as closely as possible to the phone-driven themes (which the IMDB app doesn’t do too well), but most of the guidance is meant to promote easy-to-use apps and a more uniform platform. (I’m not sure how this compares to, say, iPhone apps, since I’ve never had an iPhone.)

At any rate, since I am planning to go through the entire Marketplace process, I will probably post another blog with the results of my adventure.

This session explains how to secure Service Bus using the Access Control Service. This is also an extension session for my session at BUILD, but watching the BUILD session is not a strict prerequisite.

The steps below are to download and run a simple application that showcases the publish-subscribe capabilities of Service Bus. You will need to have Visual Studio and NuGet installed. We will start by creating a new console application in VS:

Open the Project –> Properties and change the Target framework to .NET Framework 4

From the References node in Solution Explorer click on the context menu item for Add Library Package Reference. This will show only if you have NuGet Extension installed. (To learn more about NuGet see this TechEd video).

Search for “AppFabric” and select the Windows Azure AppFabric Service Bus Samples – PublishSubscribe item. Then complete the Install and close this dialog.

Note that the required client assemblies are now referenced and some new code files are added.

Add the following line to the main method in Program.cs and hit F5

Microsoft.Samples.PublishSubscribe.Program.Start(null);

At this point you will be prompted to provide a ServiceNamespace and Issuer Name and Key. You can create your own namespace from https://windows.azure.com/

Once a new namespace has been created you can retrieve the key from the Properties section by clicking View under Default Key:

These values can now be used in the Console application:

At this point you can run thru the different scenarios showcased by the sample. Following are some additional resources:

Today at the Build conference in Anaheim California, Satya Nadella, President Server and Tools business announced general availability of the production release of AppFabric Queues and Topics, otherwise known as Brokered Messaging.

I covered Brokered Messaging following the May CTP release of Queues and followed up shortly with an overview and exploration of Topics (please see some other great resources at the end of this post).

Since then, there was a June CTP release which included the new AppFabric Application and no visible changes to Brokered Messaging, however since its release, the AppFabric Messaging team has been hard at work refining the API and behaviors based on feedback from Advisors, MVPs and the community at large.

Since I’ve already covered Queues and Topics in the aforementioned posts, I’ll dive right in to some terse examples which demonstrate the API changes. Though not an exhaustive review of all of the changes, I’ve covered the types that your most likely to come across and will cover Queues, Topics and Subscriptions extensively in my upcoming article in CODE Magazine which will also include more in-depth walk-throughs of the .NET Client API, REST API and WCF scenarios.

Those of you who have worked with the CTPs will find some subtle and not so subtle changes, but all in all I think all of the refinements are for the best and I think you’ll appreciate them as I have. For those new to Azure AppFabric Service Bus Brokered Messaging, you’ll benefit most from reading my first two posts based on the May CTP (or any of the resources at the end of this post) to get an idea of the why behind queues and topics and then come back here to explore the what and how.

A Quick Note on Versioning

In the CTPs that preceded the release of the new Azure AppFabric Service Bus features, a temporary assembly called “Microsoft.ServiceBus.Messaging.dll” was added to serve a container for new features and deltas that were introduced during the development cycle. The final release includes a single assembly called “Microsoft.ServiceBus.dll” which contains all of the existing relay capabilities that you’re already familiar with as well as the addition of support for queues and topics. If you are upgrading from the CTPs, you’ll want to get ahold of the new Microsoft.ServiceBus.dll version 1.5 which includes everything plus the new queue and topic features.

The new 1.5 version of the Microsoft.ServiceBus.dll assembly targets the .NET 4.0 framework. Customers using .NET 3.5 can continue using the existing Microsoft.ServiceBus.dll assembly (version 1.0.1123.2) for leveraging the relay capabilities, but must upgrade to .NET 4.0 to take advantage of the latest features presented here.

.NET Client API

Queues

Below is a representative sample for creating, configuring, sending and receiving a message on a queue:

When running this sample, you’ll see that I have received Order 42 on my Inventory, Credit and North America Fulfillment Service subscriptions:

WCF

One of the great things about the WCF programming model is that it abstracts much of the underlying communication details and as such, other than dropping in a new assembly and and refactoring the binding and configuration, it is not greatly affected by the API changes from the May/June CTP to GA.

As I mentioned, one thing that has changed is that the ServiceBusMessagingBinding has been renamed to NetMessagingBinding. I’ll be covering and end to end example of using the NetMessagingBinding in my upcoming article in CODE Magazine.

REST API

The REST API is key to delivering these new capabilities across a variety of client platforms and remains largely unchanged, however one key change is how message properties are handled. Instead of individual headers for each, there is now one header with all the properties JSON encoded. Please refer to the updated REST API Reference doc for details. I’ll also be covering and end-to-end example of using the REST API to write an read to/from a queue in my upcoming article in CODE Magazine.

More Coming Soon

As I mentioned, in my upcoming article in CODE Magazine, I’ll cover the Why, What, and How behind Azure AppFabric Service Bus Brokered Messaging including end to end walkthroughs with the .NET Client API, REST API and WCF Binding. The November/December issue should be on newsstands (including Barnes and Noble) or your mailbox towards the end of October. You can also find the article online at http://code-magazine.com

Resources

You can learn more about this exciting release as well as download the GA SDK version 1.5 by visiting the following resources:

In this post, I’ll look at why getting diagnostic data for applications in the cloud is important for all applications (not just PHP), provide a short overview of what Windows Azure Diagnostics is, and show how to get diagnostics data for PHP applications (although much of what I look at is applicable regardless of language). In this post I’ll focus on how to use a configuration file (the diagnostics.wadcfg file) to get diagnostics, while in Part 2 I’ll look at how to to this programmatically.

Why get diagnostic data?

There are lots of reasons to collect and analyze data about applications running in the cloud (or any application, for that matter). One obvious reason is to help with troubleshooting. Without data about your application, it’s very difficult to figure out why something is broken when it breaks. However, gathering diagnostic data for applications running in the cloud takes on added importance. Without diagnostic data, it becomes difficult (perhaps impossible) to take advantage of scalability, a basic value proposition of the could. In order to know when to add or subtract instances from a deployment, it is essential to know how your application is performing. This is what Windows Azure Diagnostics allows you to do. (Look for more posts soon about how to scale an application based on diagnostics.)

What is Windows Azure Diagnostics?

In understanding how to get diagnostic data for my PHP applications running in Azure, it first helped me to understand what this thing called “Windows Azure Diagnostics” is. So, here’s the 30,000-foot view that, hopefully, will provide context for the how-to section that follows…

Windows Azure Diagnostics is essentially an Azure module that monitors the performance (in the broad sense) of role instances. When a role imports the Diagnostics module (which is specified in the ServiceDefinition.csdef file for a role), the module does two things:

The Diagnostics module creates a configuration file for each role instance and stores these files in your Blob storage (in a container called wad-control-container). Each file contains information (for example) about what diagnostic information should be gathered, how often it should be gathered, and how often it should be transferred to your Table storage. You can specify these settings (and others) in a diagnostics.wadcfg file that is part of your deployment (the Diagnostics module will look for this file when it starts). Regardless of whether you include the diagnostics.wadcfg file with your deployment, you can create or make changes programmatically after deployment (the Diagnostics module will create and/or update the configuration files that are in Blob storage).

According to settings in the configuration files in Blob storage, the Diagnostics module begins writing data to local storage and transferring it to Table storage (to enable the transfer you have to supply storage connection string information in your ServiceConfiguration.cscfg file).

So that’s the high-level view (import a module, configure via a file or programmatically, diagnostics info is written to your Azure storage account). Now for the details…

How to get diagnostic data using a configuration file

There are two basic ways you can get diagnostics data for a Windows Azure application: using a configuration file and programmatically. In this post, I’ll examine how to use a configuration file (I’ll look at how to get diagnostics programmatically in Part 2). I’ll assume you have followed this tutorial, Build and Deploy a Windows Azure PHP Application, and have a ready-to-package PHP application.

Note: Although I will walk through this configuration in the context of a PHP application, the configuration steps are the same for any Azure deployment, regardless of language.

Step 1: Import the Diagnostics Module

To specify that a role should import the Diagnostics module, you need to edit your ServiceDefinition.csdef file. After you run the default scaffolder in the Windows Azure SDK for PHP (details in the tutorial link above), you will have a skeleton Azure PHP application with a directory structure like this:

Open the ServiceDefinition.cscfg file in your favorite XML editor and make sure the <Imports> element has this child element:

<Import moduleName="Diagnostics"/>

That all that’s necessary to import the Diagnostics module.

Step 2: Enable transfer to Azure storage

To allow the Diagnostics module to transfer data from local storage to your Azure storage account, you need to provide your storage account name and key in the ServiceConfiguration.cscfg file. Again in your favorite XML editor, open this file and make sure the <ConfigurationSettings> element has the following child element::

Step 3: Edit the configuration file (diagnostics.wadcfg)

Now you can specify what diagnostic information you’d like to collect. Notice that included in the directory structure shown above is the diagnostics.wadcfg file. Open this file in an XML editor and you’ll begin to see exactly what you can configure. I’ll point out some of the important pieces and provide one example.

In the root element (<DiagnosticMonitorConfiguration>), you can configure two settings with the configurationChangePollInterval and overallQuotaInMB attributes:

The frequency with which the Diagnostic Monitor looks at the configuration files in your blob storage for updates (see the What is Windows Azure Diagnostics section above for more information) is controlled by the configurationChangePollInterval.

The amount of local storage set aside for storing diagnostic information is controlled by the overallQuotaInMB attribute. When this quota is reached, old entries are deleted to make room for new ones.

In the example here, these settings are 1 minute and 4GB respectively:

Logs events that are typically used for troubleshooting application and driver software.

The bufferQuotaInMB attribute on these elements controls how much local storage is set set aside for each diagnostic (the sum of which cannot exceed the value of the overallQuotaInMB attribute). If the value is set to zero, no cap is set for the particular diagnostic, but the total collected at any one time will not exceed the overall quota.

The scheduledTransferPeriod attribute controls the interval at which information is transferred to your Azure storage account.

As an example, this simple configuration file collects information about specified performance counters (the sampleRate attribute specifies how often the counter should be collected):

Step 4: Package and deploy your project

Step 5: Collect diagnostic information

After you deploy your application to Windows Azure, the Diagnostics module will begin writing data to your storage account (it may take several minutes before you see the first entry). In the case of performance counters (as shown in the example configuration file above), diagnostic information is written to a table called WADPerformanceCountersTable in Table storage. To get this information, you can query the table using the Windows Azure SDK for PHP like this:

Of course, getting this data can be somewhat trickier if you have several roles writing data to your storage account. And, this begs the question, what do I do with this data now that I’ve got it? So, it looks like I have plenty to cover in future posts.

I have another helper function I have written that basically creates a new PerformanceCounterConfiguration object. You will need an array of these to pass to the Set-PerformanceCounter cmdlet. The function takes the performance counter specifier (more on this shortly) and the sample rate in seconds for each counter.

From there it is up to you to pass the performance counters you are interested in. I’ve written another function that wraps up a the functionality of creating a PowerShell array, populating it with the counters I am interested in and sets them for the diagnostic aware role I pass in from GetDiagRoles. Note: I’m also comparing the current role name to the name of one of my web roles so I’m only adding web related counters on instances that I care about. I retrieved the counter names by typing in: typeperf.exe /qx while logged into my Windows Azure roles via RDP. This way I can actually get counters specific to the machines in the service (and the processes such as RulesEngineService.exe (my sample NT service). Finally, it makes a call to the Set-PerformanceCounter cmdlet with the array of performance counters, and configures diagnostics to transfer these to Windows Azure Storage every 15 minutes.

Now linking all of this together to configure performance logging is pretty simple. In the snippet below I call GetDiagRoles which returns a collection of all of the diagnostic aware roles in my service. Which I then pipe individually to the SetPerfmonCounters function previously created.

Set Counters for each Role

GetDiagRoles | foreach {
$_ | SetPerfmonCounters
}

Now I am successfully logging perfmon data and transferring it to storage. What’s next? Well analysis and clean up of course! To analyze the data we have added another cmdlet called Get-PerfmonLogs that downloads the data and optionally converts it to .blg for analysis. The snippet below creates a perfmon log foreach role in your service using the BLG format. Note the Get-PerfmonLog cmdlet also supports -From and -To (or -FromUTC or -ToUTC) so you can selectively download performance counter data.

Once the data is downloaded you can utilize the Clear-PerfmonLogs to clean up the previously downloaded perfmon counters from Windows Azure Storage. Note all of the Clear-* diagnostic data cmdlets support a -From and -To parameter to allow you to manage what data to delete.

TORONTO, September 19 -- Today at SIBOS 2011, Microsoft Corp. announced that a growing number of financial services customers are benefiting from the significant gains in agility, operational efficiency and cost savings achieved by moving to the high-performance Windows Server operating system and Microsoft SQL Server database.

These customers have not only reduced the cost of running their core processes, they have also realized substantial benefits by making these core processes part of a dynamic IT infrastructure that enables them to understand and serve customers better, bring new products to market more quickly, continually improve operations, and collaborate with an evolving set of partners in their global value chains.

"Microsoft is making a long-term commitment to supporting the mission-critical business applications of our financial services customers," said Karen Cone, general manager, Worldwide Financial Services, Microsoft. "Our customers are testament to this commitment to delivering a solid foundation for mission-critical workloads, with the dependability, performance and flexibility required to achieve sustainable competitive advantage in today's financial services industry."

Legacy System ModernizationSkandinavisk Data Center (SDC), which services the banking businesses of more than 150 financial institutions in Denmark, Sweden, Norway and the Faroe Islands, is expecting to save $20 million annually by moving its core banking system from its mainframe platform to Windows Server and SQL Server. With client growth and cost reduction at the forefront of SDC's business imperatives, it required a solution to help minimize spending while retaining and gaining new member banks. By migrating from the mainframe to a Windows platform, it is estimated that SDC will reduce operational costs for its core banking system by 30 percent, giving it a competitive edge. The migration of online transactions in the core banking system was completed this fall, reducing operational costs by more than 20 percent annually. As the next and final step, the systems database will be migrated from DB2 to SQL Server.

In addition, with partner Temenos Group AG, Microsoft is supporting banks across the globe with the TEMENOS T24 (T24) core banking system optimized for SQL Server. Microsoft and Temenos recently completed a high-performance benchmark that measured the high-end scalability of T24 on SQL Server and Windows Server 2008 Datacenter. The model bank environment, created to reflect tier-one retail banking workloads, consisted of 25 million accounts and 15 million customers across 2,000 branches. At peak performance, the system processed more than 3,400 transactions per second in online testing and averaged more than 5,200 interest accrual and capitalizations per second during close of business processing. The testing demonstrated near-linear scalability (95 percent) in building up toward the final hardware configuration. Banks such as Sinopac in Taiwan and Rabobank Australia and New Zealand are among the first to benefit from expertise and capabilities developed at the Microsoft and Temenos competency center. Furthermore, Microsoft and Temenos recently announced that Mexican financial institutions are live on T24 on Microsoft's cloud platform, Windows Azure. Banks that select T24 on a Microsoft platform for on-premises deployment today can therefore be confident of a road map to the cloud. [Emphasis added.]

In the past, only mainframes could run and maintain mission-critical trading solutions that require a lot from the database infrastructure: high-availability, redundancy, transaction and data integrity, consistency, predictability, and the balancing of proactive prevention with effective recovery. One such solution, the SunGard Front Arena, is a global capital markets solution that delivers electronic trading and position control across multiple asset classes and business lines. Integrating sales and distribution, trading and risk management, and settlement and accounting, Front Arena helps capital markets and businesses around the world improve performance, transparency and automation. Front Arena was designed to handle very large data flows and, in high-volume environments such as equities exchange trading, Front Arena customers routinely enter as many as 130,000 trades per day. As a result, in March and April 2011, engineers from SunGard and Microsoft worked together to confirm the performance and scalability of Front Arena on Microsoft SQL Server 2008 R2 at the Microsoft Platform Adoption Center in Redmond, Wash. The team designed a benchmark test to emulate a real-world, enterprise-class financial workload, running the software on industry-standard hardware typically found in datacenters today. Front Arena running on SQL Server 2008 R2 exceeded the goals set by the team, confirming that SQL Server 2008 R2 delivers the performance, scalability and value companies demand from their trading platform.

Transforming the Reconciliation ProcessSaxo Bank, a specialist in trading and investment, has determined that its transaction volumes have increased with the help of SunGard's Ambit Reconciliation solution, deployed on Windows Server and SQL Server. The solution provides a real-time matching and reconciliation platform on which Saxo Bank has consolidated its reconciliation and exception management operations across all trading platforms at the bank. According to Saxo Bank, it is now able to process more transactions per day compared with four years ago, with fewer staff to manage the increase in transaction volumes.

Improving Payments Efficiency and ReliabilityIn a recent report based on its Uptime Meter, a six-month availability aggregate, Stratus demonstrated that its fault-tolerant servers running Windows Server and S1 Corp.'s payments solutions are achieving "six nines" (99.9999 percent) of availability. Both hardware- and software-related incidents are included in the measurement. The combination of mission-critical technologies from S1, Stratus and Microsoft provides a compelling alternative to more costly mainframe solutions.

The success of Microsoft's mission-critical strategy depends not only on the technology and guidance provided by Microsoft, but also on the products and services provided by its large ecosystem of partners. A new generation of solutions from independent software vendors -- across banking, capital markets and insurance -- combined with Microsoft's development tools and technologies, is now delivering the dynamic environment needed to realize the benefits of next-generation, mission-critical platforms from on-premises to the cloud.

In my BUILD session Monitoring and Troubleshooting Windows Azure Applications I mention several code samples that make life much easier for implementing diagnostics with the Windows Azure PowerShell Cmdlets 2.0. This is the first in a series that walk you through configuring end-to-end diagnostics for your Windows Azure Application with PowerShell.

GetDiagRoles performs the following steps:
1) Retrieves your hosted service
2) Retrieves the correct deployment (staging or production)
3) Returns a collection of all of the roles that have diagnostics enabled.

For example the following code will print all of the role names that are returned:

GetDiagRoles | foreach {
write-host $_.RoleName
}

Note: The $_ operator basically means the “current” object in the collection.

Of course we want to configure diagnostics and not just print out role names.
The following example sets the diagnostic system to transfer any trace logs that have a loglevel of Error to storage every 15 minutes.

Of course there are -From and -To arguments for all of the Get-* diagnostic cmdlets so you can filter by certain time ranges etc.
A few other goodies we have added are the ability to clear your diagnostic settings (reset them to none) and also the ability to clean out your diagnostic storage.

Even while we are here at BUILD we are working hard to make deploying and managing your Windows Azure applications simpler. We have made some significant improvements to the Windows Azure Platform PowerShell Cmdlets and we are proud to announce we are releasing them today on CodePlex: http://wappowershell.codeplex.com/releases.

In this release we focused on the following scenarios:

Automation of Deployment Scenarios

Windows Azure Diagnostics Management

Consistency and Simpler Deployment

As part of making the cmdlets more consistent and easier to deploy we have renamed a few cmdlets and enhanced others with the intent of following PowerShell cmdlet design guidelines much closer. From a deployment perspective we have merged the Access Control Service PowerShell Cmdlets with the existing Windows Azure PowerShell Cmdlets to have both in a single installation.

In addition to those changes we have added quite a few new and powerful cmdlets in this release:

If you played a bit more with the sites configuration in Windows Azure you may have discovered some inconsistent behavior between what Visual Studio does and what the cspack.exe command line does when it relates to physicalDirecroty attribute. I certainly did! Here is the problem I encountered while trying to deploy PHP on Windows Azure.

Project Folder Structure

I was following the instructions on Installing PHP on Windows Azure leveraging Full IIS Support but decided to leverage the help of Visual Studio instead building the package by hand. Not a good idea for this particular scenario :( After creating my cloud solution in Visual Studio I ended up with the following folder structure:

and PHPRole was my VS project folder containing the code for my web role

The PHPRole folder contained the WebPI command line tool needed to install PHP in the cloud stored in the WebPI-cmd subfolder; the PHP extensions for Azure in the PHP-Azure subfolder; the installation scripts in the bin subfolder; and most importantly my PHP pages in Sites\PHP subfolder (in this case I had simple index.php page containing phpinfo()).

Configuring Site Entry in the CSCFG File

Of course my goal was to configure the site to point to the folder where my PHP files were stored. In this particular case this was the PHPonAzureSol\PHPRole\Sites\PHP folder if you follow the structure above. This is simply done by adding the physicalDirectory attribute to the Site tag in CSDEF. Here is how my Site tag looked like:

My expectation was that with this setting in CSDEF IIS will be configured to point to the content that comes from the physicalDirectory folder. Hence if I type the URL of my Windows Azure hosted service I should be able to see the index.php page (i.e. http://[my-hosted-service].cloudapp.net should point to your PHP code).

Visual Studio handling of physicalDirectory attribute

Of course when I used Visual Studio to pack and deploy my Web Role I was unpleasantly surprised. It seems Visual Studio ignores the physicalDirectory attribute from your CSDEF file, and points the site to your Web Role’s approot folder (or the content from PHPRole folder if you follow the structure above). Thus if I wanted to access my PHP page I had to type the following URL:

The reason for this is that Visual Studio calls cspack.exe with additional options (either /sitePhysicalDirectories or /sites) that overwrite the physicalDirectory attribute from CSDEF. As of now I am not aware of a way to change this behavior in VS.

Update (9-12-2011): As it seems VS ignores the physicalDirectory attribute ONLY if your web site is called Web (i.e. name="Web" as in the example above). If you rename the site to something else (name="PHPWeb" for example) you will end up with the expected behavior described below. Unfortunately name="Web" is the default setting, and this may result in unexpected behavior for your application.

cspack.exe handling of physicalDirectory attribute

Solution to the problem is to call cspack.exe from the command line (without the above mentioned options of course:)).

There are few gotchas about how you call cspack.exe using the folder structure that Visual Studio creates. After few trial-and-errors where I received several of those errors:

Error: Could not find a part of the path '[some-path-here]'.

I figured out that you should call cspack.exe from the solution folder (PHPonAzureSol in the above structure). Once I did this everything worked fine and I was able to access my index.php by just typing my hosted service’s URL.

How physicalDirectory attribute works?

For those of you interested how the physicalDirectory attribute works here is a simple explanation.

MSDN documentation for How to Configure a Web Role for Multiple Web Sites points out that physicalDirectory attribute value is relative to the location of the Service Configuration (CSCFG) file. This is true in the majority of the cases however I think the following two clarifications are necessary:

Because the attribute is present in the Service Definition (CSDEF) file the correct statement is that physicalDirectory attribute value is relative to the location of the Service Definition (CSDEF) file instead. Of course if you use Visual Studio to build your folder structure you can always assume that the Service Configuration (CSCFG) and the Service Definition (CSDEF) files are placed in the same folder. If you build your project manually you should be careful how you set the physicalDirectory attribute. This is of course important if you want to use relative paths in the attribute.

This one I think is much more important than the first one, and it states that you can use absolute paths in the physicalDirectory attribute. The physicalDirectory attribute can contain any valid absolute path on the machine where you build the package. This means that you can point cspack.exe to include any random folder from your local machine as your site’s root.

Here is how this works.

What cspack.exe does is to take the content of the folder configured in physicalDirectory attribute and copy it under [role]/sitesroot/[num] folder in the package. Here is how my package structure looked like (follow the path in the address line):

During deployment IIS on the cloud VM is configured to point the site to sitesroot\[num] folder, and serve the content from there. Here is how it is deployed in the cloud:

…please forgive me for the shameless header image of me at the presentation. I am usually quite shy

Slide 1

So, as mentioned, my name is Paul Patterson.

I work for an organization named Quercus Solutions Inc. We are a technology based company specializing in Microsoft technologies such as .Net, SharePoint, Office 365, and Microsoft Azure.

First off, by show of hands, how many of you have had an opportunity to play with LightSwitch at all?

Cool. So most, if not all, of the information I have is probably already familiar to you – which is good because if I fail on something, I will call on you to bail me out. J

So, what I am going to present to you is a summary of what Microsoft Visual Studio LightSwitch is, and what it may mean to you as a professional software developer, as well as what it may mean to you as a “non-programmer”. Probably more so for the non-programmer.

With this presentation, I will be talking with the perspective of the non-professional software developer – those departmental people who may look at LightSwitch as an option to solving a business problem, for example.

Slide 2

So, just so you know what to expect, here is a quick look at an agenda.

I’ll first give you a quick introduction to LightSwitch, in which will include a peak at the technologies that make LightSwitch tick.

The introduction will include a quick demo where I’ll put some what is presented into practice.

Next I’ll touch on some extensibility points about LightSwitch.

And then, with time permitting, I’ll skim over a few deployment points, and possibly show a quick deployment scenario for you.

As far as questions go; at the risk of not having enough time to get through the presentation and demos, if we could hold on to the questions until the end, we might be able to get through the entire presentation.

Having said that, I have to caveat that LightSwitch is a huge topic. We could easily spend an entire day going through all the fine details of LightSwitch. Given the 1 hour time box here, I have to cover the higher level points, so I totally expect some questions; just if we can hold off until the end, that would serve us best – thanks.

Slide 3

So, according to Microsoft… in using Microsoft Visual Studio LightSwitch 2011, we, “…will be able to build professional quality business applications quickly and easily, for the desktop and the cloud.”

Okay! First off, I am not a designated LightSwitch “Champion”. I am just a curious fella who happened onto something that tweaked my interest a few years back.

Before you start laying into me with some subjectivity, you should first understand where I coming from, and how I think.

I generally like to find things that help me take care of a task in as short a time as possible.

This probably came from the various roles as the technical go-to guy within the non-technical departments I’ve worked in.

A lot of my early experience to solving business problems involved the use of Excel and Access.

When I first read about this tool that Microsoft was working on, one that had the potential to do something fast and easy; I was intrigued.

I started watching the development of LightSwitch very early on, even before the first beta was released.

I believe it was at sometime in 2008, maybe 2007, that Microsoft made people aware that it was working on this new tool codenamed KittyHawk.

Rumour had it that the KittyHawk team included some former FoxPro people – which would make sense because of the timing of FoxPro’s retirement.

Anyways it was sometime early last year that I read about this LightSwitch tool that Microsoft was readying for beta testing.

After culling through some forums and interweb rumour mills, I took it to task to keep a diligent eye on this thing – hence the start of my blog PaulSPatterson.com.

So, since early last year I have been keeping my ear to the ground, listening and watching how this product has evolved in to what it is today.

Slide 4

So why did I tell you all that!

I am going on my intuition and gut instincts that LightSwitch is going to have a relatively large impact on the industry. Maybe not tomorrow, or even within the next year, but something is telling me to keep an eye on this tool.

Software developers tend to keep some technologies close to their chest.

All I am saying is to keep an open mind about LightSwitch, and don’t discount the obvious – such as the value proposition the product has.

Slide 5

Back to the agenda, let’s talk about the technology behind LightSwitch…

Slide 6

LightSwitch is a part of the Visual Studio family of products.

It essentially sits as a SKU between Visual Studio Pro and the free Express products.

I believe the current retail price for LightSwitch is about $200.00.

When you install LightSwitch, if you already have Visual Studio 2010 (Professional or better), it automagically integrates with Visual Studio, making its templates available for selection from the new project templates dialogs.

If you don’t have Visual Studio installed, LightSwitch installs as a stand-alone tool; using the same familiar Visual Studio IDE.

Using LightSwitch, it is possible to create and deploy an application without writing a single line of code.

As such, you can already begin to imagine the value proposition that this will have with the non-developer types.

Like I said before, LightSwitch is data centric, and all that someone has to do is provide some data, select to add some screens for the data, and presto, you have an application ready to show off to all your work buddies.

It really is just that easy! (That is Shell Busey, home improvement guy!). Yes, I just dated myself

Slide 7

LightSwitch uses “best practices” in how it creates applications.

For example, LightSwitch applications are built on a classic three-tier architecture where each tier runs independently of the others and performs a specific role in the application.

Here is an example 3-tier architecture model. The presentation tier (or “UI”). The logic tier which is the liaison between the presentation tier and the… data storage tier; which is responsible for the application data.

Slide 8

We can map specific technologies used in LightSwitch to this architecture.

There are also opportunities to consume other data sources, which are exposed, typically, via WCF RIA services – oData is a good example.

The idea is that LightSwitch removes the complexity of building three-tier applications by making specific technology choices for you so that you can concentrate on the business logic and not the plumbing.

Silverlight!! LightSwitch builds out applications using Silverlight.

When you create an application using LightSwitch, you are essentially creating an application that uses Silverlight technologies.

Slide 9

Back to the agenda…

So, next I want to talk a little about what is meant by Screens over Data.

Slide 10

It all starts with the data.

Data is the heart and foundation of developing with LightSwitch.

Most everything we do in LightSwitch revolves around the data.

In a nutshell, you tell LightSwitch what to use, and then you create the “screens” that fit over the data. More to come on that later…

A LightSwitch application can connect to two types of data: local or internal data and external data.

With local data, SQL Server Express is used behind the scenes.

When you start designing entities in LightSwitch, which I’ll show you an example of in a second, you are using SQL Express.

External data can be consumed from SQL Server databases, SharePoint lists or any WCF RIA Service exposed data.

With external data sources, LightSwitch can perform data management, such as CRUD operations, however it cannot make any schema changes on the data source.

Note that WCF RIA Services can expose a lot of different types of data sources.

If you can wrap a data source in a WCF RIA Service, chances are you can consume the data source in LightSwitch; such as OData, for example – which I have done, as shown in my blog where I consumed some City of Edmonton data to view bus stop information.

Also, LightSwitch can also connect to more than one data source at a time, and internally, define relationships between external data sources and internal data entities, if any.

Slide 11

So where are we at with the agenda?

[/arrow_list

Slide 12

DEMO TIME!!

Launching LightSwitch

The IDE – Just like Visual Studio (because it IS Visual Studio)

New Project Dialog

LightSwitch Start Screen – shows how data is the center of attention

Create new table and Attach to external Data Source items

Create a new table.

Example Customer Table…

Explore the Table Designer

Field Name

Different Data Types

Required Checkbox

Explore the field properties panel

General

Choice List

Appearance

Custom Validation

Create custom validation on date field…

Private Sub DateAdded_Validate(results As EntityValidationResultsBuilder)
If DateAdded.HasValue Then
If DateAdded > Date.Today Then
results.AddPropertyError("The date added must be today or in the past.")
End If
End If
End Sub

Add a screen for the Customer table

Explore the add new screen dialog

Select List and Details Screen template

Select Customers data.

Note how the data is “pluralized”

Explore the Screen Designer

Not your familiar GUI designer.

Screen Members List

Screen Content Tree

Run the application

Show Customers screen

Add a customer

Show phone number formatting

Show phone number drop down selection

Show date validation

Show save data feature

Show debug mode designer features…

Demonstrate real-time customization

Change labels of detail fields

Add Description to field to show field tooltip.

Explore the Solution Explorer.

Folder structure

Data Sources

Screens

Add a new “Address” Table

Update the AddressType field to use a ChoiceList

Add a relationship to the Customer table.

Demonstration the add relationship dialog

Show resulting screen designer with the relationship.

Delete the existing screen

Add a new List and Details screen

Select the Customers data for the screen.

Show how the Customer addresses is available for selection.

Select the addresses to show on the screen.

Review the screen designer showing the additional entity collection for the addresses.

Run the application and review the new address collection on the screen.

Create a new table named City, using just the CityName as a field.

Edit the Address table by removing the City field, and then add the relationship to the new City table.

Open the CustomerListDetail screen and show how the screen has removed the City field.

Drag and drop the field from the Addresses collection to the Addresses data grid.

Also, move the Address Type item to the top of the list of items.

Create a new Editable Grid screen that will be used to maintain the list of cities.

Run the application.

Ask if anyone notices anything about what gets displayed…

Two things, the address type field, and

The address record that was added earlier was deleted.

This is because of the edit to the address entity. There was a field removed, and a relationship created, which basically refreshed the model.

Show the available navigation on the left Task menu.

Select the editable Cities grid and then add some cities.

Go back to the Customer List Detail and add some addresses for the customer.

Close the app

Review the project properties

General Properties

Shell: The placement and behaviour of elements on a screen

Theme: The look and feel (colors and things like that). CSS for the most part.

The Windows Azure Platform Pricing Calculator is designed to help application developers get a rough initial estimate of their Windows Azure usage costs. But where to start? What sort of numbers do you plug in?

A series of white papers from MSDEV helps give you a starting point for figuring it all out. Start with Getting a Start on Windows Azure Pricing and pick a scenario that is similar to your app. Then, use simple sliders in the pricing calculator to plug information about your application —like how many compute instances your application will use, the size of your database, the amount of data transferred to and from the application, and how much storage the application will need. Set the sliders and the calculator kicks out the estimated monthly cost to run the application on a pay-as-you-go model or through special offers.

The series is based on several scenarios that will give you a start on figuring out pricing for your application :

Windows Azure Pricing Scenario: E-Commerce Web Site. Estimating the monthly cost of running your e-commerce application on Windows Azure just got easier. The Windows Azure team recently launched a new pricing calculator designed to help developers estimate the cost of running their applications on Azure. You use simple sliders to plug information about your application into the calculator—like how many compute instances your application will use, the size of your database, the amount of data transferred to and from the application, and how much storage the application will need.

Windows Azure Pricing Scenario: Sales Training Application. If you’re developing a sales training application rich in video and other media, how do you estimate its monthly cost to run on Windows Azure? The Windows Azure team recently launched a new pricing calculator designed to help developers estimate the cost of running their applications on Azure. You use simple sliders to plug information about your application into the calculator—like how many compute instances your application will use, the size of your database, the amount of data transferred to and from the application, and how much storage the application will need. offers.

Windows Azure Pricing Scenario: Social Media Application. Estimating the monthly cost of running your social media applications on Windows Azure just got easier. The Windows Azure team recently launched a new pricing calculator designed to help developers estimate the cost of running their applications on Azure. You use simple sliders to plug information about your application into the calculator—like how many compute instances your application will use, the size of your database, the amount of data transferred to and from the application, and how much storage the application will need.

Something funny happened to me down at Microsoft’s Build conference, held this week in Anaheim. Something rare. Something unusual.

I wanted what I saw on the keynote stage, and I wanted it bad.

I’m talking about the new look-and-feel of Windows 8. The Metro user interface. The seamless transition that it encourages between devices in many different form factors: desktops, servers, tablets and phones. The user experience looks fresh and compelling, and frankly is the most innovative update that I’ve seen to a Microsoft desktop operating system since Windows 95.

As mentioned above, it’s rare for me to have that type of reaction. I didn’t have it upon seeing the first iPhone, for example. In fact, Apple has only done that to me twice, with the MacBook Air and the iPad. (Both of which I purchased promptly when they appeared in stores.)

In fact, I can only think of a few other times I had that reaction. Upon seeing the launch of a particular version of Mathematica (I forget which version). The launch of the Cobalt Cube, an innovative small-business server that Sun Microsystems acquired and killed. Steve Jobs demonstrating the second-generation NeXT pizza-box workstation. Not many others.

Downloading and installing the Windows Developer Preview, including tools, onto one of my lab machines is on my to-do list. (Microsoft gave every paid attendee at Build a Samsung tablet with the Win8 beta and tools preinstalled, but those were not offered to press attendees like yours truly.)

What about the developer angle? Microsoft appears to be making it easy to retrofit existing Windows applications to behave nicely within the new Metro user experience; in fact, the company claims that every app that runs under Windows 7 will run under Windows 8. (Presumably, that’s for Intel x32/x64 apps and not for ARM applications.) The Metro experience is driven by JavaScript with HTML, but can also be implemented using C#, C++ or Visual Basic using XAML. No rocket science there. ...

In a way, Azure was the star of Build 2011 and folks here in Anaheim didn't even really know it. Whatever form the Metro apps delivery system takes in the final shipping version of Windows 8 (with a likely timeframe now of Q1 2013), its most impressive and maybe the most important aspect is the inclusion of apps that learn what functions they can provide to the user from the cloud in real-time, and then manage those functions locally on the user's behalf. Put more simply: adaptive apps. [Emphasis added.]

Chris Jones, Microsoft's Senior Vice President for Windows Live, may become the company's newest star if he can pull this off. Windows Live has had trouble scratching out an identity for itself; but as Jones perceives it, Windows 8 could give Live new life, as a kind of cloud-based servant for the operating system. Jones began his Day 1 keynote demo on Tuesday by showing off services such as mail and scheduling.

If you weren't paying much attention to that point, you might have thought how ordinary it seems to have a mobile platform run mail and scheduling. If so, you would miss the underlying meanings here:

1. Microsoft is at least experimenting with the idea of folding Outlook from Office into Windows. As Jones said repeatedly, and showed directly, this mail app has Exchange ActiveSync built in. So do Windows Phones, of course, but making that feature meaningful only to folks who use Outlook and Exchange, as opposed to folks who use Windows, is thinking too narrowly.

2. Windows Live is experimenting with the notion of providing services directly to Windows without thinking it has to shove its brand into everyone's face. The brand distinctions (Windows 7, Windows Phone, Windows Live) aren't working for Microsoft as well as its services. Perhaps the way to get people to use Windows Live is to fold it into Windows 8 along with Outlook.

Metro-style mail, Jones told attendees, is entirely HTML and JavaScript-based. What he could have said is, the same expertise used to make Windows Live Mail into a Web page has been put to better use making an app.

Here's one of those revelations from Chris Jones that folks may have missed: "All my mail accounts [are] in one place, and because they're all stored in the cloud, I just type my Live ID into this PC and they all just come down into the system. I don't have to worry about setting things up any more, because all of the settings are done through Live."

Jones took this connection one big step further with his demo of the Photos app. Again, there's no "Windows Live" branding here; the brand is you. As the photo at the top of this article shows, the services with which a Windows 8 user shares photos are branded with one of those photos - Jones uses his own family (lovely, by the way) as an example. Those connections with Facebook, Flickr and whatever else are all done in the background because the user logged in with the Live ID first, and because ACS handles all the rest of the authentication process in the background. It's single sign-on, but this time only once.

So when Jones happens to share photos from his phone with Facebook, those photos appear on the Windows 8 PC - even as the "Facebook" category itself. No manual syncing; a zero-click process. You've shared your photos once, and there they are.

Then Jones extended the notion of sharing photos to sharing entire folders - access to remote PCs via SkyDrive without having to go through SkyDrive.

"Every Windows 8 user's got a SkyDrive," said Jones. "Every Windows Phone user's got a SkyDrive. In fact, if you've got a Live ID, you've got a SkyDrive and it's there for you to put your personal files and the things you want to share. It's also accessible to developers, and that's an important thing because it lets you as a developer access SkyDrive the way you might have accessed the local file system." Photos that happen to be on a user's SkyDrive simply appear in the Photos app, again without manually syncing.

After years of wondering what Windows Live services should eventually become, this may finally be it: the background service that rises to the foreground.

Three of our local user groups are joining forces this week to host an FOTC (Friend of the Community) and one of my Evangelist predecessors – Thom Robbins – as he presents “A Case Study in Building for Today’s Web – Kentico CMS” on Wednesday, September 21st, at the Microsoft Office on Jones Road in Waltham.

Building software is a set of smart choices to meet the needs of your customers and the possibilities of technology. Today’s Web demands that customers have a choice in how they deploy their applications. With over 7,000 websites in 84 countries, Kentico CMS for ASP.Net is delivered as a single code base for use as a cloud, hosted, or on-premise solution. With over 34 out of the box modules and everything built on a SQL Server backend – How did we do it? What tradeoffs did we make? In this session we will answer that question and look at how to build a rich and compelling website using Windows Azure Cloud

Thom Robbins is the Chief Evangelist for Kentico Software. He is responsible for evangelizing Kentico CMS for ASP.NET with Web developers, Web designers and interactive agencies. Prior to joining Kentico, Thom joined Microsoft Corporation in 2000 and served in a number of executive positions. Most recently, he led the Developer Audience Marketing group that was responsible for increasing developer satisfaction with the Microsoft platform. Thom also led the .NET Platform Product Management group responsible for customer adoption and implementation of the .NET Framework and Visual Studio. Thom was also a Principal Developer Evangelist working with developers across New England implementing .NET based solutions. A regular speaker and writer, he currently resides in Seattle with his wife and son.

Special thanks to Thom for taking time out of his schedule to present on his old turf, and also to Bill Wilder, Naziq Huq, Dean Serrentino, Teresa DeLuca, and Robert Hurlbut for coordinating their groups to make this happen. I’m looking forward to both the talk and the new connections that attendees of the various groups will undoubtedly make at the meeting – hope to see you there!

Join the Windows Azure team at Seattle Interactive Conference (Nov 2 -3, 2011) for two days of technical content and one-on-one advice and assistance from product experts. Cloud Experience track is for experienced developers and who want to learn how to leverage the cloud for mobile, social and web app scenarios. No matter what platform or technology you choose to develop for, these sessions will provide you with a deeper understanding of cloud architecture, back end services and business models so you can scale for user demand and grow your business.

SIC is developing a world-class speaker roster comprised of online technology’s most successful and respected personalities, alongside earlier-stage entrepreneurs who are establishing themselves as the leaders of tomorrow. SIC isn’t just about telling a story, it’s about truly sharing a story in ways that provide all attendees with a thought provoking experience and actionable lessons from the front lines.

Our confirmed speakers include:

Wade Wegner, Microsoft

Wade Wegner is a Technical Evangelist at Microsoft, responsible for influencing and driving Microsoft’s technical strategy for the Windows Azure Platform.

Rob Tiffany, Microsoft

Rob Tiffany is an Architect at Microsoft focused on combining wireless data technologies, device hardware, mobile software, and optimized server and cloud infrastructures together to form compelling solutions.

Steve Marx, Microsoft

Steve Marx is a Technical Product Manager for Windows Azure.

Nick Harris, Microsoft

Nick Harris is a Technical Evangelist at Microsoft specializing in Windows Azure.

Scott Densmore, Microsoft

Scott Densmore works as a Senior Software Engineer at Microsoft.

Nathan Totten, Microsoft

Nathan Totten is a Technical Evangelist at Microsoft specializing in Windows Azure and web development.

OpenStack, the free, open source cloud computing platform, is building a lot of momentum in the industry. It seems as though small service providers, multinational data center service providers and technology giants such as Intel alike are finding a lot of value in an open cloud standard. Unsurprisingly, a cloud developer community has sprung up around the platform with an eye on filling the gaps.

Take Zenoss, for example, which just launched a monitoring solution for OpenStack servers. Zenoss specializes in virtual, physical and cloud infrastructure monitoring and management, and this release extends its reach to OpenStack. Administrators using the Zenoss ZenPack for OpenStack can see server health, performance and inventory, ensuring application stability.

Oh, and did I mention ZenPack is free? And according to the press release, it gives the ability to monitor servers across providers, infrastructures and deployment types (physical, virtual, etc.).

According to Zenoss, it’s just meeting the demand of the rising number of OpenStack users. But my question is this: Right now, OpenStack has the attention and affections of the FOSS community. But will there be an opportunity for cloud ISVs to develop a business around building out the OpenStack experience and feature set?

These instances provide you with plenty of RAM, cycles, and network performance for heavy-duty workloads. With this release, you can now run Microsoft Windows on every one of the eleven EC2 instance types, from the Micro on up.

You can select the Windows Server 2008 R2 AMI from the AWS Management Console:

Our compliance team has been working non-stop to make sure that AWS qualifies for and receives a number of important certifications and accreditations. In the last year or so I have blogged about SAS 70 Type II, FISMA Low, and ISO 27001.

After receiving our FISMA Low level certification and accreditation, we took the next step and started to pursue the far more stringent FISMA Moderate level. This work has been completed and the door is now open for a much wider range of US Government agencies to use AWS as their cloud provider. Based on detailed security baselines established by the National Institute of Standards and Technology (NIST), FISMA Moderate certification and accreditation required us to address an extensive set of security configuration and controls.

We receive requests for many different types of reports and certifications and we are doing our best to prioritize and to respond to as many of them as possible. Please let me know (comments are fine) which certifications would let you make even better use of AWS.

You can read about our security certfications and practices at the AWS Security Center. To learn more about how our team works with agencies of the federal government, visit our Federal Government page.

A new integration from Facebook and Heroku gives Facebook developers direct access to Heroku’s cloud Platform-as-a-Service offering for hosting their applications. The goal is to make life as easy as possible for developers by eliminating hassles associated with actually running an application once it’s written. And it’s likely just a first step for Heroku when it comes to integrating with popular specialized development platforms.

Writing Facebook applications is actually easy enough, Heroku co-founder Adam Wiggins told me during an interview, but Facebook has been working to ease the burden of running them. When Heroku launched its Facebook App Package last year, I asked if it could become the official cloud of Facebook. Maybe it has done just that.

Now there’s an option within the Facebook development platform to launch an app on Heroku. In fact, with the push of a button, apps are up and running on Heroku without ever taking developers off the Facebook site. It’s only afterward that developers have to log in to Heroku to set a password and add additional tools or services.

As with all things cloud computing, the benefit for developers is not just automating the hosting process but also the promise of being able to handle unexpected traffic spikes. For individual developers, Wiggins said, the integration will let their apps keep up should they suddenly get popular. Businesses get the same benefit in terms of elasticity, he added, which lets them make social media inroads without buying and provisioning pools of servers in advance.

Facebook, of course, has a large developer community and represents a potentially significant source of new customers for Heroku, but Wiggins said it’s probably just a stepping stone for more-specialized integrations. He noted mobile apps as an area with particular promise — a possibility I highlighted in a recent GigaOM Pro report (subscription required) — but added that specialized offerings, in general, are easier to sell to specific developer bases than are general-purpose offerings.

Facebook developers can start using the Heroku integration on Thursday, and Heroku personnel will be at next week’s F8 conference to address questions or issues that arise in the meantime.

The dual Web role application has been running in Microsoft's South Central US (San Antonio) data center since September 2009. I believe it is the oldest continuously running Windows Azure application.

About Me

I'm a Windows Azure Insider, a retired Windows Azure MVP, the principal developer for OakLeaf Systems and the author of 30+ books on Microsoft software. The books have more than 1.25 million English copies in print and have been translated into 20+ languages.

Full disclosure: I make part of my livelihood by writing about Microsoft products in books and for magazines. I regularly receive free evaluation software from Microsoft and press credentials for Microsoft Tech•Ed and PDC. I'm also a member of the Microsoft Partner Network.