Because of the remoteness of SQL Azure it is beneficial to have some tricks in your coding toolbox for dealing with large binary objects, the varbinary(max) data type in SQL Azure. One of these is to be able to stream large binary objects (BLOB) -- reading or writing a piece of the data at a time.

This article provides a SqlStream class written in C# code. The class implements the abstract Stream class for the varbinary(max) data type on SQL Azure; Stream is an abstract class defined in the .NET CLR that is well supported and very versatile. The SqlStream class provided when used with SQL Azure allows you to manipulate a single blob a chunk at a time.

Using the SqlStream class provided you can:

Create a console application to copy a file of BLOB data to SQL Azure from the command line without having to load the whole BLOB into memory.

Create a Windows Azure Role that would read a BLOB from SQL Azure and store it in Windows Azure Storage.

Stream the BLOB from SQL Azure to a Winform application one chunk at a time - with the added benefit of being able to provide a good status dialog with progress bar.

Use the BinaryWriter, BinaryReader, FileStream, and MemoryStream classes in CLR to read and write the varbinary(max) data type without having to load the all the data into memory.

The blog post is a start of a series, in the coming days we will build some of the applications listed above. For now, here are some samples for using the SqlStream class. Download the SqlStream class its own .cs file at the bottom of the post.

Samples

The first sample uses the SqlStream class to read an image from the Adventure Works database deployed on SQL Azure and saving it to a file. You can download the Adventure Works database from SQL Server Database Samples.

Matt Steele walks us through on a whiteboard all of the steps required on how to federate your identity to Windows Azure using ADFS 2.0 for single-sign-on. This video is a great way to learn how ADFS works and to help you get started to deploy this scenario before you dig into deeper whitepapers. We will help you answer questions like:

What kind of SSL certificate should we get and when to get it?

Should we open up the firewall to the ADFS server or just manually copy over the certificates to establish the initial trust relationship?

Recently the Azure and security community published the “Security Best Practices for Developing Windows Azure Application” paper on download.microsoft.com: download available here

Outlining the security considerations developers should consider when building a service on Windows Azure.

As I have worked on Azure before and have an interest in security I found this a very interesting read, with links to some excellent msdn articles.

Identity Management

The document starts by outlining the best practices within Identity Management and Access Control and how that applies in general and to the cloud. One reference in particular is the use of Windows Identity Foundation or “WIF” along with ADFS and using AppFabric. I have worked with WIF in the past outside of the cloud, and while being quite complicated to understand, when you understand the basics (or play around in Visual Studio with a Secure Token Service for a while) it becomes very powerful quite quickly.

Service Layer

Then the paper starts to look into specific development considerations for the service-layer.

One consideration similar to most cross site scripting issues within web technologies is to use the Anti-Cross-Site-Scripting Library and using the ASP.NET Page.ViewStateUserKey to mitigate against Cross-Site Request Forgery attacks. This article is excellent at explaining how the attack works and what is needed to mitigate against it:

void Page_Init (object sender, EventArgs e) {

ViewStateUserKey = Session.SessionID;

:

}

Now specific to Azure, is where the (hosting) Namespace and scopes. As all azure services are hosted on *ServiceName*.cloudapp.net, it is worth pointing out that the best practice is to use a custom domain name which you have full control over, and using a CNAME to do the redirection: blog article

This way no other applications running under the cloudapp.net domain can script to your service using your custom domain without XSS.

Secure Data

As Azure storage uses a Shared Access Signature to allow your service to read and write to blobs/queue/tables, there is a guideline to minimizing risk of using a Shared Access Signature:

Generate Shared Access Signatures with the most restrictive set of ACLs possible that still grant the access required by the trusted party.

Use the shortest lifetime possible.

Use HTTPS in the request URL so the token cannot be snooped on the wire.

Remember that these tokens are only used for temporary access to non-public blob storage – as with passwords, it’s a bad idea to use the same ones over and over.

It is also suggested for data protection to encrypt data with certificates as DPAPI is not available, and deploy them using the Azure Certificate Store and not in Azure Storage such as blobs:

Infrastructure Layer

As Windows Azure deals with a lot of the infrastructure, items here are mostly within the configuration and definition of the service you write. The ports open for example are explicitly defined within your Azure project,

example here of the Web Role:

Due to the nature of the Load Balancer within the Azure environment, Denial of Service attacks are partially mitigated, and the Azure team are also reviewing additional Distributed Denial of Service (DDoS) attacks.

The paper goes into quite deep technical detail around Spoofing, Eves dropping and Information Disclosure on a network level, and explains the Hypervisor’s role and network structure of the hosted solution.

Trust

Azure has both a custom, restricted-privilege trust model called “Windows Azure Partial Trust” and Full trust with Native Code Execution. Though unless you need specific scenarios such as:

The “Gatekeeper” Design Pattern

A recommended design pattern is to use the concept of a gatekeeper running under partial trust as a webrole to accept requests and messages and a KeyMater workerrole which acts as a data provider.

The Gatekeeper can talk to the KeyMaster via an internal endpoint over HTTPS if the requests are immediate or via the Azure Storage Queue. This also allows a separation for sanitizing the requests going to the KeyMaster and therefore the final Storage at the Gatekeeper level where the potential service of attack is small. These 2 roles will be deployed on separate VM’s so if the GateKeeper is compromised, no storage information will be.

Sanitizing and processing the requests are still subject to secure design and coding consideration, though this will provide additional levels of security to protect the data at the heart of the application.

As part of the Real World Windows Azure series, we talked to Martin Svensson, CEO at Sagastream, about using the Windows Azure platform to deliver its online video platform. Here's what he had to say:

MSDN: Tell us about Sagastream and the services you offer.

Svensson: Sagastream is a new online video startup company based in Gothenburg, Sweden. We've developed a flexible online video platform-ensity-that includes an easy-to-use tool set that companies can use to successfully upload, manage, and publish interactive online video. With ensity, users can create interactive videos for branding, selling, marketing, and product demonstration, to name just a few examples. Every aspect of the platform, including streaming, encoding, and service hosting, is cloud-based, which helps make our services globally accessible and scalable. The solution is still in closed beta release, but we're doing a phased rollout during the rest of 2010.

MSDN: Previously, you used Amazon EC2. Can you tell us about your experiences with Amazon EC2 and why you decided to switch to the Windows Azure platform?

Svensson: When we originally used Amazon EC2, we were looking to address a few issues: scalability, manageability, and reducing heavy investments in server infrastructure. With Amazon EC2, we reduced our infrastructural costs, but there was still a lot to manage and we had to implement the scaling logic ourselves, including setting up servers for load balance. Whereas Amazon EC2 offers Infrastructure-as-a-Service (IaaS), the Windows Azure platform offers a Platform-as-a-Service (PaaS) that is better suited for our needs. With Window Azure, we don't have to worry about managing the infrastructure or setting up virtual machines for scalability.

MSDN: Can you describe the solution you built with the Windows Azure platform?

Svensson: Instead of building every component of our platform from scratch, we rely on industry leaders in fields such as encoding and streaming, and we pull everything together with our service-we focus on building the "brain" that controls everything. With video management, server loads are very high, so we need that brain to be smart. That's where the Windows Azure platform comes in. We use Windows Azure for our computational processing needs, either directly or indirectly on client computers through the application programming interfaces (APIs) hosted on Windows Azure. We use Windows Azure Blob storage for files, Windows Azure Table storage for log files, and Microsoft SQL Azure for our relational data needs.

MSDN: What makes your solution unique?

Svensson: The APIs allow third-party add-ons to easily integrate with ensity, giving customers a single user interface that works with all their add-ons. This feature is unique to ensity and key for accomplishing the otherwise impossible task of making an online video platform that is both easy to use and flexible. …

Zuora, the company that defined and continues to lead the subscription billing industry, today announced that it will join the Microsoft Windows Azure Technology Adoption Program (TAP) and help in the advancement of Microsoft's cloud computing platform. Zuora is the first on-demand billing and subscription management provider to be chosen by Microsoft from among 300 ISV partners.

As part of the Windows Azure TAP program, Zuora is making immediately available its Zuora Toolkit for Windows Azure.

As the world quickly moves to cloud computing, developers, ISV partners and enterprises need the right tools to build, run, and scale their solutions in the cloud and reduce the time-to-market and to support the changing subscription economy. Microsoft created the Windows Azure TAP to ensure the world's most innovative companies not only adopt Windows Azure platform, but also help drive its success.

Concurrently, the shift to cloud computing is also driving substantial changes to the software market including how ISVs sell and charge for access. Customers are demanding software solutions on a subscription basis, using on-demand, pay-for-what-you-use models that mirror the elastic nature of the cloud itself. This is the new face of cloud commerce.

Unless vendors of cloud solutions -- IaaS, PaaS, or SaaS -- also have the right tools needed to meter, price, and bill for their offerings, cloud computing will not achieve its full potential.

Introducing the Zuora Toolkit for Windows Azure

To drive success and adoption for the Windows Azure ecosystem, Zuora has delivered the Zuora Toolkit for Windows Azure in conjuncture with Microsoft to enable developers and ISVs to easily automate commerce from within their Windows Azure application and/or website in a matter of minutes. With the Zuora Toolkit for Windows Azure, Windows Azure developers and ISVs can:

"Cloud computing is poised to change the way our customers do business, and we're working to ensure that Windows Azure will enable these companies to quickly develop and deploy cloud-based applications," said Michael Maggs, senior director, Windows Azure partner strategy at Microsoft. "The importance of Zuora's solution is that it helps give developers and ISVs the flexibility to monetize their applications based on any pricing variables."

"The single most important service for ISVs in the cloud is the ability to monetize, and Zuora removes the friction of building a payment ecosystem and manages the constant changes that come with subscription management," said Shawn Price, president and general manager at Zuora. "We are excited to be chosen by Microsoft and to offer our expertise in cloud-based billing and commerce so the 10,000 Azure developers and customers will be able to launch and monetize their applications quickly."

Many thanks to Ashish Mundra [of GlobalLogic] for putting together a handy overview that highlights considerations and requirements to moving an app over to the Windows Azure platform.

Here’s Ashish’s introduction:

Migration of existing ASP.Net web application to Windows Azure involves manual work as there is no automated tool available. This also requires you to look at several aspects of your application. However, if you already have a scalable, configurable application capable on running on a Web farm then migrating the application to cloud is not a complex undertaking. This blog covers vital considerations for moving an ASP.Net web application from On-Premises to Cloud.

The blog assumes that you have a basic understanding of Windows Azure platform and ASP.Net. The section below will provide overview on design / architecture changes involved for different components of a Web Application.

Important Note: This blog contains contents from different materials available on Internet from several sites. Purpose of this document is to gather those contents in a concise format at one place and provide inputs from our experience working with ASP.Net applications and Windows Azure CTP wherever required.

In April, we introduced Visual Studio 2010 to the world. One of the breakthrough features we delivered in VS 2010 is IntelliTrace - a tool that enables you to do historical debugging and is a key part of addressing the 'no repro' scenarios that we always encounter. Customer feedback on this tool has been very positive.

Today, we are announcing the availability of the June 2010 release of Windows Azure Tools for Microsoft Visual Studio. This brings the power of IntelliTrace to cloud services running in Windows Azure for Visual Studio 2010 Ultimate customers.

Yesterday: Limited Visibility; Today: Clear Skies

One of the challenges of developing for Windows Azure is being able to "see into the cloud", and new debugging tools let you do exactly that. In particular, the integration of IntelliTrace with the Windows Azure Tools allows you to historically debug issues that occur in the cloud right from your desktop.

Show Me How

To show you how the IntelliTrace integration with the Windows Azure Tools works, let's create a new Windows Azure Cloud Service. Click on File | New Project | Windows Azure Cloud Service. Click to add an ASP.NET MVC Web Role, and click OK.

This solution will work just fine in the cloud, so let's introduce an error that we can debug later using IntelliTrace.

In MvcWebRole1, click to open the References node, right click on System.Web.Mvc and select "Properties".

Change the "Copy Local" property to False, which will cause the application to be deployed without its System.Web.Mvc dependency, causing a load time error in the application. This load time error is the error we will find and trace using IntelliTrace.

Now we can deploy our project to the cloud. Right click on the Cloud Service project and select "Publish":

This will bring up the deploy dialog. Follow the steps to setup your credentials and pick a Hosted Service to deploy to. If you are using Visual Studio 2010 Ultimate and .NET 4, you can click the checkbox option to "enable IntelliTrace for .NET 4 roles".

This will deploy the cloud service to Windows Azure, packaging in the necessary IntelliTrace files along with an agent that Visual Studio will communicate with to retrieve the IntelliTrace data. You can monitor the progress of the deployment from the Windows Azure Activity log and the status of the Hosted Service from the Windows Azure Compute node in the Server Explorer.

Because we added a load time error into this cloud service by changing the Copy Local property of one of our referenced assemblies to False, the web role never gets to the running state. Instead, our web role becomes unresponsive. You can see the activity log showing the web role as unresponsive above, and below, the Server Explorer shows the web role instance as unresponsive as well.

Now we can use IntelliTrace to debug the issue. Right click on the instance that is unresponsive and select "View IntelliTrace logs".

This will communicate with the debugging agent in the cloud and create an IntelliTrace log that Visual Studio will display to you. Once the file is open, navigate to the Exception Data and you'll see the error "Could not load file or assembly System.Web.Mvc". Now you can change the Copy Local property of the assembly to false in your project, rebuild, and redeploy your web role to ensure you've fixed the problem. While this is a simple issue with a quick fix, without IntelliTrace, this error could be very difficult to diagnose because it won't reproduce in your local development environment where you may have added the required assembly to your path or global assembly cache.

As with previous versions, Windows Azure Tools are freely available for Visual Studio customers and integrate into Visual Studio directly. Download the June 2010 release of the Windows Azure Tools and let us know what you think. To learn more about today's Windows Azure Tools release, visit Cloudy In Seattle.

Whoever you support in the World Cup, follow their progress through this great web site http://www.theworldcupmap.com/ The World Cup Map” site shows the new stadiums that South Africa have built for the World Cup, providing details on what games are being played in which stadium and ability to download the schedule into your calendar – even better its a great demonstration of Microsoft’s latest technologies and how they can be used to create highly engaging visual experiences.

Microsoft is mounting an all-out cloud sales offensive against rival Google that includes a move to add 300 to 500 direct salespeople to work with partners and customers to sell cloud solutions.

"We are incrementing our sales force to go after the cloud," said Vince Menzione, Microsoft general manager, partner strategy for US Public Sector. "We are changing up our message to customers."

"All of our salespeople will be leading with cloud," said Menzione. "The message from (Microsoft CEO) Steve Ballmer is that we are all in with the cloud. Cloud is the way we lead our discussions with our customers."

Menzione detailed the big cloud push and the opportunities for partners in a keynote address on Monday before several hundred public sector solution providers at the Everything Channel XChange Public Sector conference at the Sawgrass Marriott in Jacksonville, Fla.

Menzione said the new cloud sales assignments take hold for the start of Microsoft's new fiscal year July 1. What's more, he said Microsoft's all out cloud offensive will be on full display at the Microsoft Worldwide Partner Conference on July 11-15 in Washington, DC.

Menzione said the partner conference will include new partner and pricing models around cloud services. "We are breaking glass within Microsoft," he said. "It (The Cloud) is changing our business models, processes, and product portfolio."

Microsoft insiders said the cloud sales push represents the same kind of all out attack that Microsoft placed in the internet game in the mid-90s when Netscape's Navigator browser early on beat Microsoft to the punch. Microsoft's aggressive push with its own Internet Explorer ultimately ended up decimating Netscape's Navigator in the browser wars.

"There is a realization that we weren't first to market, but now it is time to take all of our solutions and our rich experience in software that everyone is familiar with utilizing and focus it on the cloud," Menzione said. "There is an opportunity to get out and be a market leader."

Menzione said there is no contest between the opportunities that partners have going to market with Microsoft versus Google. "Google is an ad model," he said. " We are an enterprise model. The class of services we are offering are enterprise class services. It is not consumer e-mail we are just talking about here. This is enterprise class scale with financial SLAs (Service Level Agreements)."

"The difference is in our approach," he continued."Everything we do is around partners. We value the partner ecosystem in everything we do."

Steve continues with page 2: New Microsoft Partner Opportunities Around Cloud. The sales team probably will devote more energy to Microsoft's Business Productivity Online Suite (BPOS) and Office Live Web apps than Azure, but 300 to 500 more folks hawking the cloud can’t hurt.

Microsoft is betting on the cloud to provide the next wave of innovation and opportunities for technologists, businesses and consumers. CEO Steve Ballmer has said that the vendor is "all in" for the cloud, which potentially represents a $3.3 trillionmarket. But where does its cloud computing platform stand today? To gauge Microsoft's cloud momentum, check out our latest news stories, product reports and user adoption stories.

Executive commitments

Microsoft emphasizes hybrid cloud at TechEd: As the technology industry moves toward the cloud, users can ease the transition by adopting a hybrid computing model, said Bob Muglia, Microsoft's president of servers and tools. "We're creating the precursors for the cloud. Today there is a lot of work you're doing inside your environment that could be delivered as a service."

Microsoft exec: We and users win with cloud: Microsoft is firmly on the cloud-computing bandwagon and with good reason -- it can make more money by doing so, even as it helps customers cut costs, said business division head Stephen Elop. Microsoft is not only selling applications via the cloud, but raw computing power and a development platform with its Azure service. "We're going after more of the pot."

Microsoft's Ballmer: 'For the cloud, we're all in': Microsoft has 40,000 people employed building software around the globe, and about 70% of those folks are doing something for the cloud, Steve Ballmer said during a March address at the University of Washington.

Microsoft 'all in' the cloud, customers not as much: While the benefits of cloud computing are demonstrable -- lower costs, greater flexibility, scalability and the like -- not all software applications are suitable for being delivered in the cloud and it will take a while for cloud computing to become mainstream, said Tim O'Brien, senior director of the Platform Strategy Group at Microsoft.

Microsoft's Muglia: Cloud revenue to hit in a couple years: Microsoft plans to invest heavily in its cloud platform, but expects to see little revenue for two to three years, as businesses to resume spending on client and server software. "[The cloud] is not what will drive financial growth in server and tools. It is essentially zero percent of our current operating revenue."

Product progress

Microsoft: Features still missing in Azure: Due to an early emphasis on getting the right architecture for its Azure cloud platform, which went live in February, Microsoft's cloud service is still missing key features that are available in the company's standalone products, said Microsoft executives at the company's 2010 Tech Ed conference.

Microsoft's cloud-based Exchange, SharePoint still stuck in 2007: Microsoft has begun upgrading cloud-based Exchange and SharePoint services to its 2010 offerings, but the migration is expected to last all year and many customers may see only a "preview" version of the technology in 2010. Exchange 2010 shipped last November, and SharePoint 2010 was released in May of this year, but the cloud-based versions of Exchange and SharePoint are still running on the 2007 versions.

Microsoft weaves management technology into cloud vision: Microsoft's plans for cloud computing don't stop with infrastructure and applications. Company executives say Microsoft will also provide the heterogeneous management layer that customers will need to optimize application performance on-premises or in hosted environments.

Microsoft's 2010 task: Make the cloud clear: For Microsoft, 2010 is a platform building and marketing year with no less than the future success of its cloud strategy hanging in the balance, according to observers. Experts say Microsoft's charge is not only to begin developing and delivering technology that will define its external, internal and hybrid cloud environments, but also to clearly articulate to an overwhelming majority of corporate IT pros just how and why they want to live in a cloud.

Microsoft rolls out cloud for U.S. federal users: Microsoft announced a suite of hosted cloud services that will be delivered from a facility dedicated to U.S. federal government users. The services available include Exchange, SharePoint, Office Live Meeting and Office Communications.

Microsoft creates new server and cloud division: Microsoft in December created a new division designed to brings its cloud and on-premises software development together and provide a consistent platform for corporate customers. The Server and Cloud Division (SCD) will be part of the Server and Tools business unit and is a combination of the Windows Server and solutions group and the Windows Azure group.

Microsoft cloud service deployed by Kentucky schools: The Kentucky Department of Education is replacing its e-mail servers with a free cloud-based offering from Microsoft, one that will supply 700,000 students, faculty and staff with e-mail and other information-sharing tools. By going with a free, cloud-based offering, the state expects to save $6.3 million in IT-related costs over the next four years.

City of Carlsbad connects to the cloud: The city of Carlsbad, Calif., recently signed on for Microsoft's Business Productivity Online Suite, a cloud-based service in which Microsoft hosts the city's e-mail and collaboration services, including SharePoint, Live Meeting and instant messaging.

A daily newspaper running a story this long on a topic as nebulous as cloud computing means it must be here to stay, no?

• Bob Evans asserts “Analyzing the cloud's impact on everything from security to the CIO to corporate culture, a new book on the cloud revolution by my outstanding colleague Charles Babcock is an absolute must-read” in his Global CIO: 10 Indispensable Insights On Cloud Computing column of 6/17/2010 for InformationWeek:

While InformationWeek editor-at-large Charles Babcock has forgotten more about enterprise software than most of us will ever know, he has crafted a highly engaging new book--Management Strategies for the Cloud Revolution--about the power and the future of this emerging technology that's far more of an adventure story than a technical treatise on hypervisors and provisioning.

A gifted and graceful writer, Babcock weaves a tale of the cloud's warts as well as its promise, and takes readers along for a, uh, "virtual" tour of data centers and IT organizations and startups and Amazon and Google and much much more, escalating gradually the power and sweep of his ideas as he deftly describes such arcana as virtual appliances and the behavior patterns of hackers.

Management Strategies for the Cloud Revolution (available in bookstores and on Amazon) is easy and engaging to read not because Babcock takes either the subject matter or the intelligence of his readers lightly—he's far too intelligent and aware to even feint in either of those directions—but because he knows his subject so well and because, as the title promises, he has the fervor and the passion of a revolutionary himself.

For anyone whose job comes even close to forming strategy around or making decisions on cloud computing, I recommend this book unconditionally and promise that it will enrich your strategies and inform your decision-making. And even beyond that, it will likely transform your thinking about why the cloud doesn't represent just another dry technology progression but rather, as Babcock says, "a new way of doing things and a whole new set of opportunities" and also a "break from the shackles of the past."

(For more analyses of cloud strategies and related columns, be sure to check out our "Recommended Reading" list at the end of this column.)

Now I'll shut up and let the expert speak: with Charlie's permission, here's my own set of excerpts from his book delivered in a list of 10 Indispensable Perspectives On Cloud Computing (the headings are mine, the excerpts are from Charlie's book).

Please join us at Computerworld magazine’s seminar entitled Leveraging the Cloud to Optimize Enterprise Application and Website Delivery sponsored by Akamai Technologies, in partnership with Verizon Business to interact with industry leaders and your peers in sessions and case studies that will help you understand the latest techniques for cost-effectively optimizing Web content and application delivery around the globe.

How do technologies dependent on the Internet and the cloud deal with mission-critical challenges like application performance, scale, availability and security?

With customers, partners and employees distributed over vast geographies, how can cloud-based solutions deliver timely, reliable service from their source?

And what strategies and tactics are companies using to optimize their Internet-dependent computing platforms?

*Our audience consistently rates peer-to-peer networking opportunities as one of the top reasons they attend a Computerworld conference. To that end, you must be a senior IT executive or a senior IT director involved in the purchase of IT products and services to qualify for attendance. Additional criteria for qualification are available upon request. As such, analysts, venture capitalists, sales, marketing and consulting positions from non-sponsoring vendor companies do not qualify for attendance. Computerworld reserves the right to deny enrollment to anyone who does not meet the qualifying criteria. Thank you for understanding.

Major announcements by Hewlett-Packard, General Electric and Verizon provide more evidence that cloud computing options for businesses are expanding rapidly. Such announcements are raising the confidence of businesses of all sizes that cloud computing systems are now big enough and efficient enough to reliably safeguard their IT systems even at a lower cost than they can do it themselves.

When Hewlett-Packard launched its Software Universe event June 15 with a presentation on cloud computing, it was just the start of a day about clouds. HP kicked things off with its announcement at a massive event held at the Gaylord National conference center at Washington's National Harbor, and featured Bill Veghte, who recently arrived at HP after a long stint at Microsoft. Veghte said that about three-quarters of businesses are pursuing cloud computing in one way or another.

Meanwhile, General Electric was holding an event in downtown Washington at which it announced its electronic medical records software-as-a-service product. This platform is aimed at giving small medical practices a way to offer electronic medical records affordably.

Next, Verizon announced June 15 that it will be providing its own branded cloud storage product aimed at enterprises. Verizon Cloud Storage is designed to work either on its own or in conjunction with existing SAN or NAS storage systems. Verizon's product is building on a capability that the company has had for some time, but is only now being offered under its own brand.

It's significant that in a single day, the IT community has seen three major, if unrelated, announcements of cloud computing and storage products. But what's significant may not be what you think. After all, the whole cloud thing has been part of virtually every IT discussion for the last year. Even individual consumers are being offered cloud services for offsite backup. What's significant is that the whole range of cloud services is starting to show signs of offering more complete systems.

GE's example may be the best one. While much of the world of cloud computing is theoretical at best, GE Health Care is offering an actual service that's badly needed by a community of professionals that often doesn't have ready access to up-to-date IT systems. In many ways, doctor's offices and small medical practices have a great need for good IT because it helps their patients, and it can help them by making operations more efficient. But the sea of regulations that surrounds medical computing is a powerful disincentive.

GE, in its new product launch, is offering a solution to a community that badly needs a good, secure, reliable and affordable cloud service, and has no way to get it. While this service is just for the medical community, it opens the door for the hundreds of thousands of small businesses that may not deal with medical records but that need a path to secure, reliable and affordable IT systems.

Verizon and HP, meanwhile, also open new paths to the cloud for companies that haven't been using it effectively in the past. While there are a number of cloud storage companies already serving the enterprise, Verizon has the credibility—and the clout—to be attractive to large enterprises. It's a little less clear what HP has in mind, except that the company is indicating that it will be offering cloud solutions and cloud-capable applications that allow companies to take advantage of virtualization by offering support for cloud services.

What's critical, however, aren't these specific announcements. What's important is the trend. Piece by piece, companies are beginning to offer cloud-based solutions that can be used by actual companies relatively easily. Until recently, the discussion about computing or storage in the cloud has mostly been about what could be done, someday. Now the discussion is moving to what is being done and is available either now or in the near future.

Cloud computing and cloud storage aren't the answer to all needs for all companies, but the technology and the services that come with it are very important. There's no reason, for example, that a small company can't put its point-of-sale operation or its inventory control software in the cloud and spend a lot less money than it does maintaining its own data center. There's also no reason why a smaller company can't use cloud services when it previously had no way to automate any of its operations.

While the industry has a way to go before it can offer affordable cloud services to every mom-and-pop grocery store or landscaping company, the trend is in that direction. After all, most small medical offices only consist of a few people, a limited number of services and a lot of records. How long will it be before those records become inventory records, and those appointments are for yard work instead of summer camp physical exams?

The leap into cloud-based software isn't that big, but the number of companies that need to make that leap is immense. While it's unlikely that there will be a cloud service, cloud software or cloud storage that's appropriate for every business, the broad availability of affordable cloud services will be a significant benefit to business as a whole because it will bring enterprise-class operations to businesses of any size, at the same time lowering the cost of doing business. The good news is that the process has already started, as this one day's worth of announcements indicates.

David Linthicum asserts “VMware could raise its cloud profile -- assuming Amazon.com doesn't have other plans for the Ruby on Rails cloud platform” in his What VMware sees from a possible buy of EngineYard post of 6/17/2010 to InfoWorld’s Cloud Computing blog:

It's just a rumor for now, but EMC's VMware unit could be seeking new cloud technology as the company looks to further raise its profile -- in the cloud computing market generally and in the platform services space specifically. The rumored target, according to the GigaOm site: EngineYard.

The quick description of EngineYard is that it is the Ruby on Rails platform service provider. The company raised $37 million from the likes of Amazon.com, Benchmark, DAG Ventures, and Bay Partners, and those VCs could be in for a quick exit if the rumors are correct. Of course, they are not commenting.

So what would this acquisition get VMware? A pretty good cloud platform provider, if you ask me. As a cloud provider, EngineYard hasn't gotten much ink, but it is every bit as good as -- if not better than -- both Google and Microsoft. Ruby on Rails is a very productive way to build apps, and many developers who've opted for Ruby are now developing on EngineYard.

One reason this deal may not go through could be the fact that VMware, at the end of the day, is really a software company and EngineYard is a service. However, it could still incorporate EngineYard intellectual property within VMware products, but I figure that's a lot tougher than it sounds.

Also, major investor Amazon.com could have other plans for EngineYard and perhaps provide a better fit than VMware, given that Amazon.com's infrastructure service model is complementary to EngineYard's platform service model.

That said, VMware has a lot of momentum in the market right now, and EngineYard would certainly accelerate that trend. As VMware looks to become a larger part of the cloud, the addition of EngineYard would provide more creditability.

We'll see if the rumor is true. Regardless, I figure somebody will buy EngineYard pretty soon. There's just too much value in that technology right now.

Because of the remoteness of SQL Azure it is beneficial to have some tricks in your coding toolbox for dealing with large binary objects, the varbinary(max) data type in SQL Azure. One of these is to be able to stream large binary objects (BLOB) -- reading or writing a piece of the data at a time.

This article provides a SqlStream class written in C# code. The class implements the abstract Stream class for the varbinary(max) data type on SQL Azure; Stream is an abstract class defined in the .NET CLR that is well supported and very versatile. The SqlStream class provided when used with SQL Azure allows you to manipulate a single blob a chunk at a time.

Using the SqlStream class provided you can:

Create a console application to copy a file of BLOB data to SQL Azure from the command line without having to load the whole BLOB into memory.

Create a Windows Azure Role that would read a BLOB from SQL Azure and store it in Windows Azure Storage.

Stream the BLOB from SQL Azure to a Winform application one chunk at a time - with the added benefit of being able to provide a good status dialog with progress bar.

Use the BinaryWriter, BinaryReader, FileStream, and MemoryStream classes in CLR to read and write the varbinary(max) data type without having to load the all the data into memory.

The blog post is a start of a series, in the coming days we will build some of the applications listed above. For now, here are some samples for using the SqlStream class. Download the SqlStream class its own .cs file at the bottom of the post.

Samples

The first sample uses the SqlStream class to read an image from the Adventure Works database deployed on SQL Azure and saving it to a file. You can download the Adventure Works database from SQL Server Database Samples.

Matt Steele walks us through on a whiteboard all of the steps required on how to federate your identity to Windows Azure using ADFS 2.0 for single-sign-on. This video is a great way to learn how ADFS works and to help you get started to deploy this scenario before you dig into deeper whitepapers. We will help you answer questions like:

What kind of SSL certificate should we get and when to get it?

Should we open up the firewall to the ADFS server or just manually copy over the certificates to establish the initial trust relationship?

Recently the Azure and security community published the “Security Best Practices for Developing Windows Azure Application” paper on download.microsoft.com: download available here

Outlining the security considerations developers should consider when building a service on Windows Azure.

As I have worked on Azure before and have an interest in security I found this a very interesting read, with links to some excellent msdn articles.

Identity Management

The document starts by outlining the best practices within Identity Management and Access Control and how that applies in general and to the cloud. One reference in particular is the use of Windows Identity Foundation or “WIF” along with ADFS and using AppFabric. I have worked with WIF in the past outside of the cloud, and while being quite complicated to understand, when you understand the basics (or play around in Visual Studio with a Secure Token Service for a while) it becomes very powerful quite quickly.

Service Layer

Then the paper starts to look into specific development considerations for the service-layer.

One consideration similar to most cross site scripting issues within web technologies is to use the Anti-Cross-Site-Scripting Library and using the ASP.NET Page.ViewStateUserKey to mitigate against Cross-Site Request Forgery attacks. This article is excellent at explaining how the attack works and what is needed to mitigate against it:

void Page_Init (object sender, EventArgs e) {

ViewStateUserKey = Session.SessionID;

:

}

Now specific to Azure, is where the (hosting) Namespace and scopes. As all azure services are hosted on *ServiceName*.cloudapp.net, it is worth pointing out that the best practice is to use a custom domain name which you have full control over, and using a CNAME to do the redirection: blog article

This way no other applications running under the cloudapp.net domain can script to your service using your custom domain without XSS.

Secure Data

As Azure storage uses a Shared Access Signature to allow your service to read and write to blobs/queue/tables, there is a guideline to minimizing risk of using a Shared Access Signature:

Generate Shared Access Signatures with the most restrictive set of ACLs possible that still grant the access required by the trusted party.

Use the shortest lifetime possible.

Use HTTPS in the request URL so the token cannot be snooped on the wire.

Remember that these tokens are only used for temporary access to non-public blob storage – as with passwords, it’s a bad idea to use the same ones over and over.

It is also suggested for data protection to encrypt data with certificates as DPAPI is not available, and deploy them using the Azure Certificate Store and not in Azure Storage such as blobs:

Infrastructure Layer

As Windows Azure deals with a lot of the infrastructure, items here are mostly within the configuration and definition of the service you write. The ports open for example are explicitly defined within your Azure project,

example here of the Web Role:

Due to the nature of the Load Balancer within the Azure environment, Denial of Service attacks are partially mitigated, and the Azure team are also reviewing additional Distributed Denial of Service (DDoS) attacks.

The paper goes into quite deep technical detail around Spoofing, Eves dropping and Information Disclosure on a network level, and explains the Hypervisor’s role and network structure of the hosted solution.

Trust

Azure has both a custom, restricted-privilege trust model called “Windows Azure Partial Trust” and Full trust with Native Code Execution. Though unless you need specific scenarios such as:

The “Gatekeeper” Design Pattern

A recommended design pattern is to use the concept of a gatekeeper running under partial trust as a webrole to accept requests and messages and a KeyMater workerrole which acts as a data provider.

The Gatekeeper can talk to the KeyMaster via an internal endpoint over HTTPS if the requests are immediate or via the Azure Storage Queue. This also allows a separation for sanitizing the requests going to the KeyMaster and therefore the final Storage at the Gatekeeper level where the potential service of attack is small. These 2 roles will be deployed on separate VM’s so if the GateKeeper is compromised, no storage information will be.

Sanitizing and processing the requests are still subject to secure design and coding consideration, though this will provide additional levels of security to protect the data at the heart of the application.

As part of the Real World Windows Azure series, we talked to Martin Svensson, CEO at Sagastream, about using the Windows Azure platform to deliver its online video platform. Here's what he had to say:

MSDN: Tell us about Sagastream and the services you offer.

Svensson: Sagastream is a new online video startup company based in Gothenburg, Sweden. We've developed a flexible online video platform-ensity-that includes an easy-to-use tool set that companies can use to successfully upload, manage, and publish interactive online video. With ensity, users can create interactive videos for branding, selling, marketing, and product demonstration, to name just a few examples. Every aspect of the platform, including streaming, encoding, and service hosting, is cloud-based, which helps make our services globally accessible and scalable. The solution is still in closed beta release, but we're doing a phased rollout during the rest of 2010.

MSDN: Previously, you used Amazon EC2. Can you tell us about your experiences with Amazon EC2 and why you decided to switch to the Windows Azure platform?

Svensson: When we originally used Amazon EC2, we were looking to address a few issues: scalability, manageability, and reducing heavy investments in server infrastructure. With Amazon EC2, we reduced our infrastructural costs, but there was still a lot to manage and we had to implement the scaling logic ourselves, including setting up servers for load balance. Whereas Amazon EC2 offers Infrastructure-as-a-Service (IaaS), the Windows Azure platform offers a Platform-as-a-Service (PaaS) that is better suited for our needs. With Window Azure, we don't have to worry about managing the infrastructure or setting up virtual machines for scalability.

MSDN: Can you describe the solution you built with the Windows Azure platform?

Svensson: Instead of building every component of our platform from scratch, we rely on industry leaders in fields such as encoding and streaming, and we pull everything together with our service-we focus on building the "brain" that controls everything. With video management, server loads are very high, so we need that brain to be smart. That's where the Windows Azure platform comes in. We use Windows Azure for our computational processing needs, either directly or indirectly on client computers through the application programming interfaces (APIs) hosted on Windows Azure. We use Windows Azure Blob storage for files, Windows Azure Table storage for log files, and Microsoft SQL Azure for our relational data needs.

MSDN: What makes your solution unique?

Svensson: The APIs allow third-party add-ons to easily integrate with ensity, giving customers a single user interface that works with all their add-ons. This feature is unique to ensity and key for accomplishing the otherwise impossible task of making an online video platform that is both easy to use and flexible. …

Zuora, the company that defined and continues to lead the subscription billing industry, today announced that it will join the Microsoft Windows Azure Technology Adoption Program (TAP) and help in the advancement of Microsoft's cloud computing platform. Zuora is the first on-demand billing and subscription management provider to be chosen by Microsoft from among 300 ISV partners.

As part of the Windows Azure TAP program, Zuora is making immediately available its Zuora Toolkit for Windows Azure.

As the world quickly moves to cloud computing, developers, ISV partners and enterprises need the right tools to build, run, and scale their solutions in the cloud and reduce the time-to-market and to support the changing subscription economy. Microsoft created the Windows Azure TAP to ensure the world's most innovative companies not only adopt Windows Azure platform, but also help drive its success.

Concurrently, the shift to cloud computing is also driving substantial changes to the software market including how ISVs sell and charge for access. Customers are demanding software solutions on a subscription basis, using on-demand, pay-for-what-you-use models that mirror the elastic nature of the cloud itself. This is the new face of cloud commerce.

Unless vendors of cloud solutions -- IaaS, PaaS, or SaaS -- also have the right tools needed to meter, price, and bill for their offerings, cloud computing will not achieve its full potential.

Introducing the Zuora Toolkit for Windows Azure

To drive success and adoption for the Windows Azure ecosystem, Zuora has delivered the Zuora Toolkit for Windows Azure in conjuncture with Microsoft to enable developers and ISVs to easily automate commerce from within their Windows Azure application and/or website in a matter of minutes. With the Zuora Toolkit for Windows Azure, Windows Azure developers and ISVs can:

"Cloud computing is poised to change the way our customers do business, and we're working to ensure that Windows Azure will enable these companies to quickly develop and deploy cloud-based applications," said Michael Maggs, senior director, Windows Azure partner strategy at Microsoft. "The importance of Zuora's solution is that it helps give developers and ISVs the flexibility to monetize their applications based on any pricing variables."

"The single most important service for ISVs in the cloud is the ability to monetize, and Zuora removes the friction of building a payment ecosystem and manages the constant changes that come with subscription management," said Shawn Price, president and general manager at Zuora. "We are excited to be chosen by Microsoft and to offer our expertise in cloud-based billing and commerce so the 10,000 Azure developers and customers will be able to launch and monetize their applications quickly."

Many thanks to Ashish Mundra [of GlobalLogic] for putting together a handy overview that highlights considerations and requirements to moving an app over to the Windows Azure platform.

Here’s Ashish’s introduction:

Migration of existing ASP.Net web application to Windows Azure involves manual work as there is no automated tool available. This also requires you to look at several aspects of your application. However, if you already have a scalable, configurable application capable on running on a Web farm then migrating the application to cloud is not a complex undertaking. This blog covers vital considerations for moving an ASP.Net web application from On-Premises to Cloud.

The blog assumes that you have a basic understanding of Windows Azure platform and ASP.Net. The section below will provide overview on design / architecture changes involved for different components of a Web Application.

Important Note: This blog contains contents from different materials available on Internet from several sites. Purpose of this document is to gather those contents in a concise format at one place and provide inputs from our experience working with ASP.Net applications and Windows Azure CTP wherever required.

In April, we introduced Visual Studio 2010 to the world. One of the breakthrough features we delivered in VS 2010 is IntelliTrace - a tool that enables you to do historical debugging and is a key part of addressing the 'no repro' scenarios that we always encounter. Customer feedback on this tool has been very positive.

Today, we are announcing the availability of the June 2010 release of Windows Azure Tools for Microsoft Visual Studio. This brings the power of IntelliTrace to cloud services running in Windows Azure for Visual Studio 2010 Ultimate customers.

Yesterday: Limited Visibility; Today: Clear Skies

One of the challenges of developing for Windows Azure is being able to "see into the cloud", and new debugging tools let you do exactly that. In particular, the integration of IntelliTrace with the Windows Azure Tools allows you to historically debug issues that occur in the cloud right from your desktop.

Show Me How

To show you how the IntelliTrace integration with the Windows Azure Tools works, let's create a new Windows Azure Cloud Service. Click on File | New Project | Windows Azure Cloud Service. Click to add an ASP.NET MVC Web Role, and click OK.

This solution will work just fine in the cloud, so let's introduce an error that we can debug later using IntelliTrace.

In MvcWebRole1, click to open the References node, right click on System.Web.Mvc and select "Properties".

Change the "Copy Local" property to False, which will cause the application to be deployed without its System.Web.Mvc dependency, causing a load time error in the application. This load time error is the error we will find and trace using IntelliTrace.

Now we can deploy our project to the cloud. Right click on the Cloud Service project and select "Publish":

This will bring up the deploy dialog. Follow the steps to setup your credentials and pick a Hosted Service to deploy to. If you are using Visual Studio 2010 Ultimate and .NET 4, you can click the checkbox option to "enable IntelliTrace for .NET 4 roles".

This will deploy the cloud service to Windows Azure, packaging in the necessary IntelliTrace files along with an agent that Visual Studio will communicate with to retrieve the IntelliTrace data. You can monitor the progress of the deployment from the Windows Azure Activity log and the status of the Hosted Service from the Windows Azure Compute node in the Server Explorer.

Because we added a load time error into this cloud service by changing the Copy Local property of one of our referenced assemblies to False, the web role never gets to the running state. Instead, our web role becomes unresponsive. You can see the activity log showing the web role as unresponsive above, and below, the Server Explorer shows the web role instance as unresponsive as well.

Now we can use IntelliTrace to debug the issue. Right click on the instance that is unresponsive and select "View IntelliTrace logs".

This will communicate with the debugging agent in the cloud and create an IntelliTrace log that Visual Studio will display to you. Once the file is open, navigate to the Exception Data and you'll see the error "Could not load file or assembly System.Web.Mvc". Now you can change the Copy Local property of the assembly to false in your project, rebuild, and redeploy your web role to ensure you've fixed the problem. While this is a simple issue with a quick fix, without IntelliTrace, this error could be very difficult to diagnose because it won't reproduce in your local development environment where you may have added the required assembly to your path or global assembly cache.

As with previous versions, Windows Azure Tools are freely available for Visual Studio customers and integrate into Visual Studio directly. Download the June 2010 release of the Windows Azure Tools and let us know what you think. To learn more about today's Windows Azure Tools release, visit Cloudy In Seattle.

Whoever you support in the World Cup, follow their progress through this great web site http://www.theworldcupmap.com/ The World Cup Map” site shows the new stadiums that South Africa have built for the World Cup, providing details on what games are being played in which stadium and ability to download the schedule into your calendar – even better its a great demonstration of Microsoft’s latest technologies and how they can be used to create highly engaging visual experiences.

Microsoft is mounting an all-out cloud sales offensive against rival Google that includes a move to add 300 to 500 direct salespeople to work with partners and customers to sell cloud solutions.

"We are incrementing our sales force to go after the cloud," said Vince Menzione, Microsoft general manager, partner strategy for US Public Sector. "We are changing up our message to customers."

"All of our salespeople will be leading with cloud," said Menzione. "The message from (Microsoft CEO) Steve Ballmer is that we are all in with the cloud. Cloud is the way we lead our discussions with our customers."

Menzione detailed the big cloud push and the opportunities for partners in a keynote address on Monday before several hundred public sector solution providers at the Everything Channel XChange Public Sector conference at the Sawgrass Marriott in Jacksonville, Fla.

Menzione said the new cloud sales assignments take hold for the start of Microsoft's new fiscal year July 1. What's more, he said Microsoft's all out cloud offensive will be on full display at the Microsoft Worldwide Partner Conference on July 11-15 in Washington, DC.

Menzione said the partner conference will include new partner and pricing models around cloud services. "We are breaking glass within Microsoft," he said. "It (The Cloud) is changing our business models, processes, and product portfolio."

Microsoft insiders said the cloud sales push represents the same kind of all out attack that Microsoft placed in the internet game in the mid-90s when Netscape's Navigator browser early on beat Microsoft to the punch. Microsoft's aggressive push with its own Internet Explorer ultimately ended up decimating Netscape's Navigator in the browser wars.

"There is a realization that we weren't first to market, but now it is time to take all of our solutions and our rich experience in software that everyone is familiar with utilizing and focus it on the cloud," Menzione said. "There is an opportunity to get out and be a market leader."

Menzione said there is no contest between the opportunities that partners have going to market with Microsoft versus Google. "Google is an ad model," he said. " We are an enterprise model. The class of services we are offering are enterprise class services. It is not consumer e-mail we are just talking about here. This is enterprise class scale with financial SLAs (Service Level Agreements)."

"The difference is in our approach," he continued."Everything we do is around partners. We value the partner ecosystem in everything we do."

Steve continues with page 2: New Microsoft Partner Opportunities Around Cloud. The sales team probably will devote more energy to Microsoft's Business Productivity Online Suite (BPOS) and Office Live Web apps than Azure, but 300 to 500 more folks hawking the cloud can’t hurt.

Microsoft is betting on the cloud to provide the next wave of innovation and opportunities for technologists, businesses and consumers. CEO Steve Ballmer has said that the vendor is "all in" for the cloud, which potentially represents a $3.3 trillionmarket. But where does its cloud computing platform stand today? To gauge Microsoft's cloud momentum, check out our latest news stories, product reports and user adoption stories.

Executive commitments

Microsoft emphasizes hybrid cloud at TechEd: As the technology industry moves toward the cloud, users can ease the transition by adopting a hybrid computing model, said Bob Muglia, Microsoft's president of servers and tools. "We're creating the precursors for the cloud. Today there is a lot of work you're doing inside your environment that could be delivered as a service."

Microsoft exec: We and users win with cloud: Microsoft is firmly on the cloud-computing bandwagon and with good reason -- it can make more money by doing so, even as it helps customers cut costs, said business division head Stephen Elop. Microsoft is not only selling applications via the cloud, but raw computing power and a development platform with its Azure service. "We're going after more of the pot."

Microsoft's Ballmer: 'For the cloud, we're all in': Microsoft has 40,000 people employed building software around the globe, and about 70% of those folks are doing something for the cloud, Steve Ballmer said during a March address at the University of Washington.

Microsoft 'all in' the cloud, customers not as much: While the benefits of cloud computing are demonstrable -- lower costs, greater flexibility, scalability and the like -- not all software applications are suitable for being delivered in the cloud and it will take a while for cloud computing to become mainstream, said Tim O'Brien, senior director of the Platform Strategy Group at Microsoft.

Microsoft's Muglia: Cloud revenue to hit in a couple years: Microsoft plans to invest heavily in its cloud platform, but expects to see little revenue for two to three years, as businesses to resume spending on client and server software. "[The cloud] is not what will drive financial growth in server and tools. It is essentially zero percent of our current operating revenue."

Product progress

Microsoft: Features still missing in Azure: Due to an early emphasis on getting the right architecture for its Azure cloud platform, which went live in February, Microsoft's cloud service is still missing key features that are available in the company's standalone products, said Microsoft executives at the company's 2010 Tech Ed conference.

Microsoft's cloud-based Exchange, SharePoint still stuck in 2007: Microsoft has begun upgrading cloud-based Exchange and SharePoint services to its 2010 offerings, but the migration is expected to last all year and many customers may see only a "preview" version of the technology in 2010. Exchange 2010 shipped last November, and SharePoint 2010 was released in May of this year, but the cloud-based versions of Exchange and SharePoint are still running on the 2007 versions.

Microsoft weaves management technology into cloud vision: Microsoft's plans for cloud computing don't stop with infrastructure and applications. Company executives say Microsoft will also provide the heterogeneous management layer that customers will need to optimize application performance on-premises or in hosted environments.

Microsoft's 2010 task: Make the cloud clear: For Microsoft, 2010 is a platform building and marketing year with no less than the future success of its cloud strategy hanging in the balance, according to observers. Experts say Microsoft's charge is not only to begin developing and delivering technology that will define its external, internal and hybrid cloud environments, but also to clearly articulate to an overwhelming majority of corporate IT pros just how and why they want to live in a cloud.

Microsoft rolls out cloud for U.S. federal users: Microsoft announced a suite of hosted cloud services that will be delivered from a facility dedicated to U.S. federal government users. The services available include Exchange, SharePoint, Office Live Meeting and Office Communications.

Microsoft creates new server and cloud division: Microsoft in December created a new division designed to brings its cloud and on-premises software development together and provide a consistent platform for corporate customers. The Server and Cloud Division (SCD) will be part of the Server and Tools business unit and is a combination of the Windows Server and solutions group and the Windows Azure group.

Microsoft cloud service deployed by Kentucky schools: The Kentucky Department of Education is replacing its e-mail servers with a free cloud-based offering from Microsoft, one that will supply 700,000 students, faculty and staff with e-mail and other information-sharing tools. By going with a free, cloud-based offering, the state expects to save $6.3 million in IT-related costs over the next four years.

City of Carlsbad connects to the cloud: The city of Carlsbad, Calif., recently signed on for Microsoft's Business Productivity Online Suite, a cloud-based service in which Microsoft hosts the city's e-mail and collaboration services, including SharePoint, Live Meeting and instant messaging.

A daily newspaper running a story this long on a topic as nebulous as cloud computing means it must be here to stay, no?

• Bob Evans asserts “Analyzing the cloud's impact on everything from security to the CIO to corporate culture, a new book on the cloud revolution by my outstanding colleague Charles Babcock is an absolute must-read” in his Global CIO: 10 Indispensable Insights On Cloud Computing column of 6/17/2010 for InformationWeek:

While InformationWeek editor-at-large Charles Babcock has forgotten more about enterprise software than most of us will ever know, he has crafted a highly engaging new book--Management Strategies for the Cloud Revolution--about the power and the future of this emerging technology that's far more of an adventure story than a technical treatise on hypervisors and provisioning.

A gifted and graceful writer, Babcock weaves a tale of the cloud's warts as well as its promise, and takes readers along for a, uh, "virtual" tour of data centers and IT organizations and startups and Amazon and Google and much much more, escalating gradually the power and sweep of his ideas as he deftly describes such arcana as virtual appliances and the behavior patterns of hackers.

Management Strategies for the Cloud Revolution (available in bookstores and on Amazon) is easy and engaging to read not because Babcock takes either the subject matter or the intelligence of his readers lightly—he's far too intelligent and aware to even feint in either of those directions—but because he knows his subject so well and because, as the title promises, he has the fervor and the passion of a revolutionary himself.

For anyone whose job comes even close to forming strategy around or making decisions on cloud computing, I recommend this book unconditionally and promise that it will enrich your strategies and inform your decision-making. And even beyond that, it will likely transform your thinking about why the cloud doesn't represent just another dry technology progression but rather, as Babcock says, "a new way of doing things and a whole new set of opportunities" and also a "break from the shackles of the past."

(For more analyses of cloud strategies and related columns, be sure to check out our "Recommended Reading" list at the end of this column.)

Now I'll shut up and let the expert speak: with Charlie's permission, here's my own set of excerpts from his book delivered in a list of 10 Indispensable Perspectives On Cloud Computing (the headings are mine, the excerpts are from Charlie's book).

Please join us at Computerworld magazine’s seminar entitled Leveraging the Cloud to Optimize Enterprise Application and Website Delivery sponsored by Akamai Technologies, in partnership with Verizon Business to interact with industry leaders and your peers in sessions and case studies that will help you understand the latest techniques for cost-effectively optimizing Web content and application delivery around the globe.

How do technologies dependent on the Internet and the cloud deal with mission-critical challenges like application performance, scale, availability and security?

With customers, partners and employees distributed over vast geographies, how can cloud-based solutions deliver timely, reliable service from their source?

And what strategies and tactics are companies using to optimize their Internet-dependent computing platforms?

*Our audience consistently rates peer-to-peer networking opportunities as one of the top reasons they attend a Computerworld conference. To that end, you must be a senior IT executive or a senior IT director involved in the purchase of IT products and services to qualify for attendance. Additional criteria for qualification are available upon request. As such, analysts, venture capitalists, sales, marketing and consulting positions from non-sponsoring vendor companies do not qualify for attendance. Computerworld reserves the right to deny enrollment to anyone who does not meet the qualifying criteria. Thank you for understanding.

Major announcements by Hewlett-Packard, General Electric and Verizon provide more evidence that cloud computing options for businesses are expanding rapidly. Such announcements are raising the confidence of businesses of all sizes that cloud computing systems are now big enough and efficient enough to reliably safeguard their IT systems even at a lower cost than they can do it themselves.

When Hewlett-Packard launched its Software Universe event June 15 with a presentation on cloud computing, it was just the start of a day about clouds. HP kicked things off with its announcement at a massive event held at the Gaylord National conference center at Washington's National Harbor, and featured Bill Veghte, who recently arrived at HP after a long stint at Microsoft. Veghte said that about three-quarters of businesses are pursuing cloud computing in one way or another.

Meanwhile, General Electric was holding an event in downtown Washington at which it announced its electronic medical records software-as-a-service product. This platform is aimed at giving small medical practices a way to offer electronic medical records affordably.

Next, Verizon announced June 15 that it will be providing its own branded cloud storage product aimed at enterprises. Verizon Cloud Storage is designed to work either on its own or in conjunction with existing SAN or NAS storage systems. Verizon's product is building on a capability that the company has had for some time, but is only now being offered under its own brand.

It's significant that in a single day, the IT community has seen three major, if unrelated, announcements of cloud computing and storage products. But what's significant may not be what you think. After all, the whole cloud thing has been part of virtually every IT discussion for the last year. Even individual consumers are being offered cloud services for offsite backup. What's significant is that the whole range of cloud services is starting to show signs of offering more complete systems.

GE's example may be the best one. While much of the world of cloud computing is theoretical at best, GE Health Care is offering an actual service that's badly needed by a community of professionals that often doesn't have ready access to up-to-date IT systems. In many ways, doctor's offices and small medical practices have a great need for good IT because it helps their patients, and it can help them by making operations more efficient. But the sea of regulations that surrounds medical computing is a powerful disincentive.

GE, in its new product launch, is offering a solution to a community that badly needs a good, secure, reliable and affordable cloud service, and has no way to get it. While this service is just for the medical community, it opens the door for the hundreds of thousands of small businesses that may not deal with medical records but that need a path to secure, reliable and affordable IT systems.

Verizon and HP, meanwhile, also open new paths to the cloud for companies that haven't been using it effectively in the past. While there are a number of cloud storage companies already serving the enterprise, Verizon has the credibility—and the clout—to be attractive to large enterprises. It's a little less clear what HP has in mind, except that the company is indicating that it will be offering cloud solutions and cloud-capable applications that allow companies to take advantage of virtualization by offering support for cloud services.

What's critical, however, aren't these specific announcements. What's important is the trend. Piece by piece, companies are beginning to offer cloud-based solutions that can be used by actual companies relatively easily. Until recently, the discussion about computing or storage in the cloud has mostly been about what could be done, someday. Now the discussion is moving to what is being done and is available either now or in the near future.

Cloud computing and cloud storage aren't the answer to all needs for all companies, but the technology and the services that come with it are very important. There's no reason, for example, that a small company can't put its point-of-sale operation or its inventory control software in the cloud and spend a lot less money than it does maintaining its own data center. There's also no reason why a smaller company can't use cloud services when it previously had no way to automate any of its operations.

While the industry has a way to go before it can offer affordable cloud services to every mom-and-pop grocery store or landscaping company, the trend is in that direction. After all, most small medical offices only consist of a few people, a limited number of services and a lot of records. How long will it be before those records become inventory records, and those appointments are for yard work instead of summer camp physical exams?

The leap into cloud-based software isn't that big, but the number of companies that need to make that leap is immense. While it's unlikely that there will be a cloud service, cloud software or cloud storage that's appropriate for every business, the broad availability of affordable cloud services will be a significant benefit to business as a whole because it will bring enterprise-class operations to businesses of any size, at the same time lowering the cost of doing business. The good news is that the process has already started, as this one day's worth of announcements indicates.

David Linthicum asserts “VMware could raise its cloud profile -- assuming Amazon.com doesn't have other plans for the Ruby on Rails cloud platform” in his What VMware sees from a possible buy of EngineYard post of 6/17/2010 to InfoWorld’s Cloud Computing blog:

It's just a rumor for now, but EMC's VMware unit could be seeking new cloud technology as the company looks to further raise its profile -- in the cloud computing market generally and in the platform services space specifically. The rumored target, according to the GigaOm site: EngineYard.

The quick description of EngineYard is that it is the Ruby on Rails platform service provider. The company raised $37 million from the likes of Amazon.com, Benchmark, DAG Ventures, and Bay Partners, and those VCs could be in for a quick exit if the rumors are correct. Of course, they are not commenting.

So what would this acquisition get VMware? A pretty good cloud platform provider, if you ask me. As a cloud provider, EngineYard hasn't gotten much ink, but it is every bit as good as -- if not better than -- both Google and Microsoft. Ruby on Rails is a very productive way to build apps, and many developers who've opted for Ruby are now developing on EngineYard.

One reason this deal may not go through could be the fact that VMware, at the end of the day, is really a software company and EngineYard is a service. However, it could still incorporate EngineYard intellectual property within VMware products, but I figure that's a lot tougher than it sounds.

Also, major investor Amazon.com could have other plans for EngineYard and perhaps provide a better fit than VMware, given that Amazon.com's infrastructure service model is complementary to EngineYard's platform service model.

That said, VMware has a lot of momentum in the market right now, and EngineYard would certainly accelerate that trend. As VMware looks to become a larger part of the cloud, the addition of EngineYard would provide more creditability.

We'll see if the rumor is true. Regardless, I figure somebody will buy EngineYard pretty soon. There's just too much value in that technology right now.

The dual Web role application has been running in Microsoft's South Central US (San Antonio) data center since September 2009. I believe it is the oldest continuously running Windows Azure application.

About Me

I'm a Windows Azure Insider, a retired Windows Azure MVP, the principal developer for OakLeaf Systems and the author of 30+ books on Microsoft software. The books have more than 1.25 million English copies in print and have been translated into 20+ languages.

Full disclosure: I make part of my livelihood by writing about Microsoft products in books and for magazines. I regularly receive free evaluation software from Microsoft and press credentials for Microsoft Tech•Ed and PDC. I'm also a member of the Microsoft Partner Network.