In contrast to Blobs, Azure tables offer structured storage. These “tables”, however, are not to be confused with tables that might exist in the context of a relational database. In fact Azure tables are more like a typical “object database” where each table contains multiple objects, or “entities”. An entity in Azure can be up to 1 MB in size. …

Last November at PDC 09 we announced the Open Data Protocol (OData), providing a way to unlock your data and free it from silos that exist in applications today. OData makes it easy for data to be shared in a manner that follows the philosophy of Open Data and enables the creation of REST-based data services. These services allow resources identified using Uniform Resource Identifiers (URIs) and defined in an abstract data model, to be published and edited by Web clients using simple HTTP messages. OData enables a new level of data integration across a broad range of clients, servers, services, and tools.

This morning during the Keynote at Mix 2010, Doug Purdy announced the re-launch of OData.org and the release of the OData SDK.

The OData SDK brings together a wealth of resources to help developers participate in the OData Eco-system including:

Sample OData online services (northwind, etc) - open a browser and test out an OData Service.

OData client libraries

Windows Phone 7 series

iPhone

AJAX\Javascript

PHP

Java

.NET

Silverlight

Online OData explorer (Source code also available for download from odata.org)

Data Service Provider toolkit: Whitepaper and sample WCF Data Services provider implementation to demonstrate how to create a data service over *any* data source

OData validation tool: A test harness and a few sample validation tests to make it easy to validate your OData endpoint. The harness is designed to be easily extended allowing anyone to easily add new tests.

Also announced today at Mix, there are some new OData services available publicly:

I'm at the Cloud Connect 2010 conference in Santa Clara, Calif., one of the first major gatherings of the year on cloud computing. One of the larger topics that has come up thus far is not using relational databases for data persistence. Called the "NoSQL" movement, it is about leveraging more efficient databases that are perhaps able to handle larger data sets more effectively. I've already written about the "big data" efforts that are emerging around cloud, but this is a more fundamental movement to drive data back to more primitive, but perhaps some more efficient models and physical storage approaches.

NoSQL systems work with data in memory, typically or uploading chunks of data from many disks in parallel. The issue is that "traditional" relational databases don't provide the same models and, thus, the same performance. While this was fine in the days of databases with a few gigabytes of data, many cloud computing databases are blowing past a terabyte, and we'll see huge databases supporting cloud-based systems going forward. Relational databases for operations on large data sets are contraindicated because SQL queries tend to consume many CPU cycles and thrash the disk as they process data.

OData, the Open Data Protocol, is Microsoft’s alternative to Google’s GData. If you’ve heard Microsoft use the codename “Astoria” or talk about ADO.Net Data Services in the past, these two codenames are now under the larger OData umbrella. Microsoft defines OData as a protocol that builds on top of HTTP, Atom Publishing Protocol (AtomPub) and JSON “to provide access to information from a variety of applications, services, and stores.” Microsoft is building OData support into a number of its products, including SharePoint Server 2010, Excel 2010, Dallas, its Dynamics products, and its MediaRoom IPTV offerings.

The OData SDK, announced today, bundles Microsoft’s various OData clients — for Java, PHP, PalmOS, .Net, and (as of today), the iPhone — into a single package. Microsoft officials also said during today’s Mix keynote that Microsoft is open-sourcing the .Net OData client, under an Apache license.

Dallas is a new service built on top of Windows Azure and SQL Azure that will provide users with access to free and paid collections of public and commercial data sets that they can use in developing applications. The datasets are available via Microsoft’s PinPoint partner/ISV site. Microsoft is planning another Dallas CTP in the next couple of months and plans to announce Dallas pricing at the Worldwide Partner Conference in July, officials said. [Emphasis added.]

If you have a SQL Azure database account you can easily expose an OData feed through a simple configuration portal. You can select authenticated or anonymous access and expose different OData views according to permissions granted to the specified SQL Azure database user.

A preview of this upcoming SQL Azure service is available at https://www.sqlazurelabs.com/OData.aspx and it enables you to select one of your existing SQL Azure databases and, with a few clicks, turn it into an OData feed. It looks as though SQL Azure will soon be added to the stable of products that natively support OData, good news indeed.

… Today Microsoft announced the Open Data Protocol [Read more at odata.org] at the MIX Conference. The Open Data Protocol is an extension of the ATOMPub format. The OData Information can be can be represented in ATOM or JSON Format.

One of the key features of OData is that each element contains a datatype so that the data can be consumed by a number of different platforms including Java, javascript, Python, Ruby, PHP, and .NET without running into type safety issues, or misrepresenting the data in a string format. …

This week a number of the people from our team are at the MIX conference in Las Vegas. Yesterday, Mike Clark (the Group Manager for our team) presented a session to kick off the conference on the topic of building offline web applications. In this session Mike presented a look at the work we are doing to enable users to build offline SilverLight applications that enable bi-directional synchronization of data from offline SilverLight applications and a central data store like SQL Azure and SQL Server. [Emphasis added.]

The offline SilverLight support is based on a new set of capabilities we are creating called the "Asymmetric Protocol" that allows us to extend sync framework capabilities to virtually any device such as Windows Mobile, Windows Phone 7 Series, iPhone and Symbian, even if those devices do not actually have support for the Sync Framework runtime.

Mike went through a number of demonstrations that were based on the MIX scheduling application built using this new Asymmetric protocol. He showed how we enable users to take the conference agenda and session information offline in their Desktop, Windows Phone 7 Series device and iPhone devices allowing people to register for sessions and then sync and share this information amongst all of their devices via the central conference database.

To understand you really need to think what the PC era has done to information. In effect the PC revolution started what the Internet has super charged - Information Creation. Think about it, more information is now being creating in the time it takes me to write this post than was probably created between the time humans first figured out how to write up until the birth of the Internet.

But for the most part the majority of the information humankind has created has not been accessible. Most of this raw data or knowledge has been sitting in various silo's -- be it a library, a single desktop, a server, database or even data center. But recently something changed, the most successful companies of the last decade have discover how to tap into this raw data. These companies are better at analyzing, mining and using this mountain of data sitting "out there" -- turning a useless raw resource into something much more useful, Information.

Before you say anything, Yes I know I'm not the first to say this. In a 2006 post Michael Palmer wrote "Data is the new oil!" declaring "Data is just like crude. It’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals, etc to create a valuable entity that drives profitable activity; so must data be broken down, analyzed for it to have value." …

Ruv continues with further support for the Data as the new oil theory.

ProblemWe know that SQL Azure is the database offering on the Windows Azure cloud computing platform, and it goes without saying that all the technologies that plug-in to databases need to start exercising and adapting to this flavor of databases along with the regular approach of database access. In this tip we learn how to use SQL Server Reporting Services (SSRS 2008 R2 hereafter) 2008 R2 Nov CTP to connect to SQL Azure.

SolutionTwo major providers can be used in SSRS 2008 R2 to connect to SQL Azure: "Microsoft SQL Server" and "OLE DB". Using these providers, a report developer can continue to develop the report in much the same way as any locally or network installed database. The only thing that one should take care of is that, when SQL Azure goes RTM (official release), users will be charged for accessing SQL Azure. So initial development and prototyping can be done on a locally installed database, and when the report is developed to a considerable extent, testing and validation can be started against SQL Azure.

Again this is not the only option, but it really depends upon the pricing policy opted. It also may fall to a scenario that part of the data is hosted on SQL Azure and part is hosted locally. In this tip we will focus on how to connect to SQL Azure using SSRS 2008 R2, and this tip assumes that the reader has some basic working knowledge of SSRS.

Last week we released a Billing Preview that provided a usage summary to help customers prepare for when billing begins in April. Usage statistics in this billing preview are consistent with our new Connection-based billing for the AppFabric Services Bus that we announced in January. A key feature of this model allows customers to purchase Connections individually on a pay-as-you-go basis, or in Connection Packs if they anticipate needing larger quantities.

To provide greater insight into the Billing Preview and the options available to customers, a member of our team created an Excel-based calculator that converts usage numbers into costs. The calculator also allows customers to see how much their usage would have cost with each of the different reserved pack options.

While we hope customers find this tool quite useful, please do note that it is being provided only as a courtesy and is not guaranteed against possible bugs. The calculator also does not guarantee the amount of future bills.

The new release contains various fixes and improvements, but above all it contains two brand-new labs about two scenarios you asked for loud and clear: Web Services and Identity in Windows Azure and Developing Identity-Driven Silverlight Applications. …

The AppFabric Labs environment will be used showcase some early bits and get feedback from the community. Usage for this environment will not be billed.

In this release of the LABS environment, we’re shipping two features:

Silverlight support: we’ve added the ability for Silverlight clients to make cross-domain calls to the Service Bus and Access Control Services.

Multicast with Message Buffers: we’ve added the ability for Message Buffers to attach to a multicast group. A message send to the multicast group is delivered to every Message Buffer that is attached to it.

Download Labs samples from here to learn more about these new features.

To provide feedback and to learn what the community is building using the Labs technology, please visit the Windows Azure platform AppFabric forum and connect with the AppFabric team.

Running AppFabric V1 Samples Against the Labs Environment

To run the AppFabric V1 SDK samples against the Labs environment, you'll have to rename the ServiceBus.config.txt file found at the AppFabric Labs SDK page to ServiceBus.config and place it in your .NET Framework CONFIG directory. The CONFIG directory is located at:

The AppFabric V1 SDKs Windows Azure sample will not work against the Labs environment. This is because to run V1 samples against Labs you need to place the ServiceBus.config file in your .NET Framework CONFIG directory and Windows Azure doesn't allow that.

Known Issues

When uploading CrossDomain.XML to the Service Bus root, please leave out the <!DOCTYPE> schema declaration.

U-Prove is an innovative cryptographic technology that enables the issuance of claims in a manner that provides multi-party security: issuing organizations, users, and relying parties can protect themselves not just against outsider attacks but also against attacks originating from each other. At the same time, the U-Prove technology enables any desired degree of privacy (including authenticated anonymity and pseudonymity) without contravening multi-party security.

Given these user-centric aspects, it comes as no surprise that we have integrated the technology into the identity metasystem, and in particular, using information cards. Users can now obtain information cards protected by U-Prove and present them 1) with higher privacy guarantees, and 2) without online connectivity to the identity provider when interacting with relying parties. The U-Prove technology helps realize the vision set forth by the laws of identity.

To encourage experimentation and gather feedback on the technology, the following software components are made available as part of the U-Prove CTP:

· Active Directory Federation Services 2.0 (U-Prove CTP): a U-Prove enabled version of AD FS 2.0 that has the ability to issue an information card that supports U-Prove; and that can act both as a U-Prove identity provider (IP-STS) and a relying party (RP-STS).

· Windows CardSpace 2.0 (U-Prove CTP): a U-Prove enabled version of Windows CardSpace 2.0 that has the ability to obtain, store, and present U-Prove tokens associated with an information card.

Saurabh Pant has a great blog post about deploying RIA Services to your server. This post specifically targets .NET 4, Silverlight 4, and Visual Studio 2010, and even announces some hosting companies that now provide RIA Services RC support.

While I won't rehash all of the deployment details, I wanted to draw attention to Azure deployments. Saurabh points out that RIA Services RC only supports .NET 4. Currently, Azure only supports .NET 3.5. This means the “server” side of your RIA Services app cannot yet be used in Azure (although you can develop it locally and run with the Development Fabric).

I haven’t heard or seen any official statement about .NET 4 support on Azure, but my gut feeling (read: educated guess) is that we’ll see an Azure Virtual Machine upgrade at the same time .NET 4 is RTM, currently slated for April 12. Hopefully this will all be cleared up in the next month.

If you need to deploy a RIA Services application to Azure today, continue working with the RIA Services Beta which was announced at PDC in November. The Beta works with both .NET 3.5 and .NET 4. [Emphasis added.]

We spent some time in our first few weeks of the project getting Enterprise Library 5 (Beta 2) working on Azure. The first thing we did is took the reference implementation (MusicStore) from the Web Client Developer Guidance and added components of Enterprise Library to it. We also wrote up a short Technical Note on our findings. The good news is that most everything just works. There are some gotchas that have to do with the Azure Platform itself (nothing bad). You can read the paper on our CodePlex site here. We should have the code to post in the next couple of days.

Well... at least be entered into a contest to win one of three Dell Minis.

I really like it when I find interesting projects like this: a contest for Azure apps that is being done by an independent organization, not affiliated with Microsoft. Just a great developer community engaging their developers in an interesting way.

The folks at CodeProject have very clear instructions on setting up accounts, developing applications, deploying applications, and then de-provisioning them (hey! with a link to my blog... which actually how I found out about the contest... suddenly got a spike in views in Feb that came from CodeProject). So there's another thing I like: unexpected links that drive a bunch of traffic to my blog :)

As I've mentioned before - Cloud platforms, like Windows Azure, are a fantastic choice for Open Government solutions, so for those of you who are in the Public Sector space, consider this an opportunity to kill two birds with one app.

Pervasive Business Exchange is a hosted service for trading business documents with trading partners. While initially seen as a horizontal solution, Larry and Mike talk about how this could target vertical markets such as healthcare. In addition Mike explains some of the issues which led Pervasive to work with the Microsoft Azure Platform.

Los Angeles-based CitySourced, the developer of a mobile tool for helping involve citizens in local government headed by Jason Kiesel and Kurt Daradics, has linked up with Microsoft and its Windows Azure platform, the firms said Tuesday. CitySourced is an angel-funded firm which develops a smart phone application which allows citizens to report potholes, graffiti, and other issues directly to local governments. The firm is using Microsoft Windows Azure for its application infrastructure. [Emphasis added.]

In unrelated news, CitySourced also said it has also been nominated to the World Economic Forum Davos Pioneer 2011 awards, an effort to spotlight companies developing technology with long term impact on business and society. Daradics is well known for his organization of the MOTM and Digital Family Reunion events in the Los Angeles area. CitySourced has received angel funding from Dale Okuno, most recently the founder of E-Z Data.

1. Moving to the Cloud 2. Integrating with the Cloud 3. Leveraging the Cloud

First up is a document which explains the capabilities and limitations of Enterprise Library 5.0 Beta 2 in terms of use within .NET applications designed to run with the Windows Azure platform. You can download it here.

For my MIX10 presentation this year, I cooked up a Windows Azure application that creates zoomable presentations (a bit like prezi or pptPlex) using Silverlight. I’ll blog more when I get a chance about how the application works, but for those of you who are curious about the slides, you can take a look at them now at http://www.onebigslide.com/slides/play/smarx-mix10. (The full recording of my talk should be somewhere on visitmix.com within the next day or so.)

First build of our samples is now available on CodePlex. This initial version is the “before the cloud” baseline application, so you won’t find anything related to Windows Azure here.

This week we will take this simple baseline and start moving it to the cloud.

Goals for this next iteration are to:

Claims-enable the application to keep SSO experience for users. We will use WIF for this.

Remove dependency with AD for querying the user Manager and Cost Center. This will be done by sending this information as claims as opposed to having the application querying AD. We want to avoid having to call back into Adatum from a-Expense.

Deploy an Identity Provider on Adatum (e.g ADFS). This is the issuer of security tokens for a-Expense. It will be configured to issue the claim set a-Expense needs (e.g. employee, Cost Center, employee manager)

Move database to SQL Azure. This is straight forward. We may add some connection retry logic to the data access layer to increase resiliency. But it should “just work”.

Move the Profile storage to Azure Table storage. This database is fairly small and has a simple data model. There’s really no need for full relational support.

Eugenio continues with a detailed “How it works” section:

In a nutshell:

We are trying to keep things as simple as possible.

Jody Gilbert claims “You don’t have to know everything about cloud computing, but a familiarity with the terminology will help you follow the trends and industry developments. This glossary offers a rundown of the terms you’re likely to come across” as an introduction to her Mini-glossary: Cloud computing terms you should know post of 3/16/2010 to TechRepublic’s Servers and Storage blog:

Cloud computing is one of the hottest topics in IT these days, with Microsoft, Google, Amazon, and other big players joining in the fray. However, the technology brings with it new terminology that can be confusing. Here are some common cloud-related terms and their meanings.

Thanks to the Azure Services Platform each and every architect has an almost infinite amount of storage and compute power at his disposal without any large upfront investments. Together with these major advantages however, also come a lot of design challenges that will change the way we design software.

In this article, I will guide you through this new environment and point out some of these design challenges that the cloud presents to us. I will also propose an architectural style, and some additional guidance, that can be used to overcome many of these challenges. Furthermore I'll give you an overview of the tools offered by the Azure cloud platform that can be used to implement such a system. …

I was surprised to find no mention of the new Reactive Extensions (Rx) for .NET in the discussion.

An honorable mention goes out to the Load balancer – which does the obvious.

Honorable mention? It’s an afterthought that certainly one of the key enabling technologies of cloud computing does not deserve. Shortly after reading the post and debating this point with Paul Richards (the author) I came to the realization that he was looking at cloud computing from the view point of the consumer, i.e. the organization, the customer, an administrator/developer looking for a cloud in which to deploy applications. That made his statement make a lot more sense. If you’re looking at cloud services offered and trying to decide which one to jump on then perhaps a load balancer isn’t your primary concern at all (although that makes me want to say, “Inconceivable!”). But from the perspective of the definition of cloud computing and the folks who are implementing (internal/external, public/private) such environments, a load balancer is certainly a lot more than just window dressing.

So I will say that as far as cloud services go, load balancing may be – based solely on consumer need, or perception of need – worthy of only honorable mention. But as far as implementing a cloud computing environment goes, it’s a requirement. …

Lori continues with a discussion titled “LOAD BALANCING is in the CLOUD DNA: FROM CPU to NETWORK to APPLICATION to DATA CENTER.”

As technical director, I’m obviously concerned with technological progress and making sure we invest money only where the ROI justifies it. I’ve been using EC2 since its inception, mostly for making sure I can scale easily without incurring actual expenses on projects that might realize a profit. Basic equation: nothing to write mom about. But after attending Chris Auld’s presentation, I realize I did not get the Cloud right. Coding for high volume and high availability is not enough on the cloud. Partitioning and denormalizing for performance as one does in a non-cloud environment is not only not enough but will bite you big time when your next invoice comes.

Therefore, while the economic aspect of cloud do relate to scaling and managing upfront costs, it also relates to managing the architecture such as to leverage the possibilities of cloud computing and working with the “pay as you go” nature of that beast. Cloud changes the way one thinks of scaling as well as the way one thinks of his data. For example, denormalization and duplications need to be considered at least as much as the parallel and asynchronous dimensions of the cloud application. Batch jobs, queues, blobs and CSS sprites are all tools one need to consider in his architecture when thinking of cloud. What you store and where you store it is also a consideration. For example, Binaries in SQL Server can make sense but it doesn’t in SQL Azure as we pay by the GB of space.

So what I take from yesterday’s workshop is that cloud is about maths: performance will not be an issue as long as you apply sound development methods. Success though, will come by figuring out the economic aspects of the beast and adjusting the application architecture accordingly. In the end, I think one could feel justify to think of Cloud as a game changer. It will not be so much a divide between those who are on the cloud and those who are not but between those who can leverage the cloud and those who get bitten by it.

McAfee Tuesday announced [at the Cloud Connect conference, #ccevent] a vulnerability-assessment scanning service that's aimed at giving cloud-computing service providers a way to provide security assurances to their customers.

Called the McAfee Cloud Secure Program, the daily scanning service is directed from the Internet into the cloud service provider to probe for any weaknesses in the network infrastructure, perimeter and applications, says Marc Olesen, senior vice president and general manager for McAfee's software-as-a-service business unit.

The McAfee Cloud Secure Program, which is likely to be expande to include other security services as well, "is intended to give customers more confidence in their cloud providers," Olese says. SuccessFactors.com is among the first to participate in the McAfee program.

The charges for McAfee Cloud Secure are based on number of IP addresses, among other factors, and start at less than $5,000 per month, according to Olesen.

Cloud providers are entitled to post the McAfee Cloud Secure mark when they've successfully completed daily vulnerability-assessment scans. The program is modeled to some extent after the McAfee Secure Trustmark program for e-commerce Web sites that undergo security scans.

Security, and VP WW Engineering, reached out to me and after a rousing game of calendar alignment, we spoke about the Cloud Security Alliance, its goals and how it plans to go about achieving those goals. Novell’s products, by the way, have been powerful tools to help organizations achieve the goals of high levels of security and its expertise will certainly help the alliance. …

The task of implementing this security model now falls under the jurisdiction of a professional security administrator, not the developers of each separate application. In fact, no code or configuration needs to change on foo, bar, or any of my services. The security model is decoupled from the application, taken out of the hands of each developer and centralized. This is the basic value proposition of intermediaries in SOA, and this value is never realized effectively if you allow direct connections between clients and servers. This is why architectural patterns are sometimes necessary to allow us to be consistent with our principles—or our anti-principles, as the case may be.

The February Meetup was cancelled due to the ‘winter storm that wasn’t’, so I’ll be presenting on Windows Azure this Wednesday, March 17th.

The second presentation of the evening will be by Gil Rapaport, Co-founder and President of Viewfinity, which provides a SaaS solution for managing the support and control of desktops, laptops and Windows servers, regardless of worker location.

What is a Windows Azure Boot Camp?

Windows Azure Boot Camp is a two day deep dive class to get you up to speed on developing for Windows Azure. The class includes a trainer with deep real world experience with Azure, as well as a series of labs so you can practice what you just learned. ABC is more than just a class, it is also an event in a box. If you don't see a class near you, then throw your own. We provide all of the materials and training you need to host your own class. This can be for your company, your customers, your friends, or even your family. Please let us know so we can give you all of the details.

Awesome. How much does it cost?

Thanks to all of our fantabulous sponsors, this two day training event is FREE! We will provide drinks and snacks, but you will be on your own for lunch on both days. This is a training class after all.

How do I attend one?

Just find a city and date on the schedule that works for you. Click through to see the details for that class, and then register. Keep in mind you will most likely need to bring your own laptop to do the labs.

What do I need to bring?

For most boot camps, you will need to bring your own laptop, and have it preloaded with the software listed here. An extension cord would help as well.

There isn't one near me? Now what?

Throw your own! We will help you put on your own boot camp. Contact us for more details

Extending upon the private cloud offerings launched last year, IBM is the latest major player to launch a commercial hosted service.

Like Microsoft's recently-released Azure services, the new IBM Cloud, released Tuesday, is targeted at developers and testers. However like Microsoft, Amazon and Google, IBM is clearly looking to extend its portfolio of products and services to the public cloud over time.

"You will see IBM continuing to release this set of work-based cloud computing environments," said Daniel Kloud, director of cloud computing in IBM's Rational business group.

"IBM has been talking a good cloud game for the last year or so," notes Forrester Research analyst James Staten in a blog posting. "But its public cloud efforts, outside of application hosting have been a bit of wait and see. Well, the company is clearly getting its act together in the public cloud space with today’s announcement."

While IBM does offer targeted hosted services such as Lotus Live, the company's new IBM Cloud service brings some key components of Big Blue's platform to the commercial cloud such as its WebSphere suite of application servers and its DB2 and Informix databases.

"What IBM is offering customers is not only the infrastructure to put development and test environments in place, but we also provide software images," Kloud said, in an interview.

A customer or partner is presented with a catalog of images or they can have IBM provision their own images, Kloud said. "Basically you can get your development and test teams up and running in a matter of minutes because they avoid basically acquiring the hardware and configuring the system and software," he said.

According to IBM, 50 percent of an organization's IT infrastructure is used for development and test, while 90 percent of it is idle at any given time.

"Certainly any IaaS [infrastructure as a service] can be used for test and development purposes so IBM isn’t breaking new ground here," notes Forrester's Staten. However he says IBM is launching a storing offering with support from test and development partners such as SOASTA, VMLogix, AppFirst and Trinity Software.

IBM's new commercial cloud service currently only supports hosting of Linux systems – the company did not disclose plans for offering Windows Server images other than to say it will be expanding on its stack.

The public IBM Cloud infrastructure is based on Red Hat Enterprise Virtualization (RHEV) stack based on the Kernel-Based Virtual Machine (KVM). Red Hat acquired the technology from Qumranet in 2008.

Red Hat called the choice of RHEV over virtualization technology from VMware a coup for its hypervisor stack. "It's a big milestone," said Scott Crenshaw, vice president and general manager of Red Hat's cloud business, in an interview. Crenshaw argued that the key advantage of its RHEV stack released in November is its support for multi-tenant data architectures. "It has a lot of advantages in areas like reliability, scalability and security," he said.

As part of its launch, IBM released Rational Software Delivery Services for Cloud Computing v1.0, which includes components of the company's Rational development and testing suite. IBM is not publishing pricing for its service.

If you don’t publish public prices for cloud services, how does a customer know if his company is receiving “most-favored purchaser” pricing?

There’s no doubt that virtualization, automation, and service-centric architectures lead to cost efficiency and more agile information technology. But there are many ways to deploy clouds: Privately, atop on-premise hardware behind enterprise firewalls; publicly, through third-party service providers; or in a hybrid, blended model that leverages the best of both worlds. Which of these is right today? Why, and will this change? Join this panel for a look at the sweet spot of clouds and how utility computing will evolve in coming years.

The Amazon AWS product is all about services. While others are marketing the cloud with an explanation point, the cloud leader is focused on the raw building blocks. This includes everything from storage to people. Amazon is learning how to find new ways to optimize connections and monetize them in increments of time.

Amazon, the Verb: Motion

When thinking of Amazon as a verb, one word stands out, motion. When Amazon was first introduced as the Internet bookstore, it immediately created a change in the landscape.

It seemed like the writing was on the wall for brick and mortar retail, and to a large degree, it was. In a mere 15 years, it has disrupted the entire book vertical with an end-to-end digital system. Amazon is now in the position to completely automate the flow of content bits from upstream to downstream. …

Enterprises of all shapes and sizes are catching on to the value in moving e-mail and other productivity apps to the cloud where they can be delivered and managed by vendors like Microsoft, Google or Cisco.

The major appeal of cloud is that it saves money, says Ron Markezich, Corporate VP of Microsoft Online Services.

But the benefits of moving apps out of your data center and into a cloud environment such as Microsoft's BPOS (business productivity online suite) go deeper than cost cutting, says Markezich. A cloud environment can speed up workflow simply by allowing workers to access e-mail from any Internet connection. It can get top brass using wikis and blogging to improve communication at a company. And it can take the burden of managing servers off the IT department and free them up to work on more business critical projects.

The last year I wrote quite a few posts on the business models around SaaS and cloud computing including SaaS 2.0, disruptive early stage cloud computing start-ups, and branding on the cloud. This year people have started asking me – well, we have seen PaaS, IaaS, and SaaS but what do you think are some of the emergent cloud computing business models that are likely to go mainstream in coming years. I spent some time thinking about it and here they are [abbreviated]:

Computing arbitrage …

Gaming-as-a-service …

App-driven and content-driven clouds …

Chirag is Technology, Design, and Innovation Strategist with the Office of the CEO[s] at SAP.

In contrast to Blobs, Azure tables offer structured storage. These “tables”, however, are not to be confused with tables that might exist in the context of a relational database. In fact Azure tables are more like a typical “object database” where each table contains multiple objects, or “entities”. An entity in Azure can be up to 1 MB in size. …

Last November at PDC 09 we announced the Open Data Protocol (OData), providing a way to unlock your data and free it from silos that exist in applications today. OData makes it easy for data to be shared in a manner that follows the philosophy of Open Data and enables the creation of REST-based data services. These services allow resources identified using Uniform Resource Identifiers (URIs) and defined in an abstract data model, to be published and edited by Web clients using simple HTTP messages. OData enables a new level of data integration across a broad range of clients, servers, services, and tools.

This morning during the Keynote at Mix 2010, Doug Purdy announced the re-launch of OData.org and the release of the OData SDK.

The OData SDK brings together a wealth of resources to help developers participate in the OData Eco-system including:

Sample OData online services (northwind, etc) - open a browser and test out an OData Service.

OData client libraries

Windows Phone 7 series

iPhone

AJAX\Javascript

PHP

Java

.NET

Silverlight

Online OData explorer (Source code also available for download from odata.org)

Data Service Provider toolkit: Whitepaper and sample WCF Data Services provider implementation to demonstrate how to create a data service over *any* data source

OData validation tool: A test harness and a few sample validation tests to make it easy to validate your OData endpoint. The harness is designed to be easily extended allowing anyone to easily add new tests.

Also announced today at Mix, there are some new OData services available publicly:

I'm at the Cloud Connect 2010 conference in Santa Clara, Calif., one of the first major gatherings of the year on cloud computing. One of the larger topics that has come up thus far is not using relational databases for data persistence. Called the "NoSQL" movement, it is about leveraging more efficient databases that are perhaps able to handle larger data sets more effectively. I've already written about the "big data" efforts that are emerging around cloud, but this is a more fundamental movement to drive data back to more primitive, but perhaps some more efficient models and physical storage approaches.

NoSQL systems work with data in memory, typically or uploading chunks of data from many disks in parallel. The issue is that "traditional" relational databases don't provide the same models and, thus, the same performance. While this was fine in the days of databases with a few gigabytes of data, many cloud computing databases are blowing past a terabyte, and we'll see huge databases supporting cloud-based systems going forward. Relational databases for operations on large data sets are contraindicated because SQL queries tend to consume many CPU cycles and thrash the disk as they process data.

OData, the Open Data Protocol, is Microsoft’s alternative to Google’s GData. If you’ve heard Microsoft use the codename “Astoria” or talk about ADO.Net Data Services in the past, these two codenames are now under the larger OData umbrella. Microsoft defines OData as a protocol that builds on top of HTTP, Atom Publishing Protocol (AtomPub) and JSON “to provide access to information from a variety of applications, services, and stores.” Microsoft is building OData support into a number of its products, including SharePoint Server 2010, Excel 2010, Dallas, its Dynamics products, and its MediaRoom IPTV offerings.

The OData SDK, announced today, bundles Microsoft’s various OData clients — for Java, PHP, PalmOS, .Net, and (as of today), the iPhone — into a single package. Microsoft officials also said during today’s Mix keynote that Microsoft is open-sourcing the .Net OData client, under an Apache license.

Dallas is a new service built on top of Windows Azure and SQL Azure that will provide users with access to free and paid collections of public and commercial data sets that they can use in developing applications. The datasets are available via Microsoft’s PinPoint partner/ISV site. Microsoft is planning another Dallas CTP in the next couple of months and plans to announce Dallas pricing at the Worldwide Partner Conference in July, officials said. [Emphasis added.]

If you have a SQL Azure database account you can easily expose an OData feed through a simple configuration portal. You can select authenticated or anonymous access and expose different OData views according to permissions granted to the specified SQL Azure database user.

A preview of this upcoming SQL Azure service is available at https://www.sqlazurelabs.com/OData.aspx and it enables you to select one of your existing SQL Azure databases and, with a few clicks, turn it into an OData feed. It looks as though SQL Azure will soon be added to the stable of products that natively support OData, good news indeed.

… Today Microsoft announced the Open Data Protocol [Read more at odata.org] at the MIX Conference. The Open Data Protocol is an extension of the ATOMPub format. The OData Information can be can be represented in ATOM or JSON Format.

One of the key features of OData is that each element contains a datatype so that the data can be consumed by a number of different platforms including Java, javascript, Python, Ruby, PHP, and .NET without running into type safety issues, or misrepresenting the data in a string format. …

This week a number of the people from our team are at the MIX conference in Las Vegas. Yesterday, Mike Clark (the Group Manager for our team) presented a session to kick off the conference on the topic of building offline web applications. In this session Mike presented a look at the work we are doing to enable users to build offline SilverLight applications that enable bi-directional synchronization of data from offline SilverLight applications and a central data store like SQL Azure and SQL Server. [Emphasis added.]

The offline SilverLight support is based on a new set of capabilities we are creating called the "Asymmetric Protocol" that allows us to extend sync framework capabilities to virtually any device such as Windows Mobile, Windows Phone 7 Series, iPhone and Symbian, even if those devices do not actually have support for the Sync Framework runtime.

Mike went through a number of demonstrations that were based on the MIX scheduling application built using this new Asymmetric protocol. He showed how we enable users to take the conference agenda and session information offline in their Desktop, Windows Phone 7 Series device and iPhone devices allowing people to register for sessions and then sync and share this information amongst all of their devices via the central conference database.

To understand you really need to think what the PC era has done to information. In effect the PC revolution started what the Internet has super charged - Information Creation. Think about it, more information is now being creating in the time it takes me to write this post than was probably created between the time humans first figured out how to write up until the birth of the Internet.

But for the most part the majority of the information humankind has created has not been accessible. Most of this raw data or knowledge has been sitting in various silo's -- be it a library, a single desktop, a server, database or even data center. But recently something changed, the most successful companies of the last decade have discover how to tap into this raw data. These companies are better at analyzing, mining and using this mountain of data sitting "out there" -- turning a useless raw resource into something much more useful, Information.

Before you say anything, Yes I know I'm not the first to say this. In a 2006 post Michael Palmer wrote "Data is the new oil!" declaring "Data is just like crude. It’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals, etc to create a valuable entity that drives profitable activity; so must data be broken down, analyzed for it to have value." …

Ruv continues with further support for the Data as the new oil theory.

ProblemWe know that SQL Azure is the database offering on the Windows Azure cloud computing platform, and it goes without saying that all the technologies that plug-in to databases need to start exercising and adapting to this flavor of databases along with the regular approach of database access. In this tip we learn how to use SQL Server Reporting Services (SSRS 2008 R2 hereafter) 2008 R2 Nov CTP to connect to SQL Azure.

SolutionTwo major providers can be used in SSRS 2008 R2 to connect to SQL Azure: "Microsoft SQL Server" and "OLE DB". Using these providers, a report developer can continue to develop the report in much the same way as any locally or network installed database. The only thing that one should take care of is that, when SQL Azure goes RTM (official release), users will be charged for accessing SQL Azure. So initial development and prototyping can be done on a locally installed database, and when the report is developed to a considerable extent, testing and validation can be started against SQL Azure.

Again this is not the only option, but it really depends upon the pricing policy opted. It also may fall to a scenario that part of the data is hosted on SQL Azure and part is hosted locally. In this tip we will focus on how to connect to SQL Azure using SSRS 2008 R2, and this tip assumes that the reader has some basic working knowledge of SSRS.

Last week we released a Billing Preview that provided a usage summary to help customers prepare for when billing begins in April. Usage statistics in this billing preview are consistent with our new Connection-based billing for the AppFabric Services Bus that we announced in January. A key feature of this model allows customers to purchase Connections individually on a pay-as-you-go basis, or in Connection Packs if they anticipate needing larger quantities.

To provide greater insight into the Billing Preview and the options available to customers, a member of our team created an Excel-based calculator that converts usage numbers into costs. The calculator also allows customers to see how much their usage would have cost with each of the different reserved pack options.

While we hope customers find this tool quite useful, please do note that it is being provided only as a courtesy and is not guaranteed against possible bugs. The calculator also does not guarantee the amount of future bills.

The new release contains various fixes and improvements, but above all it contains two brand-new labs about two scenarios you asked for loud and clear: Web Services and Identity in Windows Azure and Developing Identity-Driven Silverlight Applications. …

The AppFabric Labs environment will be used showcase some early bits and get feedback from the community. Usage for this environment will not be billed.

In this release of the LABS environment, we’re shipping two features:

Silverlight support: we’ve added the ability for Silverlight clients to make cross-domain calls to the Service Bus and Access Control Services.

Multicast with Message Buffers: we’ve added the ability for Message Buffers to attach to a multicast group. A message send to the multicast group is delivered to every Message Buffer that is attached to it.

Download Labs samples from here to learn more about these new features.

To provide feedback and to learn what the community is building using the Labs technology, please visit the Windows Azure platform AppFabric forum and connect with the AppFabric team.

Running AppFabric V1 Samples Against the Labs Environment

To run the AppFabric V1 SDK samples against the Labs environment, you'll have to rename the ServiceBus.config.txt file found at the AppFabric Labs SDK page to ServiceBus.config and place it in your .NET Framework CONFIG directory. The CONFIG directory is located at:

The AppFabric V1 SDKs Windows Azure sample will not work against the Labs environment. This is because to run V1 samples against Labs you need to place the ServiceBus.config file in your .NET Framework CONFIG directory and Windows Azure doesn't allow that.

Known Issues

When uploading CrossDomain.XML to the Service Bus root, please leave out the <!DOCTYPE> schema declaration.

U-Prove is an innovative cryptographic technology that enables the issuance of claims in a manner that provides multi-party security: issuing organizations, users, and relying parties can protect themselves not just against outsider attacks but also against attacks originating from each other. At the same time, the U-Prove technology enables any desired degree of privacy (including authenticated anonymity and pseudonymity) without contravening multi-party security.

Given these user-centric aspects, it comes as no surprise that we have integrated the technology into the identity metasystem, and in particular, using information cards. Users can now obtain information cards protected by U-Prove and present them 1) with higher privacy guarantees, and 2) without online connectivity to the identity provider when interacting with relying parties. The U-Prove technology helps realize the vision set forth by the laws of identity.

To encourage experimentation and gather feedback on the technology, the following software components are made available as part of the U-Prove CTP:

· Active Directory Federation Services 2.0 (U-Prove CTP): a U-Prove enabled version of AD FS 2.0 that has the ability to issue an information card that supports U-Prove; and that can act both as a U-Prove identity provider (IP-STS) and a relying party (RP-STS).

· Windows CardSpace 2.0 (U-Prove CTP): a U-Prove enabled version of Windows CardSpace 2.0 that has the ability to obtain, store, and present U-Prove tokens associated with an information card.

Saurabh Pant has a great blog post about deploying RIA Services to your server. This post specifically targets .NET 4, Silverlight 4, and Visual Studio 2010, and even announces some hosting companies that now provide RIA Services RC support.

While I won't rehash all of the deployment details, I wanted to draw attention to Azure deployments. Saurabh points out that RIA Services RC only supports .NET 4. Currently, Azure only supports .NET 3.5. This means the “server” side of your RIA Services app cannot yet be used in Azure (although you can develop it locally and run with the Development Fabric).

I haven’t heard or seen any official statement about .NET 4 support on Azure, but my gut feeling (read: educated guess) is that we’ll see an Azure Virtual Machine upgrade at the same time .NET 4 is RTM, currently slated for April 12. Hopefully this will all be cleared up in the next month.

If you need to deploy a RIA Services application to Azure today, continue working with the RIA Services Beta which was announced at PDC in November. The Beta works with both .NET 3.5 and .NET 4. [Emphasis added.]

We spent some time in our first few weeks of the project getting Enterprise Library 5 (Beta 2) working on Azure. The first thing we did is took the reference implementation (MusicStore) from the Web Client Developer Guidance and added components of Enterprise Library to it. We also wrote up a short Technical Note on our findings. The good news is that most everything just works. There are some gotchas that have to do with the Azure Platform itself (nothing bad). You can read the paper on our CodePlex site here. We should have the code to post in the next couple of days.

Well... at least be entered into a contest to win one of three Dell Minis.

I really like it when I find interesting projects like this: a contest for Azure apps that is being done by an independent organization, not affiliated with Microsoft. Just a great developer community engaging their developers in an interesting way.

The folks at CodeProject have very clear instructions on setting up accounts, developing applications, deploying applications, and then de-provisioning them (hey! with a link to my blog... which actually how I found out about the contest... suddenly got a spike in views in Feb that came from CodeProject). So there's another thing I like: unexpected links that drive a bunch of traffic to my blog :)

As I've mentioned before - Cloud platforms, like Windows Azure, are a fantastic choice for Open Government solutions, so for those of you who are in the Public Sector space, consider this an opportunity to kill two birds with one app.

Pervasive Business Exchange is a hosted service for trading business documents with trading partners. While initially seen as a horizontal solution, Larry and Mike talk about how this could target vertical markets such as healthcare. In addition Mike explains some of the issues which led Pervasive to work with the Microsoft Azure Platform.

Los Angeles-based CitySourced, the developer of a mobile tool for helping involve citizens in local government headed by Jason Kiesel and Kurt Daradics, has linked up with Microsoft and its Windows Azure platform, the firms said Tuesday. CitySourced is an angel-funded firm which develops a smart phone application which allows citizens to report potholes, graffiti, and other issues directly to local governments. The firm is using Microsoft Windows Azure for its application infrastructure. [Emphasis added.]

In unrelated news, CitySourced also said it has also been nominated to the World Economic Forum Davos Pioneer 2011 awards, an effort to spotlight companies developing technology with long term impact on business and society. Daradics is well known for his organization of the MOTM and Digital Family Reunion events in the Los Angeles area. CitySourced has received angel funding from Dale Okuno, most recently the founder of E-Z Data.

1. Moving to the Cloud 2. Integrating with the Cloud 3. Leveraging the Cloud

First up is a document which explains the capabilities and limitations of Enterprise Library 5.0 Beta 2 in terms of use within .NET applications designed to run with the Windows Azure platform. You can download it here.

For my MIX10 presentation this year, I cooked up a Windows Azure application that creates zoomable presentations (a bit like prezi or pptPlex) using Silverlight. I’ll blog more when I get a chance about how the application works, but for those of you who are curious about the slides, you can take a look at them now at http://www.onebigslide.com/slides/play/smarx-mix10. (The full recording of my talk should be somewhere on visitmix.com within the next day or so.)

First build of our samples is now available on CodePlex. This initial version is the “before the cloud” baseline application, so you won’t find anything related to Windows Azure here.

This week we will take this simple baseline and start moving it to the cloud.

Goals for this next iteration are to:

Claims-enable the application to keep SSO experience for users. We will use WIF for this.

Remove dependency with AD for querying the user Manager and Cost Center. This will be done by sending this information as claims as opposed to having the application querying AD. We want to avoid having to call back into Adatum from a-Expense.

Deploy an Identity Provider on Adatum (e.g ADFS). This is the issuer of security tokens for a-Expense. It will be configured to issue the claim set a-Expense needs (e.g. employee, Cost Center, employee manager)

Move database to SQL Azure. This is straight forward. We may add some connection retry logic to the data access layer to increase resiliency. But it should “just work”.

Move the Profile storage to Azure Table storage. This database is fairly small and has a simple data model. There’s really no need for full relational support.

Eugenio continues with a detailed “How it works” section:

In a nutshell:

We are trying to keep things as simple as possible.

Jody Gilbert claims “You don’t have to know everything about cloud computing, but a familiarity with the terminology will help you follow the trends and industry developments. This glossary offers a rundown of the terms you’re likely to come across” as an introduction to her Mini-glossary: Cloud computing terms you should know post of 3/16/2010 to TechRepublic’s Servers and Storage blog:

Cloud computing is one of the hottest topics in IT these days, with Microsoft, Google, Amazon, and other big players joining in the fray. However, the technology brings with it new terminology that can be confusing. Here are some common cloud-related terms and their meanings.

Thanks to the Azure Services Platform each and every architect has an almost infinite amount of storage and compute power at his disposal without any large upfront investments. Together with these major advantages however, also come a lot of design challenges that will change the way we design software.

In this article, I will guide you through this new environment and point out some of these design challenges that the cloud presents to us. I will also propose an architectural style, and some additional guidance, that can be used to overcome many of these challenges. Furthermore I'll give you an overview of the tools offered by the Azure cloud platform that can be used to implement such a system. …

I was surprised to find no mention of the new Reactive Extensions (Rx) for .NET in the discussion.

An honorable mention goes out to the Load balancer – which does the obvious.

Honorable mention? It’s an afterthought that certainly one of the key enabling technologies of cloud computing does not deserve. Shortly after reading the post and debating this point with Paul Richards (the author) I came to the realization that he was looking at cloud computing from the view point of the consumer, i.e. the organization, the customer, an administrator/developer looking for a cloud in which to deploy applications. That made his statement make a lot more sense. If you’re looking at cloud services offered and trying to decide which one to jump on then perhaps a load balancer isn’t your primary concern at all (although that makes me want to say, “Inconceivable!”). But from the perspective of the definition of cloud computing and the folks who are implementing (internal/external, public/private) such environments, a load balancer is certainly a lot more than just window dressing.

So I will say that as far as cloud services go, load balancing may be – based solely on consumer need, or perception of need – worthy of only honorable mention. But as far as implementing a cloud computing environment goes, it’s a requirement. …

Lori continues with a discussion titled “LOAD BALANCING is in the CLOUD DNA: FROM CPU to NETWORK to APPLICATION to DATA CENTER.”

As technical director, I’m obviously concerned with technological progress and making sure we invest money only where the ROI justifies it. I’ve been using EC2 since its inception, mostly for making sure I can scale easily without incurring actual expenses on projects that might realize a profit. Basic equation: nothing to write mom about. But after attending Chris Auld’s presentation, I realize I did not get the Cloud right. Coding for high volume and high availability is not enough on the cloud. Partitioning and denormalizing for performance as one does in a non-cloud environment is not only not enough but will bite you big time when your next invoice comes.

Therefore, while the economic aspect of cloud do relate to scaling and managing upfront costs, it also relates to managing the architecture such as to leverage the possibilities of cloud computing and working with the “pay as you go” nature of that beast. Cloud changes the way one thinks of scaling as well as the way one thinks of his data. For example, denormalization and duplications need to be considered at least as much as the parallel and asynchronous dimensions of the cloud application. Batch jobs, queues, blobs and CSS sprites are all tools one need to consider in his architecture when thinking of cloud. What you store and where you store it is also a consideration. For example, Binaries in SQL Server can make sense but it doesn’t in SQL Azure as we pay by the GB of space.

So what I take from yesterday’s workshop is that cloud is about maths: performance will not be an issue as long as you apply sound development methods. Success though, will come by figuring out the economic aspects of the beast and adjusting the application architecture accordingly. In the end, I think one could feel justify to think of Cloud as a game changer. It will not be so much a divide between those who are on the cloud and those who are not but between those who can leverage the cloud and those who get bitten by it.

McAfee Tuesday announced [at the Cloud Connect conference, #ccevent] a vulnerability-assessment scanning service that's aimed at giving cloud-computing service providers a way to provide security assurances to their customers.

Called the McAfee Cloud Secure Program, the daily scanning service is directed from the Internet into the cloud service provider to probe for any weaknesses in the network infrastructure, perimeter and applications, says Marc Olesen, senior vice president and general manager for McAfee's software-as-a-service business unit.

The McAfee Cloud Secure Program, which is likely to be expande to include other security services as well, "is intended to give customers more confidence in their cloud providers," Olese says. SuccessFactors.com is among the first to participate in the McAfee program.

The charges for McAfee Cloud Secure are based on number of IP addresses, among other factors, and start at less than $5,000 per month, according to Olesen.

Cloud providers are entitled to post the McAfee Cloud Secure mark when they've successfully completed daily vulnerability-assessment scans. The program is modeled to some extent after the McAfee Secure Trustmark program for e-commerce Web sites that undergo security scans.

Security, and VP WW Engineering, reached out to me and after a rousing game of calendar alignment, we spoke about the Cloud Security Alliance, its goals and how it plans to go about achieving those goals. Novell’s products, by the way, have been powerful tools to help organizations achieve the goals of high levels of security and its expertise will certainly help the alliance. …

The task of implementing this security model now falls under the jurisdiction of a professional security administrator, not the developers of each separate application. In fact, no code or configuration needs to change on foo, bar, or any of my services. The security model is decoupled from the application, taken out of the hands of each developer and centralized. This is the basic value proposition of intermediaries in SOA, and this value is never realized effectively if you allow direct connections between clients and servers. This is why architectural patterns are sometimes necessary to allow us to be consistent with our principles—or our anti-principles, as the case may be.

The February Meetup was cancelled due to the ‘winter storm that wasn’t’, so I’ll be presenting on Windows Azure this Wednesday, March 17th.

The second presentation of the evening will be by Gil Rapaport, Co-founder and President of Viewfinity, which provides a SaaS solution for managing the support and control of desktops, laptops and Windows servers, regardless of worker location.

What is a Windows Azure Boot Camp?

Windows Azure Boot Camp is a two day deep dive class to get you up to speed on developing for Windows Azure. The class includes a trainer with deep real world experience with Azure, as well as a series of labs so you can practice what you just learned. ABC is more than just a class, it is also an event in a box. If you don't see a class near you, then throw your own. We provide all of the materials and training you need to host your own class. This can be for your company, your customers, your friends, or even your family. Please let us know so we can give you all of the details.

Awesome. How much does it cost?

Thanks to all of our fantabulous sponsors, this two day training event is FREE! We will provide drinks and snacks, but you will be on your own for lunch on both days. This is a training class after all.

How do I attend one?

Just find a city and date on the schedule that works for you. Click through to see the details for that class, and then register. Keep in mind you will most likely need to bring your own laptop to do the labs.

What do I need to bring?

For most boot camps, you will need to bring your own laptop, and have it preloaded with the software listed here. An extension cord would help as well.

There isn't one near me? Now what?

Throw your own! We will help you put on your own boot camp. Contact us for more details

Extending upon the private cloud offerings launched last year, IBM is the latest major player to launch a commercial hosted service.

Like Microsoft's recently-released Azure services, the new IBM Cloud, released Tuesday, is targeted at developers and testers. However like Microsoft, Amazon and Google, IBM is clearly looking to extend its portfolio of products and services to the public cloud over time.

"You will see IBM continuing to release this set of work-based cloud computing environments," said Daniel Kloud, director of cloud computing in IBM's Rational business group.

"IBM has been talking a good cloud game for the last year or so," notes Forrester Research analyst James Staten in a blog posting. "But its public cloud efforts, outside of application hosting have been a bit of wait and see. Well, the company is clearly getting its act together in the public cloud space with today’s announcement."

While IBM does offer targeted hosted services such as Lotus Live, the company's new IBM Cloud service brings some key components of Big Blue's platform to the commercial cloud such as its WebSphere suite of application servers and its DB2 and Informix databases.

"What IBM is offering customers is not only the infrastructure to put development and test environments in place, but we also provide software images," Kloud said, in an interview.

A customer or partner is presented with a catalog of images or they can have IBM provision their own images, Kloud said. "Basically you can get your development and test teams up and running in a matter of minutes because they avoid basically acquiring the hardware and configuring the system and software," he said.

According to IBM, 50 percent of an organization's IT infrastructure is used for development and test, while 90 percent of it is idle at any given time.

"Certainly any IaaS [infrastructure as a service] can be used for test and development purposes so IBM isn’t breaking new ground here," notes Forrester's Staten. However he says IBM is launching a storing offering with support from test and development partners such as SOASTA, VMLogix, AppFirst and Trinity Software.

IBM's new commercial cloud service currently only supports hosting of Linux systems – the company did not disclose plans for offering Windows Server images other than to say it will be expanding on its stack.

The public IBM Cloud infrastructure is based on Red Hat Enterprise Virtualization (RHEV) stack based on the Kernel-Based Virtual Machine (KVM). Red Hat acquired the technology from Qumranet in 2008.

Red Hat called the choice of RHEV over virtualization technology from VMware a coup for its hypervisor stack. "It's a big milestone," said Scott Crenshaw, vice president and general manager of Red Hat's cloud business, in an interview. Crenshaw argued that the key advantage of its RHEV stack released in November is its support for multi-tenant data architectures. "It has a lot of advantages in areas like reliability, scalability and security," he said.

As part of its launch, IBM released Rational Software Delivery Services for Cloud Computing v1.0, which includes components of the company's Rational development and testing suite. IBM is not publishing pricing for its service.

If you don’t publish public prices for cloud services, how does a customer know if his company is receiving “most-favored purchaser” pricing?

There’s no doubt that virtualization, automation, and service-centric architectures lead to cost efficiency and more agile information technology. But there are many ways to deploy clouds: Privately, atop on-premise hardware behind enterprise firewalls; publicly, through third-party service providers; or in a hybrid, blended model that leverages the best of both worlds. Which of these is right today? Why, and will this change? Join this panel for a look at the sweet spot of clouds and how utility computing will evolve in coming years.

The Amazon AWS product is all about services. While others are marketing the cloud with an explanation point, the cloud leader is focused on the raw building blocks. This includes everything from storage to people. Amazon is learning how to find new ways to optimize connections and monetize them in increments of time.

Amazon, the Verb: Motion

When thinking of Amazon as a verb, one word stands out, motion. When Amazon was first introduced as the Internet bookstore, it immediately created a change in the landscape.

It seemed like the writing was on the wall for brick and mortar retail, and to a large degree, it was. In a mere 15 years, it has disrupted the entire book vertical with an end-to-end digital system. Amazon is now in the position to completely automate the flow of content bits from upstream to downstream. …

Enterprises of all shapes and sizes are catching on to the value in moving e-mail and other productivity apps to the cloud where they can be delivered and managed by vendors like Microsoft, Google or Cisco.

The major appeal of cloud is that it saves money, says Ron Markezich, Corporate VP of Microsoft Online Services.

But the benefits of moving apps out of your data center and into a cloud environment such as Microsoft's BPOS (business productivity online suite) go deeper than cost cutting, says Markezich. A cloud environment can speed up workflow simply by allowing workers to access e-mail from any Internet connection. It can get top brass using wikis and blogging to improve communication at a company. And it can take the burden of managing servers off the IT department and free them up to work on more business critical projects.

The last year I wrote quite a few posts on the business models around SaaS and cloud computing including SaaS 2.0, disruptive early stage cloud computing start-ups, and branding on the cloud. This year people have started asking me – well, we have seen PaaS, IaaS, and SaaS but what do you think are some of the emergent cloud computing business models that are likely to go mainstream in coming years. I spent some time thinking about it and here they are [abbreviated]:

Computing arbitrage …

Gaming-as-a-service …

App-driven and content-driven clouds …

Chirag is Technology, Design, and Innovation Strategist with the Office of the CEO[s] at SAP.

The dual Web role application has been running in Microsoft's South Central US (San Antonio) data center since September 2009. I believe it is the oldest continuously running Windows Azure application.

About Me

I'm a Windows Azure Insider, a retired Windows Azure MVP, the principal developer for OakLeaf Systems and the author of 30+ books on Microsoft software. The books have more than 1.25 million English copies in print and have been translated into 20+ languages.

Full disclosure: I make part of my livelihood by writing about Microsoft products in books and for magazines. I regularly receive free evaluation software from Microsoft and press credentials for Microsoft Tech•Ed and PDC. I'm also a member of the Microsoft Partner Network.