So you have built yourself an Azure worker process to crunch through your work, but how do you keep an eye on things? Facing that question I investigated and then built the world’s first Vista / Win7 Gadget to monitor your Queue size. The cool part is this is a pure JavaScript solution! Yes I managed to set the headers, do the Hash-based Message Authentication Code using SHA256 and the base64 encoding. I’ll put together another post to explain all that very soon.

One of the latest features introduced on SQL Azure is the ability to apply firewall settings on your database and allow only specific IP ranges to connect to it. This can be done through SQL Azure Portal or through code using stored procedures.

If you want to take a look at which rules are active on your SQL Azure database, you can use:

select * from sys.firewall_rulesThat will give you a view of your firewall rules.

If you want to add a new firewall rule, you can use the "sp_set_firewall_rule". The syntax is "sp_set_firewall_rule <firewall_rule_name> <ip range start> <ip range end>". For example:

Today in the opening keynote at PDC we announced the availability of SQL Azure Data Sync – November CTP, an early preview open to the public through a demonstration with Kelley Blue Book. For those of you who have been following our blog, you may be asking yourself, what exactly does this include and how does it compare to Project “Huron” that we have been talking about for some time now? In this post I want to give some additional details.

You can think of SQL Azure Data Sync as the first part of our overall Project “Huron” vision which is to create a Data Hub in the Cloud, or more specifically a place for you to easily consolidate and share all of your information. With SQL Azure Data Sync we have worked to simplify the task of sharing information whether that is from on-premises SQL Server to the cloud or from the cloud, down to mobile users, retails stores or remote offices. All of this being powered by the Microsoft Sync Framework.

• Katrina Woznicki reports “Ninety-six percent of the 50 patients surveyed left out at least one drug when they were asked to list their medications, and, on average, patients omitted 6.8 medications” in her When Asked, Patients Can't Tell article of 12/10/2009 for MedPageToday:

Hospitalized patients were often clueless when asked about their medications, with almost all of them unable to name all their medications and many leaving out as many as a half-dozen drugs they have been prescribed, according toa small survey of patients in a Colorado hospital.

Moreover, 44% of the patients thought they were taking a medication that had not been prescribed.

The researchers conducted the patient survey as part of a larger project examining a potential role for patients in reducing medication errors and improving patient safety.

"This study is a first for raising the questions 'How involved should patients be in their hospital medication safety?' and 'How do you involve them?'" Cumbler told MedPage Today.

"We don't live in a perfect healthcare system and errors do occur. If you have a patient who wants to be involved in their medication safety, you have to let him or her know what they're taking and to let them be an active participant."

Among scheduled medications, patients commonly omitted several important therapeutics, including antibiotics, cardiovascular drugs, and antithrombotics. …

This story points out the importance to patient safety of easily-accessible, cloud-based personal health records (PHRs).

Dorado Corporation has released its predictions for major trends and developments likely to shape residential mortgage lending and mortgage technology in 2010. They are:

Software-as-a-service (SaaS) adoption will reach critical mass in mortgage originator use, with more than 30% of all originations in North America occurring in the cloud.

Loan volume will fall below 2009 levels, primarily as a result of reduced FHA originations and an expectation that refinancing levels will soon peak. However, the drop will be mitigated by several factors: continued low interest rates for at least the first half of the year, employment stabilization, the continued high supply of affordable homes, and proactive lender marketing driven by advances in lead capture technology and new entrants such as Google into the lead generation space.

Regulatory compliance will become a competitive advantage, as those lenders that adopt turnkey approaches to updating their systems and processes take market share away from those institutions mired in human error, penalties, operational disruptions, and customer complaints.

New integrations and interoperability capabilities will provide banks and their borrowers alike with unprecedented choice in external services, driving the cost of these services downwards.

Requirements for lenders to maintain a greater financial interest in the loans sent to the secondary markets will further drive a trend towards clean, error-free loans and increased transparency in the creation and handling of loan file information.

The demarcation between originators and servicers will continue to blur, related to the increased exposure of lenders to the pools that they feed, and an increased level of pre-close analytics and other safeguards instituted in the origination process to minimize risk.

The mid-tier group of banks from 2009 will begin to differentiate themselves, with a handful of aggressive, volume-focused lenders using technology and new ways of doing business to capture market share. The result is a breakout group of “super regionals” that will force former mid-level players further down the food chain.

Chilmark Research sees this as a very savvy acquisition that will further extend the capabilities and thus market opportunities for Microsoft in the healthcare sector. For example, in the hot market for Health Information Exchanges (HIE), managing security access across multiple entities within a given region is challenging – the Sentillion suite of security solutions will slot into this market need quite readily. For Sentillion, this is also a good move has it provides them to backing, resources and distribution channel to truly take their solution suite global far faster than if they attempted to do it organically.

• Joseph Goedart adds his analysis of the Sentillion acquisition in a Microsoft to Buy Sentillion post of 12/10/2009 to the Health Data Management blog:

Microsoft Corp. will acquire Sentillion Inc., an Andover, Mass.-based vendor of context management and single-sign-on software, for an undisclosed sum. The vendors already are partners. Redmond, Washington-based Microsoft in mid-2009 signed a license to use Sentillion's software as a module with the Amalga Unified Intelligence System. Amalga is advanced data integration and aggregation software. Sentillion's applications enable users to access and simultaneously view patient data from multiple information systems during a single session.

Sentillion will continue to sell and support its products while Microsoft invests in the long-term evolution of the combined product suite, according to the companies. Sentillion will continue to operate out of its Andover headquarters. The companies expect the acquisition to close in early 2010.

Microsoft claims “The European Environment Agency's Eye on Earth site lets citizens track important environmental data. The Windows Azure-powered portal is being showcased at this week's United Nations Climate Change Conference in Copenhagen” in a Microsoft Helps Europeans Keep an 'Eye on Earth' press release of 12/9/2009:

… The new vantage point comes compliments of Eye on Earth, a joint project between Microsoft and the European Environment Agency (EEA) that is being shown at the United Nation's 15th Climate Change Conference in Copenhagen (COP15), which kicked off Dec. 7. One of the first applications built on the Microsoft Windows Azure cloud-computing platform, the Eye on Earth portal provides real-time environmental information to the 500 million people who live in the EEA's 32 member countries. It serves up that data in a visual format via Bing Maps. [Emphasis added.]

Keeping an Eye on Earth: At the Eye on Earth home page, visitors can click on links to sensors that show water or air quality at many locations across Europe.

… As much as Chilmark would love to say that the CDE is a great idea that will be met with broad adoption and use, we just don’t see that playing out for several reasons:

1) Google Health & HealthVault. It wasn’t that long ago that both Google Health and HealthVault were nothing but rumors. At that time the PHA/PHR market was in funk with numerous apps being developed, but few gaining much traction in the market and the path to market was convoluted. The concept of a CDE was a welcomed one and demonstrated foresight on the part of those leading Project HealthDesign. But PHA development is now coalescing around the major commercial platform plays of Google and Microsoft and a third platform (CDE) that has little market visibility will wither.

2) Marketing. RWJF is a great source of funding to push the envelop on what might be in regards to PHAs but getting beyond the academic funding exercise to the promotion and marketing of the results of their funding is lackluster at best. They simply do not have the gravitas in the market, they do not get the ink and subsequently, few even know of the CDE, let alone have accessed the code (according to RWJF, since June ‘09 there have been 38 downloads altogether of either the compiled or source code for the CDE – hardly a stampede by developers of PHAs).

3) Smartphones. The advent of the iPhone with its thousands of medical, health & wellness apps, the more recent introduction of Google’s Android mobile OS, and the onslaught from virtually every other mobile app OS to have its own “AppStore” has completely changed the equation of what consumers will ultimately use as the input device for their online health information. From collecting ODLs to granting access to personal health information (PHI) on the fly the smartphone will become the modality of choice within the next 5 years. This is where the market for consumer-facing healthcare apps is headed and likewise, where developers will be focusing the majority of their attention and of course limited resources. …

The Microsoft Azure platform, when it makes its commercial debut in February, will be late to the cloud computing party compared to Amazon, Salesforce.com and Google. But it won't be too late, according to developers and solution providers weighing their own move into cloud computing applications.

In fact, some think that Microsoft's tardiness to market may pay off as it has in the past. The company was famously late to graphical user interfaces, Internet browsers, spreadsheets and word processors and went on to dominate all of those categories through sheer perseverance.

Even some Microsoft-centric developers and partners that had blasted the lack of commercially-hosted services from Microsoft just a few months ago, now say that Azure may hit the Web just as a critical mass of customers are finally ready to trust at least some of their information technology processes or data to a cloud of some type. …

At the Professional Developers Conference last month, Ray Ozzie and Bob Muglia made a series of announcements around Microsoft’s cloud platform that moved us from something of interest to developers to something that I think will catch the attention of CIO’s too. A few things of note

App Fabric & Project Sydney – bringing cloud and on premises together

System Center “cloud” on Muglia’s slides

VM support in Windows Azure

Dallas – putting public data sets in the cloud with a marketplace and open API’s

We’re now clearly in the Iaas, PaaS and SaaS world as shown on this slide from Bob’s presentation

What does this all mean? It means that Microsoft now has a full deck of cards on the table around cloud computing - though there is more to come with things like Office Web Apps – and we’re now showing customers that if they prefer to deploy IT capability in the cloud, be it infrastructure, apps they build or services we sell (like Exchange) they can choose to do that. If they’d like to continue on premises, they can choose to do that….and if they’d like a hybrid they can choose to do that. Choice….not something other vendors are really offering. They tend to be all cloud, or all on premises. Of course this runs the risk of being confusing for customers but the reality is, I think it’s the hybrid approach that many customers will take, putting some infrastructure in the cloud or “bursting” to the cloud when they need it during peak times whilst keeping some things like HR systems on premises for the moment. For a CIO and an IT director, that choice gives them plenty of ways to save money or spend in a different way – i.e. on demand vs. up front. It also gives them a way to do things like proofs of concept or short lived projects in a matter of moments as they can dial up and down servers as they need. The elasticity of the cloud is a god send for CIO’s, CFO’s and IT folks.

Fortunately, it’s not just me who thinks this as there have been a number of posts on the web over the last week observing that Azure is ripe for business adoption. CIO magazine talked with Crispin Porter + Bogusky who have been using Azure in anger already and Chevron is eyeing it up.

Steve goes on to quote Gartner analyst Ray Valdes, as well as Network World and CRN writers about Azure’s appeal to businesses.

FederalNewsRadio has learned budget passback language also calls for alternative analyses for major IT projects in 2012.

The Office of Management and Budget will require agencies to develop an alternative analysis discussing how they could use cloud computing for all major technology projects for the fiscal 2012 budget.

Agencies will be expected to tell OMB why they wouldn't use cloud computing for these initiatives, according to the 2011 budget passback language obtained by FederalNewsRadio.

And in 2013, agencies must give OMB a complete alternatives analysis for mixed life cycle projects where agencies are spending new money-known as development, modernization and enhancement-and steady state or operations and maintenance funding for how they could move to cloud computing, the budget instructions say.

This language is on top of the plan by OMB to require agencies launch a series of cloud computing pilots across the government in 2010 using the E-Government Fund. In the Financial Services bill, the House's version includes $33 million for the General Services Administration, which manages the fund for OMB. The Senate's version includes $35 million.

Congress has yet to finalize the bill, but OMB will receive more money for e-government than ever before. …

I'm off to Seoul, South Korea next week, but before I leave I wanted to give you a little holiday gift, yes, the gift of my prognostication. Before I do, as anyone who routinely reads my blog will understand, all I pretty much do is attempt predict the future. As an entrepreneur that has always been a key part of my successes & failures. (That and I also seem to be an eternal optimist) Generally my view of the future is not shaped by selecting any particular point in time but instead done from what I see from my ever changing vantage point in the present. …

Ruv’s predictions fall into these categories:

Anytime Data - Real Time, Anytime and Anywhere

Emergent Clouds

Technological Convergence

Darryl K. Taft reports “Microsoft’s "Software + Services" strategy – delivering bits to customers in a variety of ways, from on-premises software to full-blown public/private cloud services -- is nowhere more evident than in the company's approach to the federal sector” in his Microsoft Takes Windows Azure to the Feds post to eWeek’s Cloud Computing News segment:

… Perhaps it is because of federal rules, regulations, procurement policies and whatnot, but in the federal sector, Microsoft's strategy that many have criticized as blurry if not hyped, becomes as clear as a cloudless day.

At a FedScoop Cloud Computing Shoot Out here on Dec. 8, Susie Adams, Microsoft's chief technology officer for the federal sector, and Yousef Khalidi, a Microsoft distinguished engineer and member of the founding team that created the core of Microsoft's Windows Azure cloud platform, helped deliver some of that clarity.

In back-to-back conversations with eWEEK Adams and Khalidi laid out the Microsoft plan to deliver software across the "full spectrum," from on-premise IT to the cloud -- including private cloud-like environments -- and to take the lessons learned in doing so and parlay those back into the product and services lines.

"We're taking our learnings from the process of building Azure and putting that back into the Windows Server product and our other technology," Khalidi said.

Moreover, "We fundamentally don't believe that private clouds are going to go away," Adams said, noting that certain agencies with certain information and workloads will never want to see that stuff in a public cloud environment.

Although Windows Azure is a public cloud technology, "We have dedicated offerings -- dedicated clouds [if you will]," Adams said. This consists of a dedicated network pipe, compute services that Microsoft runs and the customer "manages who gets access and we run it for them." …

Analysts, bloggers and mainstream media have spent 2009 promoting cloud computing as “the next big thing” that will revolutionize the way companies buy and use computing power. But beyond the hype and the C-level interest in an exciting trend, there’s value to the cloud that appeals to the pragmatic, “show me” nature of enterprise IT.

The two main drivers for cloud computing are the same ones that have always motivated enterprise IT: save money (do more with less) and be more responsive to business needs. These goals are typically in conflict with each other, so that in tough times the first takes precedence and in boom times the second one does. …

In a move that starts the countdown to Microsoft's Jan. 1 launch of its Windows Azure cloud services platform, Microsoft has shifted the product from a development group headed by chief software architect Ray Ozzie to a commercial unit under server boss Bob Muglia.

The company also announced it will partner with NetApp on the development of some cloud technologies. …

Microsoft is betting big on so-called cloud, or hosted, computing. The company has invested billions developing Azure and opening data centers from which to deliver services. Azure provides cloud based OS, development, and storage services that will offer enterprise customers off-premises computing.

Microsoft also plans to offer cloud systems that business customers can run in their own data centers. In keeping with that, the company on Wednesday announced a three-year partnership with storage and virtualization specialist NetApp.

Under the arrangement, the two companies will collaborate on product development, integration, and marketing of products and services for in-house cloud environments. In particular, the vendors will work to integrate NetApp's storage system with Microsoft's Windows Server 2008 R2 server OS and Hyper-V virtualization technology. …

… The move makes sense, as the company's "software plus services" strategy requires consistency in the management and execution capabilities of both Windows Server and Windows Azure. Microsoft has been working on both Azure and private cloud capabilities for some time now, though its Web site currently pitches its Dynamic Data Center Toolkit as a "foundation" for both private and partner cloud services.

It should be noted that this move means that CTO Ray Ozzie is no longer heading the Azure team, a signal that Azure has graduated from a technical project to a full-fledged Microsoft business. …

A group of companies is starting up an Enterprise Cloud Buyers Council in hopes of removing barriers to enterprise use of hosted cloud computing.

Initial members include companies that offer hosted cloud computing as well as enterprises that use such services, including Microsoft, IBM, HP, Cisco, AT&T, BT, EMC, Deutsche Bank, Alcatel-Lucent, Amdocs, CA, Nokia Siemens Networks, Telecom Italia and Telstra. Two industry organizations, Distributed Management Task Force and the IT Service Management Forum, are also involved. The TM Forum, an industry association that helps information and communications companies create profitable services, came up with the idea of the council.

One important issue that the council will try to address is the current fear among enterprises of vendor lock-in, said Gary Bruce, a principal researcher at BT. The council may decide to work on standards-based solutions around various layers of cloud computing, including the virtualization, management and control layers, so that enterprises can more easily port their projects from one cloud computing vendor to another, he said.

In addition, enterprises are often concerned about security and reliability, he said.

I find it hard to believe that Microsoft wants to avoid vendor lock-in to the Windows Azure Platform and SQL Azure.

While conducting research for the long overdue and nearly completed report on Personal Health Clouds (Dossia, Google Health and HealthVault) came across a recently published report by the European Network and Information Security Agency (ENISA) addressing cloud computing security. Though quite long (over 120 pages) the report provides a very comprehensive overview of cloud computing, its benefits, risks and some very good risk assessment tools to assist one in evaluating a cloud solution offering including segmentation by SaaS, IaaS and PaaS.

“In the world of SOA, simply put, governance means designing, building, testing, and implementing policies for services monitoring and their use. Governance as related to services, or service governance, is most applicable to the use of cloud computing, since we are basically defining our architecture as a set of services that are relocatable between on-premise and cloud computing-based services.”

The question is, where are the vendors on this? Most SOA governance solutions, up to this point, have focused on Web services. Now it appears some vendors are extending the concept of service governance to address cloud-based services.

For example, this week, AmberPoint, best known for its SOA management platform, and SOA Software, which has been in the governance game for a few years, both announced new governance offerings, and both point to the clouds. These offerings extend their reach to REST-based services and beyond, both vendors say.

Click here for more information on AmberPoint’s application and SOA governance offerings.

One of the most important barriers to getting population-health data is the concern that PHI privacy could be violated. After all, health information is very personal and sensitive (perhaps, one could argue, even more than personal banking information), and HIPAA Privacy Laws govern the protection, privacy and security of such information.

In order that data extracted from EHRs can be used for such public health purposes, it would need to be de-identified. But is true de-identification possible? This has been the subject of numerous blog articles, and it has been argued that with just a few pieces of data, re-identification can be achieved.

As noted in legal reviews, the HIPAA Privacy Rule permits covered entities to release data that has been de-identified without obtaining an authorization and without further restrictions upon use or disclosure, because de-identified data is not PHI, and therefore not subject to the Privacy Rule.

•Steve Riley will present two cloud-related sessions on Thursday, 12/10/2009 from 6:00 PM to 9:00 PM EST to the New York IT Security User Group (NYITSUG) at the AXA Financial Building, 1290 6th Avenue (nee Avenue of the Americas), New York, NY 10104 (map):

Fear the cloud no moreSuddenly, it seems, the simple network diagram symbol for the Internet has become a major component for providing infrastructure platforms and service offerings. Unlike the application service provider days of the late 1990s, cloud computing is here to stay. It’s already gained much traction for specialty computing purposes, yet many IT shops remain wary. Moving compute and storage out of your own data center and into someone else’s, mingled among many others, seems daunting at first. Common questions arise around security, manageability, performance, and reliability. Think about it, though–these are the same concerns you’ve always had. Nothing about the cloud requires that you jettison everything you’ve learned during your career. The cloud is a logical next step in the evolution of computing, and when integrated with corporate IT removes much of the burden and allows a business to concentrate on its core functions.

Security and compliance in the cloudMoving to the cloud raises lots of questions, mostly about security. Providers worthy of your business should answer them clearly and honestly. Amazon Web Services has built an infrastructure and established processes to mitigate common vulnerabilities and offer a safe compute and storage environment.

Steve is a former Microsoft Security guru and now is a Senior Technical Program Manager at Amazon.com.

Are you uncertain about EHR protection requirements introduced by ARRA and the HITECH Act? You’re not alone. In this web seminar, Forrester Senior Analyst Andrew Jaquith will help you navigate the new guidelines for data privacy and breach disclosure, and recommend strategies for protecting data and reducing risk. Also, technology experts from Intel and Lenovo will discuss how you can ensure maximum protection for mobile computers – the most common source of electronic health records (EHR) data breaches. Discussion topics will include:

Security obligations under the HITECH Act, HIPAA and state laws

Proving compliance in the exchange and storage of electronic records

Anti-theft technology for minimizing the risks associated with mobile computers

Services for multi-layered security offered by PC providers

If your organization is implementing EHR, register now for this unique chance for expert analysis of data privacy issues.

If you are managing PCs, laptops or mobile devices, cloud-based business intelligence services can make your job much easier, not to mention helping you improve data security and reduce management costs.

In this 45-minute webcast, our featured guest, Chris Silva of independent research firm, Forrester Research, Inc., will discuss why PC managers and "Mobile Operations Managers" need to increase their ability to monitor what happens on endpoints. He will also present research results on how operations managers are coping with their most pressing challenges.

Jonathan Dale of MaaS360 will then present case studies showing how a cloud-based business intelligence tool helped two enterprises:

Detect unknown security vulnerabilities.

Identify why some systems were not ready for software upgrades.

Find risky software packages on systems in remote locations.

Produce compliance reports that saved weeks of data compilation.

Prove that lost or stolen laptops were fully encrypted.

After 45 minutes you will understand how the information provided by "endpoint intelligence" can simplify your job and make you more effective. You will learn how to take the next step by using cloud based-tools to enforce policies and perform remediation. Finally, you will receive information about a simple trial that can be used to assess the value of visibility into endpoints in your own environment.

Brandon Sanford of Waggener Edstrom announced Microsoft MIX10 registration now open in a 12/9/2009 e-mail:

Today Microsoft announced that registration is now open for MIX10, as well as the event’s keynote line-up. MIX10 will be held March 15 - 17, 2010 at the Mandalay Bay in Las Vegas.

Keynoters include Bill Buxton, Microsoft Principal Researcher and author of Sketching User Experiences, and Scott Guthrie, corporate vice president of Microsoft’s .NET Developer Division. The first sessions and workshops were also disclosed, covering topics including design/user experience (UX), mobile, rich Internet applications (RIAs) and web standards. Many of this year’s MIX sessions will be selected via online voting. An open call for session content is now live athttp://live.visitmix.com/opencall.

The past few years have seen dramatic increases in the size and efficiency of the world’s largest data centers, hosted at providers like Amazon, Microsoft, and Google. As the industry builds out for the coming age of cloud computing, we are being forced to rethink old problems and learn new lessons. What we are learning about the unique economics of cloud computing will be impacting the industry for years to come. Come hear the latest insights about cloud computing economics and how it will impact you.

Prices are the same for Windows Server 2003 and 2008 versions. There’s no surcharge for SQL Server 2008 Express but SQL Server 2008 Standard Edition runs $1.08 per hour, which computes to $777.60 per month. If you can live with a 10GB maximum database size, SQL Azure’s Business Edition at $99.95 per month is a comparative bargain. David Robinson of Microsoft’s SQL Azure Team promised at PDC 2009 larger database size limits in future versions.

The outage occurred on Dec. 9, 2009 beginning at approximately 3:34 a.m. EST and lasted approximately 44 minutes.

During that time, access to systems in the Amazon’s northern Virginia data center was unavailable to businesses.

Apparent Networks’ Cloud Performance Center, a free service that offers performance data on leading cloud computing service providers such as Amazon, Google and GoGrid, detected the outage.

The Cloud Computing Performance Center utilizes Apparent Networks’ PathView Cloud service to test the performance of cloud service providers. The service has been configured to sample path performance to a series of pre-determined targets hosted at Amazon’s data centers every 120 seconds.

For the past several years, many people have claimed that cloud computing can reduce a company's costs, improve cash flow, reduce risks, and maximize revenue opportunities. Until now, prospective customers have had to do a lot of leg work to compare the costs of a flexible solution based on cloud computing to a more traditional static model. Doing a genuine "apples to apples" comparison turns out to be complex — it is easy to neglect internal costs which are hidden away as "overhead".

We want to make sure that anyone evaluating the economics of AWS has the tools and information needed to do an accurate and thorough job. To that end, today we released a pair of white papers and an Amazon EC2 Cost Comparison Calculator spreadsheet as part of our brand new AWS Economics Center. This center will contain the resources that developers and financial decision makers need in order to make an informed choice. We have had many in-depth conversations with CIO's, IT Directors, and other IT staff, and most of them have told us that their infrastructure costs are structured in a unique way and difficult to understand. Performing a truly accurate analysis will still require deep, thoughtful analysis of an enterprise's costs, but we hope that the resources and tools below will provide a good springboard for that investigation.

If you hear an increasingly loud rumbling noise, you'll probably find it's either: that chicken pesto sandwich you had for lunch; or the sound of telecom operators scrambling to position themselves as cloud services pioneers.

The latest operator to make a noise about its hosted services offerings is BT Group plc (NYSE: BT; London: BTA), though this isn't the British incumbent's first foray into the world of so-called cloud services. (See BT, Microsoft Get Cloudy .)

Today, the operator is taking its next big step down the hosted applications road. In cahoots with long-time partner Cisco Systems Inc. (Nasdaq: CSCO), BT has unveiled a "global hosted IP telephony service" that "allows businesses to bring converged voice, mobile and data services to every desktop in their organisation, using BT and Cisco’s cloud computing-based technologies."

The Cisco technology in question is the Hosted Unified Communications Services (HUCS) platform. The IP giant, though, has plenty of other ideas about how it can help cloud services reign. [Ed. note: Geddit?] (See Cisco Plays in the Clouds.) …

Major users to form Enterprise Cloud Buyers Council (ECBC) as the core driver

Cloud service providers and technology suppliers to join with users to collaborate on a comprehensive program for accelerating commercial availability of managed and secure cloud services

ORLANDO, FL, USA - December 8, 2009 - TM Forum, the world's premier industry group focused on business effectiveness for the communications and media sectors, today announced the formation of an ecosystem of major industry players in the emerging cloud services sector. The centerpiece of this effort is the creation of the Enterprise Cloud Buyers Council (ECBC) whose goal is to understand the needs of the largest global cloud buyers and ensure any impediments to the uptake of cloud technology are removed. Together with key service and technology suppliers, the ecosystem will initiate a range of programs designed to remove barriers to the growth of commercial cloud services. …

TM Forum is an industry association dedicated to helping companies in the information, communications and entertainment industries reduce the costs and risks associated with creating and delivering profitable services. The Forum's initiatives focus on providing industry research, publications, technology roadmaps, best practices, software standards, certified training courses and conferences to its more than 700 member companies in 75 countries. Membership includes the world's largest service providers, cable and network operators, software suppliers, equipment suppliers and systems integrators. To learn more, please visit www.tmforum.org.

So you have built yourself an Azure worker process to crunch through your work, but how do you keep an eye on things? Facing that question I investigated and then built the world’s first Vista / Win7 Gadget to monitor your Queue size. The cool part is this is a pure JavaScript solution! Yes I managed to set the headers, do the Hash-based Message Authentication Code using SHA256 and the base64 encoding. I’ll put together another post to explain all that very soon.

One of the latest features introduced on SQL Azure is the ability to apply firewall settings on your database and allow only specific IP ranges to connect to it. This can be done through SQL Azure Portal or through code using stored procedures.

If you want to take a look at which rules are active on your SQL Azure database, you can use:

select * from sys.firewall_rulesThat will give you a view of your firewall rules.

If you want to add a new firewall rule, you can use the "sp_set_firewall_rule". The syntax is "sp_set_firewall_rule <firewall_rule_name> <ip range start> <ip range end>". For example:

Today in the opening keynote at PDC we announced the availability of SQL Azure Data Sync – November CTP, an early preview open to the public through a demonstration with Kelley Blue Book. For those of you who have been following our blog, you may be asking yourself, what exactly does this include and how does it compare to Project “Huron” that we have been talking about for some time now? In this post I want to give some additional details.

You can think of SQL Azure Data Sync as the first part of our overall Project “Huron” vision which is to create a Data Hub in the Cloud, or more specifically a place for you to easily consolidate and share all of your information. With SQL Azure Data Sync we have worked to simplify the task of sharing information whether that is from on-premises SQL Server to the cloud or from the cloud, down to mobile users, retails stores or remote offices. All of this being powered by the Microsoft Sync Framework.

• Katrina Woznicki reports “Ninety-six percent of the 50 patients surveyed left out at least one drug when they were asked to list their medications, and, on average, patients omitted 6.8 medications” in her When Asked, Patients Can't Tell article of 12/10/2009 for MedPageToday:

Hospitalized patients were often clueless when asked about their medications, with almost all of them unable to name all their medications and many leaving out as many as a half-dozen drugs they have been prescribed, according toa small survey of patients in a Colorado hospital.

Moreover, 44% of the patients thought they were taking a medication that had not been prescribed.

The researchers conducted the patient survey as part of a larger project examining a potential role for patients in reducing medication errors and improving patient safety.

"This study is a first for raising the questions 'How involved should patients be in their hospital medication safety?' and 'How do you involve them?'" Cumbler told MedPage Today.

"We don't live in a perfect healthcare system and errors do occur. If you have a patient who wants to be involved in their medication safety, you have to let him or her know what they're taking and to let them be an active participant."

Among scheduled medications, patients commonly omitted several important therapeutics, including antibiotics, cardiovascular drugs, and antithrombotics. …

This story points out the importance to patient safety of easily-accessible, cloud-based personal health records (PHRs).

Dorado Corporation has released its predictions for major trends and developments likely to shape residential mortgage lending and mortgage technology in 2010. They are:

Software-as-a-service (SaaS) adoption will reach critical mass in mortgage originator use, with more than 30% of all originations in North America occurring in the cloud.

Loan volume will fall below 2009 levels, primarily as a result of reduced FHA originations and an expectation that refinancing levels will soon peak. However, the drop will be mitigated by several factors: continued low interest rates for at least the first half of the year, employment stabilization, the continued high supply of affordable homes, and proactive lender marketing driven by advances in lead capture technology and new entrants such as Google into the lead generation space.

Regulatory compliance will become a competitive advantage, as those lenders that adopt turnkey approaches to updating their systems and processes take market share away from those institutions mired in human error, penalties, operational disruptions, and customer complaints.

New integrations and interoperability capabilities will provide banks and their borrowers alike with unprecedented choice in external services, driving the cost of these services downwards.

Requirements for lenders to maintain a greater financial interest in the loans sent to the secondary markets will further drive a trend towards clean, error-free loans and increased transparency in the creation and handling of loan file information.

The demarcation between originators and servicers will continue to blur, related to the increased exposure of lenders to the pools that they feed, and an increased level of pre-close analytics and other safeguards instituted in the origination process to minimize risk.

The mid-tier group of banks from 2009 will begin to differentiate themselves, with a handful of aggressive, volume-focused lenders using technology and new ways of doing business to capture market share. The result is a breakout group of “super regionals” that will force former mid-level players further down the food chain.

Chilmark Research sees this as a very savvy acquisition that will further extend the capabilities and thus market opportunities for Microsoft in the healthcare sector. For example, in the hot market for Health Information Exchanges (HIE), managing security access across multiple entities within a given region is challenging – the Sentillion suite of security solutions will slot into this market need quite readily. For Sentillion, this is also a good move has it provides them to backing, resources and distribution channel to truly take their solution suite global far faster than if they attempted to do it organically.

• Joseph Goedart adds his analysis of the Sentillion acquisition in a Microsoft to Buy Sentillion post of 12/10/2009 to the Health Data Management blog:

Microsoft Corp. will acquire Sentillion Inc., an Andover, Mass.-based vendor of context management and single-sign-on software, for an undisclosed sum. The vendors already are partners. Redmond, Washington-based Microsoft in mid-2009 signed a license to use Sentillion's software as a module with the Amalga Unified Intelligence System. Amalga is advanced data integration and aggregation software. Sentillion's applications enable users to access and simultaneously view patient data from multiple information systems during a single session.

Sentillion will continue to sell and support its products while Microsoft invests in the long-term evolution of the combined product suite, according to the companies. Sentillion will continue to operate out of its Andover headquarters. The companies expect the acquisition to close in early 2010.

Microsoft claims “The European Environment Agency's Eye on Earth site lets citizens track important environmental data. The Windows Azure-powered portal is being showcased at this week's United Nations Climate Change Conference in Copenhagen” in a Microsoft Helps Europeans Keep an 'Eye on Earth' press release of 12/9/2009:

… The new vantage point comes compliments of Eye on Earth, a joint project between Microsoft and the European Environment Agency (EEA) that is being shown at the United Nation's 15th Climate Change Conference in Copenhagen (COP15), which kicked off Dec. 7. One of the first applications built on the Microsoft Windows Azure cloud-computing platform, the Eye on Earth portal provides real-time environmental information to the 500 million people who live in the EEA's 32 member countries. It serves up that data in a visual format via Bing Maps. [Emphasis added.]

Keeping an Eye on Earth: At the Eye on Earth home page, visitors can click on links to sensors that show water or air quality at many locations across Europe.

… As much as Chilmark would love to say that the CDE is a great idea that will be met with broad adoption and use, we just don’t see that playing out for several reasons:

1) Google Health & HealthVault. It wasn’t that long ago that both Google Health and HealthVault were nothing but rumors. At that time the PHA/PHR market was in funk with numerous apps being developed, but few gaining much traction in the market and the path to market was convoluted. The concept of a CDE was a welcomed one and demonstrated foresight on the part of those leading Project HealthDesign. But PHA development is now coalescing around the major commercial platform plays of Google and Microsoft and a third platform (CDE) that has little market visibility will wither.

2) Marketing. RWJF is a great source of funding to push the envelop on what might be in regards to PHAs but getting beyond the academic funding exercise to the promotion and marketing of the results of their funding is lackluster at best. They simply do not have the gravitas in the market, they do not get the ink and subsequently, few even know of the CDE, let alone have accessed the code (according to RWJF, since June ‘09 there have been 38 downloads altogether of either the compiled or source code for the CDE – hardly a stampede by developers of PHAs).

3) Smartphones. The advent of the iPhone with its thousands of medical, health & wellness apps, the more recent introduction of Google’s Android mobile OS, and the onslaught from virtually every other mobile app OS to have its own “AppStore” has completely changed the equation of what consumers will ultimately use as the input device for their online health information. From collecting ODLs to granting access to personal health information (PHI) on the fly the smartphone will become the modality of choice within the next 5 years. This is where the market for consumer-facing healthcare apps is headed and likewise, where developers will be focusing the majority of their attention and of course limited resources. …

The Microsoft Azure platform, when it makes its commercial debut in February, will be late to the cloud computing party compared to Amazon, Salesforce.com and Google. But it won't be too late, according to developers and solution providers weighing their own move into cloud computing applications.

In fact, some think that Microsoft's tardiness to market may pay off as it has in the past. The company was famously late to graphical user interfaces, Internet browsers, spreadsheets and word processors and went on to dominate all of those categories through sheer perseverance.

Even some Microsoft-centric developers and partners that had blasted the lack of commercially-hosted services from Microsoft just a few months ago, now say that Azure may hit the Web just as a critical mass of customers are finally ready to trust at least some of their information technology processes or data to a cloud of some type. …

At the Professional Developers Conference last month, Ray Ozzie and Bob Muglia made a series of announcements around Microsoft’s cloud platform that moved us from something of interest to developers to something that I think will catch the attention of CIO’s too. A few things of note

App Fabric & Project Sydney – bringing cloud and on premises together

System Center “cloud” on Muglia’s slides

VM support in Windows Azure

Dallas – putting public data sets in the cloud with a marketplace and open API’s

We’re now clearly in the Iaas, PaaS and SaaS world as shown on this slide from Bob’s presentation

What does this all mean? It means that Microsoft now has a full deck of cards on the table around cloud computing - though there is more to come with things like Office Web Apps – and we’re now showing customers that if they prefer to deploy IT capability in the cloud, be it infrastructure, apps they build or services we sell (like Exchange) they can choose to do that. If they’d like to continue on premises, they can choose to do that….and if they’d like a hybrid they can choose to do that. Choice….not something other vendors are really offering. They tend to be all cloud, or all on premises. Of course this runs the risk of being confusing for customers but the reality is, I think it’s the hybrid approach that many customers will take, putting some infrastructure in the cloud or “bursting” to the cloud when they need it during peak times whilst keeping some things like HR systems on premises for the moment. For a CIO and an IT director, that choice gives them plenty of ways to save money or spend in a different way – i.e. on demand vs. up front. It also gives them a way to do things like proofs of concept or short lived projects in a matter of moments as they can dial up and down servers as they need. The elasticity of the cloud is a god send for CIO’s, CFO’s and IT folks.

Fortunately, it’s not just me who thinks this as there have been a number of posts on the web over the last week observing that Azure is ripe for business adoption. CIO magazine talked with Crispin Porter + Bogusky who have been using Azure in anger already and Chevron is eyeing it up.

Steve goes on to quote Gartner analyst Ray Valdes, as well as Network World and CRN writers about Azure’s appeal to businesses.

FederalNewsRadio has learned budget passback language also calls for alternative analyses for major IT projects in 2012.

The Office of Management and Budget will require agencies to develop an alternative analysis discussing how they could use cloud computing for all major technology projects for the fiscal 2012 budget.

Agencies will be expected to tell OMB why they wouldn't use cloud computing for these initiatives, according to the 2011 budget passback language obtained by FederalNewsRadio.

And in 2013, agencies must give OMB a complete alternatives analysis for mixed life cycle projects where agencies are spending new money-known as development, modernization and enhancement-and steady state or operations and maintenance funding for how they could move to cloud computing, the budget instructions say.

This language is on top of the plan by OMB to require agencies launch a series of cloud computing pilots across the government in 2010 using the E-Government Fund. In the Financial Services bill, the House's version includes $33 million for the General Services Administration, which manages the fund for OMB. The Senate's version includes $35 million.

Congress has yet to finalize the bill, but OMB will receive more money for e-government than ever before. …

I'm off to Seoul, South Korea next week, but before I leave I wanted to give you a little holiday gift, yes, the gift of my prognostication. Before I do, as anyone who routinely reads my blog will understand, all I pretty much do is attempt predict the future. As an entrepreneur that has always been a key part of my successes & failures. (That and I also seem to be an eternal optimist) Generally my view of the future is not shaped by selecting any particular point in time but instead done from what I see from my ever changing vantage point in the present. …

Ruv’s predictions fall into these categories:

Anytime Data - Real Time, Anytime and Anywhere

Emergent Clouds

Technological Convergence

Darryl K. Taft reports “Microsoft’s "Software + Services" strategy – delivering bits to customers in a variety of ways, from on-premises software to full-blown public/private cloud services -- is nowhere more evident than in the company's approach to the federal sector” in his Microsoft Takes Windows Azure to the Feds post to eWeek’s Cloud Computing News segment:

… Perhaps it is because of federal rules, regulations, procurement policies and whatnot, but in the federal sector, Microsoft's strategy that many have criticized as blurry if not hyped, becomes as clear as a cloudless day.

At a FedScoop Cloud Computing Shoot Out here on Dec. 8, Susie Adams, Microsoft's chief technology officer for the federal sector, and Yousef Khalidi, a Microsoft distinguished engineer and member of the founding team that created the core of Microsoft's Windows Azure cloud platform, helped deliver some of that clarity.

In back-to-back conversations with eWEEK Adams and Khalidi laid out the Microsoft plan to deliver software across the "full spectrum," from on-premise IT to the cloud -- including private cloud-like environments -- and to take the lessons learned in doing so and parlay those back into the product and services lines.

"We're taking our learnings from the process of building Azure and putting that back into the Windows Server product and our other technology," Khalidi said.

Moreover, "We fundamentally don't believe that private clouds are going to go away," Adams said, noting that certain agencies with certain information and workloads will never want to see that stuff in a public cloud environment.

Although Windows Azure is a public cloud technology, "We have dedicated offerings -- dedicated clouds [if you will]," Adams said. This consists of a dedicated network pipe, compute services that Microsoft runs and the customer "manages who gets access and we run it for them." …

Analysts, bloggers and mainstream media have spent 2009 promoting cloud computing as “the next big thing” that will revolutionize the way companies buy and use computing power. But beyond the hype and the C-level interest in an exciting trend, there’s value to the cloud that appeals to the pragmatic, “show me” nature of enterprise IT.

The two main drivers for cloud computing are the same ones that have always motivated enterprise IT: save money (do more with less) and be more responsive to business needs. These goals are typically in conflict with each other, so that in tough times the first takes precedence and in boom times the second one does. …

In a move that starts the countdown to Microsoft's Jan. 1 launch of its Windows Azure cloud services platform, Microsoft has shifted the product from a development group headed by chief software architect Ray Ozzie to a commercial unit under server boss Bob Muglia.

The company also announced it will partner with NetApp on the development of some cloud technologies. …

Microsoft is betting big on so-called cloud, or hosted, computing. The company has invested billions developing Azure and opening data centers from which to deliver services. Azure provides cloud based OS, development, and storage services that will offer enterprise customers off-premises computing.

Microsoft also plans to offer cloud systems that business customers can run in their own data centers. In keeping with that, the company on Wednesday announced a three-year partnership with storage and virtualization specialist NetApp.

Under the arrangement, the two companies will collaborate on product development, integration, and marketing of products and services for in-house cloud environments. In particular, the vendors will work to integrate NetApp's storage system with Microsoft's Windows Server 2008 R2 server OS and Hyper-V virtualization technology. …

… The move makes sense, as the company's "software plus services" strategy requires consistency in the management and execution capabilities of both Windows Server and Windows Azure. Microsoft has been working on both Azure and private cloud capabilities for some time now, though its Web site currently pitches its Dynamic Data Center Toolkit as a "foundation" for both private and partner cloud services.

It should be noted that this move means that CTO Ray Ozzie is no longer heading the Azure team, a signal that Azure has graduated from a technical project to a full-fledged Microsoft business. …

A group of companies is starting up an Enterprise Cloud Buyers Council in hopes of removing barriers to enterprise use of hosted cloud computing.

Initial members include companies that offer hosted cloud computing as well as enterprises that use such services, including Microsoft, IBM, HP, Cisco, AT&T, BT, EMC, Deutsche Bank, Alcatel-Lucent, Amdocs, CA, Nokia Siemens Networks, Telecom Italia and Telstra. Two industry organizations, Distributed Management Task Force and the IT Service Management Forum, are also involved. The TM Forum, an industry association that helps information and communications companies create profitable services, came up with the idea of the council.

One important issue that the council will try to address is the current fear among enterprises of vendor lock-in, said Gary Bruce, a principal researcher at BT. The council may decide to work on standards-based solutions around various layers of cloud computing, including the virtualization, management and control layers, so that enterprises can more easily port their projects from one cloud computing vendor to another, he said.

In addition, enterprises are often concerned about security and reliability, he said.

I find it hard to believe that Microsoft wants to avoid vendor lock-in to the Windows Azure Platform and SQL Azure.

While conducting research for the long overdue and nearly completed report on Personal Health Clouds (Dossia, Google Health and HealthVault) came across a recently published report by the European Network and Information Security Agency (ENISA) addressing cloud computing security. Though quite long (over 120 pages) the report provides a very comprehensive overview of cloud computing, its benefits, risks and some very good risk assessment tools to assist one in evaluating a cloud solution offering including segmentation by SaaS, IaaS and PaaS.

“In the world of SOA, simply put, governance means designing, building, testing, and implementing policies for services monitoring and their use. Governance as related to services, or service governance, is most applicable to the use of cloud computing, since we are basically defining our architecture as a set of services that are relocatable between on-premise and cloud computing-based services.”

The question is, where are the vendors on this? Most SOA governance solutions, up to this point, have focused on Web services. Now it appears some vendors are extending the concept of service governance to address cloud-based services.

For example, this week, AmberPoint, best known for its SOA management platform, and SOA Software, which has been in the governance game for a few years, both announced new governance offerings, and both point to the clouds. These offerings extend their reach to REST-based services and beyond, both vendors say.

Click here for more information on AmberPoint’s application and SOA governance offerings.

One of the most important barriers to getting population-health data is the concern that PHI privacy could be violated. After all, health information is very personal and sensitive (perhaps, one could argue, even more than personal banking information), and HIPAA Privacy Laws govern the protection, privacy and security of such information.

In order that data extracted from EHRs can be used for such public health purposes, it would need to be de-identified. But is true de-identification possible? This has been the subject of numerous blog articles, and it has been argued that with just a few pieces of data, re-identification can be achieved.

As noted in legal reviews, the HIPAA Privacy Rule permits covered entities to release data that has been de-identified without obtaining an authorization and without further restrictions upon use or disclosure, because de-identified data is not PHI, and therefore not subject to the Privacy Rule.

•Steve Riley will present two cloud-related sessions on Thursday, 12/10/2009 from 6:00 PM to 9:00 PM EST to the New York IT Security User Group (NYITSUG) at the AXA Financial Building, 1290 6th Avenue (nee Avenue of the Americas), New York, NY 10104 (map):

Fear the cloud no moreSuddenly, it seems, the simple network diagram symbol for the Internet has become a major component for providing infrastructure platforms and service offerings. Unlike the application service provider days of the late 1990s, cloud computing is here to stay. It’s already gained much traction for specialty computing purposes, yet many IT shops remain wary. Moving compute and storage out of your own data center and into someone else’s, mingled among many others, seems daunting at first. Common questions arise around security, manageability, performance, and reliability. Think about it, though–these are the same concerns you’ve always had. Nothing about the cloud requires that you jettison everything you’ve learned during your career. The cloud is a logical next step in the evolution of computing, and when integrated with corporate IT removes much of the burden and allows a business to concentrate on its core functions.

Security and compliance in the cloudMoving to the cloud raises lots of questions, mostly about security. Providers worthy of your business should answer them clearly and honestly. Amazon Web Services has built an infrastructure and established processes to mitigate common vulnerabilities and offer a safe compute and storage environment.

Steve is a former Microsoft Security guru and now is a Senior Technical Program Manager at Amazon.com.

Are you uncertain about EHR protection requirements introduced by ARRA and the HITECH Act? You’re not alone. In this web seminar, Forrester Senior Analyst Andrew Jaquith will help you navigate the new guidelines for data privacy and breach disclosure, and recommend strategies for protecting data and reducing risk. Also, technology experts from Intel and Lenovo will discuss how you can ensure maximum protection for mobile computers – the most common source of electronic health records (EHR) data breaches. Discussion topics will include:

Security obligations under the HITECH Act, HIPAA and state laws

Proving compliance in the exchange and storage of electronic records

Anti-theft technology for minimizing the risks associated with mobile computers

Services for multi-layered security offered by PC providers

If your organization is implementing EHR, register now for this unique chance for expert analysis of data privacy issues.

If you are managing PCs, laptops or mobile devices, cloud-based business intelligence services can make your job much easier, not to mention helping you improve data security and reduce management costs.

In this 45-minute webcast, our featured guest, Chris Silva of independent research firm, Forrester Research, Inc., will discuss why PC managers and "Mobile Operations Managers" need to increase their ability to monitor what happens on endpoints. He will also present research results on how operations managers are coping with their most pressing challenges.

Jonathan Dale of MaaS360 will then present case studies showing how a cloud-based business intelligence tool helped two enterprises:

Detect unknown security vulnerabilities.

Identify why some systems were not ready for software upgrades.

Find risky software packages on systems in remote locations.

Produce compliance reports that saved weeks of data compilation.

Prove that lost or stolen laptops were fully encrypted.

After 45 minutes you will understand how the information provided by "endpoint intelligence" can simplify your job and make you more effective. You will learn how to take the next step by using cloud based-tools to enforce policies and perform remediation. Finally, you will receive information about a simple trial that can be used to assess the value of visibility into endpoints in your own environment.

Brandon Sanford of Waggener Edstrom announced Microsoft MIX10 registration now open in a 12/9/2009 e-mail:

Today Microsoft announced that registration is now open for MIX10, as well as the event’s keynote line-up. MIX10 will be held March 15 - 17, 2010 at the Mandalay Bay in Las Vegas.

Keynoters include Bill Buxton, Microsoft Principal Researcher and author of Sketching User Experiences, and Scott Guthrie, corporate vice president of Microsoft’s .NET Developer Division. The first sessions and workshops were also disclosed, covering topics including design/user experience (UX), mobile, rich Internet applications (RIAs) and web standards. Many of this year’s MIX sessions will be selected via online voting. An open call for session content is now live athttp://live.visitmix.com/opencall.

The past few years have seen dramatic increases in the size and efficiency of the world’s largest data centers, hosted at providers like Amazon, Microsoft, and Google. As the industry builds out for the coming age of cloud computing, we are being forced to rethink old problems and learn new lessons. What we are learning about the unique economics of cloud computing will be impacting the industry for years to come. Come hear the latest insights about cloud computing economics and how it will impact you.

Prices are the same for Windows Server 2003 and 2008 versions. There’s no surcharge for SQL Server 2008 Express but SQL Server 2008 Standard Edition runs $1.08 per hour, which computes to $777.60 per month. If you can live with a 10GB maximum database size, SQL Azure’s Business Edition at $99.95 per month is a comparative bargain. David Robinson of Microsoft’s SQL Azure Team promised at PDC 2009 larger database size limits in future versions.

The outage occurred on Dec. 9, 2009 beginning at approximately 3:34 a.m. EST and lasted approximately 44 minutes.

During that time, access to systems in the Amazon’s northern Virginia data center was unavailable to businesses.

Apparent Networks’ Cloud Performance Center, a free service that offers performance data on leading cloud computing service providers such as Amazon, Google and GoGrid, detected the outage.

The Cloud Computing Performance Center utilizes Apparent Networks’ PathView Cloud service to test the performance of cloud service providers. The service has been configured to sample path performance to a series of pre-determined targets hosted at Amazon’s data centers every 120 seconds.

For the past several years, many people have claimed that cloud computing can reduce a company's costs, improve cash flow, reduce risks, and maximize revenue opportunities. Until now, prospective customers have had to do a lot of leg work to compare the costs of a flexible solution based on cloud computing to a more traditional static model. Doing a genuine "apples to apples" comparison turns out to be complex — it is easy to neglect internal costs which are hidden away as "overhead".

We want to make sure that anyone evaluating the economics of AWS has the tools and information needed to do an accurate and thorough job. To that end, today we released a pair of white papers and an Amazon EC2 Cost Comparison Calculator spreadsheet as part of our brand new AWS Economics Center. This center will contain the resources that developers and financial decision makers need in order to make an informed choice. We have had many in-depth conversations with CIO's, IT Directors, and other IT staff, and most of them have told us that their infrastructure costs are structured in a unique way and difficult to understand. Performing a truly accurate analysis will still require deep, thoughtful analysis of an enterprise's costs, but we hope that the resources and tools below will provide a good springboard for that investigation.

If you hear an increasingly loud rumbling noise, you'll probably find it's either: that chicken pesto sandwich you had for lunch; or the sound of telecom operators scrambling to position themselves as cloud services pioneers.

The latest operator to make a noise about its hosted services offerings is BT Group plc (NYSE: BT; London: BTA), though this isn't the British incumbent's first foray into the world of so-called cloud services. (See BT, Microsoft Get Cloudy .)

Today, the operator is taking its next big step down the hosted applications road. In cahoots with long-time partner Cisco Systems Inc. (Nasdaq: CSCO), BT has unveiled a "global hosted IP telephony service" that "allows businesses to bring converged voice, mobile and data services to every desktop in their organisation, using BT and Cisco’s cloud computing-based technologies."

The Cisco technology in question is the Hosted Unified Communications Services (HUCS) platform. The IP giant, though, has plenty of other ideas about how it can help cloud services reign. [Ed. note: Geddit?] (See Cisco Plays in the Clouds.) …

Major users to form Enterprise Cloud Buyers Council (ECBC) as the core driver

Cloud service providers and technology suppliers to join with users to collaborate on a comprehensive program for accelerating commercial availability of managed and secure cloud services

ORLANDO, FL, USA - December 8, 2009 - TM Forum, the world's premier industry group focused on business effectiveness for the communications and media sectors, today announced the formation of an ecosystem of major industry players in the emerging cloud services sector. The centerpiece of this effort is the creation of the Enterprise Cloud Buyers Council (ECBC) whose goal is to understand the needs of the largest global cloud buyers and ensure any impediments to the uptake of cloud technology are removed. Together with key service and technology suppliers, the ecosystem will initiate a range of programs designed to remove barriers to the growth of commercial cloud services. …

TM Forum is an industry association dedicated to helping companies in the information, communications and entertainment industries reduce the costs and risks associated with creating and delivering profitable services. The Forum's initiatives focus on providing industry research, publications, technology roadmaps, best practices, software standards, certified training courses and conferences to its more than 700 member companies in 75 countries. Membership includes the world's largest service providers, cable and network operators, software suppliers, equipment suppliers and systems integrators. To learn more, please visit www.tmforum.org.

The dual Web role application has been running in Microsoft's South Central US (San Antonio) data center since September 2009. I believe it is the oldest continuously running Windows Azure application.

About Me

I'm a Windows Azure Insider, a retired Windows Azure MVP, the principal developer for OakLeaf Systems and the author of 30+ books on Microsoft software. The books have more than 1.25 million English copies in print and have been translated into 20+ languages.

Full disclosure: I make part of my livelihood by writing about Microsoft products in books and for magazines. I regularly receive free evaluation software from Microsoft and press credentials for Microsoft Tech•Ed and PDC. I'm also a member of the Microsoft Partner Network.