One of the best parts about working with SQL Azure is that you can use the tools that you already know and love. There are some differences though with the System Views, System Stored Procedures, System Tables and SMO that break some of those tools. The current version of SSMS 2008 is able to connect to SQL Azure but the Object Browser Explorer doesn’t work. There is also a trick to getting connected.

and Zach provides the workaround: Click Cancel when the Connect to Server dialog appears in SSMS startup and and then click the New Query button to reopen it. Dismiss the spurious error message and connect to SQL Azure.

David Robinson’s So… What do you think? post of 8/23/2009 repeats his earlier request for feedback about SQL Azure:

One of the most important pieces of information a product team can get is an understanding of what people think about their product, and how they are using it. My question for you is…What do you think? How has your experience been thus far? We would love to know – the good and the bad.

Commenters say SQL Azure is overpriced at $100/month for a 10-GB Business Edition database, complain about the 10-GB maximum database size, and ask “when we will be able to use the Visual Studio Server Manager and/or SQL Server Management Studio with the SQL Azure database?”

You can use SSMS with SQL Azure to create and drop databases and execute T-SCL commands and scripts. What doesn’t work yet is Object Explorer, but the team says that should be fixed by PDC 09.

After a long pause, the Id Element returns with a bang: yesterday I sneaked in Jorgen Thelin’s office, and he was so kind to spend some time with us discussing the Microsoft Federation Gateway. The MFG is absolutely central to Microsoft’s services offerings: this interview will help you understand what scenarios it enables and how to take advantage of it. Below there’s the caption:

Jorgen Thelin, Senior Program Manager, looks after key identity services in Microsoft such as Windows Live ID and the Microsoft Federation Gateway (MFG).

In today's interview Jorgen describes the role of MFG, and touches on the many wonders it enables: using AD accounts to SSO (single sign on) access to Microsoft Business Online Services such as Dynamics CRM, allowing the 550 million owners of a Live ID account to gain access to your federated applications developed with Windows Identity Foundation, and much more.

Jorgen also takes the chance to tell the story of the Microsoft Services Connector (MSC), from its inception to the decision of consolidating its functionalities in Active Directory Federation Services 2.0 (see the Microsoft Online Service Federation Utility preview).

Finally, Jorgen gives us a taste of the future of MFG: non-AD directories, SAML2.0 protocol and the new scenarios that those exciting features will enable.

Want to know why I spent one hour every day of my vacation practicing touch typing? Well, apart from the fact that it’s simply scandalous that after 20+ years spent on keyboards I still hunt & peck: in the next few months I’m going to need all the typing speed I can gather… I am signed up to write (or otherwise actively participate in) three books about identity:

Well, now you know what I will do during the 37+ hours of flights to & fro TechEd AU/NZ… or better, for as long as the 2 tablet batteries will last. Now you see why I am trying to learn how to touch-type: hunt & peck in economy class means shoving your elbows in the ribs of your neighbors, and that’s veeery bad practice ;-)

••Infragistics’ August 2009 Technical Newsletter describes Aqua, a healthcare CRM reference application for a hospital emergency department available for download from CodePlex. Infragistics says it “demonstrate[s] best practices in building WPF experiences using technologies such as WIndows Azure, SQL Data Services, LiveID and Infragistics UI Controls.”

Infragistics doesn’t call Aqua an Electronic Medical Record (EMR) emulation but it offers a demonstration of how you can combine Windows Presentation Foundation (WPF), Windows Azure and and Infragistics’ third-party controls to create the UI for an EMR or Personal Health Record (PHR) implementation. Here’s a capture of the physician’s starting page (click images for full-size captures):

The version currently available from CodePlex (under an MS-PL license) is based on the Windows Azure May 2009 CTP, and I was only able to find at CodePlex one of the three videos and one of three hands-on labs mentioned in the newsletter. Here are links to videos I found with a Web search:

A tight schedule to finish Cloud Computing with the Windows Azure Platform in time to make it available at PDC 09 prevents me from giving the source code a test-drive at present. If any reader has tested Aqua, please leave a comment with your results or link to a review.

In a measure of the continuing acceptance of cloud computing among businesses of all sizes, the question is slowly shifting from why move to cloud computing to why not? Forrester Research offers answers to that and other good questions about cloud computing, plus specific advice on what to do first.

Covering all three elements of cloud computing, Software as a Service (SaaS), Platform as a Service (PaaS) and Infrastructure as a Service (IaaS), Forrester analysts James Staten, Ted Schadler, John R. Rymer, and Chenxi Wang, Ph.D. (with Sharyn C. Leaver and Allison Herald) mix basic explanations with some solid, practical advice. …

Fredric Paul is publisher/editor-in-chief of bMighty.com and SmallBizResource.com.

HealthVault 0908 release is now available in the production and pre-production environments, with this release we are removing the beta label from HealthVault. The associated .NET SDK for this release will be available shortly.

Michael J. Astrue, Commissioner of Social Security, announced today that the agency has entered into an agreement with Microsoft to test the use of Microsoft’s HealthVault application in the disability process. HealthVault is a free online service that enables people to gather, store and manage their families’ health information, and share that information with their physicians and healthcare providers. These “personal health records” contain the same types of information that Social Security generally obtains from people applying for disability benefits.

“The use of personal health records holds great promise for ensuring that the medical information we collect from someone applying for disability benefits is accurate and complete,” Commissioner Astrue said. “Combined with other advancements in health information technology, our use of HealthVault should result in faster decisions for disability applicants. I look forward to working closely with Microsoft, a world-wide leader in information technology.”

HealthVault is Microsoft’s consumer-focused health-records-management Software+Service platform, which the company unveiled officially in 2007. (The service component of HealthVault is one of a handful of Microsoft services that already is hosted on top of Azure.)

The Medicare and Medicaid incentives for adopting electronic health records will lead to a gradual build in demand for the software, rather than a surge, one investment analyst says. “That’s because some portion of the market will want to wait to see the final rules,” says Raymond Falci, managing director of Cain Brothers & Co., New York, who tracks public health care I.T. firms.

Although I think this kind of test should be more focused on specific applications. As [the] studies show, some of the potential problems found on the Cloud could be solved by the developers.

Using these ideas as starting point I’m working on a project that should be presented as a parallel event on the CloudViews.Org Cloud Computing Conference – 2010. This project and the whole CloudViews.Org Cloud Computing Conference 2010 will be presented very soon, but if any one is interested in these topics, feel free to contact me directly.

There has been a bit of interest in an application called 'myTODO' that we built for the World Partner Conference (WPC) event back in July. It is a simple, yet useful application. The application allows you to create and share lists very easily. It integrates with Twitter, so if you decide to share your lists, it will tweet them and their updates automatically. You can also subscribe to lists using standard RSS if Twitter isn't your thing.

The funny thing is that we only built this app because we wanted something more interesting than the standard "Hello World" application. The entire purpose of the app was to show how easily you can deploy an application (in just mins) on Windows Azure.

Currently, SQL Azure is in CTP and will undergo some more development. Of course, I wanted to play with this, but… connecting to the thing using SQL Server management Studio is not the most intuitive and straightforward task. It’s more of a workaround. Juliën Hanssens, a colleague of mine, was going crazy for this. Being a good colleague, I poored some coffee tea in the guy and he came up with the SQL Azure manager:a community effort to quickly enable connecting to your SQL Azure database(s) and perform basic tasks.

The main problem I find with SSMS is the lack of support for Object Explorer. However, the SQL Azure team plans to fix that problem by PDC 09. You can get more details on SQL Azure Manager and run it without installing it here.

A Sydney-based university researcher leading a study of the cloud computing platforms of Amazon, Google and Microsoft yesterday revealed some of the reasons cloud services provide unpredicatable results under stress testing.

Here is a collection of 5 hours of free tutorials on Azure Services for Developers [that date from November and December 2008.] This video series is from the msdev.com website, and it is presented by Nancy Strickland and Bill Lodin.

Azure Services Platform is an application platform in the cloud that makes it possible for applications to be hosted and run at Microsoft datacenters. It consists of a cloud operating system called Windows Azure that serves as a runtime for the applications and provides a set of services that allows development, management and hosting of managed applications off-premises. All Azure Services and applications built using them run on top of Windows Azure.

The Microsoft SQL Data Services of 11/21/2008 video no longer is valid because SDS has been replaced by SQL Azure.

Still updating my Mix 09 Silverlight 3 + RIA Services talk with more fun stuff. Azure represents a general trend in the industry towards a cloud computing model. As much as I love the development phase of projects, the operations side is often the most cost intensify part. With economies of scale, cloud computing has the potential to greatly reduce those costs. … I took advantage of Windows Azure to host the web-tier of the application and Sql Azure to host the data. Those are actually independent decisions. You could decide to host one and not the other.

Neil McAllister’s The perils of becoming a cloud software developer post of 8/20/2009 to InfoWorld’s Fatal Exception blog asks: “SaaS vendors and Web-based businesses such as Twitter are opening their APIs to third-party developers. But are their platform offerings strong enough to build on?”

Software platforms can live and die on the strength of their developer communities. Just ask Microsoft. Remember Steve Ballmer's "developers, developers, developers" speech? As staunchly proprietary and hostile to open source as folks are up in Redmond, they're not dumb enough to deny the huge role ISVs have played in making Windows the dominant desktop OS.

• MyCloud Computing with the Windows Azure Platform title for Wrox/Wiley has received its final cover design, which departs from Wrox’s traditional author photographs:

I’m not sure what snowboarding and skiing have to do with cloud computing, but surfing would have had something to do with me. Adaobi Obi Tulton, my Senior Development Editor for the book, says the folks on the cover are “skysurfing” but “cloudsurfing” would be more apropos. Here’s a link to Jeremy Zwodny’s Cloud Surfing post of 3/5/2006 about soaring from the Hollister, CA airport.

••Lori MacVittie’s The End of DNS As We Know It post of 8/28/2009 has this deck: “DNS wasn’t meant to handle hybrid cloud architectures and on-demand routing” and continues:

When you start distributing services (workloads, applications) across multiple locations, a la cloud balancing, and those locations may change on a frequent basis you begin to run into problems with finding those services and scaling the rate of change effectively. DNS was designed to resolve host names, but never expected that the same host name might resolve to one of two, three, or four IP addresses all within the span of five minutes. …

Global server load balancing (GSLB), however, was designed to handle these types of scenarios. GSLB is one of the most misnamed technologies, in my opinion, because while the goal is to load balance requests globally (across multiple data centers and locations) the implementation is really via a flexible and intelligent system based on DNS. A GSLB implementation is designed with the understanding that any given request might need to be directed to some other location and does not maintain a one-to-one relationship between host/application and IP address. GSLB can assume both a high rate of change and on-demand resolution.

••Lori MacVittie continues her Friday posts with Cloud Computing's (Not So) Best Kept Secret: “Cloud providers know the secret to a successful cloud computing implementation is integration between the infrastructure:”

While it is certainly true that the infrastructure – and specifically the application delivery infrastructure – you choose to lay the foundation for a cloud computing architecture can affect your ability to succeed and innovate there’s no reason to keep it secret.

I recently had the opportunity to join a lively panel discussion led by Phil Wainewright to ruminate over this question, and we came to a general conclusion that cloud, indeed, is making SOA an easier sell to businesses. The consensus seemed to be that cloud is helping to boost the advantages promised by service orientation to a firmer business footing.

Phil and I were joined by David Bressler, principal architect with Progress Software, and Ed Horst, vice president of product strategy for AmberPoint. (Listen to the 45-minute interactive panel discussion here, read the full transcript here.)

••Zoli Erdos’ 1984 and Dan Morill’s 1984 or Panopticon posts of 8/28/2009 recommend removing highly important or confidential information from laptops that might be examined by U.S. Customs Agents and storing that data in the cloud. Zoli:

“Keeping Americans safe in an increasingly digital world depends on our ability to lawfully screen materials entering the United States.”

Zoli brings up an interesting point about USA search and seizure of electronic systems at the border: there is always a way around any restriction put in place. What makes this always interesting is that with the modern cloud computing environment, all you really need is a computer to act as your interface to the cloud, and store all the important stuff either on corporate sponsored systems or in Google or Zoho docs if you think your stuff even stands a remote chance of being confiscated or intercepted for any reason.

Of course the Department of Homeland Security’s Transportation Security Agency (TSA) might use waterboarding to force you to disclose the cloud URL and your password.

Many of us have spent years explaining to customers why our various versions of Platform as a Service (PaaS) are their best alternative for customization and deployment of business software applications. Logically, there is little reason not to choose a PaaS as the core architecture for your businesses software. However while there has been adoption, it hasn't occurred at the pace which it probably should given the magnitude of the value proposition. This of course is the quandary called "the adoption cycle" that receives a lot of attention from authors and analysts alike. …

It's starting to look like central IT, end users, business leaders etc. are at best middle adopters. It's starting to look like the force that will push PaaS across the chasm is Independent Software Vendors (ISV's). These are folks who have a business centered around a specific target market, and want to offer software to that market. These folks have the business experience to make it work, and may or may not have a lot of technical expertise. However what is clear is that these folks appreciate the advantages a great PaaS brings to their business, and are highly motivated to build their own offering on top of a PaaS. (Emphasis Treff’s.)

This report includes analysis, insights and guidance developed from Saugatuck’s fourth annual SaaS research program, which was comprised of a web survey including 1,788 qualified user enterprise executives; interviews with 30 user enterprise executives with SaaS experience; and briefings with 25 SaaS vendors/providers.

Price of the report is $1,295. The summary includes a table of contents and list of figures and sidebars; An Executive Summary is forthcoming (requires site registration.)

Over the past few months I’ve been working with many customers and internal groups to try and determine whether there is a common set of patterns for cloud computing. As you’ve probably read, many people are claiming to do cloud computing, so my goal has been to explore the types of applications that people are running in the cloud, and whether these applications can be categorized in any way.

The result of this work is a presentation called Patterns for Cloud Computing, which I delivered at our internal conference (TechReady) last month. Due to many people asking for it however, I’ve now created and uploaded a public version on SlideShare, which you can find here.

The Open Group, a vendor- and technology-neutral consortium focused on open standards and global interoperability within and between enterprises, today announced the availability of two new industry standards: the Open Group Service Integration Maturity Model (OSIMM) and SOA Governance Framework. OSIMM will provide an industry recognized maturity model for advancing the continuing adoption of SOA and Cloud Computing within and across businesses. The SOA Governance Framework is a free guide for organizations to apply proven governance standards that will accelerate service-oriented architecture success rates.

"These are two very different standards that are both important for the deployment of SOA within large organizations," said Dr. Chris Harding, forum director for SOA and semantic interoperability at The Open Group. "Enterprises use OSIMM when considering adopting SOA, to help them determine what level of SOA is appropriate to their needs and capabilities. They use the SOA Governance Framework once they have adopted SOA, to help them determine how to organize themselves to use it to their best advantage. Together, the standards enable enterprises to accelerate SOA deployment and generate direct business value much sooner."

I believe it’s a bit early in the game to call these two documents “industry standards.”

The blogosphere and Twitterverse is full of debates about whether private clouds are actually clouds or not. I have had my own share of debates on the Twitter and I thought I will bring it over to Cloud Ave for further enlightenment. For a change, I am going to follow the Redmonk style and go with a Q&A approach. …

F5 Networks has announced the results of a survey that shows how large enterprises are implementing cloud computing. The study reveals that among large enterprises, cloud computing is gaining critical mass, with more than 80 percent of respondents at least in trial stages for public and private cloud computing deployments. Additionally, despite the maturing rate of adoption of cloud computing among enterprises, the study shows that there is considerable confusion and concern around the definition of cloud computing.

“It’s no surprise that large enterprises are attracted to cloud computing because of the promise of an agile, scalable IT infrastructure and reduced costs,” said Jason Needham, Sr. Director of Product Management at F5. “However, this survey shows that despite interest in the cloud, widespread enterprise adoption of cloud computing is contingent upon solving access, security, and performance concerns. As organizations turn to the cloud to increase IT agility, it is important for them to understand the technical components of the cloud and how the cloud will affect the network before developing an implementation strategy.”

There has been a great deal of back in forth in the blogosphere around the use of private clouds, which generally means IT infrastructure that relies on the same techniques cloud providers use for their own datacenters, such as multitenancy, virtualization, Web delivery, and highly standardized environments. Most of the discussion emerged from a blog post by Appirio, "2009 prediction: Rise and fall of the private cloud."

The following comment is what set off the firestorm: "Here's the rub: Private clouds are just an expensive datacenter with a fancy name. We predict that 2009 will represent the rise and fall of this overhyped concept. Of course, virtualization, service-oriented architectures, and open standards are all great things for every company operating a datacenter to consider. But all this talk about 'private clouds' is a distraction from the real news: The vast majority of companies shouldn't need to worry about operating any sort of datacenter anymore, cloud-like or not."

Of course there were some valid responses supporting the private cloud concept, including this post by [Chris Hoff, a.k.a.] beaker: "If we're talking about a large, heavily regulated enterprise (pick your industry/vertical) with sunk costs and the desire/need to leverage the investment they've made in the consolidation, virtualization and enterprise modernization of their global datacenter footprints and take it to the next level..."

I am guessing that Vivek Kundra, the US Government's new CIO and a strong advocate of Cloud Computing, is sending Barak Obama and Ray LaHood, the US Transportation Secretary, an email saying "I told ya so".

Why? The "Cash for Clunkers" auto stimulus program's web site clunked due to the popularity of the program.

First of all, what is Cash for Clunkers? A US Government program created to stimulate the sales of newer automobiles which also enabled the removal of older, less efficient (lower MPG), higher polluting cars from US roadways. The program will end in less than 24 hours and will be feather in the President's cap. As many as 475K new vehicles were sold under this program. The results so far - the auto industry is pleased with higher sales and reduced inventories; the dealers are mostly pleased with increased sales albeit many are waiting for their government pay backs; the consumers are pleased because of a subsidy on the auto sale. The only people that are not pleased are the administrators of "Cash for Clunkers" and the dealers. Why? Too much demand which exposed inefficiencies. …

••• Eric Chabow’s IT Sector Regulation Appears Inevitable post of 8/28/2009 interviews Richard Hunter, a fellow and vice president at the IT advisory firm Gartner, who sees “Federally Imposed Rules … by 2015:”

In 2001, it was big news when a rogue employee stole 35,000 credit card numbers. Eight years later, the Heartland Payment Systems security breach exposed 130 million credit card accounts. Such breaches along with other woes with information technology products and services will likely lead to the government regulating the IT industry by the middle of the next decade, says Richard Hunter, a fellow and vice president at the IT advisory firm Gartner.

"Markets don't seem to have done it (self regulate) on their own so far," Hunter says in an interview with GovInfoSecurity.com (transcript below). "The progression in consequences for the public of failures in IT has been climbing pretty steadily and rather steeply in the last few years. ... Indeed, as information technology becomes more and more deeply imbedded in the fabric of society, there is no reason to believe that the consequences of IT failures will lessen over time."

Like the airlines, automotive, financial services, pharmaceutical and telecommunications industries, the government will regulate the IT sector, Hunter predicts.

•• Lydia Leong asks Are multiple cloud APIs bad? in this 8/27/2009 post and suggests that it’s too soon to eliminate balkanized APIs and or grant de facto standard status to Amazon S3 and EC2 APIs because they are “the APIs with the greatest current adoption and broadest tools support.” …

The concepts of “security” and “privacy” of medical information (Protected Health Information, or PHI) are closely intertwined. “Security,” as described in the second part of this series, has to do with breaking into medical data (either data at rest, or data in transit) and committing an act of theft. “Privacy,” on the other hand, has to do with permissions, and making sure that only the intended people can have access to PHI.

So, who actually “owns” the medical record? The legal status of medical records “ownership” is that they are the property of those who prepare them, rather than about whom they are concerned. These records are the medico-legal documentation of advice given. Such documentation, created by physicians about patients, is governed by doctor-patient confidentiality, and cannot be discovered by any outside party without consent. HIPAA Privacy Rules govern the steps needed to ensure that this level of confidentiality is protected against theft (security) and against unauthorized viewing (privacy). HIPAA-covered entities (medical professionals and hospitals) are held accountable for ensuring such confidentiality, and can be penalized for violation. …

The Cloud Security Alliance (CSA), a not-for-profit organization with a mission to promote the use of best practices for providing security assurance within Cloud Computing, today made several announcements underscoring its rapid growth and broad industry support as the largest cloud security initiative on a global basis, including the news that numerous leading corporations in the fields of cloud computing and security have joined the Alliance.

The CSA now has over 200 practitioner volunteers developing version 2.0 of its "Security Guidance for Critical Areas of Focus in Cloud Computing," with plans to release the guidance in October. CSA has also announced formation of new working groups for special interests, including Healthcare, Cloud Threats and the Federal Government. …

Joseph Goedert reports FTC Breach Rule Now Official for data breaches by vendors of personal health records and online applications that interact with PHRs in this 8/25/2009 post to Health Data Management:

The Federal Trade Commission on Aug. 25 published in the Federal Register its final rule governing the reporting of data breaches by vendors of personal health records and online applications that interact with PHRs.

The rule has been available for more than a week but publication starts the clock on compliance (see healthdatamanagement.com/news/PHR-38824-1.html). The rule is effective Sept. 24, 2009, with full compliance required by Feb. 22, 2010. …

Called A6 (Audit, Assertion, Assessment and Assurance API) the proposal is still in the works, driven by two people: Chris Hoff - who came up with the idea and works for Cisco - and the author of the Iron Fog blog who identifies himself as Ben, an information security consultant in Toronto.

The usefulness of the API would be that cloud providers could offer customers a look into certain aspects of the service without compromising the security of other customers’ assets or the security of the cloud provider’s network itself.

Work on a draft of A6 is posted here http://www.scribd.com/doc/18515297/A6-API-Documentation-Draft-011. It’s incomplete, but offers a sound framework for what is ultimately needed.

Either in real terms or perceived terms, security is one of the biggest hang-ups people have, and it's a wide-open question. When we talk about the cloud and the enterprise, are we talking about something that is fundamentally different in terms of securing it, versus what people are accustomed to doing across their networks?

And continues:

Here to help us better understand the perils and promises of adopting cloud approaches securely, we welcome our panel. With us we have Glenn Brunette, distinguished engineer and chief security architect at Sun Microsystems. He is also a founding member of the Cloud Security Alliance (CSA). We're also joined by Doug Howard, chief strategy officer of Perimeter eSecurity, and president of USA.NET; Chris Hoff, a technical adviser at the Cloud Security Alliance, and also director of cloud and virtualization solutions at Cisco Systems; Dr. Richard Reiner, CEO of Enomaly; and lastly, we welcome Tim Grance, program manager for cyber and network security at the National Institute of Standards and Technology (NIST).

Lydia Leong admits she “recently contributed to a couple of [Gartner] hype cycles” in her Hype cycles post of 8/24/2009:

Gartner’s very first Hype Cycle for Cloud Computing features a whole array of cloud-related technologies and services. One of the most interesting things about this hype cycle, I think, is the sheer number of concepts that we believe will hit the plateau of productivity in just two to five years. For a nascent technology, that’s pretty significant — we’re talking about a significant fundamental shift in the way that IT is delivered, in a very short time frame. However, a lot of the concepts in this hype cycle haven’t yet hit the peak of inflated expectations — you can expect plenty more hype to be coming your way. There’s a good chance that for the IaaS elements that I focus on, the crash down into the trough of disillusionment will be fairly brief and shallow, but I don’t think it can be avoided. Indeed, I can already tell you tales of clients who got caught up in the overhype and got themselves into trouble. But the “try it and see” aspect of cloud IaaS means that expectations and reality can get a much faster re-alignment than it can if you’re, say, spending a year deploying a new technology in your data center. With the cloud, you’re never far from actually being able to try something and see if it fits your needs.

My hype cycle profile for CDNs appears on our Media Industry Content hype cycle, as well as our brand-new TV-focused (digital distribution and monetization of video) Media Broadcasting hype cycle. Due to the deep volume discounts media companies receive from CDNs, the value proposition is and will remain highly compelling, although I do hear plenty of rumblings about both the desire to use excess origin capacity as well as the possibilities that the cloud offers for both delivery and media archival.

Security concerns are one of the biggest impediments to the widespread adoption of cloud computing. Arm yourself with the facts and learn how to leverage the cloud -- safely -- by joining this free live webinar to learn about cloud computing security. In this webinar, we'll discuss:

• Anna Liu will speak at CloudCamp Sydney on 8/27/2009. Anna is an associate professor in services engineering at the school of computer science and engineering at the University of NSW (UNSW) and is a member of the team that’s conducting stress tests on Amazon, Google and Microsoft cloud computing services. (See the Live Windows Azure Apps, Tools and Test Harnesses section.)

The US federal government has set aside $19 billion in an economic stimulus package to create an electronic health record for every American by 2014. The government is not only using incentives to encourage adoption; they are also using penalties. This 'Espresso Webcast' describes the advantages of implementing standards-based infrastructure for Electronic Health Records (EHRs) and Electronic Medical Records (EHRs). It discusses the considerations you need to be aware of as you work with the infrastructure for electronic health systems.

Are you technically oriented? A hacker? Are you thinking about one day starting your own company? Or maybe you’ve already started one?

How’d you like to get free advice from some of the tech industry’s most notable startup experts? People like Mitch Kapor, founder of Lotus; Chris Anderson, Wired editor-in-chief; Mark Zuckerberg, founder of Facebook; Paul Buchheit, founder of Friendfeed; Tony Hsieh, CEO of Zappos; and others.

The event is free, but because more people may want to attend than there room for, the organizers have asked that you fill out a brief application form if you want to attend. The application deadline is noon, Pacific time, on October 1, and acceptance notices will be sent out by October 8. Because the purpose of this event is to teach technical people about startups, they will get priority. …

I am really happy to report that I’ve been summoned again to present at TechEd Australia and TechEd New Zealand on my favourite topic! Last year it was a blast, awesome audiences and great, great places, and I really look forward to get there and blabber about identity, claims & company. I’ll meet customers in Melbourne & Sydney, then I’ll head to Gold Coast and from there to Auckland. I’ll have exactly 0 (zero) time to take a look around, in fact I’ll have to head back ASAP, but that’s more the rule than the exception… that’s how we roll ;-)

I am scheduled to deliver the same 2 sessions in both events: one (ARC204) will be a classic intro to claims-based identity, the other (SEC305) will be a drilldown in WIF.

My point here was that trying to use multitenancy as a way to distinguish between Public and Private Cloud deployments ignores the reality that in many large enterprises — many of whom who are beginning to architect and deploy Private Clouds — they think of their business constituencies as individual “tenants.” Each of these “tenants” often have different business requirements, service level requirements, cost structure and chargeback rates, policies, etc.

Citrix is going to try to bar VMware from getting its hooks deep in the cloud by developing the open source Xen hypervisor, already used by public clouds like Amazon, into a full-blown, cheaper, non-proprietary Xen Cloud Platform (XCP).

It intends to surround the Xen hypervisor with a complete runtime virtual infrastructure platform that virtualizes storage, server and network resources. It’s supposed to be agnostic about virtual machines and run VMware’s, which currently run only on its own infrastructure.

The announcement will be made Monday in VMware’s own house at the kickoff of the VMworld conference in San Francisco where VMware is expected to show off its new vCloud Express. …

Amazon, purveyor of the EC2 public cloud, suddenly announced Aug. 26 it’s a private cloud supplier. Isn't there something wrong with a multi-tenant, shared resource provider transforming itself into a private cloud service? I'm not sure Amazon can offer a private cloud --yet. Then again, I see no reason why it couldn't sometime in the future.

Amazon announced Wednesday that it's offering an enterprise service oriented toward private cloud use, the Virtual Private Cloud. That means it will make facilities and services available that can be accessed solely by the subscriber over a VPN. No snooping eyes or devices on the network are going to see your private data.

Werner Vogels, in his blog on the subject, says: Amazon Virtual Private Cloud customers will be able to "seamlessly extend their IT infrastructure into the cloud while maintaining the levels of isolation required for their enterprise management tools to do their work." …

••• Taylor Buley reports Citrix Goes After VMware in this 8/28/2009 article for Forbes.com: “Cloud computing company Citrix hopes to grab the buzz from its competitor with new product:”

Competition among cloud computing companies is about to get even more intense.

Forbes learned on Friday that Xen.org, a popular open-source hypervisor project shepherded by Citrix, plans to announce on Monday what amounts to an open-source version of VMware's ( VMW - news - people ) vCloud--literally on the first day of VMware's annual developer event.

And while industry leader VMware has been building virtualization software that works best when customers buy into its entire product line, Citrix is aiming to blow a hole in those plans by making Xen work fluently with all the competition.

That can't be good news to VMware: At VMworld on Monday, Chief Executive Paul Maritz is expected to announce vCloud Express, an easy way to get up and running with vCloud service. …

At Enomaly we have also been working on enhanced VPC functionality for our cloud service provider customers around the globe. For me this move by Amazon is a great endorsement of an idea we as well as others have been pushing for quite awhile.

On a side note, before you ask, Yes, I'm just glad I bought the VirtualPrivateCloud.com./.net/.org domain names when I wrote the original post. And yes, a place holder site and announcement is coming soon ;)

… If we take off the spin, we can easily see that Amazon is clearly understanding the dilemma faced by enterprises in embracing public clouds right away and wanted to lure them with an offering that will make them relatively comfortable testing the Cloudy waters. They clearly saw the emergence of a strong private cloud market and this is a direct response to the competition.

Having said that, one should not dismiss this announcement as a mere market response. There is a clear wow factor here. This has a potential to uproot some of the players in the "cloud labs" category and, also, threaten some vendors in the Amazon ecosystem itself. Plus, it offers Amazon a direct line to enterprise customers without any intermediary. …

Amazon Virtual Private Cloud (Amazon VPC) lets you create your own logically isolated set of Amazon EC2 instances and connect it to your existing network using an IPsec VPN connection. This new offering lets you take advantage of the low cost and flexibility of AWS while leveraging the investment you have already made in your IT infrastructure.

This cool new service is now in a limited beta and you can apply for admission here.

You’ll need a Cisco or Juniper-class router to create the IPSec-encoded VPN to and from the Amazon data center (U.S. East only for now). Support for software VPNs is planned “in the near future.” You pay a five-cent-per hour surcharge over standard EC2 rates for each VPN connected. Amazon promises to open the VPC to the Internet later.

Let’s hear from the Azure Services and SQL Azure teams how they plan to compete with this new Amazon private cloud infrastructure.

In my post about the Public Vs Private Cloud debate, I pointed out to how the promoters of "only public clouds" idea use the financial component as a requirement in the very definition of cloud computing. In their quest to drive home their point about the economics behind the public clouds, these pundits, either knowingly or unknowingly, promote a myth about private clouds. They argue that private cloud is not a cloud because it doesn't take a multi-tenant approach and the infrastructure is built exclusively for a single company/enterprise. …

The argument that the private clouds serve exclusively as a single tenant is a fallacy. In this post I will show how it need not be the case …

•• Lori MacVittie contributes to the private-versus-public-cloud controversy with her The Virtual Public-Private Cloud Connection post of 8/27/2008 subtitled “Secure, optimized tunnels to a remote site, e.g. the cloud. Haven’t we been here before?” :

In the continuing discussion around Business Intelligence in the cloud comes a more better (yes I did, in fact, say that) discussion of the reasons why you’d want to put BI in the cloud and, appropriately, some of the challenges. As previously mentioned, BI data sets are, as a rule, huge. Big. Bigger than big. Ginormous, even. One of the considerations, then, if you’re going to leverage a cloud-based business intelligence offering – or any offering in which very, very large data sets/files are required - would be how the heck are you going to transfer all that data to the cloud in a timely fashion? …

There has been a lot of good discussion lately about the semantics of private vs. public clouds. The general issue revolves around the issue of elasticity. It goes something like this: “If you have to buy your own servers and deploy them in your data center, that’s not very elastic and therefore cannot be cloud.” Whether or not you buy into the distinction, pivate clouds (if you want to call them that) do suffer from inelasticity. Werner Vogels in his VPC blog post debunks the private cloud as not real:

“Private Cloud is not the Cloud” …

What if we were to look at the private cloud concept as an interoperability play? If someone implements a cloud-like automation, provisioning and management infrastructure in their data center to gain many of the internal business process benefits of cloud computing (perhaps without the financial benefits of opex vs. capex and elastic “up/down scaling”), it still can be a very valuable component of a cloud computing strategy. It’s not “the Cloud” as Werner points out. It’s just part of the cloud.

Earlier today, I summarized what VPC is and isn’t, but I realize, after reading the other reactions, that I should have been clearer on one thing: Amazon VPC is not a private cloud offering. It is a connectivity option for a public cloud. If you have concerns about sharing infrastructure, they’re not going to be solved here. If you have concerns about Amazon’s back-end security, this is one more item you’re going to have to trust them on — all their technology for preventing VM-to-VM and VM-to-public-Internet communication is proprietary.

Almost every other public cloud compute provider already offers connectivity options beyond public Internet. Many other providers offer multiple types of Internet VPN (IPsec, SSL, PPTP, etc.), along with options to connect virtual servers in their clouds to colocated or dedicated equipment within the same data center, and options to connect those cloud servers to private, dedicated connectivity, such as an MPLS VPN connection or other private WAN access method (leased line, etc.).

All Amazon has done here is join the club — offering a service option that nearly all their competitors already offer. It’s not exactly shocking that customers want this; in fact, customers have been getting this from competitors for a long time now, bugging Amazon to offer an option, and generally not making a secret of their desires. (Gartner clients: Connectivity options are discussed in my How to Select a Cloud Computing Infrastructure Provider note, and its accompanying toolkit worksheet.) …

… Frankly, Amazon VPC is a terrible virtual private cloud. Network control and management are rudimentary, the VPN is stone-age, users can’t expose clients to the internet and can’t assign them IP addresses. Clearly it is not ready for prime-time, and clearly it is not aimed at Amazon’s existing user base, because they’d all have to uproot their current infrastructures to use it. It is for experimenters who start with requirements that preclude public cloud. …

… OpSource has taken aim at a Public Cloud offering for the wary enterprise, hoping to capture the potentially large revenue opportunity in enterprise production workloads. The OpSource Cloud is intended as an enterprise-ready Cloud with robust, secure, manageable, and community-supported services for production workloads for both large and small enterprises. Market Impact. [Link added.]

In mounting this Public Cloud offering, OpSource can point to its strong hosting experience and heritage in delivering infrastructure services for SaaS providers. Interestingly, however, the OpSource Cloud leverages its newer relationship with NTT, who is both an investor in OpSource, and an important hosting / technology partner. In addition, OpSource is leveraging other key technology partnerships, including Cisco (networking and security), Dell (servers) and VMware (virtualization). …

The era of cloud computing is dawning amid great fanfare, supported by mountains of cash and reams of hype. Whether this change is positive is debatable – very real concerns plague cloud computing – but the tech industry has decided: the cloud is king.

Just as the hulking mainframes of the 1960s were replaced by client server systems in the 1980s, the in-house datacenter is now shifting toward an externally-based model. Vendors of every size are maneuvering, targeting this new market. They know their future rests on their ability to grab a piece of this emerging paradigm before it’s fully established.

Eucalyptus is an open-source system for implementing on-premise private and hybrid clouds using the hardware and software infrastructure that is in place, without modification. Eucalyptus adds capabilities such as end-user customization, self-service provisioning, and legacy application support to data center virtualization features, making IT customer service easier, more fully featured, and less expensive. …

… [A]mongst other factors such as cost reduction, operational efficiency, and the other usual IT issues, is an ever increasing need for speed. …

Not only are the throughput requirements exploding, but the response times or latency tolerance is approaching Zero at an alarming rate. …

On the flip side, Banks and Hedge Funds with their algorithmic trading systems are sending trades into these exchanges at volumes which for stocks is increasing at over 50% per year and for options at well over 100% per year, so these system need to scale.

And the reasons why we're seeing this more frequently speaks volumes to current IT thinking. …

First of all, traditional IT concerns like "stability" and "security" and "availability" and "integration with existing environment" tend to be less important here. Anything you put in front of these users is far better than what they have today, which is -- well -- nothing.

Now, it's clear that most IT organizations could do the above stuff -- given enough time and resource -- but the "need for speed" results in a preference to "get to good" as quickly as possible, and then retrofit back into the existing processes and procedures. …

Chuck is VP and Global Marketing CTO for EMC Corporation, which counts VMWare (of course) and Cisco as partners.

• Maureen O’Gara asks What’s Andy Bechtolsheim Up To? at the opportune moment: “Arista claims to have the first network product to bridge physical, virtual and cloud networks:”

[Arista] says it can link physical, virtual and cloud networks using VMware, a close ally of Cisco and an early investment of Bechtolsheim, the man with the golden touch who co-founded Sun and was Google’s first investor.

If Arista’s new vEOS includes VPN capabilities, which it undoubtedly does, it sounds to me like a candidate for other private cloud purveyors to compete with Amazon' VPC. Rich Miller’s Arista vEOS Targets Virtual Machine Mobility post of 8/26/2009 indicates that former Ciscoan Doug Gourlay might throw some more light on the subject.

AWS MFA uses an authentication device that continually generates random, six-digit authentication codes solely for your use. Once you enable AWS MFA, every time somebody tries to sign in to your secure pages on the AWS Portal or AWS Management Console, access will only be granted after the correct Amazon email-id and password (the first “factor”: something you know) and the precise code from your authentication device (the second “factor”: something you have) are provided. This multi-factor authentication provides even greater protection for your AWS account, including extra protection of sensitive information such as your AWS access identifiers and critical actions such as changing your AWS infrastructure service subscriptions. It also extends this protection to the AWS Management Console so that your AWS resources, such as Amazon Elastic Compute Cloud (Amazon EC2) instances or Amazon CloudFront distributions, cannot be modified without multi-factor authentication.

The second factor is a Gemalto Ezio Time Token hardware device, which is designed to be carried as a key fob. It’s based on the OATH standard for Time-based One-Time Passwords.

The purpose of this series is to make the case for implementing a widespread, systematic approach to health information technology education in medical schools and continuing medical education programs for physicians. Subsequent posts will cover:

2. The impact of EHRs on medical education 3. Tweaking medical education to leverage the benefits of EHRs 4. Social Media in medicine, disruptive force 5. HIT and professional education: Innovations that make a difference

Dr. Laffel is Senior VP Clinical Affairs of PracticeFusion.

• Nicholas Beaudrot offers The Flowchart (inspired by Chris Hayes) to demonstrate the scale of the pending Health Insurance Reform process and the number of potential users of cloud-based personal health records (click for full-size version):

John Treadway’s Deep Data from InfiBase post of 8/25/2009 offers a chart depicting the number of top 500,000 sites hosted on Amazon EC2, RackSpace (SliceHost), Joyent, Google App Engine and GoGrid (from InfiBase):

The post also includes a chart of CPU vendor (AMD and Intel) by instance type.

Dana’s How the Cloud Aids Supply Chain Recalls: “Cloud computing uniquely enables product and food recall processes across supply chain” post of 8/25/2009 covers one of the topics of the preceding podcast in detail.

Rackspace, the erstwhile managed hosting provider who is pushing hard to be an open alternative to Amazon, Microsoft, etc., took a step towards putting some order in their ecosystem with the release of a new portal called Cloud Tools. Rackspace jumped into the Cloud game with an acquisition of a tool for their future ecosystem. They had Jungledisk from the start. With the release of their Open APIs, more companies joined their ecosystem. The greatest advantage of their APIs is the ease with which it can be integrated and a solid documentation support. One of the companies in the Amazon cloud ecosystem, Enstratus, recently joined the Rackspace ecosystem. After completing the integration, the CTO of Enstratus, George Reese, said the following about Rackspace API:

“Rackspace really did a solid job designing and (especially) documenting their API. Took us no time to add Rackspace Cloud support.”

Rackspace ecosystem is not as big as Amazon's but they had recently jumped into the field and with their advantage of open API (unless Amazon changes the game by opensourcing their API) and fanatical support, it is matter of time before they capture the imagination of cloud pundits. Rich Miller of Data Center Knowledge points out that there are more than 51,000 cloud computing users and nearly 20,000 managed hosting customers.

Larry Dignan offers a good overview of Rackspace portal. It is like the Apple's App store but without the control and allowing the app vendors to close the deal directly.

The dual Web role application has been running in Microsoft's South Central US (San Antonio) data center since September 2009. I believe it is the oldest continuously running Windows Azure application.

About Me

I'm a Windows Azure Insider, a retired Windows Azure MVP, the principal developer for OakLeaf Systems and the author of 30+ books on Microsoft software. The books have more than 1.25 million English copies in print and have been translated into 20+ languages.

Full disclosure: I make part of my livelihood by writing about Microsoft products in books and for magazines. I regularly receive free evaluation software from Microsoft and press credentials for Microsoft Tech•Ed and PDC. I'm also a member of the Microsoft Partner Network.