FabrikamShipping. This is a fairly complete example of how to use the Windows identity Foundation for addressing common tasks in the development of web solutions: accepting identities from an external identity provider, driving the UI using claims, invoking back-end WCF services via delegated authentication, handling claims based authorization and so on. The sample is based on the scenario described in Kim’s Cameron PDC08 session. You can download the sample from here; a detailed description is available here.

ClaimsDrivenModifierControl. This is a sample ASP.NET control that demonstrates how you can take advantage of claims for driving the behavior of your web UX without the need of writing any code! You can download the sample from here; a detailed description is available here; finally, if you want to see the control in action a screencast is available here.

SecurityTokenVisualizerControl. This a very simple ASP.NET control that can help you to debug your websites secured with the Windows identity Foundation, by allowing you to inspect identity information in the current context such as claims list, raw XML of the incoming token, signing certificates and more. You can download the sample from here; a detailed description is available here

They’re included in this post because any application that demonstrates Geneva technologies aids in understanding federated identity services and the like.

This guide provides a step-by-step guide for hosting in Windows Azure a web application which accepts identities from an external identity provider by taking advantage of Windows Identity Foundation.

This is a provisional sample whose purpose is giving you a chance to experiment with federated scenarios in WA today, by using publicly available bits (Windows Azure July CTP, Geneva Framework Beta 2 codename beta release of Windows Identity Foundation). The code shown here is NOT production ready and contains various temporary compromises: you can expect many of those to become unnecessary in future releases of the products.

I’d say “Absolutely.” After retrenching, reprogramming, and restarting an hour later to get rid of the trolls, I believe all participants were more than satisfied with the chat session. Looks like the next chat is scheduled for Friday, 8/21/2009 at 12:00 noon PDT:

Steve promises to post on Monday or later a copy of the complete chat session for those of us who didn’t allow popups before logging into the chat session and lost the ability to save it.

• Simon Munro’s Windows Azure Chat Nuggets post of 8/7/2009 offers a brief summary of the topics covered in the first chat session.

At Cloudscale we're preparing for the public launch, later in the year, of the first Intercloud service, a platform for the world's realtime apps. In response to a number of requests for information, here is a brief overview of what we have been developing.

The Cloudscale platform can handle all types of realtime data, and is as easy to use as a spreadsheet. Complex analytics can be run continuously, with automatic scaling and fault tolerance to ensure realtime responsiveness. The platform offers users seamless integration from standard desktop or mobile clients to realtime intercloud apps.

As a self-service platform aimed at mass market adoption, there is no database software and administration to deal with, and no IT department delays and complexity to get in the way of delivering immediate value to users. The platform will drive the democratization of data, unleashing creativity and “turning data into action” everywhere. Like the iPhone platform, Cloudscale’s AppStore will provide a commercial marketplace for the many exciting new apps developed on the platform.

Cloud Computing is one of the many things enterprise CIOs, CTOs and other engineers will master in delivering capability. I believe in the power of new Cloud Computing technologies and concepts and think we should all continue our focus there.

I have said, and still say, the same thing about design approaches like Service Oriented Architecture (SOA). The constructs, methods and models of SOA are good practices that result in good designs for enterprises. It is smart to separate data from application logic and smart to enable agility and mashups the way good SOA design does.

And then goes on to plot the declining volume of Web searches for “SOA” versus an increasing volume for “cloud computing,” and concludes:

… Your defense against this [flood of hype] will be the strength of your own position in Cloud Computing. Therefore, I strongly recommend you personally think through what Cloud Computing means to your enterprise. Also, think through your definition of cloud computing. Hopefully the NIST definition will suit your use, since the more people who form up on that the better.

What’s most interesting is that both Blumenthal and U.S. Chief Technology Officer Aneesh Chopra pushed back. The HITECH stimulus money goes to hospitals that automate records and deliver the functional requirements of a recently-passed plan on meaningful use.

The resulting Electronic Health Records (EHRs) are covered by the HIPAA law, in terms of how they can be shared. What Schmidt and Mundie want is that EHRs respond to Web standards so the records can be turned into Personal Health Records (PHRs) controlled by patients.

HealthVault is Microsoft’s consumer-focused health-records-management Software+Service platform, which the company unveiled officially in 2007. (The service component of HealthVault is one of a handful of Microsoft services that already is hosted on top of Azure.) Amalga UIS, (one of the products formerly under the Azyxxi brand), is one of the main elements of Microsoft’s enterprise health-information-system platform.

I am flying home from the HealthVault Connected Care Conference in Seattle. I left with two big takeaways which will be addressed in two separate posts. It was a great trip, in fact refreshing in many ways, coming on the heels of a wonderful but intellectually strenuous series of meetings for the X PRIZE. In fact, I have never been to Seattle when it was so beautiful – perfectly warm and sunny days with intermittent cumulus clouds and light breezes whose temperature was nearly imperceptible. My favorite evening in town was spent watching sailboats glide effortlessly around the Sound in the fading sunlight of a perfect day. Magic.

Perhaps the setting got me in a good mood, but I walked away very clearly impressed with what Microsoft is attempting to do with their health care strategy. I have to be clear – as an ardent and passionate open source advocate (recovering zealot) – I was very ambivalent about stepping clearly into and over “enemy” lines during my sojourn in Redmond. I was quickly put at ease by the West Coast flavor of the meeting (ie, casual business dress with a young-ish crowd, high energy music, and overall good karma) and the impressive lineup of speakers and attendees. Furthermore, this was the first time I was actually able to figure out what the heck HealthVault really is and how all these various partnerships I keep reading about even begin to make sense. …

So kudos to Peter Neupert and crew for the progress to date. I was impressed.

But I was also puzzled at the same time – Where (on earth ) Is Google Health?

… Google Health has been nothing more than a distraction to the broader market. A distraction in that Google Health has really done very little to create a truly compelling platform, yet due to its size, market presence and media and market pundits belief that Google is the be all to end all, Google Health gets far more press and attention than it rightfully deserves …

John is managing director of Chillmark Research which describes itself as “a healthcare technology industry analyst firm focusing on personal healthcare technology that will enable citizens to take more direct responsibility for their health and the health of loved ones.”

The HealthVault Connected Care Conference was held 6/10 to 6/12/2009 at the Meydenbauer Center in Bellevue, WA. Microsoft’s site offers PDF presentations and video segments of the sessions. The US$19 billion allocated to health information technology (HIT) by the American Recovery and Reinvestment Act (ARRA) of 2009 has greatly increased interest in HIT as well as EHR and PHR implementations. Alice Lipowicz’s Is the nation's health network healthy? article of 8/7/2009 in FederalComputerWeek throws more light on the ARRA incentives and the National Health Information Network (NHIN), the Health and Human Services Departments replacement for the Internet.

HealthVault and Practice Fusion, which was the recipient of a recent investment by Salesforce.com (see last item in this post), appear to be competitors. However, Practice Fusion offers free EHR services primarily to physicians while HealthVault provides PHR storage for patients.

• Chris Hoff (@Beaker) goes off the deep end with his Introducing the “Cloud For Clunkers Program” post of 8/8/2009:

As compelling as the offer of Cloud may be, in order to pull off incentivizing large enterprises to think differently, it requires an awful lot going on under the covers to provide this level of abstracted awesomeness; a ton of heavy lifting and the equipment and facilities to go with it. [Emphasis added.] …

To get ready for the gold rush, most of the top-tier IaaS/PaaS Cloud providers are building data processing MegaCenters around the globe in order to provide these services, investing billions of dollars to do so…all supposedly so you don’t have to. Remember, however, that service providers make money by squeezing the most out of you while providing as little as they need to in order to ensure the circle of life continues. Note, this is not an indictment of that practice, as $deity knows I’ve done enough of that myself, but just because it has the word “Cloud” in front of it does not make it any different from a business case. Live by the ARPU, die by the ARPU. …

•Datacenter Dynamics’ The American cloud’s weakest link post of 8/7/2009 has “Dark fiber developer brings the cloud back to the ground and shares his broadband stimulus experience” as its deck:

An essential element is often left out of excited industry discourse around cloud-computing. Is the physical network infrastructure in the country sufficient to support visions of the future cloud?

Allied Fiber CEO Hunter Newby offered a rather sobering view of reality at the DatacenterDynamics conference in Seattle, Wash., Thursday. The company specializes in building out dark fiber infrastructure.

“Without physical there is no virtual,” Newby said. “Without fiber there is no cloud.”

“Moving apps into the cloud is very dangerous if you don’t know your physical fiber route. You can be buying from three or four different providers but there’s only one path and everybody else is tied too on that path and you think you’re redundant and diverse, but in fact, you’re not. Those are very basic questions you need to ask before you do anything in higher layers. I believe that if you’re not aware of the basic fundamental things that are very simple to understand, your entire business that you build above it is in jeopardy.”

Fiber-rich patches, such as coastal areas, are sporadically spread around the country, enabling a healthy amount of competition in those areas but connectivity outside of those areas leaves a lot to be desired. “And no one company can afford to build out the proper infrastructure to make it all work and that’s the problem.”

Of $787 billion the U.S. government allocated to stimulating the economy, $7.2 billion was dedicated to developing the country’s broadband infrastructure. Newby feels that, while a lot can be accomplished with $7.2 billion, the amount is insufficient for satisfying the country’s broadband needs. …

I enjoyed James Urquhart's post, "In cloud computing, data is not electricity," which points out some of the sillier analogies we're seeing in the emerging cloud computing space. Specifically, Urquhart refers to Nick Carr's classic vision of cloud computing, "The Big Switch," which compares traditional on-premise computing as generating your own power to cloud computing as using the standard power grid.

"However, some have taken electricity as an analogy to cloud adoption to an extreme, and declared that there will be a massive and sudden shift from corporate datacenters to entirely external cloud computing environments -- public cloud utilities, if you will. They are wrong," Urquhart writes.

Citing an unfavorable change in tax laws, Microsoft is moving its Windows Azure cloud from a data center in Washington state to one in Texas. It's an interesting new twist in the cloud computing market—moving a cloud across state lines in response to the regulatory climate.

Of course, the problem will be that there will only be a single US data center available for some time, which means that geolocation for disaster recovery won’t be an option for early Azure adopters.

•Miko Matsumura (a.k.a. @MikoJava) claimed SOA Arrogance is Dead when he followed Anne Thomas Manes’ (@atmanes) session at the Burton Group’s Catalyst Conference on 7/29/2009 and made the following point in his 8/7/2009 post:

First and foremost, the most stupid and ignorant reading of “SOA is DEAD” is that the perspective of SOA is no longer needed in the Enterprise. This point of view is stupid, particularly when SOA is so important for mash-ups, Cloud Computing, SaaS, PaaS, BSM, IT Governance, Portfolio management and most modern IT practices.

The problem of Enterprise IT Complexity (and Entropy) *DOES* need to be solved. SOA is one of many key architectural perspectives that can make this happen.

Everything is a service (SOA) is an incredibly powerful view.

But within appropriate bounds, everything can also be appropriately viewed as a Process, an Event, an Object, a database table, or other abstraction.

The idea that an enterprise architect could become so focused on “one architecture to rule them all” is as preposterous as “one vendor to rule them all”.

In most organizations, SOA has become a bad word. Except in rare situations, SOA has failed to deliver its promised benefits. IT Groups have invested hundreds of thousands, if not millions of dollars into SOA with little return to show for it. The people holding the purse strings are fed up. Funding for these SOA initiatives has dried up.

It’s time to face reality: the term “SOA” now carries too much baggage. It’s time to declare SOA dead and move on.

So what went wrong? Was SOA really just a great failed experiment? Or did we just lose our way? Should we abandon our architectural efforts? Can we salvage any value from our past efforts?

This 2009 study has 712 pages, 211 Tables and Figures. Worldwide markets are poised to achieve significant growth as search engines use efficient automated process to drive new advertising and communications capabilities. Applications can be built without programming. …

SOA reaches into every industry and every segment of the economy via cloud computing. SOA drives innovation for the very large enterprises. Mid range size companies and very small organizations are adopting technologies similar to what the enterprise use, creating automated process to replace manual process. Cloud computing markets at $36 billion in 2008 are expected to reach $160.2 billion by 2015. …

The question is what will be the market for cloud-computing research reports in 2015?

Want to know what gets my blood pressure up? It's when there's both a huge shift in thinking around how we should do computing, namely cloud computing, and at the same time, there's a bunch of information out there that causes confusion. As cloud computing hype spikes to a frenzy, so does the number of less-than-intelligent things that I hear about it and its relationship to SOA.

We've got a herd mentality in IT. We're always chasing the next paradigm shift, which seems to come along every five years, claiming that whatever the last paradigm shift was had "failed" and that's why we're looking at something new. However, these hype-driven trends are often complementary, and so the real power is in figuring out how known approaches fit with what's new, and not look to replace, but how to build on the foundation. The best case for that scenario has been how SOA benefits cloud computing, but few understand how and why. …

To be fair, Google's Chrome OS is not the only operating system to which the cloud handle has been attached. It is merely the latest in a long line of attempts to capitalise on the growing interest and hype surrounding cloud computing.

Novell, Dell, Microsoft — in fact, anyone who is anyone with a stake in operating systems has been mentioned at least once in conjunction with a cloud operating system.

There is no such thing. It is a myth existing entirely in the minds of those who cannot seem to get enough cloud in their daily technology diets. And the problem in perpetuating that myth is that it continues to confuse an already confused market.

The state of Washington is investing $180 million to build a new data center, and not everyone is thrilled about it. Opponents wonder if cloud computing wouldn't be a cheaper alternative. Ironically, Washington is home to two of the biggest cloud service providers, Amazon.com and Microsoft.

As reported byThe Olympian, a bond sale and groundbreaking for the new facility, which will also serve as the headquarters for the state's Information Services division, is imminent. Construction equipment is due to arrive on the site in Olympia within a few days.

Two state representatives, Reuven Carlyle and Hans Dunshee, tried to put a halt to the project. In a letter to Gov. Chris Gregoire, they pointed to data centers operated by Google, Microsoft, and others -- a.k.a. cloud computing centers -- as potentially cheaper alternatives. For a state that spends upwards of $1 billion annually on IT (according to Carlyle), lawmakers and tax payers can't be blamed for balking.

Mike currently is responsible for the global data center design, construction, ongoing operations and professional services for Digital Realty Trust, his past roles include similar responsibilities at Microsoft Corporation, and leadership roles at Walt Disney, Rhythms NetConnections, and Nuclio Corporation (now part of Sun Microsystems).

The move to the Internet cloud will pick up steam in the next year for developers, according to a new survey from Santa Cruz, Calif.-based Evans Data Corp.

Nearly half (48 percent plus) of the 500-plus developers surveyed expects to deploy private cloud applications in the coming year. Development for the cloud is also happening now. More than 29 percent said they are currently building applications for a private cloud.

Evans Data announced some of the results on Tuesday, but the company's "Cloud Development Survey 2009" publication is expected to be released sometime next week. The survey also examines public cloud trends among developers. …

There will soon be two major paths for cloud computing providers: commodity and premium. If you read my series, Cloud Futures, you’ll know that I broke down cloud service providers into three major categories: service clouds, consumer clouds (previously ‘commodity’)[1], and focused clouds. In retrospect I realize now that there are possibly four, not three major categories. The missing category is premium enterprise clouds. Previously I had lumped these under focused clouds, but I now realize that, in fact, there are likely to be so many of these that they deserve their own category. I’ll go even further and suggest that in terms of markets targeted, there will really only be two ends of a spectrum: enterprise and non-enterprise. …

Slowly, but steadily, enterprises are warming up to Cloud technologies. No, they are not queuing outside the Amazon headquarters waiting to order public cloud infrastructure, like the Amazon's EC2 offerings, yet. But, the idea of private clouds and the advantages of tapping the public clouds for non mission critical operations like testing are slowly making the enterprise community comfortable with Cloud Computing. In fact, a recent Gartner survey predicts that by 2012, 80 percent of Fortune 1000 enterprises will be paying for some cloud computing services and 30 percent will be paying for cloud computing infrastructure services. …

In January 2008, Waxhaw started a new Building Inspections Department. This function was moved from the county level in order to be more responsive to Waxhaw citizens while ensuring quality building construction for new developments and historic restorations.

Greg Mahar, the Director of Planning and Community Development for Waxhaw, sought software that would better enable his team to manage planning and community development in a more effective manner. After extensive research, Greg chose BasicGov web-based software because of its affordability and reliability. …

IT services providers regularly contact Saugatuck, seeking to understand the potential opportunities and limitations of introducing Cloud Computing and SaaS to their customer bases. What we see is the emergence of web-based Cloud Computing outsourcing alternatives (including SaaS) that are substantially reshaping the way technology-enabled services are purchased and used. This in turn is fundamentally shifting the IT outsourcing landscape, creating new opportunities that grow from established, traditional categories of IT outsourcing. These emergent shifts - and opportunities - include the following:

From Infrastructure Outsourcing (IO) to Infrastructure-as-a-service (IaaS) and Platform-as-a-Service (PaaS);

From Application Management Outsourcing (AMO) to PaaS and SaaS; and

From Business Process Outsourcing to Cloud Enabled Business Services and IT as a Service.

Future Saugatuck Strategic Perspectives will examine these areas of change opportunity in more detail, from both the user and provider points of view. …

It sometimes seems as if the whole world has gone cloud crazy - well at least most of the vendors, pundits and many in the media. If we listen to the evangelists, the days of the enterprise data centre are numbered and players like Google, Amazon and Microsoft will inherit the earth. Even David Cameron, the illustrious leader of the opposition to the UK government, has been talking about handing over the country's health records for storage and management to one of these big American multinationals.

In the midst of all this noise and hype, many have lost sight of the fact that getting a third party to run some of your infrastructure for you has been around for at least three decades. Indeed, those who have been taking advantage of hosted services - or on the other side of the fence, delivering them - must be wondering what all the fuss is about. Just what, exactly, is this cloud thing bringing to the party that is supposed to change the way everything works? …

Look all around and you can easily see that there is no shortage of press regarding the promises of cloud computing. Cloud evangelists have touted cloud computing as the next big thing, a game changer - a disruptive technology that will spark innovation and revolutionize the way businesses acquire and deliver IT services. The staggering volume of these sales pitches is to be expected, considering that cloud computing is at or near the peak of its hype cycle, but as with any new technology or model, reality will eventually set in and the public relations blitz will fade. As people continue to define cloud computing and debate its pros and cons, one thing is certain - one of the biggest obstacles to widespread cloud computing adoption will be security.

Articles and blog posts associated with security and cloud computing are a daily occurrence, unless some well-publicized breach occurs in the cloud. At that point the number of commentaries and discussions will increase exponentially, and then, over the following week, return to normal frequency.

I decided to focus on security as it relates to cloud storage, to see if something really new and different is occurring, and if overall changes need to be contemplated, as it comes to classic data security activities. When I focused in this way, I quickly discovered that not much has changed, and security of data in the cloud is highly dependent on the same precautions and understandings as security of your data in a private data center.

In this recent article, it was suggested that files of one owner residing on a physical device with the files of others could somehow result in unauthorized access. It could, and the answer to this and a myriad of concerns fits within traditional approaches and understandings of security.

Homeland Security Secretary Janet Napolitano isn't the federal cybersecurity czar, and has no desire to become the president's top IT security adviser. But if one of the responsibilities of the White House cybersecurity coordinator is to be the cheerleader for federal government cybersecurity initiatives, then Napolitano is filling that bill.

I’d say that Janet is angling to fill the power vacuum created by Melissa Hathaway’s resignation in advance of her anticipated appointment to the job. Melissa was the White House’s acting senior director for cyberspace.

One of the key parameters in the push to accelerate enterprise cloud adoption is the SLA (Service Level Agreements). It is an important requirement before enterprises can even think of jumping into the cloud. After a slow start, companies are coming out with SLAs for their services but it is still a messy affair with different companies offering varying terms with ambiguity. Recently, US General Services Administration, part of federal government, came up with a RFQ (Request For Quotations) that demands a 99.95% uptime per month. Let us try to understand the SLA dynamics in this post and see how government's requirement will affect the SLA game.

Microsoft requires two compute instances for a 99.95% uptime guarantee and only offers 99.9% uptime for data accessibility when Windows Azure RTMs.

In this white paper, Jon Oltsik, Principal Analyst at ESG, cuts through the hype and provides recommendations to protect your organization's data, with today's budget. Oltsik shows you where to start, how to focus on the real threats to data, and what actions you can take today to make a meaningful contribution to stopping data breaches.

As part of the paper's storage encryption to-do list, Oltsik details three realistic steps to provide the necessary protection for stored data based on risk.

The white paper covers:

What are the real threats to data today

Where do you really need to encrypt data first

How does key management fit into your encryption plans

What shifts in the industry and vendor developments will mean to your storage environment and strategy

While many companies are considering moving applications to the cloud, the security of the third-party services still leaves much to be desired, security experts warned attendees at last week's Black Hat Security Conference. …

"Guys at the low end are using (cloud infrastructure) to save money, but the danger is that the guys at the top end start to use it without any auditing," says Haroon Meer, technical director at security firm SensePost, who discussed his team's research into some aspects of Amazon's Elastic Compute Cloud (EC2) at the Black Hat security conference. …

It's hard to believe that it's been a year since we first created the Cloud Computing Interoperability Forum (CCIF) with the goal of defining and enabling interoperable enterprise-class cloud computing platforms through application integration and stakeholder cooperation. Over the last 12 months a lot has happened. For me the most notable change has been how the conversation has shifted from "why use the cloud" & "what is cloud computing" to how to implement it. The need for interoperability among vendors has also become a central point of discussion with the concept being included in recent US federal government cloud requirements. But like it or not the battle for an open cloud ecosystem is far from over.

Security researchers today unveiled details about a little-known but ubiquitous class of vulnerabilities that may reside in a range of Internet components, from Web applications to mobile and cloud computing platforms to documents, images and instant messaging products. [Emphasis added.]

At issue are problems with the way many hardware and software makers handle data from an open standard called XML. Short for "eXtensible Markup Language," XML has been used for many years as a fast and efficient way to transport, store and structure information across a wide range of often disparate applications.

Researchers at Codenomicon Ltd., a security testing company out of Oulu, Finland, say they found multiple critical flaws in XML "libraries," chunks of code that are typically used and re-used in software applications to process XML data. …

[P]ublic calendar titled the Federal Technology Events Calendar. This calendar uses Google Calendar technologies so it is fast and easy to maintain, which means it should be easy to keep it up to date.

Calendar data also is available for download in XML, ICAL and HTML formats.

• The B.NET (Bangalore.net) User Group announces a series of sessions focused on Windows Azure starting on 8/8/2009:

B.NET brings you a series of sessions on the Microsoft Cloud Computing platform - Windows Azure. Spread across 6 sessions of 90 minutes each, this series takes you through the nuts and bolts of Windows Azure. Some of the key concepts of Windows Azure like the Fabric, Web Role, Worker Role, Tables, Blobs, Queues and configuration will be covered in-depth in the sessions. By end of the series, you would be able to architect cloud services for Azure or migrate your existing applications to the Azure platform.

What's more? Those members who attend all the 6 sessions and successfully complete a quiz by the end of the series stand to win cool prizes from our sponsors!

Students from all IU campuses and other university students from across the US have an opportunity to consider the implications of cloud computing on the geosciences while networking with some of the leading thought leaders in the field. The Indiana University Pervasive Technology Institute Data to Insight Center (D2I) is soliciting student abstracts for an upcoming workshop titled "Cloud Computing and Collaborative Technologies in the Geosciences."

Sponsored by the National Science Foundation, the workshop will be hosted by the Pervasive Technology Institute and the Linked Environment for Atmospheric Discovery (LEAD) Project and will take place September 17-18, 2009, at the University Place Conference Center on the campus of Indiana University-Purdue University Indianapolis.

Abstracts for poster sessions will be accepted through August 20, 2009. Funding awards for travel and accommodations will be recommended to those posters targeted to: geosciences, including atmospheric, earth sciences, hydrology, environmental sciences, and climatology; collaborative technologies; and cloud computing. …

Vivek Kundra, Federal CIO, will deliver the opening keynote address on Monday, September 14, at 8:15 a.m. PT at the 2009 InformationWeek 500 Conference and Gala Awards, to be held at the St. Regis Monarch Beach Resort in Dana Point, Calif. Kundra will share his unique perspectives on getting things done within the massive federal bureaucracy, ensuring that his $75 billion annual IT budget delivers maximum value and impact. …

SalesForce have recently been heavily promoting their application development platform. The platform offers all of the benefits of cloud computing (scaleable, lower costs, etc) with the added bonus of best in breed CRM and sales support.

As more and more organizations are looking towards cloud computing to reduce their ICT spend cloud computing is very attractive. Add in all of the ready to use functionality SalesForce offers and it would appear to be the best solution but there are some hidden costs to consider. For example if an organization wanted to authenticate users to their Visual Force site they would need to purchase a license for each user or pay per sign on.

An organisation wanting to host large volumes of data would have to pay extra once they have exceeded their data allowance (typically 1GB for an enterprise license).

For SalesForce to compete with rivals such as Amazon Web Services they will need to consider changing their price structure. In the past their current price structure worked well for CRM and sales software as a service (SaaS) but they are now in the application development market and those high prices cannot compete with competitors who are offering cloud storage for $0.25 per GB/Month.

This is another case of miscategorizing the competition. Like Azure, the Sales Force app platform is a PaaS, not an IaaS like Amazon Web services. Fortunately, Martin compares SalesForce’s pricing with AWS and Google App Engine in these Google Sheets. It’s unfortunate that he didn’t include Azure in the comparison.

So we are working on a caching related project on EC2. In this scenario high performance is very important.

We are setting up a Varnish cluster on EC2 and evaluate if can replace an existing caching infrastructure in terms of costs and requests per second. Our benchmarks yielded some interesting results. It seemed that for our caching scenario the limiting factor is bandwidth. Varnish is very humble with CPU/RAM consumption. We could easily deliver 500 to 600 requests per second with a small instance and have the box idle around 95% (uncompressed content).

It turns out we are limited by bandwidth and not by CPU.

In our benchmarks we were only able to push 35 MB/s on small instances. So the actual requests per seconds were dependent on the object size we were pushing. The limit were always ~35 MB/s. Our typical HTML pages were around 50 to 70 KB, so we couldn’t reach the desired requests per second as our instance was at it’s bandwidth limit. …

Virtualization implementers found that the key bottleneck to virtual machine density is memory capacity; now there's a whole new slew of servers coming out with much larger memory footprints, removing memory as a system bottleneck. Cloud computing negates that bottleneck by removing the issue of machine density from the equation—sorting that out becomes the responsibility of the cloud provider, freeing the cloud user from worrying about it.

For cloud computing, bandwidth to and from the cloud provider is a bottleneck. We recently performed a TCO analysis for a client, evaluating whether it would make sense to migrate its application to a cloud provider. Interestingly, our analysis showed that most of the variability in the total cost was caused by assumptions about the amount of network traffic the application would use. This illustrates a key truth about computing: there's always a bottleneck, and solving one shifts the system bottleneck to another location.

Virtualization implementers found that the key bottleneck to virtual machine density is memory capacity; now there's a whole new slew of servers coming out with much larger memory footprints, removing memory as a system bottleneck. Cloud computing negates that bottleneck by removing the issue of machine density from the equation—sorting that out becomes the responsibility of the cloud provider, freeing the cloud user from worrying about it. …

Healthcare provides a good business opportunity to information technology companies, a fact reinforced by Salesforce.com (CRM - Analyst Report). The company is investing in Practice Fusion, which is involved in the business of electronic health records, health policy, health information technology and consumer medical data topics. Salesforce.com will invest around $10.0 million for a marginal stake in the company, which will generate around $1.0 million of revenue a year for the company. …

FabrikamShipping. This is a fairly complete example of how to use the Windows identity Foundation for addressing common tasks in the development of web solutions: accepting identities from an external identity provider, driving the UI using claims, invoking back-end WCF services via delegated authentication, handling claims based authorization and so on. The sample is based on the scenario described in Kim’s Cameron PDC08 session. You can download the sample from here; a detailed description is available here.

ClaimsDrivenModifierControl. This is a sample ASP.NET control that demonstrates how you can take advantage of claims for driving the behavior of your web UX without the need of writing any code! You can download the sample from here; a detailed description is available here; finally, if you want to see the control in action a screencast is available here.

SecurityTokenVisualizerControl. This a very simple ASP.NET control that can help you to debug your websites secured with the Windows identity Foundation, by allowing you to inspect identity information in the current context such as claims list, raw XML of the incoming token, signing certificates and more. You can download the sample from here; a detailed description is available here

They’re included in this post because any application that demonstrates Geneva technologies aids in understanding federated identity services and the like.

This guide provides a step-by-step guide for hosting in Windows Azure a web application which accepts identities from an external identity provider by taking advantage of Windows Identity Foundation.

This is a provisional sample whose purpose is giving you a chance to experiment with federated scenarios in WA today, by using publicly available bits (Windows Azure July CTP, Geneva Framework Beta 2 codename beta release of Windows Identity Foundation). The code shown here is NOT production ready and contains various temporary compromises: you can expect many of those to become unnecessary in future releases of the products.

I’d say “Absolutely.” After retrenching, reprogramming, and restarting an hour later to get rid of the trolls, I believe all participants were more than satisfied with the chat session. Looks like the next chat is scheduled for Friday, 8/21/2009 at 12:00 noon PDT:

Steve promises to post on Monday or later a copy of the complete chat session for those of us who didn’t allow popups before logging into the chat session and lost the ability to save it.

• Simon Munro’s Windows Azure Chat Nuggets post of 8/7/2009 offers a brief summary of the topics covered in the first chat session.

At Cloudscale we're preparing for the public launch, later in the year, of the first Intercloud service, a platform for the world's realtime apps. In response to a number of requests for information, here is a brief overview of what we have been developing.

The Cloudscale platform can handle all types of realtime data, and is as easy to use as a spreadsheet. Complex analytics can be run continuously, with automatic scaling and fault tolerance to ensure realtime responsiveness. The platform offers users seamless integration from standard desktop or mobile clients to realtime intercloud apps.

As a self-service platform aimed at mass market adoption, there is no database software and administration to deal with, and no IT department delays and complexity to get in the way of delivering immediate value to users. The platform will drive the democratization of data, unleashing creativity and “turning data into action” everywhere. Like the iPhone platform, Cloudscale’s AppStore will provide a commercial marketplace for the many exciting new apps developed on the platform.

Cloud Computing is one of the many things enterprise CIOs, CTOs and other engineers will master in delivering capability. I believe in the power of new Cloud Computing technologies and concepts and think we should all continue our focus there.

I have said, and still say, the same thing about design approaches like Service Oriented Architecture (SOA). The constructs, methods and models of SOA are good practices that result in good designs for enterprises. It is smart to separate data from application logic and smart to enable agility and mashups the way good SOA design does.

And then goes on to plot the declining volume of Web searches for “SOA” versus an increasing volume for “cloud computing,” and concludes:

… Your defense against this [flood of hype] will be the strength of your own position in Cloud Computing. Therefore, I strongly recommend you personally think through what Cloud Computing means to your enterprise. Also, think through your definition of cloud computing. Hopefully the NIST definition will suit your use, since the more people who form up on that the better.

What’s most interesting is that both Blumenthal and U.S. Chief Technology Officer Aneesh Chopra pushed back. The HITECH stimulus money goes to hospitals that automate records and deliver the functional requirements of a recently-passed plan on meaningful use.

The resulting Electronic Health Records (EHRs) are covered by the HIPAA law, in terms of how they can be shared. What Schmidt and Mundie want is that EHRs respond to Web standards so the records can be turned into Personal Health Records (PHRs) controlled by patients.

HealthVault is Microsoft’s consumer-focused health-records-management Software+Service platform, which the company unveiled officially in 2007. (The service component of HealthVault is one of a handful of Microsoft services that already is hosted on top of Azure.) Amalga UIS, (one of the products formerly under the Azyxxi brand), is one of the main elements of Microsoft’s enterprise health-information-system platform.

I am flying home from the HealthVault Connected Care Conference in Seattle. I left with two big takeaways which will be addressed in two separate posts. It was a great trip, in fact refreshing in many ways, coming on the heels of a wonderful but intellectually strenuous series of meetings for the X PRIZE. In fact, I have never been to Seattle when it was so beautiful – perfectly warm and sunny days with intermittent cumulus clouds and light breezes whose temperature was nearly imperceptible. My favorite evening in town was spent watching sailboats glide effortlessly around the Sound in the fading sunlight of a perfect day. Magic.

Perhaps the setting got me in a good mood, but I walked away very clearly impressed with what Microsoft is attempting to do with their health care strategy. I have to be clear – as an ardent and passionate open source advocate (recovering zealot) – I was very ambivalent about stepping clearly into and over “enemy” lines during my sojourn in Redmond. I was quickly put at ease by the West Coast flavor of the meeting (ie, casual business dress with a young-ish crowd, high energy music, and overall good karma) and the impressive lineup of speakers and attendees. Furthermore, this was the first time I was actually able to figure out what the heck HealthVault really is and how all these various partnerships I keep reading about even begin to make sense. …

So kudos to Peter Neupert and crew for the progress to date. I was impressed.

But I was also puzzled at the same time – Where (on earth ) Is Google Health?

… Google Health has been nothing more than a distraction to the broader market. A distraction in that Google Health has really done very little to create a truly compelling platform, yet due to its size, market presence and media and market pundits belief that Google is the be all to end all, Google Health gets far more press and attention than it rightfully deserves …

John is managing director of Chillmark Research which describes itself as “a healthcare technology industry analyst firm focusing on personal healthcare technology that will enable citizens to take more direct responsibility for their health and the health of loved ones.”

The HealthVault Connected Care Conference was held 6/10 to 6/12/2009 at the Meydenbauer Center in Bellevue, WA. Microsoft’s site offers PDF presentations and video segments of the sessions. The US$19 billion allocated to health information technology (HIT) by the American Recovery and Reinvestment Act (ARRA) of 2009 has greatly increased interest in HIT as well as EHR and PHR implementations. Alice Lipowicz’s Is the nation's health network healthy? article of 8/7/2009 in FederalComputerWeek throws more light on the ARRA incentives and the National Health Information Network (NHIN), the Health and Human Services Departments replacement for the Internet.

HealthVault and Practice Fusion, which was the recipient of a recent investment by Salesforce.com (see last item in this post), appear to be competitors. However, Practice Fusion offers free EHR services primarily to physicians while HealthVault provides PHR storage for patients.

• Chris Hoff (@Beaker) goes off the deep end with his Introducing the “Cloud For Clunkers Program” post of 8/8/2009:

As compelling as the offer of Cloud may be, in order to pull off incentivizing large enterprises to think differently, it requires an awful lot going on under the covers to provide this level of abstracted awesomeness; a ton of heavy lifting and the equipment and facilities to go with it. [Emphasis added.] …

To get ready for the gold rush, most of the top-tier IaaS/PaaS Cloud providers are building data processing MegaCenters around the globe in order to provide these services, investing billions of dollars to do so…all supposedly so you don’t have to. Remember, however, that service providers make money by squeezing the most out of you while providing as little as they need to in order to ensure the circle of life continues. Note, this is not an indictment of that practice, as $deity knows I’ve done enough of that myself, but just because it has the word “Cloud” in front of it does not make it any different from a business case. Live by the ARPU, die by the ARPU. …

•Datacenter Dynamics’ The American cloud’s weakest link post of 8/7/2009 has “Dark fiber developer brings the cloud back to the ground and shares his broadband stimulus experience” as its deck:

An essential element is often left out of excited industry discourse around cloud-computing. Is the physical network infrastructure in the country sufficient to support visions of the future cloud?

Allied Fiber CEO Hunter Newby offered a rather sobering view of reality at the DatacenterDynamics conference in Seattle, Wash., Thursday. The company specializes in building out dark fiber infrastructure.

“Without physical there is no virtual,” Newby said. “Without fiber there is no cloud.”

“Moving apps into the cloud is very dangerous if you don’t know your physical fiber route. You can be buying from three or four different providers but there’s only one path and everybody else is tied too on that path and you think you’re redundant and diverse, but in fact, you’re not. Those are very basic questions you need to ask before you do anything in higher layers. I believe that if you’re not aware of the basic fundamental things that are very simple to understand, your entire business that you build above it is in jeopardy.”

Fiber-rich patches, such as coastal areas, are sporadically spread around the country, enabling a healthy amount of competition in those areas but connectivity outside of those areas leaves a lot to be desired. “And no one company can afford to build out the proper infrastructure to make it all work and that’s the problem.”

Of $787 billion the U.S. government allocated to stimulating the economy, $7.2 billion was dedicated to developing the country’s broadband infrastructure. Newby feels that, while a lot can be accomplished with $7.2 billion, the amount is insufficient for satisfying the country’s broadband needs. …

I enjoyed James Urquhart's post, "In cloud computing, data is not electricity," which points out some of the sillier analogies we're seeing in the emerging cloud computing space. Specifically, Urquhart refers to Nick Carr's classic vision of cloud computing, "The Big Switch," which compares traditional on-premise computing as generating your own power to cloud computing as using the standard power grid.

"However, some have taken electricity as an analogy to cloud adoption to an extreme, and declared that there will be a massive and sudden shift from corporate datacenters to entirely external cloud computing environments -- public cloud utilities, if you will. They are wrong," Urquhart writes.

Citing an unfavorable change in tax laws, Microsoft is moving its Windows Azure cloud from a data center in Washington state to one in Texas. It's an interesting new twist in the cloud computing market—moving a cloud across state lines in response to the regulatory climate.

Of course, the problem will be that there will only be a single US data center available for some time, which means that geolocation for disaster recovery won’t be an option for early Azure adopters.

•Miko Matsumura (a.k.a. @MikoJava) claimed SOA Arrogance is Dead when he followed Anne Thomas Manes’ (@atmanes) session at the Burton Group’s Catalyst Conference on 7/29/2009 and made the following point in his 8/7/2009 post:

First and foremost, the most stupid and ignorant reading of “SOA is DEAD” is that the perspective of SOA is no longer needed in the Enterprise. This point of view is stupid, particularly when SOA is so important for mash-ups, Cloud Computing, SaaS, PaaS, BSM, IT Governance, Portfolio management and most modern IT practices.

The problem of Enterprise IT Complexity (and Entropy) *DOES* need to be solved. SOA is one of many key architectural perspectives that can make this happen.

Everything is a service (SOA) is an incredibly powerful view.

But within appropriate bounds, everything can also be appropriately viewed as a Process, an Event, an Object, a database table, or other abstraction.

The idea that an enterprise architect could become so focused on “one architecture to rule them all” is as preposterous as “one vendor to rule them all”.

In most organizations, SOA has become a bad word. Except in rare situations, SOA has failed to deliver its promised benefits. IT Groups have invested hundreds of thousands, if not millions of dollars into SOA with little return to show for it. The people holding the purse strings are fed up. Funding for these SOA initiatives has dried up.

It’s time to face reality: the term “SOA” now carries too much baggage. It’s time to declare SOA dead and move on.

So what went wrong? Was SOA really just a great failed experiment? Or did we just lose our way? Should we abandon our architectural efforts? Can we salvage any value from our past efforts?

This 2009 study has 712 pages, 211 Tables and Figures. Worldwide markets are poised to achieve significant growth as search engines use efficient automated process to drive new advertising and communications capabilities. Applications can be built without programming. …

SOA reaches into every industry and every segment of the economy via cloud computing. SOA drives innovation for the very large enterprises. Mid range size companies and very small organizations are adopting technologies similar to what the enterprise use, creating automated process to replace manual process. Cloud computing markets at $36 billion in 2008 are expected to reach $160.2 billion by 2015. …

The question is what will be the market for cloud-computing research reports in 2015?

Want to know what gets my blood pressure up? It's when there's both a huge shift in thinking around how we should do computing, namely cloud computing, and at the same time, there's a bunch of information out there that causes confusion. As cloud computing hype spikes to a frenzy, so does the number of less-than-intelligent things that I hear about it and its relationship to SOA.

We've got a herd mentality in IT. We're always chasing the next paradigm shift, which seems to come along every five years, claiming that whatever the last paradigm shift was had "failed" and that's why we're looking at something new. However, these hype-driven trends are often complementary, and so the real power is in figuring out how known approaches fit with what's new, and not look to replace, but how to build on the foundation. The best case for that scenario has been how SOA benefits cloud computing, but few understand how and why. …

To be fair, Google's Chrome OS is not the only operating system to which the cloud handle has been attached. It is merely the latest in a long line of attempts to capitalise on the growing interest and hype surrounding cloud computing.

Novell, Dell, Microsoft — in fact, anyone who is anyone with a stake in operating systems has been mentioned at least once in conjunction with a cloud operating system.

There is no such thing. It is a myth existing entirely in the minds of those who cannot seem to get enough cloud in their daily technology diets. And the problem in perpetuating that myth is that it continues to confuse an already confused market.

The state of Washington is investing $180 million to build a new data center, and not everyone is thrilled about it. Opponents wonder if cloud computing wouldn't be a cheaper alternative. Ironically, Washington is home to two of the biggest cloud service providers, Amazon.com and Microsoft.

As reported byThe Olympian, a bond sale and groundbreaking for the new facility, which will also serve as the headquarters for the state's Information Services division, is imminent. Construction equipment is due to arrive on the site in Olympia within a few days.

Two state representatives, Reuven Carlyle and Hans Dunshee, tried to put a halt to the project. In a letter to Gov. Chris Gregoire, they pointed to data centers operated by Google, Microsoft, and others -- a.k.a. cloud computing centers -- as potentially cheaper alternatives. For a state that spends upwards of $1 billion annually on IT (according to Carlyle), lawmakers and tax payers can't be blamed for balking.

Mike currently is responsible for the global data center design, construction, ongoing operations and professional services for Digital Realty Trust, his past roles include similar responsibilities at Microsoft Corporation, and leadership roles at Walt Disney, Rhythms NetConnections, and Nuclio Corporation (now part of Sun Microsystems).

The move to the Internet cloud will pick up steam in the next year for developers, according to a new survey from Santa Cruz, Calif.-based Evans Data Corp.

Nearly half (48 percent plus) of the 500-plus developers surveyed expects to deploy private cloud applications in the coming year. Development for the cloud is also happening now. More than 29 percent said they are currently building applications for a private cloud.

Evans Data announced some of the results on Tuesday, but the company's "Cloud Development Survey 2009" publication is expected to be released sometime next week. The survey also examines public cloud trends among developers. …

There will soon be two major paths for cloud computing providers: commodity and premium. If you read my series, Cloud Futures, you’ll know that I broke down cloud service providers into three major categories: service clouds, consumer clouds (previously ‘commodity’)[1], and focused clouds. In retrospect I realize now that there are possibly four, not three major categories. The missing category is premium enterprise clouds. Previously I had lumped these under focused clouds, but I now realize that, in fact, there are likely to be so many of these that they deserve their own category. I’ll go even further and suggest that in terms of markets targeted, there will really only be two ends of a spectrum: enterprise and non-enterprise. …

Slowly, but steadily, enterprises are warming up to Cloud technologies. No, they are not queuing outside the Amazon headquarters waiting to order public cloud infrastructure, like the Amazon's EC2 offerings, yet. But, the idea of private clouds and the advantages of tapping the public clouds for non mission critical operations like testing are slowly making the enterprise community comfortable with Cloud Computing. In fact, a recent Gartner survey predicts that by 2012, 80 percent of Fortune 1000 enterprises will be paying for some cloud computing services and 30 percent will be paying for cloud computing infrastructure services. …

In January 2008, Waxhaw started a new Building Inspections Department. This function was moved from the county level in order to be more responsive to Waxhaw citizens while ensuring quality building construction for new developments and historic restorations.

Greg Mahar, the Director of Planning and Community Development for Waxhaw, sought software that would better enable his team to manage planning and community development in a more effective manner. After extensive research, Greg chose BasicGov web-based software because of its affordability and reliability. …

IT services providers regularly contact Saugatuck, seeking to understand the potential opportunities and limitations of introducing Cloud Computing and SaaS to their customer bases. What we see is the emergence of web-based Cloud Computing outsourcing alternatives (including SaaS) that are substantially reshaping the way technology-enabled services are purchased and used. This in turn is fundamentally shifting the IT outsourcing landscape, creating new opportunities that grow from established, traditional categories of IT outsourcing. These emergent shifts - and opportunities - include the following:

From Infrastructure Outsourcing (IO) to Infrastructure-as-a-service (IaaS) and Platform-as-a-Service (PaaS);

From Application Management Outsourcing (AMO) to PaaS and SaaS; and

From Business Process Outsourcing to Cloud Enabled Business Services and IT as a Service.

Future Saugatuck Strategic Perspectives will examine these areas of change opportunity in more detail, from both the user and provider points of view. …

It sometimes seems as if the whole world has gone cloud crazy - well at least most of the vendors, pundits and many in the media. If we listen to the evangelists, the days of the enterprise data centre are numbered and players like Google, Amazon and Microsoft will inherit the earth. Even David Cameron, the illustrious leader of the opposition to the UK government, has been talking about handing over the country's health records for storage and management to one of these big American multinationals.

In the midst of all this noise and hype, many have lost sight of the fact that getting a third party to run some of your infrastructure for you has been around for at least three decades. Indeed, those who have been taking advantage of hosted services - or on the other side of the fence, delivering them - must be wondering what all the fuss is about. Just what, exactly, is this cloud thing bringing to the party that is supposed to change the way everything works? …

Look all around and you can easily see that there is no shortage of press regarding the promises of cloud computing. Cloud evangelists have touted cloud computing as the next big thing, a game changer - a disruptive technology that will spark innovation and revolutionize the way businesses acquire and deliver IT services. The staggering volume of these sales pitches is to be expected, considering that cloud computing is at or near the peak of its hype cycle, but as with any new technology or model, reality will eventually set in and the public relations blitz will fade. As people continue to define cloud computing and debate its pros and cons, one thing is certain - one of the biggest obstacles to widespread cloud computing adoption will be security.

Articles and blog posts associated with security and cloud computing are a daily occurrence, unless some well-publicized breach occurs in the cloud. At that point the number of commentaries and discussions will increase exponentially, and then, over the following week, return to normal frequency.

I decided to focus on security as it relates to cloud storage, to see if something really new and different is occurring, and if overall changes need to be contemplated, as it comes to classic data security activities. When I focused in this way, I quickly discovered that not much has changed, and security of data in the cloud is highly dependent on the same precautions and understandings as security of your data in a private data center.

In this recent article, it was suggested that files of one owner residing on a physical device with the files of others could somehow result in unauthorized access. It could, and the answer to this and a myriad of concerns fits within traditional approaches and understandings of security.

Homeland Security Secretary Janet Napolitano isn't the federal cybersecurity czar, and has no desire to become the president's top IT security adviser. But if one of the responsibilities of the White House cybersecurity coordinator is to be the cheerleader for federal government cybersecurity initiatives, then Napolitano is filling that bill.

I’d say that Janet is angling to fill the power vacuum created by Melissa Hathaway’s resignation in advance of her anticipated appointment to the job. Melissa was the White House’s acting senior director for cyberspace.

One of the key parameters in the push to accelerate enterprise cloud adoption is the SLA (Service Level Agreements). It is an important requirement before enterprises can even think of jumping into the cloud. After a slow start, companies are coming out with SLAs for their services but it is still a messy affair with different companies offering varying terms with ambiguity. Recently, US General Services Administration, part of federal government, came up with a RFQ (Request For Quotations) that demands a 99.95% uptime per month. Let us try to understand the SLA dynamics in this post and see how government's requirement will affect the SLA game.

Microsoft requires two compute instances for a 99.95% uptime guarantee and only offers 99.9% uptime for data accessibility when Windows Azure RTMs.

In this white paper, Jon Oltsik, Principal Analyst at ESG, cuts through the hype and provides recommendations to protect your organization's data, with today's budget. Oltsik shows you where to start, how to focus on the real threats to data, and what actions you can take today to make a meaningful contribution to stopping data breaches.

As part of the paper's storage encryption to-do list, Oltsik details three realistic steps to provide the necessary protection for stored data based on risk.

The white paper covers:

What are the real threats to data today

Where do you really need to encrypt data first

How does key management fit into your encryption plans

What shifts in the industry and vendor developments will mean to your storage environment and strategy

While many companies are considering moving applications to the cloud, the security of the third-party services still leaves much to be desired, security experts warned attendees at last week's Black Hat Security Conference. …

"Guys at the low end are using (cloud infrastructure) to save money, but the danger is that the guys at the top end start to use it without any auditing," says Haroon Meer, technical director at security firm SensePost, who discussed his team's research into some aspects of Amazon's Elastic Compute Cloud (EC2) at the Black Hat security conference. …

It's hard to believe that it's been a year since we first created the Cloud Computing Interoperability Forum (CCIF) with the goal of defining and enabling interoperable enterprise-class cloud computing platforms through application integration and stakeholder cooperation. Over the last 12 months a lot has happened. For me the most notable change has been how the conversation has shifted from "why use the cloud" & "what is cloud computing" to how to implement it. The need for interoperability among vendors has also become a central point of discussion with the concept being included in recent US federal government cloud requirements. But like it or not the battle for an open cloud ecosystem is far from over.

Security researchers today unveiled details about a little-known but ubiquitous class of vulnerabilities that may reside in a range of Internet components, from Web applications to mobile and cloud computing platforms to documents, images and instant messaging products. [Emphasis added.]

At issue are problems with the way many hardware and software makers handle data from an open standard called XML. Short for "eXtensible Markup Language," XML has been used for many years as a fast and efficient way to transport, store and structure information across a wide range of often disparate applications.

Researchers at Codenomicon Ltd., a security testing company out of Oulu, Finland, say they found multiple critical flaws in XML "libraries," chunks of code that are typically used and re-used in software applications to process XML data. …

[P]ublic calendar titled the Federal Technology Events Calendar. This calendar uses Google Calendar technologies so it is fast and easy to maintain, which means it should be easy to keep it up to date.

Calendar data also is available for download in XML, ICAL and HTML formats.

• The B.NET (Bangalore.net) User Group announces a series of sessions focused on Windows Azure starting on 8/8/2009:

B.NET brings you a series of sessions on the Microsoft Cloud Computing platform - Windows Azure. Spread across 6 sessions of 90 minutes each, this series takes you through the nuts and bolts of Windows Azure. Some of the key concepts of Windows Azure like the Fabric, Web Role, Worker Role, Tables, Blobs, Queues and configuration will be covered in-depth in the sessions. By end of the series, you would be able to architect cloud services for Azure or migrate your existing applications to the Azure platform.

What's more? Those members who attend all the 6 sessions and successfully complete a quiz by the end of the series stand to win cool prizes from our sponsors!

Students from all IU campuses and other university students from across the US have an opportunity to consider the implications of cloud computing on the geosciences while networking with some of the leading thought leaders in the field. The Indiana University Pervasive Technology Institute Data to Insight Center (D2I) is soliciting student abstracts for an upcoming workshop titled "Cloud Computing and Collaborative Technologies in the Geosciences."

Sponsored by the National Science Foundation, the workshop will be hosted by the Pervasive Technology Institute and the Linked Environment for Atmospheric Discovery (LEAD) Project and will take place September 17-18, 2009, at the University Place Conference Center on the campus of Indiana University-Purdue University Indianapolis.

Abstracts for poster sessions will be accepted through August 20, 2009. Funding awards for travel and accommodations will be recommended to those posters targeted to: geosciences, including atmospheric, earth sciences, hydrology, environmental sciences, and climatology; collaborative technologies; and cloud computing. …

Vivek Kundra, Federal CIO, will deliver the opening keynote address on Monday, September 14, at 8:15 a.m. PT at the 2009 InformationWeek 500 Conference and Gala Awards, to be held at the St. Regis Monarch Beach Resort in Dana Point, Calif. Kundra will share his unique perspectives on getting things done within the massive federal bureaucracy, ensuring that his $75 billion annual IT budget delivers maximum value and impact. …

SalesForce have recently been heavily promoting their application development platform. The platform offers all of the benefits of cloud computing (scaleable, lower costs, etc) with the added bonus of best in breed CRM and sales support.

As more and more organizations are looking towards cloud computing to reduce their ICT spend cloud computing is very attractive. Add in all of the ready to use functionality SalesForce offers and it would appear to be the best solution but there are some hidden costs to consider. For example if an organization wanted to authenticate users to their Visual Force site they would need to purchase a license for each user or pay per sign on.

An organisation wanting to host large volumes of data would have to pay extra once they have exceeded their data allowance (typically 1GB for an enterprise license).

For SalesForce to compete with rivals such as Amazon Web Services they will need to consider changing their price structure. In the past their current price structure worked well for CRM and sales software as a service (SaaS) but they are now in the application development market and those high prices cannot compete with competitors who are offering cloud storage for $0.25 per GB/Month.

This is another case of miscategorizing the competition. Like Azure, the Sales Force app platform is a PaaS, not an IaaS like Amazon Web services. Fortunately, Martin compares SalesForce’s pricing with AWS and Google App Engine in these Google Sheets. It’s unfortunate that he didn’t include Azure in the comparison.

So we are working on a caching related project on EC2. In this scenario high performance is very important.

We are setting up a Varnish cluster on EC2 and evaluate if can replace an existing caching infrastructure in terms of costs and requests per second. Our benchmarks yielded some interesting results. It seemed that for our caching scenario the limiting factor is bandwidth. Varnish is very humble with CPU/RAM consumption. We could easily deliver 500 to 600 requests per second with a small instance and have the box idle around 95% (uncompressed content).

It turns out we are limited by bandwidth and not by CPU.

In our benchmarks we were only able to push 35 MB/s on small instances. So the actual requests per seconds were dependent on the object size we were pushing. The limit were always ~35 MB/s. Our typical HTML pages were around 50 to 70 KB, so we couldn’t reach the desired requests per second as our instance was at it’s bandwidth limit. …

Virtualization implementers found that the key bottleneck to virtual machine density is memory capacity; now there's a whole new slew of servers coming out with much larger memory footprints, removing memory as a system bottleneck. Cloud computing negates that bottleneck by removing the issue of machine density from the equation—sorting that out becomes the responsibility of the cloud provider, freeing the cloud user from worrying about it.

For cloud computing, bandwidth to and from the cloud provider is a bottleneck. We recently performed a TCO analysis for a client, evaluating whether it would make sense to migrate its application to a cloud provider. Interestingly, our analysis showed that most of the variability in the total cost was caused by assumptions about the amount of network traffic the application would use. This illustrates a key truth about computing: there's always a bottleneck, and solving one shifts the system bottleneck to another location.

Virtualization implementers found that the key bottleneck to virtual machine density is memory capacity; now there's a whole new slew of servers coming out with much larger memory footprints, removing memory as a system bottleneck. Cloud computing negates that bottleneck by removing the issue of machine density from the equation—sorting that out becomes the responsibility of the cloud provider, freeing the cloud user from worrying about it. …

Healthcare provides a good business opportunity to information technology companies, a fact reinforced by Salesforce.com (CRM - Analyst Report). The company is investing in Practice Fusion, which is involved in the business of electronic health records, health policy, health information technology and consumer medical data topics. Salesforce.com will invest around $10.0 million for a marginal stake in the company, which will generate around $1.0 million of revenue a year for the company. …

The dual Web role application has been running in Microsoft's South Central US (San Antonio) data center since September 2009. I believe it is the oldest continuously running Windows Azure application.

About Me

I'm a Windows Azure Insider, a retired Windows Azure MVP, the principal developer for OakLeaf Systems and the author of 30+ books on Microsoft software. The books have more than 1.25 million English copies in print and have been translated into 20+ languages.

Full disclosure: I make part of my livelihood by writing about Microsoft products in books and for magazines. I regularly receive free evaluation software from Microsoft and press credentials for Microsoft Tech•Ed and PDC. I'm also a member of the Microsoft Partner Network.