This is part 1 of 2 screencasts I recorded that show a Microsoft .NET developer how to use the new Windows Azure CloudDrive feature. In this videoI go over the details on how to create a VHD file in Windows 7 and the code you need to upload it as a Page Blob in Windows Azure Storage.

A few days ago Scott and I recorded a "[H]anselminutes" episode where we spent some time talking about OData. Scott came armed with a whole bunch of questions to help tease apart the current transition we're going through regarding products names. We also touched on various topics ranging from the layering of the system in our .NET implementation to what is the relationship of OData with the REST architectural style.

This was a good warm-up exercise for next week. We'll have a lot to say about OData at Mix 2010 in Las Vegas. If you are planning on attending Mix you won't want to miss the keynotes and OData sessions that are happening on Tuesday. We'll also be hanging out in the common areas during the event so if you're into this topic and would like to chat definitely come and find us.

From Show #205’s description:

Astoria, ADO.NET Data Services and OData - what's the difference and the real story? How does OData work and when should I use it? When do I use OData and when do I use WCF? Scott gets the scoop from the architect himself, Pablo Castro.

This week Telerik released a new LINQ implementation that is simple to use and produces domain models very fast. Built on top of the enterprise grade OpenAccess ORM, you can connect to any database that OpenAccess can connect to such as: SQL Server, MySQL, Oracle, SQL Azure, VistaDB, etc. While this is a separate LINQ implementation from traditional OpenAccess Entites, you can use the visual designer without ever interacting with OpenAccess, however, you can always hook into the advanced ORM features like caching, fetch plan optimization, etc, if needed.

Just to show off how easy our LINQ implementation is to use, I will walk you through building an OData feed using “Data Services Update for .NET Framework 3.5 SP1”. (Memo to Microsoft:P-L-E-A-S-E hire someone from Apple to name your products.) How easy is it? If you have a fast machine, are skilled with the mouse, and type fast, you can do this in about 60 seconds via three easy steps. (I promise in about 2-3 weeks that you can do this in less then 30 seconds. Stay tuned for that.)

Steve continues with step-by-step instructions for creating an OData feed from the Northwind sample database.

In this blog post, we'll give an introduction to working with data in cloud solutions.

Overview

Working with data is a critical part in most solutions. In a cloud solution, we can adopt most guidelines we already have for on-premises solutions. However, cloud solution also has its unique use cases in working with data. In this post, we will discuss the following use cases:

Expose your cloud data to the rest of the world.

Expose your on-premises data to your cloud applications.

Common considerations

In either use case, there're a few common considerations that you need to decide before going on.

Choose a protocol

In an SOA world, the most important concept is contract. In a cloud world, when it comes to communication, the most important concept is also contract. When there is a common contract that is adopted by lots of cloud applications, we call it a protocol.

In the data communication scenario, if you choose Microsoft cloud solution, the recommended protocol is the Open Data Protocol (OData). Based on open standards such as HTTP and AtomPub, OData provides a consistent solution to deliver data across multiple platforms. If your cloud service exposes data using the OData protocol, the rest of the world can consume your data using the same solution as they consume other OData compatible cloud services. Likewise, OData provides the ability for your cloud applications to consume your on-premises data in a consistent manner.

A lot of products are already using OData. Just to name a few: Windows Azure Table Storage, Dallas, SharePoint 2010, SQL Server 2008 R2, and so on.

If you want to choose other protocols, it is important to investigate how scalable the protocol is, what's the adoption rate, and so on.

Colbertz continues with these topics:

Choose a technology

Expose your cloud data to the rest of the world

Expose your on-premises data to your cloud applications

You can download a sample from All-In-One Code Framework (Azure).zip that demonstrates how to expose your cloud data stored in Windows Azure Table Storage to the rest of the world using WCF Data Services. The sample name is: CSAzureTableStorageWCFDS/VBAzureTableStorageWCFDS.

Ian Fleming’s slide deck from his The Counterintuitive Web presentation about REST principles at QCon London 2010 on 3/11/2010 illustrates this abstract:

The Web doesn't care for your finely-honed application architecture principles - for your orthodox tell-don't-ask, information hiding dictums, separated concerns, and guaranteed and reliable delivery strategies. It's an irresponsible place, where exposing your data, polling for results and making your errors the client's problem are considered acceptable behaviour. If it wasn't so successful, it'd be dismissed as an architectural clown. But despite its disregard for polite architectural society, it consistently beats your enterprise application efforts - and all at massive scale. It's time to find out why.

Ian Robinson is a RESTful development specialist and a Principal Consultant with ThoughtWorks, where he specializes in the design and delivery of service-oriented and distributed systems.

He has written guidance for Microsoft on implementing integration patterns with Microsoft technologies,and has published articles on business-oriented development methodologies and distributed systems design - most recently in The ThoughtWorks Anthology (Pragmatic Programmers, 2008). He is currently co-authoring a book on RESTful enterprise integration.

I’m just back from QCon London, a software development conference with an agile flavour that I enjoy because it is not vendor-specific. Conferences like this are energising; they make you re-examine what you are doing and may kick you into a better place. Here’s what I noticed this year.

Robert C Martin from Object Mentor gave the opening keynote, on software craftsmanship. His point is that code should not just work; it should be good. He is delightfully opinionated. Certification, he says, provides value only to certification bodies. If you want to know whether someone has the skills you want, talk to them.

Martin also came up with a bunch of tips for how to write good code, things like not having more than two arguments to a function and never a boolean. I’ve written these up elsewhere.

Next I looked into the non-relational database track and heard Geir Magnusson explain why he needed Project Voldemort, a distributed key-value storage system, to get his ecommerce site to scale. Non-relational or NOSQL is a big theme these days; database managers like CouchDB and MongoDB are getting a lot of attention. I would like to have spent more time on this track; but there was too much else on; a problem with QCon. …

So, this post will really serve two purposes. The first is to serve as a shameless plug for the some of the sessions coming up this week in Las Vegas for SQL Azure. My friend and colleague David Robinson will be in town giving what I can only tell you will be an EXTREMELY cool demo of some upcoming features that we’ll be releasing over the next couple of weeks. Trust me… You will want to attend. I’ve linked some of the relevant SQL Azure session below to make it easier to locate these for those of you attending.

The second purpose here is to really go over some usability things that we’ve added to the service recently (mostly in our last service update but it continues in our upcoming one). The first is that you now have the ability to easily change the SKU for any databases you previously created. This was a pretty big piece of feedback we heard from you and ended up being holiday project of mine.

As you know you can easily determine the number of db’s you have of a particular SKU using a query like the following in your master database.

select * from sys.database_usage where time = '2010-03-13’

So, let’s say that you presently have a Web SKU database and you would likely to switch to a Business SKU database. You can now easily do this using the following syntax:

alter database <database_name> modify (MAXSIZE=10GB)

Pretty simple right? Hopefully you think it is.

The team is beginning to look at opening other alter capabilities that make sense. Some of the ones high on our list, and I’d like to hear feedback on this, include:

DB Rename, Read-Only DB access, DBO access only mode, etc.

Again, can’t stress enough of this. We love to hear feedback on the features we take on, the priorities and pain points you have with the service. As always, please feel free to drop me feedback here or even better here for others to comment on.

Over the last few months I have had the opportunity to ramp up significantly on SQL Azure. In fact I will be the co-author of Pro SQL Azure, published by Apress. This is going to be a book on how to best leverage SQL Azure, both from a technology and design standpoint. Talking about design, one of the things I realized is that understanding the key limitations and boundary parameters of Azure in general, and more specifically SQL Azure, will play an important role in making sounds design decisions that both meet reasonable performance requirements and minimize the costs associated with running a cloud computing solution.

The book touches on many design considerations including link encryption, pricing model, design patterns, and also some important performance techniques that need to be leveraged when developing in Azure, including Caching, Lazy Properties and more.

Finally I started working with Shards and how to implement them in Azure to ensure database scalability beyond the current size limitations. Implementing shards is not simple, and the book will address how to create a shard technology within your code to provide a scale-out mechanism for your SQL Azure databases.

As you can see, there are many factors to consider when designing a SQL Azure database. While we can think of SQL Azure as a cloud version of SQL Server, it is best to look at it as a new platform to make sure you don’t make any assumptions on how to best leverage it.

You can download a sample from All-In-One Code Framework (Azure).zip that demonstrates how to expose your on-premises data stored in SQL Server to the cloud. The sample name is: CSAzureServiceBusWCFDS/VBAzureServiceBusWCFDS. The sample also provides an ASP.NET client that you can use to test the service.

Today the AppFabric team has released the AppFabric LABS environment. This is a new environment which the team will use to showcase some early bits and get feedback from the community. Usage for this environment will not be billed.

AppFabric Labs provide a way for customers to test out and play with experimental AppFabric technologies. These are upcoming capabilities that excite us, and we want to get your feedback on them as soon as possible. As a result, there is no support or SLA associated with the LABS environment, but in return you will be able to preview the future of AppFabric while helping us shape it. Though similar to a Community Technology Preview, LABS technologies may occasionally be even farther away from commercial availability.

In this release of the LABS environment, we’re shipping two features:

Silverlight support: we’ve added the ability for Silverlight clients to make cross-domain calls to the Service Bus and Access Control Services.

Multicast with Message Buffers: we’ve added the ability for Message Buffers to attach to a multicast group. A message sent to the multicast group is delivered to every Message Buffer that is attached to it.

A few people have been getting the following error when trying to Build a project after installing Visual Studio 2010 RC:

The OutputPath property is not set for project 'CloudService7.ccproj'. Please check to make sure that you have specified a valid combination of Configuration and Platform for this project. Configuration='Debug' Platform='MCD'.

There are variations on this “Platform=” bit, as I got HCD on my machine.

After about an hour of diving in to my build properties and reading up on “Any CPU” configuration settings, it turns out that a pretty simple solution is to be had.

It turns out that HP machines from the factory come with several global variables set for it’s own update software, including things such as “PCBRAND”. One of those variables are, you guessed it, PLATFORM.

In VS 2010 RC the build environment started respecting “Platform” as a compile time System variable. Therefore, because no such platform exists in your deployment configuration, it fails.

It’s a pretty simple fix, just delete the PLATFORM variable.

Brandon continues with the instructions to “delete the PLATFORM variable” that HP adds.

Today at South by Southwest, I stopped by the Microsoft booth to sample the goods and get some swag. Bing, WebsiteSpark, and Surface where all on display alongside Azure. Being interested in how SQL Server will be moving into the cloud, I struck up a chat. No news there that I didn’t already know (core database first, then more sophisticated offerings like Analysis Services later).

The Toolkit contains all the basic pieces needed to construct a Facebook app, with some links to How-To resources. This isn’t for beginners from the looks of things. But a lot of experienced .NET developers will surely benefit from this collection.

From a database perspective, the app would use SQL Azure along with table storage, message queues, and blob storage. This combination will make for a great learning experience. More to come on this subject in future posts.

NOTE: Many of the development labs in this curriculum can be done without having an account in Windows Azure by using the Windows Azure simulator that is part of the Windows Azure Software Development kit. To deploy applications into the public Windows Azure cloud you must first establish your own Windows Azure account. If you do not already have a Windows Azure Account, you can find out how to get one here.

Posted this to the MPN East blog and though I would also share it here.

Hedgehog Development is a custom web and application development shop that provides consulting and development services for a range of different industries. They're also a former Microsoft Partner of the Year (2008) and a BizSpark Network Partner. In this interview Microsoft Evangelists John McClelland and Brian Johnson talked to Hedgehog about the Microsoft Partner Program, developing a Facebook application with Azure, and about working with BizSpark startups.

The federal government is moving to the cloud. There’s no doubt about that.

Momentum for cloud computing has been building during the past year, after the new administration trumpeted the approach as a way to derive greater efficiency and cost savings from information technology investments.

At the behest of federal Chief Information Officer Vivek Kundra, the General Services Administration became the center of gravity for cloud computing at civilian agencies, with the launch of a cloud storefront, Apps.gov, that offers business, productivity and social media applications in addition to cloud IT services.

High-profile pilot programs generated more buzz about cloud computing, including the Defense Information Systems Agency’s Rapid Access Computing Environment and NASA Ames Research Center’s Nebula, a shared platform and source repository for NASA developers that also can facilitate collaboration with scientists outside the agency. …

Rutrell continues with an analysis of the obstacles to government adoption of cloud computing that are yet to be overcome.

In this post, we’ll examine the application Adatum is considering migrating to the cloud as a proof point for their assumptions.

Adatum’s a-Expense

a-Expense is one application in Adatum’s finance support systems that helps them submit, track and process business expenses. Everyone in Adatum is required to use this application for requesting reimbursements. a-Expense is not a mission critical application, but it is clearly important. Users could tolerate a few hours downtime every once in a while, but it is clearly an important application nevertheless.

Adatum has a policy that all expenses are to be submitted for approval and processing before the end of each month. However, the vast majority of employees submit their expenses in the last 2 business days leading to relatively high demands during a short period of time. a-Expense is sized for average use, not for peak demand, therefore, during these 2 days, the system is slow and users complain.

a-Expense is currently deployed on Adatum’s data center and it is available for users on the intranet. People traveling have to access it through VPN. There’re have been requests in the past for publishing a-Expense directly on the internet, but it has never happened.

a-Expense stores quite a bit of information as most expense receipts need to be scanned and stored for 7 years. For this reason, the data stores used by a-Expense are backed up frequently.

Adatum wants to use this application as a test case for their evaluation of Windows Azure. They consider it to be a good representation of many other applications in their portfolio, surfacing many of the same challenges they are likely to encounter further on.

Vivek Kundra, the Obama official with $79 billion to spend on technology, said the government can be more efficient by putting programs on the Web, paving the way for companies like Microsoft Corp. and Google Inc. to win business.

The government wants to put data such as health-care pricing information on Internet-based systems as they grow more secure, the U.S. chief information officer said in an interview this week. The U.S. can cut costs by outsourcing that work, said Kundra, who has overseen the federal technology budget since President Barack Obama appointed him last year.

Microsoft, Google and Amazon.com Inc. are all offering more databases and programs online, allowing customers to curb storage costs. Sharing software and data that way would shrink U.S. storage needs, helping to cut expenses after previous governments spent more than $500 billion on data centers and other technology initiatives in the past decade, Kundra said.

“It’s mind-boggling,” said Kundra, a New Delhi native who previously managed information technology for the District of Columbia. “It costs a fortune, it’s duplicative and it’s an energy hog.”

The model Kundra is looking at is known as cloud computing, where users go through the Web to access computers, applications and data instead of through their own servers. He declined to say which companies are best fit to operate government clouds. He noted that Google and Redmond, Washington-based Microsoft have introduced government-focused clouds in the past few months. …

Today, the California Health and Human Services convened a summit with an expected three hundred people in the interest of a state HIE (Health Information Exchange). This project has been tasked by volunteers and state groups and led by Jonah Frolich, deputy secretary of California Health and Human Services. The teams formed have met a series of hurdles already in preparation for the next big phase of executing the next generation system and raising an initial seed of $38.8m to move the effort forward.

At stake is at least $3 billion by connecting to these services for doctors and hospitals that qualify by using the HIE as built. This means that doctors can bill for more Medi-Cal and Medicare payments that are expected to be available in coming years from the American Recovery and Reinvestment Act funds while using HIE services. Additionally, the services being created will need to support applications that engage consumers as they play a role.

We see the opportunity for California's investment to touch many interesting areas of cloud computing, identity management, and mobile - right as it is getting interesting.

Last week, Governor Arnold Schwarzenegger and California Health and Human Services Agency Secretary Kim Belshé named a new nonprofit entity called Cal eConnect to oversee the development of Health Information Exchange services. One of the first tasks at hand is to finish the CA HIE Operational plan and to finalize details in budget, technical, and engagement plans to execute with the recent first grant by ONC for $38.8m. …

Each instance of Windows Azure Service Role runs its own monitor to gather its own instance specific diagnostic data. The problem that immediately presents itself is knowing what exactly is being collected, where the data is being saved, and how to retrieve it for inspection. The purpose of this blog post is to illuminate these areas a little bit better.

So lets start at the beginning… When you create a new Windows Azure Web Role, Visual Studio will automatically add a boilerplate WebRole.cs file to your project. By default, the OnStart() method of the WebRole is overridden with an implementation that starts the Windows Azure Diagnostic Monitor. By default, Windows Azure will log its own diagnostics, IIS 7.0 logs, plus Windows Diagnostics.

The argument to the static Start method of the DiagnosticMonitor class is the Windows Azure Data Storage connection string located in the ServiceConfiguration.cscfg file.

We can inspect the “wad-control-container” of Blob storage to find the collected diagnostic information. run your favorite Windows Azure Storage exploration tool. In my example, I am using the Windows Azure Storage Explorer from the CodePlex site. You can use this tool to download the container and its contents to your local file-system for further analysis. …

The development fabric is a high-fidelity simulation of the Windows Azure cloud environment on a local development. The development fabric is designed for developing and testing and all Windows Azure apps should be fully tested on the dev fabric before being deployed. The development fabric UI can be started in any of the following ways:

Debugging or running a cloud service project from Visual Studio.

Running CSRun.exe from the command line with valid parameters.

Directly from the Azure SDK program Start menu.

Running DFUI.exe from the Windows Azure SDK bin directory.

When the development fabric starts, it can be accessed from the development fabric system tray icon. Figure 3-25 illustrates the development fabric user interface hosting a cloud service.

The development fabric shows the Azure service deployments in the local environment and allows for altering the state of a running service. Services can be suspended, restarted or removed using the development fabric UI. …

Since February 1, Azure customers are now being billed for their consumption of Azure resources. If you were an early adopter, you might have been spoiled by the free usage during CTP over the past year, or even during January of this year, when “mock bills” were generated but no actual costs accrued.

Recently I’ve been fielding questions about the true expense of running Azure web roles and worker roles, including questions about Microsoft’s “free” account for MSDN Premium developers. Let me share a few tidbits here that will, hopefully, help you manage your Azure costs.

and continues with the following topics:

Roles and Virtual Machines …

Virtual Machines and Instances …

Instances and Lifetimes …

Lifetimes and Clock time …

Clock time vs CPU time …

Cost-based architecture: What to do today …

Future-thinking:

The Azure team is reaching out to the community, asking for input about future ideas, where you can suggest a new idea or vote on someone else’s idea (check out the voting site here). I want to draw your attention to a few ideas that could really help reduce cost:

Provide multiple roles per instance. The idea would be to host, say, all of your worker roles in a single instance. Maybe this wouldn’t help with your web roles, since they’ll likely be much busier, but for lower-usage worker roles, this could work out nicely.

We'd also like to hear from Voices for Innovation members about cloud computing. What are you hearing from customers, what are your concerns, and what are your plans? Please take the VFI Cloud Computing Flash Poll to share your views. Once the poll closes, we'll share aggregated results. (To sign up for VFI, click here.)

Also on the cloud: In January, Microsoft General Counsel Brad Smith spoke with VFI about the need for a positive business policy environment to enable cloud computing to thrive. You can view his video here. We have also set up a VFI Cloud Computing resource page with links to more information.

In this 90 minute webcast, the three coauthors of “Cloud Security and Privacy” (recently published by O’Reilly) take a deep dive into cloud security issues and focus on three specific aspects: (1) data security; (2) identity management in the cloud, and; (3) governance in the cloud (in the context of managing a cloud service provider with respect to security obligations).

I just had a very disturbing conversation with a Rackspace Cloud CSR. It went something like this:

CSR: Can I have your account user name and password? Me: You want my password? CSR: Yes sir. Me: You know that's, like, security 101 that you should never reveal a password over the phone? CSR: Yes sir, but in this case we need it to verify your account. Me: OK, let me go change it to something I'm willing to tell you over the phone. [Typety type type] Me: OK, my password is now somereallylongbogusthing. CSR (without any delay): Thank you. How can I help you? Me: Wait, you must either be the world's fastest typist, or you can see my password on your screen. CSR: That's right, sir, I can see your password. Me: (The sound of my jaw hitting the floor.)

I am just stunned. I have used Rackspace for mission-critical servers in the past. They have always seemed reasonably competent, if not always 100% reliable. But this calls into question Rackspace's entire security policy. The first rule of computer security is that you do not store passwords in the clear. Never. Ever. No ifs ands or buts. You Just Don't Do That. And security is particularly critical in cloud computing, where your data ends up on hardware that can be reused by other people. If Rackspace is storing passwords in the clear, what else might they be screwing up? This really calls into question whether Rackspace can be trusted with mission-critical data.

Good grief, Rackspace, I really wanted to like you. But what were you thinking?

[UPDATE:] It really is as serious as I thought. WIth the account password you can change contact information and reset the root password on all your servers. So unless and until this is fixed you should not use RSC for anything mission critical. I hope they do fix this because other than that I really like RSC. Their UI is very well designed, and setting up a server was amazingly fast and painless.

This post has gathered several tweets (#rackspace) and I see no response from @Rackspace among them.

MIX 2010 kicks off next week at the Mandalay Bay Hotel & Casino. It’s not too late to register (if you can find a hotel room) but for those of you unable to attend the conference in person make sure you tune into Channel 9 Live for non-stop live coverage. Full details below:

What:Channel 9 Live is a live and interactive broadcast of interviews with executives, VIP’s, industry luminaries, technology experts, session presenters, mad scientists and everyone in between. If you have a question for folks like Scott Guthrie, Joe Belfiore, Bill Buxton and Dean Hachamovich send them through to us and we'll ask on your behalf, live on the air. It's the next best thing to being there in person.

When: Immediately following the keynotes, Channel 9 Live will be broadcasting from:

Hey there! CloudConnect is next week (already?) and while some of us are already on a plane heading to the Bay area to kick things off (Shlomo Swidler is already on his way, according to his tweets at 36,000 feet) some of us will be lounging preparing for our various workshops and panels until early next week.

That being the case, if you’re not going to be attending and thus missing the panel I’m moderating (what? How could you miss that?) but had a burning question you wanted to ask one of the panelists, let me know. Leave a comment, send a tweet, compose an e-mail, write me a letter (better hurry, Green Bay is a long way from everywhere). If we can fit it in (how many people actually ask live questions during the Q&A, right?) we’ll get it answered, and tweet or post a follow-up next week.

Come to think of it, even if you are attending and just can’t make the panel, or you’re like me and don’t like asking questions in public, send your question anyway.

Here are the details of Lori’s panel from the Cloud Connect site:

Interoperability between networks has fueled the growth of applications that has in turn spurred the growth of networks and internetworking. This cycle has led to increasing strain on networks, applications, and people who manage them. The 'virtualization' of networks, servers, storage, and applications as well as cloud computing not only quickens growth rates, but changes the nature of demand placed on infrastructure. Virtualization and cloud computing ultimately require new kinds of interoperability to reduce the burdens imposed by these technologies. This panel of cloud computing vendors and users will review the challenges of dynamic infrastructure design -- infrastructure capable of sustaining growth while relieving stress -- and will suggest the types of standards necessary to make those infrastructures a reality.

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Abstract: SQL Azure is Microsoft’s cloud database service offering. Cloud-based database solutions such as SQL Azure can provide many benefits, including rapid provisioning, cost-effective scalability, high availability, and reduced management overhead. In this presentation we will cover an architectural overview of SQL Azure and describe how you can use SQL Azure to augment your existing on-premises data infrastructure or as a complete database solution. We will walk through a live demo on creating a cloud based SQL Azure application. Other issues around cloud based computing including security and database maintenance will be discussed.

Law enforcement is no different in its IT needs and constraints from many other enterprises - they run legacy hardware. Migrating to a new on-premise system and hardware can be cost-prohibitive. This compels decision-makers to look for advice and evidence on how to successfully evolve existing solutions via the cloud.

In their session at the 5th International Cloud Expo, Al Perez, Chief Software Architect at Total Computer, and Gunther Lenz, ISV Architect Evangelist at Microsoft, will discuss how Total Computer migrated its existing law enforcement solution to Windows Azure, the effort needed, the obstacles faced. The upshot - lower upfront investment, pay as you go, a new business model, scalability and reliability as well as familiar tools. …

Alex Williams delivered a list of cloud-related sessions at SXSW in his SXSW 2010 for Cloud Lovers post of 3/11/2010 to the ReadWriteCloud:

Fascinated by the cloud and what it means for the future of Web apps, social gaming, open-source and the after life? Then you have plenty to keep you busy if you are heading to SXSW this year.

This discussion looks like one of the better cloud panels at SXSW. It features panelists such as Amazon CTO Werner Vogels, who is there to discuss how cloud computing platforms have evolved so that it is possible to run a ''serverless'' business with confidence.

How does the cloud affect accessibility? Will cloud-computing allow for a metaphorical curb cut out, allowing access to rich Internet applications? Ahhh - another example of how cloud computing is affecting all aspects of our word. From the SXSW guide:

"Could a Software as a Service (SaaS) model deliver assistive technologies as a cloud-based service? The National Public Inclusive Infrastructure (NPII) is trying to do just that. As a facilitator for more rapid deployment of assistive technologies and a means to prototype new business models for emerging assistive technologies, AT could become part of an extensive infrastructure of readily available, electronic curb cuts that allow for seeamless access for a broader range of users than have been included to date."

Pantheon is an open-source cloud hosting initiative for the Drupal development community. Josh Koenig will examine "The Cloud" as a concept, look at the marketplace for cloud services, and dig into what it takes to build an application on a cloud-based platform.

Were not sure about this one. Looks like it's made for SXSW with its discussions about how business and love meet. We're uncertain how the cloud plays into this one but just about everything does these days, doesn't it? Next!

Now talk about ghosts in the machine. Just think, when we die our identities will float in cloud networks all over the world. This may be the most philosophical discussion around cloud computing that we see at SXSW. It's time to explore the digital beyond!

The cloud is everywhere and here at Microsoft we’re flying high with our cloud computing release, Windows Azure. As most of you saw at the Professional Developers Conference, the reaction to Windows Azure has been nothing short of “wow” – and based on your feedback, we’ve organized this special, all-day Windows Azure Firestarter event to help you take full advantage of the cloud.

Maybe you've already watched a webcast, attended a recent MSDN Event on the topic, or done your own digging on Azure. Well, here's your chance to go even deeper. This one-of-a-kind event will focus on helping developers get ‘cloud ready’ with concrete details and hands-on tactics. We’ll start by revealing Microsoft’s strategic vision for the cloud, and then offer an end-to-end view of the Windows Azure platform from a developer’s perspective. We’ll also talk about migrating your data and existing applications (regardless of platform) onto the cloud. We’ll finish up with an open panel and lots of time to ask questions.

SQL was the first-generation Big Data tool, and MapReduce/Hadoop was the second-generation tool. Unfortunately, neither of these tools has the characteristics required to break into the mainstream of data analytics, where there are now over 100 million business professionals (non-programmers) grappling with exponentially growing data volumes that they simply can't handle.

However, a new third generation of tools for Big Data is now emerging that offer the scalability, parallelism, performance and data flexibility of tools like Hadoop, but, unlike Hadoop, can also continuously process realtime data streams. Moreover, these new tools are as easy to use as a spreadsheet.

Cloudscale is the world's first realtime, massively parallel cloud platform for Big Data. The Cloudscale platform can be used by anyone who can use a spreadsheet - no databases, no programming - but is as powerful as anything used by the top parallel programmers at Google or the most experienced realtime analytics experts on Wall Street. Over 100 million Excel power users now have a seamless extension from their Excel desktop to the full power of cloud computing to handle their big data. The platform provides the elastic scalability and performance required to cope with anything from one-off personal data analysis tasks up to the most demanding large-scale analytics apps required by the world's leading organizations in business, web, finance, scientific research, and government.

With Cloudscale, an ordinary business user can now develop realtime analytics apps in minutes that would take a team of expert programmers months, and anyone can now sit in Excel and with one click launch an app on billions of rows of data that would normally take days to compute, but the user gets the answer from the cloud in 2 minutes.

Bill McColl is Founder and CEO, Cloudscale Inc. In order to found Cloudscale he left Oxford University, where for over twenty years he was Professor of Computer Science, Head of the Parallel Computing Research Center, and Chairman of the Computer Science Faculty.

This is part 1 of 2 screencasts I recorded that show a Microsoft .NET developer how to use the new Windows Azure CloudDrive feature. In this videoI go over the details on how to create a VHD file in Windows 7 and the code you need to upload it as a Page Blob in Windows Azure Storage.

A few days ago Scott and I recorded a "[H]anselminutes" episode where we spent some time talking about OData. Scott came armed with a whole bunch of questions to help tease apart the current transition we're going through regarding products names. We also touched on various topics ranging from the layering of the system in our .NET implementation to what is the relationship of OData with the REST architectural style.

This was a good warm-up exercise for next week. We'll have a lot to say about OData at Mix 2010 in Las Vegas. If you are planning on attending Mix you won't want to miss the keynotes and OData sessions that are happening on Tuesday. We'll also be hanging out in the common areas during the event so if you're into this topic and would like to chat definitely come and find us.

From Show #205’s description:

Astoria, ADO.NET Data Services and OData - what's the difference and the real story? How does OData work and when should I use it? When do I use OData and when do I use WCF? Scott gets the scoop from the architect himself, Pablo Castro.

This week Telerik released a new LINQ implementation that is simple to use and produces domain models very fast. Built on top of the enterprise grade OpenAccess ORM, you can connect to any database that OpenAccess can connect to such as: SQL Server, MySQL, Oracle, SQL Azure, VistaDB, etc. While this is a separate LINQ implementation from traditional OpenAccess Entites, you can use the visual designer without ever interacting with OpenAccess, however, you can always hook into the advanced ORM features like caching, fetch plan optimization, etc, if needed.

Just to show off how easy our LINQ implementation is to use, I will walk you through building an OData feed using “Data Services Update for .NET Framework 3.5 SP1”. (Memo to Microsoft:P-L-E-A-S-E hire someone from Apple to name your products.) How easy is it? If you have a fast machine, are skilled with the mouse, and type fast, you can do this in about 60 seconds via three easy steps. (I promise in about 2-3 weeks that you can do this in less then 30 seconds. Stay tuned for that.)

Steve continues with step-by-step instructions for creating an OData feed from the Northwind sample database.

In this blog post, we'll give an introduction to working with data in cloud solutions.

Overview

Working with data is a critical part in most solutions. In a cloud solution, we can adopt most guidelines we already have for on-premises solutions. However, cloud solution also has its unique use cases in working with data. In this post, we will discuss the following use cases:

Expose your cloud data to the rest of the world.

Expose your on-premises data to your cloud applications.

Common considerations

In either use case, there're a few common considerations that you need to decide before going on.

Choose a protocol

In an SOA world, the most important concept is contract. In a cloud world, when it comes to communication, the most important concept is also contract. When there is a common contract that is adopted by lots of cloud applications, we call it a protocol.

In the data communication scenario, if you choose Microsoft cloud solution, the recommended protocol is the Open Data Protocol (OData). Based on open standards such as HTTP and AtomPub, OData provides a consistent solution to deliver data across multiple platforms. If your cloud service exposes data using the OData protocol, the rest of the world can consume your data using the same solution as they consume other OData compatible cloud services. Likewise, OData provides the ability for your cloud applications to consume your on-premises data in a consistent manner.

A lot of products are already using OData. Just to name a few: Windows Azure Table Storage, Dallas, SharePoint 2010, SQL Server 2008 R2, and so on.

If you want to choose other protocols, it is important to investigate how scalable the protocol is, what's the adoption rate, and so on.

Colbertz continues with these topics:

Choose a technology

Expose your cloud data to the rest of the world

Expose your on-premises data to your cloud applications

You can download a sample from All-In-One Code Framework (Azure).zip that demonstrates how to expose your cloud data stored in Windows Azure Table Storage to the rest of the world using WCF Data Services. The sample name is: CSAzureTableStorageWCFDS/VBAzureTableStorageWCFDS.

Ian Fleming’s slide deck from his The Counterintuitive Web presentation about REST principles at QCon London 2010 on 3/11/2010 illustrates this abstract:

The Web doesn't care for your finely-honed application architecture principles - for your orthodox tell-don't-ask, information hiding dictums, separated concerns, and guaranteed and reliable delivery strategies. It's an irresponsible place, where exposing your data, polling for results and making your errors the client's problem are considered acceptable behaviour. If it wasn't so successful, it'd be dismissed as an architectural clown. But despite its disregard for polite architectural society, it consistently beats your enterprise application efforts - and all at massive scale. It's time to find out why.

Ian Robinson is a RESTful development specialist and a Principal Consultant with ThoughtWorks, where he specializes in the design and delivery of service-oriented and distributed systems.

He has written guidance for Microsoft on implementing integration patterns with Microsoft technologies,and has published articles on business-oriented development methodologies and distributed systems design - most recently in The ThoughtWorks Anthology (Pragmatic Programmers, 2008). He is currently co-authoring a book on RESTful enterprise integration.

I’m just back from QCon London, a software development conference with an agile flavour that I enjoy because it is not vendor-specific. Conferences like this are energising; they make you re-examine what you are doing and may kick you into a better place. Here’s what I noticed this year.

Robert C Martin from Object Mentor gave the opening keynote, on software craftsmanship. His point is that code should not just work; it should be good. He is delightfully opinionated. Certification, he says, provides value only to certification bodies. If you want to know whether someone has the skills you want, talk to them.

Martin also came up with a bunch of tips for how to write good code, things like not having more than two arguments to a function and never a boolean. I’ve written these up elsewhere.

Next I looked into the non-relational database track and heard Geir Magnusson explain why he needed Project Voldemort, a distributed key-value storage system, to get his ecommerce site to scale. Non-relational or NOSQL is a big theme these days; database managers like CouchDB and MongoDB are getting a lot of attention. I would like to have spent more time on this track; but there was too much else on; a problem with QCon. …

So, this post will really serve two purposes. The first is to serve as a shameless plug for the some of the sessions coming up this week in Las Vegas for SQL Azure. My friend and colleague David Robinson will be in town giving what I can only tell you will be an EXTREMELY cool demo of some upcoming features that we’ll be releasing over the next couple of weeks. Trust me… You will want to attend. I’ve linked some of the relevant SQL Azure session below to make it easier to locate these for those of you attending.

The second purpose here is to really go over some usability things that we’ve added to the service recently (mostly in our last service update but it continues in our upcoming one). The first is that you now have the ability to easily change the SKU for any databases you previously created. This was a pretty big piece of feedback we heard from you and ended up being holiday project of mine.

As you know you can easily determine the number of db’s you have of a particular SKU using a query like the following in your master database.

select * from sys.database_usage where time = '2010-03-13’

So, let’s say that you presently have a Web SKU database and you would likely to switch to a Business SKU database. You can now easily do this using the following syntax:

alter database <database_name> modify (MAXSIZE=10GB)

Pretty simple right? Hopefully you think it is.

The team is beginning to look at opening other alter capabilities that make sense. Some of the ones high on our list, and I’d like to hear feedback on this, include:

DB Rename, Read-Only DB access, DBO access only mode, etc.

Again, can’t stress enough of this. We love to hear feedback on the features we take on, the priorities and pain points you have with the service. As always, please feel free to drop me feedback here or even better here for others to comment on.

Over the last few months I have had the opportunity to ramp up significantly on SQL Azure. In fact I will be the co-author of Pro SQL Azure, published by Apress. This is going to be a book on how to best leverage SQL Azure, both from a technology and design standpoint. Talking about design, one of the things I realized is that understanding the key limitations and boundary parameters of Azure in general, and more specifically SQL Azure, will play an important role in making sounds design decisions that both meet reasonable performance requirements and minimize the costs associated with running a cloud computing solution.

The book touches on many design considerations including link encryption, pricing model, design patterns, and also some important performance techniques that need to be leveraged when developing in Azure, including Caching, Lazy Properties and more.

Finally I started working with Shards and how to implement them in Azure to ensure database scalability beyond the current size limitations. Implementing shards is not simple, and the book will address how to create a shard technology within your code to provide a scale-out mechanism for your SQL Azure databases.

As you can see, there are many factors to consider when designing a SQL Azure database. While we can think of SQL Azure as a cloud version of SQL Server, it is best to look at it as a new platform to make sure you don’t make any assumptions on how to best leverage it.

You can download a sample from All-In-One Code Framework (Azure).zip that demonstrates how to expose your on-premises data stored in SQL Server to the cloud. The sample name is: CSAzureServiceBusWCFDS/VBAzureServiceBusWCFDS. The sample also provides an ASP.NET client that you can use to test the service.

Today the AppFabric team has released the AppFabric LABS environment. This is a new environment which the team will use to showcase some early bits and get feedback from the community. Usage for this environment will not be billed.

AppFabric Labs provide a way for customers to test out and play with experimental AppFabric technologies. These are upcoming capabilities that excite us, and we want to get your feedback on them as soon as possible. As a result, there is no support or SLA associated with the LABS environment, but in return you will be able to preview the future of AppFabric while helping us shape it. Though similar to a Community Technology Preview, LABS technologies may occasionally be even farther away from commercial availability.

In this release of the LABS environment, we’re shipping two features:

Silverlight support: we’ve added the ability for Silverlight clients to make cross-domain calls to the Service Bus and Access Control Services.

Multicast with Message Buffers: we’ve added the ability for Message Buffers to attach to a multicast group. A message sent to the multicast group is delivered to every Message Buffer that is attached to it.

A few people have been getting the following error when trying to Build a project after installing Visual Studio 2010 RC:

The OutputPath property is not set for project 'CloudService7.ccproj'. Please check to make sure that you have specified a valid combination of Configuration and Platform for this project. Configuration='Debug' Platform='MCD'.

There are variations on this “Platform=” bit, as I got HCD on my machine.

After about an hour of diving in to my build properties and reading up on “Any CPU” configuration settings, it turns out that a pretty simple solution is to be had.

It turns out that HP machines from the factory come with several global variables set for it’s own update software, including things such as “PCBRAND”. One of those variables are, you guessed it, PLATFORM.

In VS 2010 RC the build environment started respecting “Platform” as a compile time System variable. Therefore, because no such platform exists in your deployment configuration, it fails.

It’s a pretty simple fix, just delete the PLATFORM variable.

Brandon continues with the instructions to “delete the PLATFORM variable” that HP adds.

Today at South by Southwest, I stopped by the Microsoft booth to sample the goods and get some swag. Bing, WebsiteSpark, and Surface where all on display alongside Azure. Being interested in how SQL Server will be moving into the cloud, I struck up a chat. No news there that I didn’t already know (core database first, then more sophisticated offerings like Analysis Services later).

The Toolkit contains all the basic pieces needed to construct a Facebook app, with some links to How-To resources. This isn’t for beginners from the looks of things. But a lot of experienced .NET developers will surely benefit from this collection.

From a database perspective, the app would use SQL Azure along with table storage, message queues, and blob storage. This combination will make for a great learning experience. More to come on this subject in future posts.

NOTE: Many of the development labs in this curriculum can be done without having an account in Windows Azure by using the Windows Azure simulator that is part of the Windows Azure Software Development kit. To deploy applications into the public Windows Azure cloud you must first establish your own Windows Azure account. If you do not already have a Windows Azure Account, you can find out how to get one here.

Posted this to the MPN East blog and though I would also share it here.

Hedgehog Development is a custom web and application development shop that provides consulting and development services for a range of different industries. They're also a former Microsoft Partner of the Year (2008) and a BizSpark Network Partner. In this interview Microsoft Evangelists John McClelland and Brian Johnson talked to Hedgehog about the Microsoft Partner Program, developing a Facebook application with Azure, and about working with BizSpark startups.

The federal government is moving to the cloud. There’s no doubt about that.

Momentum for cloud computing has been building during the past year, after the new administration trumpeted the approach as a way to derive greater efficiency and cost savings from information technology investments.

At the behest of federal Chief Information Officer Vivek Kundra, the General Services Administration became the center of gravity for cloud computing at civilian agencies, with the launch of a cloud storefront, Apps.gov, that offers business, productivity and social media applications in addition to cloud IT services.

High-profile pilot programs generated more buzz about cloud computing, including the Defense Information Systems Agency’s Rapid Access Computing Environment and NASA Ames Research Center’s Nebula, a shared platform and source repository for NASA developers that also can facilitate collaboration with scientists outside the agency. …

Rutrell continues with an analysis of the obstacles to government adoption of cloud computing that are yet to be overcome.

In this post, we’ll examine the application Adatum is considering migrating to the cloud as a proof point for their assumptions.

Adatum’s a-Expense

a-Expense is one application in Adatum’s finance support systems that helps them submit, track and process business expenses. Everyone in Adatum is required to use this application for requesting reimbursements. a-Expense is not a mission critical application, but it is clearly important. Users could tolerate a few hours downtime every once in a while, but it is clearly an important application nevertheless.

Adatum has a policy that all expenses are to be submitted for approval and processing before the end of each month. However, the vast majority of employees submit their expenses in the last 2 business days leading to relatively high demands during a short period of time. a-Expense is sized for average use, not for peak demand, therefore, during these 2 days, the system is slow and users complain.

a-Expense is currently deployed on Adatum’s data center and it is available for users on the intranet. People traveling have to access it through VPN. There’re have been requests in the past for publishing a-Expense directly on the internet, but it has never happened.

a-Expense stores quite a bit of information as most expense receipts need to be scanned and stored for 7 years. For this reason, the data stores used by a-Expense are backed up frequently.

Adatum wants to use this application as a test case for their evaluation of Windows Azure. They consider it to be a good representation of many other applications in their portfolio, surfacing many of the same challenges they are likely to encounter further on.

Vivek Kundra, the Obama official with $79 billion to spend on technology, said the government can be more efficient by putting programs on the Web, paving the way for companies like Microsoft Corp. and Google Inc. to win business.

The government wants to put data such as health-care pricing information on Internet-based systems as they grow more secure, the U.S. chief information officer said in an interview this week. The U.S. can cut costs by outsourcing that work, said Kundra, who has overseen the federal technology budget since President Barack Obama appointed him last year.

Microsoft, Google and Amazon.com Inc. are all offering more databases and programs online, allowing customers to curb storage costs. Sharing software and data that way would shrink U.S. storage needs, helping to cut expenses after previous governments spent more than $500 billion on data centers and other technology initiatives in the past decade, Kundra said.

“It’s mind-boggling,” said Kundra, a New Delhi native who previously managed information technology for the District of Columbia. “It costs a fortune, it’s duplicative and it’s an energy hog.”

The model Kundra is looking at is known as cloud computing, where users go through the Web to access computers, applications and data instead of through their own servers. He declined to say which companies are best fit to operate government clouds. He noted that Google and Redmond, Washington-based Microsoft have introduced government-focused clouds in the past few months. …

Today, the California Health and Human Services convened a summit with an expected three hundred people in the interest of a state HIE (Health Information Exchange). This project has been tasked by volunteers and state groups and led by Jonah Frolich, deputy secretary of California Health and Human Services. The teams formed have met a series of hurdles already in preparation for the next big phase of executing the next generation system and raising an initial seed of $38.8m to move the effort forward.

At stake is at least $3 billion by connecting to these services for doctors and hospitals that qualify by using the HIE as built. This means that doctors can bill for more Medi-Cal and Medicare payments that are expected to be available in coming years from the American Recovery and Reinvestment Act funds while using HIE services. Additionally, the services being created will need to support applications that engage consumers as they play a role.

We see the opportunity for California's investment to touch many interesting areas of cloud computing, identity management, and mobile - right as it is getting interesting.

Last week, Governor Arnold Schwarzenegger and California Health and Human Services Agency Secretary Kim Belshé named a new nonprofit entity called Cal eConnect to oversee the development of Health Information Exchange services. One of the first tasks at hand is to finish the CA HIE Operational plan and to finalize details in budget, technical, and engagement plans to execute with the recent first grant by ONC for $38.8m. …

Each instance of Windows Azure Service Role runs its own monitor to gather its own instance specific diagnostic data. The problem that immediately presents itself is knowing what exactly is being collected, where the data is being saved, and how to retrieve it for inspection. The purpose of this blog post is to illuminate these areas a little bit better.

So lets start at the beginning… When you create a new Windows Azure Web Role, Visual Studio will automatically add a boilerplate WebRole.cs file to your project. By default, the OnStart() method of the WebRole is overridden with an implementation that starts the Windows Azure Diagnostic Monitor. By default, Windows Azure will log its own diagnostics, IIS 7.0 logs, plus Windows Diagnostics.

The argument to the static Start method of the DiagnosticMonitor class is the Windows Azure Data Storage connection string located in the ServiceConfiguration.cscfg file.

We can inspect the “wad-control-container” of Blob storage to find the collected diagnostic information. run your favorite Windows Azure Storage exploration tool. In my example, I am using the Windows Azure Storage Explorer from the CodePlex site. You can use this tool to download the container and its contents to your local file-system for further analysis. …

The development fabric is a high-fidelity simulation of the Windows Azure cloud environment on a local development. The development fabric is designed for developing and testing and all Windows Azure apps should be fully tested on the dev fabric before being deployed. The development fabric UI can be started in any of the following ways:

Debugging or running a cloud service project from Visual Studio.

Running CSRun.exe from the command line with valid parameters.

Directly from the Azure SDK program Start menu.

Running DFUI.exe from the Windows Azure SDK bin directory.

When the development fabric starts, it can be accessed from the development fabric system tray icon. Figure 3-25 illustrates the development fabric user interface hosting a cloud service.

The development fabric shows the Azure service deployments in the local environment and allows for altering the state of a running service. Services can be suspended, restarted or removed using the development fabric UI. …

Since February 1, Azure customers are now being billed for their consumption of Azure resources. If you were an early adopter, you might have been spoiled by the free usage during CTP over the past year, or even during January of this year, when “mock bills” were generated but no actual costs accrued.

Recently I’ve been fielding questions about the true expense of running Azure web roles and worker roles, including questions about Microsoft’s “free” account for MSDN Premium developers. Let me share a few tidbits here that will, hopefully, help you manage your Azure costs.

and continues with the following topics:

Roles and Virtual Machines …

Virtual Machines and Instances …

Instances and Lifetimes …

Lifetimes and Clock time …

Clock time vs CPU time …

Cost-based architecture: What to do today …

Future-thinking:

The Azure team is reaching out to the community, asking for input about future ideas, where you can suggest a new idea or vote on someone else’s idea (check out the voting site here). I want to draw your attention to a few ideas that could really help reduce cost:

Provide multiple roles per instance. The idea would be to host, say, all of your worker roles in a single instance. Maybe this wouldn’t help with your web roles, since they’ll likely be much busier, but for lower-usage worker roles, this could work out nicely.

We'd also like to hear from Voices for Innovation members about cloud computing. What are you hearing from customers, what are your concerns, and what are your plans? Please take the VFI Cloud Computing Flash Poll to share your views. Once the poll closes, we'll share aggregated results. (To sign up for VFI, click here.)

Also on the cloud: In January, Microsoft General Counsel Brad Smith spoke with VFI about the need for a positive business policy environment to enable cloud computing to thrive. You can view his video here. We have also set up a VFI Cloud Computing resource page with links to more information.

In this 90 minute webcast, the three coauthors of “Cloud Security and Privacy” (recently published by O’Reilly) take a deep dive into cloud security issues and focus on three specific aspects: (1) data security; (2) identity management in the cloud, and; (3) governance in the cloud (in the context of managing a cloud service provider with respect to security obligations).

I just had a very disturbing conversation with a Rackspace Cloud CSR. It went something like this:

CSR: Can I have your account user name and password? Me: You want my password? CSR: Yes sir. Me: You know that's, like, security 101 that you should never reveal a password over the phone? CSR: Yes sir, but in this case we need it to verify your account. Me: OK, let me go change it to something I'm willing to tell you over the phone. [Typety type type] Me: OK, my password is now somereallylongbogusthing. CSR (without any delay): Thank you. How can I help you? Me: Wait, you must either be the world's fastest typist, or you can see my password on your screen. CSR: That's right, sir, I can see your password. Me: (The sound of my jaw hitting the floor.)

I am just stunned. I have used Rackspace for mission-critical servers in the past. They have always seemed reasonably competent, if not always 100% reliable. But this calls into question Rackspace's entire security policy. The first rule of computer security is that you do not store passwords in the clear. Never. Ever. No ifs ands or buts. You Just Don't Do That. And security is particularly critical in cloud computing, where your data ends up on hardware that can be reused by other people. If Rackspace is storing passwords in the clear, what else might they be screwing up? This really calls into question whether Rackspace can be trusted with mission-critical data.

Good grief, Rackspace, I really wanted to like you. But what were you thinking?

[UPDATE:] It really is as serious as I thought. WIth the account password you can change contact information and reset the root password on all your servers. So unless and until this is fixed you should not use RSC for anything mission critical. I hope they do fix this because other than that I really like RSC. Their UI is very well designed, and setting up a server was amazingly fast and painless.

This post has gathered several tweets (#rackspace) and I see no response from @Rackspace among them.

MIX 2010 kicks off next week at the Mandalay Bay Hotel & Casino. It’s not too late to register (if you can find a hotel room) but for those of you unable to attend the conference in person make sure you tune into Channel 9 Live for non-stop live coverage. Full details below:

What:Channel 9 Live is a live and interactive broadcast of interviews with executives, VIP’s, industry luminaries, technology experts, session presenters, mad scientists and everyone in between. If you have a question for folks like Scott Guthrie, Joe Belfiore, Bill Buxton and Dean Hachamovich send them through to us and we'll ask on your behalf, live on the air. It's the next best thing to being there in person.

When: Immediately following the keynotes, Channel 9 Live will be broadcasting from:

Hey there! CloudConnect is next week (already?) and while some of us are already on a plane heading to the Bay area to kick things off (Shlomo Swidler is already on his way, according to his tweets at 36,000 feet) some of us will be lounging preparing for our various workshops and panels until early next week.

That being the case, if you’re not going to be attending and thus missing the panel I’m moderating (what? How could you miss that?) but had a burning question you wanted to ask one of the panelists, let me know. Leave a comment, send a tweet, compose an e-mail, write me a letter (better hurry, Green Bay is a long way from everywhere). If we can fit it in (how many people actually ask live questions during the Q&A, right?) we’ll get it answered, and tweet or post a follow-up next week.

Come to think of it, even if you are attending and just can’t make the panel, or you’re like me and don’t like asking questions in public, send your question anyway.

Here are the details of Lori’s panel from the Cloud Connect site:

Interoperability between networks has fueled the growth of applications that has in turn spurred the growth of networks and internetworking. This cycle has led to increasing strain on networks, applications, and people who manage them. The 'virtualization' of networks, servers, storage, and applications as well as cloud computing not only quickens growth rates, but changes the nature of demand placed on infrastructure. Virtualization and cloud computing ultimately require new kinds of interoperability to reduce the burdens imposed by these technologies. This panel of cloud computing vendors and users will review the challenges of dynamic infrastructure design -- infrastructure capable of sustaining growth while relieving stress -- and will suggest the types of standards necessary to make those infrastructures a reality.

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Abstract: SQL Azure is Microsoft’s cloud database service offering. Cloud-based database solutions such as SQL Azure can provide many benefits, including rapid provisioning, cost-effective scalability, high availability, and reduced management overhead. In this presentation we will cover an architectural overview of SQL Azure and describe how you can use SQL Azure to augment your existing on-premises data infrastructure or as a complete database solution. We will walk through a live demo on creating a cloud based SQL Azure application. Other issues around cloud based computing including security and database maintenance will be discussed.

Law enforcement is no different in its IT needs and constraints from many other enterprises - they run legacy hardware. Migrating to a new on-premise system and hardware can be cost-prohibitive. This compels decision-makers to look for advice and evidence on how to successfully evolve existing solutions via the cloud.

In their session at the 5th International Cloud Expo, Al Perez, Chief Software Architect at Total Computer, and Gunther Lenz, ISV Architect Evangelist at Microsoft, will discuss how Total Computer migrated its existing law enforcement solution to Windows Azure, the effort needed, the obstacles faced. The upshot - lower upfront investment, pay as you go, a new business model, scalability and reliability as well as familiar tools. …

Alex Williams delivered a list of cloud-related sessions at SXSW in his SXSW 2010 for Cloud Lovers post of 3/11/2010 to the ReadWriteCloud:

Fascinated by the cloud and what it means for the future of Web apps, social gaming, open-source and the after life? Then you have plenty to keep you busy if you are heading to SXSW this year.

This discussion looks like one of the better cloud panels at SXSW. It features panelists such as Amazon CTO Werner Vogels, who is there to discuss how cloud computing platforms have evolved so that it is possible to run a ''serverless'' business with confidence.

How does the cloud affect accessibility? Will cloud-computing allow for a metaphorical curb cut out, allowing access to rich Internet applications? Ahhh - another example of how cloud computing is affecting all aspects of our word. From the SXSW guide:

"Could a Software as a Service (SaaS) model deliver assistive technologies as a cloud-based service? The National Public Inclusive Infrastructure (NPII) is trying to do just that. As a facilitator for more rapid deployment of assistive technologies and a means to prototype new business models for emerging assistive technologies, AT could become part of an extensive infrastructure of readily available, electronic curb cuts that allow for seeamless access for a broader range of users than have been included to date."

Pantheon is an open-source cloud hosting initiative for the Drupal development community. Josh Koenig will examine "The Cloud" as a concept, look at the marketplace for cloud services, and dig into what it takes to build an application on a cloud-based platform.

Were not sure about this one. Looks like it's made for SXSW with its discussions about how business and love meet. We're uncertain how the cloud plays into this one but just about everything does these days, doesn't it? Next!

Now talk about ghosts in the machine. Just think, when we die our identities will float in cloud networks all over the world. This may be the most philosophical discussion around cloud computing that we see at SXSW. It's time to explore the digital beyond!

The cloud is everywhere and here at Microsoft we’re flying high with our cloud computing release, Windows Azure. As most of you saw at the Professional Developers Conference, the reaction to Windows Azure has been nothing short of “wow” – and based on your feedback, we’ve organized this special, all-day Windows Azure Firestarter event to help you take full advantage of the cloud.

Maybe you've already watched a webcast, attended a recent MSDN Event on the topic, or done your own digging on Azure. Well, here's your chance to go even deeper. This one-of-a-kind event will focus on helping developers get ‘cloud ready’ with concrete details and hands-on tactics. We’ll start by revealing Microsoft’s strategic vision for the cloud, and then offer an end-to-end view of the Windows Azure platform from a developer’s perspective. We’ll also talk about migrating your data and existing applications (regardless of platform) onto the cloud. We’ll finish up with an open panel and lots of time to ask questions.

SQL was the first-generation Big Data tool, and MapReduce/Hadoop was the second-generation tool. Unfortunately, neither of these tools has the characteristics required to break into the mainstream of data analytics, where there are now over 100 million business professionals (non-programmers) grappling with exponentially growing data volumes that they simply can't handle.

However, a new third generation of tools for Big Data is now emerging that offer the scalability, parallelism, performance and data flexibility of tools like Hadoop, but, unlike Hadoop, can also continuously process realtime data streams. Moreover, these new tools are as easy to use as a spreadsheet.

Cloudscale is the world's first realtime, massively parallel cloud platform for Big Data. The Cloudscale platform can be used by anyone who can use a spreadsheet - no databases, no programming - but is as powerful as anything used by the top parallel programmers at Google or the most experienced realtime analytics experts on Wall Street. Over 100 million Excel power users now have a seamless extension from their Excel desktop to the full power of cloud computing to handle their big data. The platform provides the elastic scalability and performance required to cope with anything from one-off personal data analysis tasks up to the most demanding large-scale analytics apps required by the world's leading organizations in business, web, finance, scientific research, and government.

With Cloudscale, an ordinary business user can now develop realtime analytics apps in minutes that would take a team of expert programmers months, and anyone can now sit in Excel and with one click launch an app on billions of rows of data that would normally take days to compute, but the user gets the answer from the cloud in 2 minutes.

Bill McColl is Founder and CEO, Cloudscale Inc. In order to found Cloudscale he left Oxford University, where for over twenty years he was Professor of Computer Science, Head of the Parallel Computing Research Center, and Chairman of the Computer Science Faculty.

The dual Web role application has been running in Microsoft's South Central US (San Antonio) data center since September 2009. I believe it is the oldest continuously running Windows Azure application.

About Me

I'm a Windows Azure Insider, a retired Windows Azure MVP, the principal developer for OakLeaf Systems and the author of 30+ books on Microsoft software. The books have more than 1.25 million English copies in print and have been translated into 20+ languages.

Full disclosure: I make part of my livelihood by writing about Microsoft products in books and for magazines. I regularly receive free evaluation software from Microsoft and press credentials for Microsoft Tech•Ed and PDC. I'm also a member of the Microsoft Partner Network.