I have been building a website recently using Azure Web Site hosting and ASP.NET MVC 4. As someone who doesn’t usually do web development, there has been a lot of new stuff for me to learn. I wanted to allow website users to upload images, and store them in Azure. Azure blob storage is perfect for this, but I discovered that a lot of the tutorials assume you are using Azure “web roles” instead of Azure web sites, meaning that a lot of the instructions aren’t applicable. So this is my guide to how I got it working with Azure web sites.

Step 1 – Set up an Azure Storage Account

This is quite straightforward in the Azure portal. Just create up a storage account. You do need to provide an account name. Each storage account can have many “containers” so you can share the same storage account between several sites if you want.

Step 2 – Install the Azure SDK

Step 3 – Setup the Azure Storage Emulator

It seems that with Azure web role projects, you can configure Visual Studio to auto-launch the Azure Storage emulator, but I don’t think that option is available for regular ASP.NET MVC projects hosted on Azure web sites. The emulator is csrun.exe and it took some tracking down as Microsoft seem to move it with every version of the SDK. It needs to be run with the /devstore comand line parameter:

To make life easy for me, I added an option to my External Tools list in Visual Studio so I could quickly launch it. Once it starts up, a new icon appears in the system tray, giving you access to the UI, which shows you what ports it is running on:

Step 4 – Set up a Development Connection String

While we are in development, we want to use the emulator, and this requires a connection string. Again, most tutorials assume you are using an “Azure Web Role”, but for ASP.NET MVC sites, we need to go directly to our web.config and enter a new connection string ourselves. The connection string required is fairly simple:

However, because we are using an Azure web site and not a Web Role, this throws an exception ("SetConfigurationSettingPublisher needs to be called before FromConfigurationSetting can be used"). There are a few ways to fix this, but I think the simplest is to call Parse, and pass in your connection string directly:

Note: the code I used as the basis for this part (the Introduction to Cloud Services lab from the Windows Azure Training Kit) holds the CloudBlobClient as a static variable, and has the code to initialise the container in a lock. I don’t know if this is to avoid a race condition of trying to create the container twice, or if creating a CloudBlobClient is expensive and should only be done once if possible. Other accesses to CloudBlobClient are not done within the lock, so it appears to be threadsafe.

Step 9 – Save the image to a blob

Finally we are ready to actually save our image. We need to give it a unique name, for which we will use a Guid, followed by the original extension, but you can use whatever naming strategy you like. Including the container name in the blob name here saves us an extra call to blobStorage.GetContainer. As well as naming it, we must set its ContentType (also available on our HttpPostedFileBase) and upload the data which HttpPostedFileBase makes available as a stream.

Note: One slightly confusing choice you must make is whether to create a block blob or a page blob. Page blobs seem to be targeted at blobs that you need random access read or write (maybe video files for example), which we don’t need for serving images, so block blob seems the best choice.

Step 10 – Finding the blob Uri

Now our image is in blob storage, but where is it? We can find out after creating it, with a call to blob.Uri:

Step 11 – Querying the container contents

How can we keep track of what we have put into the container? From within Visual Studio, in the Server Explorer tool window, there should be a node for Windows Azure Storage, which lets you see what containers and blobs are on the emulator. You can also delete blobs from there if you don’t want to do it in code.

The Azure portal has similar capabilities allowing you to manage your blob containers, view their contents, and delete blobs.

If you want to query all the blobs in your container from code, all you need is the following:

The account name is the one you entered in the first step, when you created your Azure storage account. The account key is available in the Azure Portal, by clicking the “Manage Keys” link at the bottom. If you are wondering why there are two keys, and which to use, it is simply so you can change your keys without downtime, so you can use either.

Note: most examples show DefaultEndpointsProtocol as https, which as far as I can tell, simply means that by default the Uri it returns starts with https. This doesn’t stop you getting at the same image with http. You can change this value in your connection string at any time according to your preference.

Step 13 – Create a Release Web.config transform

To make sure our live site is running against our Azure storage account, we’ll need to create a web.config transform as the Web Deploy wizard doesn’t seem to know about Azure storage accounts and so can’t offer to do this automatically like it can with SQL connection strings.

Step 14 – Link Your Storage Account to Your Web Site

Finally, in the Azure portal, we need to ensure that our web site is allowed to access our storage account. Go to your websites, select “Links” and add a link you your Storage Account, which will set up the necessary firewall permissions.

Now you're ready to deploy your site and use Azure blob storage with an Azure Web Site.

Comments

February 22. 2013 04:09

Great write up. Really easy to follow and I'm up and running in no time. I was prototyping a new website feature to allow users to upload files and I wanted to host it on Azure websites just to demo it to a client. But, uploading files to Azure websites was a bit of a black hole to me. Seriously, your article was perfect...Thanks!

Thank you for the post. I have a TFS project that was an MVC EF site converted to Azure. I couldn't actually get it to run on my Visual Studio 2013 development environment. I can tell by your examples that the previous developer did what you did. My problem however is that we are moving the website and database off Azure and quite frankly it isn't going well and I believe it's due to blob storage and local development emulators. Anyhow your post is helping me greatly in reverse engineering the solution to non-Azure. Why get off azure you may ask. We are on free trial's for everything except a backup database. The bill for that was 256 for one month. Once the bills for everything else starts kicking in Microsoft will own our business.

About Mark Heath

I'm a Microsoft MVP and software developer based in Southampton, England, currently working as a Software Architect for NICE Systems. I create courses for Pluralsight and am the author of several open source libraries. I currently specialize in architecting Azure based systems and audio programming. You can find me on: