I recently needed a way to upload images and other files to Windows Azure Storage. I like the way some sites implement ActiveX controls or other plugins that allow you to drag and drop the files onto the browser itself for upload. Naturally when I read about the new capabilites of Silverlight 4 to accept dropped files I was dying to write a little upload control. This blog is hosted on Windows Azure, and as mentioned in previous posts, the blog-entry content files (images/xap files) are stored in Windows Azure Storage.

Fig 1. screenshot of upload control with 4 files

I did a quick search for Silverlight upload controls and came across an interesting project on CodePlex being worked on by Darrick C (thanks!). This project saved me quite a bit of time as it already had code for transferring a file from Silverlight to IIS. The code supported very large file sizes and larger chunk sizes than a webservice upload would generally support. There were some additional features there as well, that I didn't care about, but would mention should anyone else be interested, such as image re-sizing and upload resume.

Now on to the interesting parts:Drag and Drop support - accepting a file drop event in SL4 is extremely easy. You can accept a drop from any control that derives from UIElement by setting AllowDrop=true.

You can see that we try to load the files as images so that we can show thumnails in the UI, defaulting to a generic icon in cases where the file is not an image. Through databinding the files are then shown in the UI as a thumbnail and a progressbar. The work of uploading the file is performed by the FileUpload class which communicates with the HTTPHandler using querystring parameters. It also raises events when the upload of the file is complete. There are a couple of things I don't like about this class, one of which is the fact that the function of uploading the file is intermingled with UI telling the users the file already exists on the server. Since this was a quick and dirty for my own personal use I didn't refactor this yet (this code is largely unchanged from the codeplex project), but I would put it high on the list.

Storing the File in Azure - the HTTP handler does a couple of things.

Creates a Container in Azure for holding uploaded files if it does not already exist.

Checks to see if the file already exists in Azure

Listens for chunks of an incoming file writing them to a temporary file in the filesystem. I wanted to avoid this step, but I couldn't find a way to append to a file in Azure storage.

Writes the file to Azure Storage and deletes the temporary file.

The heavy lifting is done in a class called AzureFileUploadProcess. A couple of interesting bits of code:

//the first time this method is called lets make sure our container is setup properly non-authenticated users have permissions // to access files in the Azure container.void Initialize(HttpContext ctx)
{
CloudBlobClient client = CloudStorageAccount.FromConfigurationSetting("StorageAccount").CreateCloudBlobClient();
//containers must be lower case - this seems strange to me but I cannot get it to work with uppercase
CloudBlobContainer container = new CloudBlobContainer(_containerName, client);
container.CreateIfNotExist();
container.SetPermissions(new BlobContainerPermissions() { PublicAccess = BlobContainerPublicAccessType.Blob });
_container = container;
}

The above code is responsible for creating the Azure Container. Containers in Azure can most be thought of as specialized file directories. They can define metadata that applies to the files they contain and also can have permissions. In the above code I create the container and allow public access to it (wouldn't do my much good to upload images no-one could get to).

Next is some code demonstrating check Azure Storage for file existence

CloudBlob blob = Container.GetBlobReference(filename.ToLower());
try
{
blob.FetchAttributes(); // this will throw an exeception if it does not exist
context.Response.Write(blob.Attributes.Properties.Length); //report the size of the file
}
catch (Exception) { context.Response.Write("0"); } //I don't let checking for existence via exception handling, // but that was the recommended/only approach on the MSFT blogs.
context.Response.Flush();

And finally we write the file to Azure and delete the temporary file

CloudBlob blob = Container.GetBlobReference(filename);
blob.UploadFile(filePath);
File.Delete(filePath); //we stored this in a temporary area - delete it now that we are done
FileUploadCompletedEventArgs args = new FileUploadCompletedEventArgs(filename, blob.Uri.ToString());
FileUploadCompleted(this, args);

In the attached project I actually included an UploadProcess that is just designed to work against the FileSystem for those of you who aren't using Azure. As you can imagine, the code is not all that different from the Azure code.

Obviously this is not intended as the be-all-end-all of file upload controls. There is a lot of room here for neat things like

Show the list of files already on Azure, presumably so you could delete/move/rename them.

The ability to pause an upload via the UI

Set metadata in Azure storage to futher categorize the files

This should really be implemented as a custom control so that the UI can be re-styled more easily.

A Visual Studio 2010 project can be found here with all the code. I should note that you will need to have the Azure tools installed for this to work. Presently the project is setup to use the local Development Storage so you can test it without having to setup a service on Azure. Also - for some reason on my machine when debugging I cannot drop files onto the control (the mouse cursor even indicates it won't work). The solution is to open an entirely new browser and test it there.

11 Comments

This is a great sample of using SL4 and Azure. I have the same issue when I debug the code in VS2010. Yes, I can run the second browser window to drop files, but I cannot debug in this case. Have you resolved this problem? If there any work around? I am thinking to enhance your sample into a full functional photo album management site. Of course, for free. What's on your mind?

KevinZkevin.zhang.canada at gmail dot com

posted by
KevinZ
on
Sun Mar 14th 2010 at 7:34 AM

I set Chrome as default browser and start debug your application. I can drag drop files into chrome. But this won't work in IE (Windodws7). I think there may be a problem in IE with Silverlight 4.

BTW, I think the FileUploadProcess.cs is useless. You did not really use the class. It has been replaced by the Azure version.

posted by
KevinZ
on
Mon Mar 15th 2010 at 5:09 AM

Hi Kevin -Did you try this on SL4 RC yet? Since this was written on SL4 Beta I wonder if the debugging issue has since been corrected. I plan on trying it out later this week.

BTW - I included the FileUploadProcess.cs just in case someone wanted to use it in a non-azure screnario.

posted by
Brett
on
Thu Mar 18th 2010 at 12:50 PM

I tried in VS2010 RC with SL4 RC, still the same problem. You can double check.

posted by
kevinz
on
Thu Mar 18th 2010 at 11:41 PM

I was wondering if uploading a large file in chunks will break in a load balanced scenario. Since each post could go to a different server, isn't it possible that individual chunks could get written to different temporary files?

posted by
Ozzy
on
Wed Mar 31th 2010 at 10:38 AM

Hi Ozzy - A load balancer would probably have to be taken into account unless it supports sticky sessions. If it didn't then one simple approach would be to write the chuncks to a shared network location instead of a local path on the balanced server. Of course that would result in a single point of failure, but you could always program in a rollover location if the first server couldn't be reached.

posted by
Brett
on
Thu Apr 1st 2010 at 1:22 PM

Kevin-Regarding debugging drag and drop: I was debugging a totally unrelated Outlook Addin and while in my debugging session I couldn't drag and drop items into Outlook either. I looked into it and it turns out the problem is that UAC prevent drag-drop operations between processes executed by diffent accounts. In my case I was running visual studio as Administrator (elevated). Once I switched to running Visual Studio in non-elevated mode I was fine. I just tried it for this project and it also worked - sort of. Because the Azure Tools require VS to be run elevated you cannot debug the solution end-to-end. Kind of a catch-22. Supposedly if you disable UAC then this is a non-issue.

posted by
Brett
on
Thu Apr 1st 2010 at 1:39 PM

Brett, this is a great post, and a great example. I have a follow up question regarding downloading as opposed to uploading. I want the Silverlight 4 app to be able to download a file given that the SL app knows the blob container and filename on Azure. However, I don't want to leak any Azure keys to the SL4 client.

Specifically, let's say that the SL4 app accesses a record from SQL Azure, which contains the blob filename. Similar to your upload example, the SL4 app would communicate with the server, passing the filename, and have it downloaded into SL4's memory for displaying. Only the server would know how to specifically access blob storage (just like your upload example). Any examples or direction you could give would be appreciated.

posted by
spbgeb
on
Fri Jul 16th 2010 at 10:21 PM

Ozzy & Brett, to solve the load-balancing issue, what about copying the files to a container that contains ONLY temp files, then move (copy) the file to the destination container once it is uploaded. Can you think of any downsides to this approach? I believe this solves the load-balancing issue in that the destination is always known and is not local to the server, and does not clutter up the desitination container.

posted by
spbgeb
on
Sat Jul 17th 2010 at 11:10 PM

spgeb: Regarding downloading files: I assume that the files you want the SL client to download are non-anyonymous files, otherwise you wouldn't need the key - just the Url. Assuming that is the case you could generate a shared access signature that you would pass to the SL client for download. I haven't done this myself, but here is an article that discusses the concept. http://blog.smarx.com/posts/new-storage-feature-signed-access-signatures

Regarding load balancing: Yes - I think that would work, with the only downside being that you are going to generate a lot of traffic between the webservers and the shared location as each file will consume the (file size/divided by the chunk size )! * chunk size unless you can append to the file in the shared location. I was just doing some reading and it looks like some folks have tried an approach where they are using PutBlock to append to an Azure Blob. http://social.msdn.microsoft.com/forums/en-us/windowsazure/thread/C60EB581-A503-463F-A102-E4A54099EB18 That would be the best way to handle the situation, other than the fact you would have to delete incomplete uploads otherwise they would sit in Azure Storage forever.

posted by
Brett
on
Mon Jul 19th 2010 at 3:32 PM

Hi everyone,

Thanks a lot Brett for this post, this is of considerable help.I am currently working on a web app that used "standard" upload on a server, and could get this working using Azure storage thanks to your article.

I noticed the server returns to the client when it has uploaded the file, but the blob may not have been uploaded to Azure storage yet. Using this proxy, there is a delay which comes from the asynchronous call (in the AzureFileUploadProcess handler): blob.UploadFile(tempFileOnServer)...

Is there a way for the client to wait for the blob to be created on Azure storage before saying "congrats, you successfully uploaded N documents..." ?

Again, thanks for this great post. Any help on this would be particularly appreciated!CheersRaphaël