This post is one of those brief ones I put out here as a reminder to myself and as a potential help to anyone else experiencing similar grief.

Recently, out of nowhere, I started getting the error “No valid combination of account information found” when executing a parse on a connection string for an Azure Storage Account. Specifically, the failure occurred on

CloudStorageAccount.Parse(storageSetting);

Come to find out, in the midst of a huge merge of code, someone had inadvertently changed the connection string in the Web.config file of my ASP.NET site. The string changed from

Data stored in Azure Storage accounts is very securely protected by Microsoft. There is very little (if any) reason to worry about catastrophic equipment failure causing your data to be lost. It’s the beauty of the cloud! However, Microsoft can’t protect our data from inadvertent user errors that we or our software might make that could corrupt or destroy our data. Because of this, backups are still a very necessary part of life in the cloud.

Unfortunately, there really wasn’t an easy or cost effective way to backup table data from an Azure Storage Account (as of 3/1/2016). Microsoft doesn’t offer an app on Azure for tables and, while there are a few backup players, their solutions are pretty costly. Through searches on the interwebs I have found that I am not the only person with this dilemma, so I set out to figure out how to do this quickly, effectively, reliably, and cheaply.

My first stop was at the new and cool Azure Storage Data Movement Library (DML). My thought was that I could use the DML library in an Azure Web Job. Everything would be contained, everything would be in the cloud, everything would be tight. However, I was disappointed to find that the DML does not yet support tables. Further, offers to add table support to the open-source DML by the community was met with delay by Microsoft, as Microsoft’s reps said to wait, that table support would be forthcoming. That’s great, but that doesn’t help me right now.

So, with some poking around, I put together a solution that doesn’t cause too much pain and get the job done effectively. My solution uses the command line AZCopy.exe tool from Microsoft and a batch file with a few tweaks to backup a list of tables and blob containers. To make my backup work, I spun up a virtual machine in Azure, using the cheapest available configuration (A0) and loaded AZCopy. I also copied the backup.bat file onto the machine. I then used the Task Scheduler to call my backup.bat file at a given interval. When the scheduler hits the bat file (in my case once a day at midnight), it pulls all of the table blob data to my virtual machine and then pushes the data back out to a backup storage account for safekeeping.

Later, if I experience a catastrophic screw up of either my or of an infinite loop’s proportions, I can restore the data to a new storage account, do some testing, and then cut over my web app to the new storage.

You can checkout the backup and restore batch files here. Bear in mind that I would like to improve the restore batch file at some point so that it is not necessary to spell out every table manifest file to restore. If you have ideas or solutions, please contribute back.

Microsoft’s Azure Search service is an incredible way to provide your users with a very powerful data navigation tool. With its REST API and .NET library, it is platform agnostic, meaning that you can utilize it in your web app, mobile client, or desktop app.

As I type this, the sweet, sweet clicky sound of my new CODE Keyboard fills the air and my fingers, typing speed, and general demeanor couldn’t be happier. I have been debating purchasing a mechanical keyboard for a while now and decided to treat myself to a nice keyboard for Christmas.

My prior keyboard was a Logitech KB350, which is a wireless, rubber-dome keyboard. I used it for at least five years and it served me well. However, I had been hearing all of the hype about mechanical keyboards for quite some time. A couple friends had gotten them for gaming, but swore by them for general typing for coding as well. After some research, I discovered that rubber-dome keyboards require the key to travel the full distance to the PCB (printed circuit board) to make contact with the circuit on the board. Supposedly, this causes more fatigue and causes more typing errors due to the longer key travel.

Logitech KB350 and CODE Keyboard

I have been using this keyboard for a day, so I can’t speak to it’s magical powers of productivity increases, but I can say that it is very solidly built. Here are my first impressions:

WASD Keyboard takes pride in their devices. You need to go check out WASD Keyboards. They make custom mechanical keyboards that are solid. This keyboard is heavy. The possibilities of the keyboards that they can make are endless. You can go off-script and make any custom mechanical keyboard you want, with different key colors for every key if you want. You can even have them print images across your keys. Further, you can specify which MX switch key color that you want on your keyboard.

Cherry MX mechanical keys are cool. The most important customization that you can make with a mechanical keyboard is the type of switch that is used for each key. After a lot of consideration, I went with the Cherry MX blue keys. I thought that I would like the travel and satisfying click of the keys and so far my assumption was correct. Some folks like the harder-to-push green, or the no-click browns. To me, these blues are perfect, but you can choose from blue, green, brown, or clear to suit your particular typing style.

I really like the LED backlighting. You can choose between 6 different levels of brightness. It’s just a cool, clean look that is nice in a dark office. You should know that (at the time of typing this) WASD doesn’t have a backlighting option for their other keyboards (other than the CODE). So if you want, you can print your cat’s picture across your keys. You just can’t have the keys backlit.

The CODE Keyboard is a collaboration between WASD Keyboards and Jeff Attwood of the popular Coding Horror blog. This keyboard looks and feels like a clean, professional keyboard. There aren’t any crazy colors to distract. Just a solid keyboard to get code done.

The multimedia controls on the keys are understated, but very convenient. On my old KB350, the multimedia controls are on top of the keyboard. They stand out very prominently and add about 2 inches to the top of the keyboard. On the CODE, the multimedia controls are in the Home cluster as function keys. You can control them with one hand very easily. The nice thing about having the keys as function keys is that you don’t have a lot of other clutter making the keyboard larger. This isn’t a big deal to me, but I nice the nice, sleek form factor of the CODE.

CODE is configurable for a Windows or Mac and you can tweak the function of the Caps Lock and OS keys via the dip switch on the back.

Will the CODE Keyboard make my C# code more efficient? Will it make my Azure Powershell scripts more readable? Will it make my writing more interesting? Probably not. However, in the journey from my head to the screen, my fingers and ears will experience a miniscule bit of tactile and audible happiness, which, when multiplied by each click will make me happier, so I think the CODE was a good purchase.CodeProject

Microsoft has provided a great amount of tooling for access to and modification of resources on their Azure platform. There is the Azure Portal and there is also a comprehensive set of cmdlets for Azure Powershell. The Azure CLI is yet another way to script changes to your Azure resources.

There comes a time, though, when you need tooling to modify Azure resources in a specific way. This has led to the creation of the BlackBarLabs.Tools.Azure library.

Currently, the library contains a single tool, the AzureTableAndBlobCopier. This tool is a wrapper around some great work by Alexandre Brisebois. With this tool, you can specify a source and target Azure Storage account and the tool will copy the tables and blobs from source to target. This is a great tool to use when moving data from, say, a staging environment to a user acceptance testing environment.

To use the tool, pull the code, update the app.config to have the connection string for your source and target storage accounts, then run the exe. That’s all there is to it.

I recently set up a project on Visual Studio Online. After pushing the source code to a VSO-hosted Git repository, I went to set up the VSO-hosted CI builder. Setting it up was a breeze. Microsoft has really made this a great product.

The only problem that I found was that when the build ran, the project build failed because MSBuild couldn’t find a dll that I had referenced.

This all led to a StackOverflow Q&A post – one of those where you post a question and answer it at the same time. Check it out here.

The mountain of tasks in front of you looks indomitable… You see the sheer size of the project in front of you, you step back, and you still can’t seem to see your way to the boundaries of the task at hand. You wonder how you can ever learn the laundry list of new libraries and tools. That backlog of tasks stares back. Delivery pressures loom. The weight of carrying your team presses down on you.

You wonder if you can ever reach your goals… You begin to realize that you have no idea what the finish line looks like. Can you develop that solution using tools you already have? How in the world will you find the time to learn the new tech required to pull this thing off? You wonder these things…

And you don’t even recognize the poisoning, limiting fear of failure as it seeps into your mind…

You realize that you may be in over your head… Your team is counting on your leadership. As a developer, you feel as if you are a total impostor.

Paralysis sets in… Which way to go? What if I blow it this time? What if we slip the ship date – again? Will I be able to see my kids this week? How in the world can we get this done?

So you sell out… There’s definitely a right way to do this, but the way I/we have done it before will work well enough. We could meet the deadlines for our deliverables. I could grow as a developer, but not right now.

And then you truly fail… Your software kind-of works, but doesn’t meet your product owner’s expectations. You ship, but your iPhone app retention rate is abysmal. You could have learned something new, but instead you chose to use that old tech that you already know and you slipped that much further behind the curve. Your company could have delivered something truly unique, but you warmed up last year’s tech, your customers rejected it, and you lost.

There was a time when I was terrified of failure. But then I found myself in an environment where failure was not considered the end, but rather, was considered one more step towards success. And not just any success, ultimate success. There is a difference between the two.

You or your team can have successes, but for you to be ultimately successful, you have to grow. You have to build success on success so that you can continue to grow, continue to stay apprised of the latest technologies, and continue to grow so that you can outpace your competitors. You have to change your mind about how you play the game; you have to play to actually win, not to play just so that you don’t lose.

You see, most companies, teams, and people let the fear of failure cause them to play safely, just enough that they don’t lose – this time. But that ultimately leads to losing anyway. The pattern goes something like this:

If you play just enough not to lose…
You will be afraid to fail.
If you avoid failure…
You will avoid learning.
If you avoid learning…
You won’t grow.
If you don’t grow…Your competition will outpace you and you will eventually lose anyway.

So how do you help your team overcome the fear of failure?

Play to win – Don’t be satisfied with getting through the next day or the next project. Don’t write code just to satisfy the minimum requirements. Go beyond. Reach. Extend yourself and your team beyond your competitors. Don’t let them outpace you. Win.

Foster an environment of learning – As a leader, you must assure your team that you are interested in their personal growth. As a team member, you have to look for ways to absorb something new with every passing project. Let your failures become learning moments, too. Every failure is just one more step towards victory.

Complacency is the enemy – The problem with achieving the mountaintop is that you can’t stay there forever. There is always another hill to climb. Stop and enjoy the view as you encounter your successes, but don’t get comfortable. Seek out the next hill. Find the next waypoint and get your team there. Keep moving the ball forward.

We’ve all heard the story of Edison and his reply to a reporter when he was asked how it felt to have failed 1000 times in constructing the first commercially viable electric light bulb. In his reply, “I didn’t fail 1000 times. The light bulb was an invention with 1000 steps,” Edison demonstrated the spirit that we and our teams should have. So, go forth, embrace your failures, play to win, learn, and ever combat the loathsome enemy that is complacency. You never know where your 1000 steps may lead you.

As I sit in the corner of this Starbucks, I am doing something that I haven’t had the chance to sit down and do for a while. I’m taking a minute to reflect. A moment to sit still, drink a venti white chocolate mocha and contemplate where I have been over the past year and a moment still to ponder the future. You see, after a year-and-a-half of work on the coolest project of my life, the core development team was cut loose yesterday – and that’s okay. It’s okay because projects come and go. It’s okay because the skills I picked up are mine now – forever. It’s okay because the team I worked with is still cohesive and is actively working to help each other move on to new opportunities. I’ve made friends. I’ve made money. It’s all good.

Flash back to July 2014… In a moderately trendy office in a renovated old factory, I sat with the CTO with the sound of ping pong clacking in the background. I expressed my concerns about the viability and stability of a startup, but I was sold, not on the company, but the idea of the job when he promised that in coming there, I would be miles ahead in technology and would be moved from the middle to the forefront of the technological curve. He was right.

It is insane what our team accomplished since July 2014. We wrote, tore down, refactored, debated, and rebuilt. We turned up servers, ran up an impressive Azure bill, and moved the ball down the field. We changed storage no less than three times. We tore out the unscalable and made it gorgeously asynchronous and modularly distributable. Aided largely by a genius-level CTO, the team moved to a lean, functional-code writing set of ninjas in a time-frame that would both impress and amaze insiders and (I am completely confident) would stand up to any Silicon Valley team anywhere, any day.

And the team! What a great set of people. Sitting literally feet away from your nearest colleague in open-format seating for days on end has a way of coalescing a team. With few exceptions (the exceptions were exiled), this group of once disparate developers came together to build something pretty amazing. I have never been in such a collaborative environment and I attribute the successes that we had to the selfless personalities that were around those tables. Even now that we aren’t directly influencing the technology, our collective spirit is still moving as we are coordinating our efforts to help one another turn the page to the next adventure.

And that next adventure. What will it be? I’m not sure yet – and that’s okay. It’s okay because I personally have grown in ways that I didn’t know were possible. It’s okay because I am part of a team that is still collaborating to solve the next problem, finding our next gig. It’s a positive future and I can’t wait to see what happens next.

In continuing with the theme of this week, it looks like Starbucks is the next company in my life that wants their chair back, so I’ll be moving on now. So, to the past, the good, the not-so-good, the fun, the learning, the toil, and the now legendary Christmas refactor of 2014, I say so long, and thanks for all the fish.

Microsoft Azure PowerShell is a very powerful way to script deployment and tweaks to your environment in Azure. There are cmdlets for creating webapps, storage, and other resources. You can also use the PowerShell to automate modifications to app settings on your sites. The hardest part is getting started, so here are a few steps to take to give it a shot:

Launch the PowerShell and run the cmdlet Get-AzurePublishSettingsFile. This will launch your default browser and take you to a page to download your Azure subscription file. You need to do this step so that the PowerShell will have your credentials.

Follow the instructions by running the Import-AzurePublishSettingsFile with the path to your settings file. This will return the names and Ids of your environments. You can now use this to run the many cmdlets available via Azure.

One other important thing to keep in mind for later, when you may be managing multiple Azure subscriptions, is that you need to make sure that the Azure Powershell has the subscription you want to work in set as the default. To find out what subscription is set as default, run the command “Get-AzureSubscription”. In my example, Powershell knows about two subscriptions that I have been working with. You can see from this example, that Powershell knows about two subscriptions and that the Free Trial subscription is set as the default.

To switch to the other account so that my scripting will run against it instead of the Free Trial subscription, I had to run “Select-AzureSubscription”. This allowed me to select my Pay-As-You-Go subscription by name. Running “Get-AzureSubscription” after this showed that the Pay-As-You-Go subscription was select. All subsequent commands acted upon this account.

This is (by no means) meant to be a complete post on all of the powerful things available via the Azure PowerShell, but rather is a quick start to get your settings file and get moving. For more information, read up at the Azure PowerShell page.

I recently worked with an endpoint in an MVC project that intentionally returns a 409 conflict HTTP status code when a user posts a model with an Id that already exists in the database. Upon encountering this conflict, the server is supposed to return the 409 status, but the client also expects to receive the conflicting record in the response body. When running this code in the debugger and hitting the endpoint on localhost via Postman, the 409 status is returned and the existing record is shown in Postman as it is passed in the body. However, this was not the behavior encountered when hitting this endpoint on a site deployed to Azure.

I deployed the same code that I ran in debug to a WebApp in Azure. When I did a post to this same endpoint in the Azure site, the 409 status was returned. However, instead of seeing the conflicting record in the response body, I received this error:

The page cannot be displayed because an internal server error has occurred.

This was strange, considering the fact that I had just seen the conflicting record returned when posting to the same endpoint on a server running in debug on my machine. After some research, I learned that Azure is suppressing the body on purpose to mask error details (e.g. stack traces) from being shown to consumers of the API.

I found a way to ensure that the body is returned from the site on Azure when a 400 or 500 series status is encountered. It is to be used with caution because you could expose more information than you want about 500 status, such as stack trace information. To ensure that the body is returned in these cases, add these lines to your web.config:

This post is one of those brief ones I put out here as a reminder to myself and as a potential help to anyone else experiencing similar grief. Recently, out of nowhere, I started getting the error “No valid combination of account information found” when executing a parse on a connection string for an Azure […]

Data stored in Azure Storage accounts is very securely protected by Microsoft. There is very little (if any) reason to worry about catastrophic equipment failure causing your data to be lost. It’s the beauty of the cloud! However, Microsoft can’t protect our data from inadvertent user errors that we or our software might make that […]

Microsoft’s Azure Search service is an incredible way to provide your users with a very powerful data navigation tool. With its REST API and .NET library, it is platform agnostic, meaning that you can utilize it in your web app, mobile client, or desktop app. Check out my series of posts on the East Five […]

As I type this, the sweet, sweet clicky sound of my new CODE Keyboard fills the air and my fingers, typing speed, and general demeanor couldn’t be happier. I have been debating purchasing a mechanical keyboard for a while now and decided to treat myself to a nice keyboard for Christmas. My prior keyboard […]

Microsoft has provided a great amount of tooling for access to and modification of resources on their Azure platform. There is the Azure Portal and there is also a comprehensive set of cmdlets for Azure Powershell. The Azure CLI is yet another way to script changes to your Azure resources. There comes a time, though, […]

I recently set up a project on Visual Studio Online. After pushing the source code to a VSO-hosted Git repository, I went to set up the VSO-hosted CI builder. Setting it up was a breeze. Microsoft has really made this a great product. The only problem that I found was that when the build ran, […]

Maybe this has been you at some point: The mountain of tasks in front of you looks indomitable… You see the sheer size of the project in front of you, you step back, and you still can’t seem to see your way to the boundaries of the task at hand. You wonder how you can ever […]

As I sit in the corner of this Starbucks, I am doing something that I haven’t had the chance to sit down and do for a while. I’m taking a minute to reflect. A moment to sit still, drink a venti white chocolate mocha and contemplate where I have been over the past year and a […]

Microsoft Azure PowerShell is a very powerful way to script deployment and tweaks to your environment in Azure. There are cmdlets for creating webapps, storage, and other resources. You can also use the PowerShell to automate modifications to app settings on your sites. The hardest part is getting started, so here are a few steps […]

I recently worked with an endpoint in an MVC project that intentionally returns a 409 conflict HTTP status code when a user posts a model with an Id that already exists in the database. Upon encountering this conflict, the server is supposed to return the 409 status, but the client also expects to receive the conflicting […]

The Problem I recently created a new ASP.NET Web Application project in Visual Studio. Upon launching it for debug in the Google Chrome browser, I was greeted with an SSL Connection Error screen from Chrome. I also noticed that the url was changed from “http:” to “https:” (https://localhost:49500/). I looked at my project settings in […]

I just got back from a great tech conference in Knoxville, Tennessee called “CodeStock”. It was a 2 day event where software developers, technologists, and entrepreneurs gathered to listen, discuss, and share information about the latest in software development and software related technologies. The first day was a treat as I got to hear Scott […]

A couple weekends ago, I had a great time at Hack Tennessee 7. For the uninitiated, a hackathon is an event where developers from all over the area gather to work on projects for fun for a weekend. It was a great time of meeting colleagues, checking out some cool tech, and (as an added […]

Several times in the past few months, I have experienced an error loading a solution in Visual Studio 2013 that goes something like: “The ‘’ package did not load correctly.” Every time that I have had this error, the solution has been to go to the “C:\Users\\AppData\Local\Microsoft\VisualStudio\12.0 folder and delete the ComponentModelCache folder. I suspect […]

Recently, I have been writing integration tests that cover async functions. During this work, I have encountered a quirky pain (of my own making) that I wanted to document. The purpose of the documentation is two-fold. First, the shame and self-deprecation of calling myself out will help to reinforce this point in my mind, and […]

In my last post, I showed a way to generate unit tests for existing code where a pattern exists using .NET’s System.Reflection library. The post showed an executable that would take the paths to a dll and an output file as input parameters where this executable would then tear through the assembly, looking for string […]

There are times when you need to write unit tests and it is important to ensure that you get full code coverage. That first sentence alone will tip you off to the fact that this is not a TDD scenario, but rather a situation where you need to create test cases for pre-existing code. It […]

As I write this post, I’m a very grateful programmer, sitting in an apartment that my new company has graciously allowed me to stay in during my transitional period in my move from Huntsville to the Nashville area. I haven’t posted in some three months because I have been rebooting my life. My old company […]

If you work with 3rd party .NET libraries, you may find yourself in a situation where you would really, really like to see what is going on “beneath the hood” of that black box, 3rd party assembly you are using. I have encountered cases where the API documentation wasn’t really clear as to what a […]

One of the problems that we see at my work, often on a newly created development box, is that you can launch ArcMap from the debugger, but can’t break and examine code and objects. The problem for this often lies in the ArcMap.exe.config file. You must allow for the proper version of the .NET runtime […]