Over the last weekend I decided that it would be worth trying out signalR for Dollars2Pounds as it appears to be an ideal technology for pushing updated exchange rates down the the client.

Behind the scenes I had my RabbitMQ client watching for a broadcast exchange rate message, when this was received it updates the local cache and then pushed a message out through a signalR client group based on the currency pair to those clients subscribed to the group/pair. Nice and simple! I was even able to re-use the view model I had for the ajax service which turned nicely to a json object on the client.

Sadly this evening I had to role back to my previous implementation. The main issue came as the site was taking the strain of the busy part of the day at a steady 800 connections, when suddenly it turned to 2000 with a 20,000 backlog, I'm not sure exactly what happened, it wasn't a sudden burst of traffic as far as I can tell. The net result of this was the exchange rate sites being unavailable with IIS being overloaded.

Their are a few other issues that mean signalR isn't suited to Dollars2Pounds and the network of sites, particularity when running the sites load balanced across more than one server, subscribing to a group, which is how the client gets only the exchange rate they are interested in, becomes difficult with the current version.

Below are a few graphs from EC2 monitoring, the final two days (6 and 7 of Feb) are with singalR running, the previous days I had a 10s poll to the server implemented.

CPU usage as you can see with the signalR implementation the average load was a bit more.

Network out is about the same, a bit more noisy. It's not so easy to seperate out the sending of a small data packets for the exchange rate when compared to the html, javascript, css and images that are served. Most visitors stay on the site for about 3-4 minutes so their would typically be 18-24 exchange rate packets (100's of bytes/packet) to every page, css files, image (ca. 100k) and chart data (300k) combination (400-800k).

Network In is one area you might expect a big difference going from polling to the pish model, but is difficult to see the change, however given that the exchange rates refresh every 10s so signalR would get the response and then open an new connection at about the same interval the 10s polling worked out it's not a suprise.

Whilst it's possible to look on this experience as a failure of signalR, I think it's more of a compliment that I could spend a Sunday afternoon starting with no knowledge to putting in a server side hub and a little bit of JavaScript and have a push driven exchange rate update live by the evening. I had already put a lot of effort into the polling and caching so for signalR to be comparable out of the box with no tuning says lots.

I've been really impressed with how simple it was to get the solution up and running and it's a great technology which is being actively developed over on GitHub so I'm going to be keeping an eye on it with a view to re-enabling it.

Currently Dollars2Pounds uses regular polling to get the latest rate from the server. The ASP.NET MVC endpoint is cached for a short time so that it can server the rates quickly, the maximum frequency that I update the rate is every 10s for USD to GBP, are refreshed between every 60s or daily dependent on the currency, and now the browser polls for the rate between 10s and 30s intervals again based on the currency and this appears to be working nicely, so in the tradition of if it ain't broke don't fix it, I'm, erm, un-breaking the fix I put in for the thing that wasn't broken in the first place.

So, no real change for Dollars2Pounds but some good fun with signalR along the way.

One advantage the Arduino has over the Netduino is a lot of high power GPIO lines, 13 of them each capable of 40mA which poweres 2 LEDs per IO line fairly brightly, and 6 of these can be run as PWM which make LED's appear go dim or bright.

Armed with a spare Arduino, a little too much time on my hands, a ghost that needed adapting from 110V to work in the UK and a bunch of white LEDs I set about making an Arduino powered Ghost ready for Halloween.

Here's the result (first 12 seconds are dull, hold on in there):

Warning: This video contains some images of bright flashing LEDs, don't watch if you are affected by that!

The .Net Micro Framework has been around for some time now (it's at version 4.1), and as with all things Microsoft it's really taking off post the V3 release.

Sadly the development boards for the Micro Framework have been fairly limited in their range and price (read expensive!), until recently with the release of the Netduino, a .Net version of the popular Arduino platform.

This fantastic board is idea for hobbyist electronics hackers like myself, it's cheap, easy to connect to and fun.

I've got a few projects I'm tinkering with for the Netduino and Meridian/P using the .Net Micro Framework, and I'm writing up my experiences over on my IImplement.Net blog, if your interested in the Micro Framwork go have a look.

This Friday (17 Sept 2010) Dollars2Pound.com (as well as the network of other forex websites), and ThreeTomatoes.co.uk (because it's on the same server) suffered a degradation of performance, this was due to me rolling out a new feature which included a nice little bug. Sorry to those of you effected by this.

As it turns out when you use JQuery's .fadeOut using css class selector the fade completed function gets called for each element that implements the css class. Well Dollars2Pounds has three of them, so the fade completed function got called three times. This caused a bit of a problem because I had it so that when the fade out was complete it would get the latest exchange rate from the server and fade that in. Net result, three rapid fire hits to the server for every browser that had Dollars2Pound open, and that was quite a few, so I inadvertently launched a distributed denial of service attack on myself!

Fortunately I was able to pick this up quickly as I run continuous monitors for site performance and page content under a local CCNET box (in addition to the basic SiteUpTime monitor which messages be if the site goes down hard).

The performance monitor is fairly relaxed allowing about a 5 second time-out for the page to load, but on Friday that went red. When I connected to the server it was running at 100% CPU and at the end of the day the log file was nearly ten times it's normal size, which was caused during the few hours it took to spot and fix the problem.

The good news is that this is now fixed and that Dollars2Pound updates the exchange rate it shows every minute (other sites are less frequent as I only fetch the exchange rate less frequently).

I've also become a big fan of Clicky recently, one thing I find fascinating is how repetitive the traffic to Dollars2Pound is. Here's a typical chart showing the same weekday one week previous, I'm amazed how similar the pattern is.

On the other hand, here's Friday's chart and the one for the previous Friday, you can see the effect that the performance issues caused, I would have expected the traffic to be similar, but as it happens lots of people gave up waiting for the page to load. So, as Google tells us, clearly performance matters!

Previouslymyself, Alan and Alastair had pitched in down the pub to create a simple MSBuild to S3 publisher task, mainly as a way for us to learn the Amazon Web Services (AWS) API. It turns out I wasn't the only one wanting a MSBuild task for S3/AWS, the week that followed saw Roy Osherovetweet that he was looking for something similar and Neil Robbins was looking for EC2 automation.

So with that, and a bank holiday weekend at hand the S3 publisher task got extended. It now supports:

More S3 commands

CreateS3BucketTask

DeleteS3BucketTask

PutS3FileObjectTask

PutS3TextObjectTask

DeleteS3ObjectTask

SetS3ObjectAclTask

S3BuildPublisher

EC2 Automation

AssociateIpAddressTask

AttachVolumeTask

CreateSnapShotTask

CreateVolumeFromSnapshotTask

CreateVolumeTask

DeleteSnapShotTask

DeleteVolumeTask

DetachVolumeTask

DisassociateIpAddressTask

RebootEC2InstancesTask

RunEC2InstancesTask

StartEC2InstancesTask

StopEC2InstancesTask

TerminateEC2InstancesTask

SimpleDB

CreateSimpleDBDomainTask

DeleteSimpleDBDomainTask

PutSimpleDBAttributeTask

GetSimpleDBAttributeTask

DeleteSimpleDBAttributesTask

Simple Notification Service

AddSNSPermissionsTask

CreateSNSTopicTask

DeleteSNSTopicTask

PublishSNSNotificationTask

SubscribeToSNSTopicTask

UnsubscribeFromSNSTopicTask

Simple Queue Service.

CreateSQSQueueTask

DeleteSQSQueueTask

SendSQSMessageTask

DeleteSQSMessageTask

ReceiveSQSMessageTask

WaitForSQSMessageTask

If you want to see how to use the .NET SDK for AWS have a loot at the code, it gives a really simple insight and the code for the MSBuild task library is licensed under MS-PL so you can take what you want.

Recently a few of us got together in a local pub to enjoy a brief glimpse of the British sunshine, some code and the nice beer served at the Tram Depot.

The Goal: (Other than to drink beer)

We wanted to learn about the AWS Api and create a MSBuild target that could copy files to S3 as part of a build. I'm currently using S3, CloudFront and EC2 for hosting and storage to support Dollars2Pounds and my various other websites but much of the deployment still requires a variety of tools and manual steps so a automated build for this would make life simpler.

After a lot of flapping around trying to get the Three portable WiFi hotspot to work, some food and a drop of beer the coding began.

A few lines of code (and another beer!) later we had the basics in place, a MSBuild task that would create a S3 bucket and add an object to it. Sadly we had to stop coding at that point so we could catch the first part of Dr Who's weeping angles (did I mention we are all geeks?)

I finished the S3 publisher off latter that evening, so here's how how to use it.

Use this task to store your Aws credentials on the machine. They get stored in the registry and the secret access key gets encrypted on the way using the EncryptionContainerName. This saves you having to embed your aws credentials in build scripts.

S3BuildPublisher

This is the main task which will copy the files to the appropriate S3 bucket, and takes the following parameters:

EncryptionContainerName – this is the container name you used when storing the client details, without this the secret key cannot be decrypted.

SourceFiles – single filename or a MSBuild array (i.e. @(files) ) of source files from a ItemGroup.

DestinationBucket – the bucket to copy the files to, will be created if it does not exist.

PublicRead – if set to “true” the files will be marked as public readable, otherwise they will be private.

The source contain two examples of using these tasks, the PublishData.proj file is used for debugging the tasks and the Publish.proj file is used during a CCNET build to publish the binaries to S3, using the task it's self.

It currently has some limitations, sub folders are not supported, error handling is very light weight, no choice of aws region.

We have released this under the MS-PL license so do as you please with it, if you like it why not join us for some more CAMDUG pub coding sessions, perhaps next time in a pub with WiFi.

Thanks to Alan for his help and driving the keyboard and Alastair for the photos and input.

As a software developer it's all too easy to turn up to work, do what's required, go home in evening and once a month pick up your pay. Just where does that get you?

Well maybe your happy doing that, then 5-10 years down the line you find you've been doing the same thing, the same way, never really pushing the boundaries or questioning if what you are doing is the best way to do it. Net result - you've failed your self and your employer, and if you end up needing a new job, well we all know how picky the tech industry is about having a perfect match of acronyms on your CV.

Ever since I had my first computer I've always had a personal project, be it software or hardware. These projects are for me to solve problems I have, every now and then one of them makes it out the door and I share it with the rest of the internet.

Ten years ago I was a Lab Rat, working as a scientist testing blood gas analyzers and pH meters. In my spair time I had I started buying bits from the US (I'm in the UK) and trying to figure out the cost in pounds wasn't that easy (open calculator, figure out if it's divide or multiply, press buttons, hit equals). So I started Dollars2Pounds.com, mainly for myself, and the 10 other people who initially visited the site every month. That's now grown a lot, but is still true to it's roots, an easy way to calculate the cost of something in your local currency.

If I hadn't started Dollars2Pounds.com I wouldn't have touched any web programming. In 2002 I went a step further and built a beast of a site in php, now that's taught me one thing, I'm not a fan of lots of php, it's great to get started with but once things get busy, ohh boy it's not so great.

The thing is, during the time I was building those sites my day job involved connecting up scientific instruments to computers, one way or another. I'd never have touched web based programming if I only did my day job. And you know what, that would have been a huge mistake. I've learnt so much from escaping the binds of the day to day work and trying new things.

I can't emphasise enough as a developer how eye opening it is to release a product or service your self. It's all to easy to sit in our cubical wearing our Dilbert curled up ties and laugh about marketing and sales, but until you've actually tried it and seen the results you will never understand how frustrating it can be, or how amazing the feeling is to see people using YOUR product and how strange the things that your users do truly are, and the actual challenges your product faces on the internet, not just the perceived problems where you think you need to shave a microsecond off of some routine that it turns out people don't use anyway.

Over the years my websites have grown, from the original Dollars2Pounds.com, shortly after followed Pounds2Euro.com, Dollars2Euro.com, Dollars2Yen.com, as well as Yuan, Won and Rupees, these combined result in a network of around 23 websites.

When these sites were in php I was maintaining a core set of functionality and then some individual domain specific pages, over the years I've put a bit of work in here and there (for the most the sites just run themselves), and moved over to ASP.NET (thanks to listening to DotNetRocks), I've consolidated the code base into one single codebase with just a seperate configuration file, database and installer for each site.

That's been the story for about a year now, my builds were taking >25 minutes to build just the installers (3 mins for the main app), the web msi installer that comes as part of VS2008 has problems, I've found installing upgrades on the site would often not update files, so upgrading the websites involved running un-install on 23 msi's, installing 23 new msi's and then saving them away to be able to automate the un-install next time around.

I also have a CCNET project that goes and runs some basic tests on each of the 23 websites to ensure that the correct configuration got installed, the site is performing ok and is not down.

So here's the problem:

I was kind of happy with that, it wasn't ideal, I didn't like the 30ish minutes between me checking in a change and being able to start a roll out, not to mention the trouble in updating 23 different websites, but it worked.

More and more I see company specific sub-domains (i.e. analysisuk.fogbugz.com for my bug tracking) on websites and I'm a big fan of that style, it makes me feel special, unique and I know from Spolsk's wramblings that I have my own little database for my bugs so a bug in the fogbugz code base isn't likely to show another company or competitor my bugs.

I had always wondered how sub-domaining website was done. Was a new IIS/Apache website created and the appropriate files copied over and a new database created in the background? (which by the sounds of it is how StackExchange is currently working). How about DNS updates?

The Soultion:

I was investigating another project thats been on my mind, this project would greatly benefit from sub-domains like the analysisuk.fogbugz.com one. So I did a bit of research. Ideally it would be one website that connected to a different database, or just had an additional identifier in the database based on the subdomain (the analysisuk bit).

Well it turns out thats all fairly easy.

On IIS you can add bindings each domain you want to resolve to the website. (e.g. bindings of CompanyA.AnalysisUK.com, CompanyB.AnalysisUK.com for a single website will send both CompanyA and CompanyB to the one site, or you can have a single IP based site and have the DNS records point to that IP address so no need to setup bindings).

In ASP.NET you can get the Host server variable to tell you the full host name (possibly including the port). So you might get something like “CompanyA.AnalysisUK.com:80” (HttpContext.Current.Request.Headers["HOST"])

it's important to also really include the VartByHeader=”host” in any page caching you do, otherwise CompanyA might see the page for CompanyB and you really don't want that happening!

Bingo, multiple sub domains on one IIS website with a master database giving a host lookup to get some kind of unique database or identifier.

Shortly after finding this out it occurred to me that this solution would work well for Dollars2Pounds.com it's network of sites. The host variable gives the domain name not just the subdomain, I can't believe I didn't think of that before. This was the missing link, between one way of working and another.

And so the result of a Saturdays worth of coding is a new table to the master database to give domain name resolution (e.g. Dollars2Pounds.com and Pounds2Dollars.com both resolve to the Dollars2Pounds website). Configuration was shifted from web.config files to a simple database table keyed by the website, strings for the website are now keyed by the website and string key.

Twenty three separate databases merged into one database, twenty three IIS websites merged into one website, a 30 minute build became 3 minutes, a 30 minute deploy became a 5 minute deploy, the web server CPU baseline load went from 20%+ to about 10% just by removing those 22 extra websites.

And whilst the benefit of a quicker build and deploy sound nice what's missing from that picture is that it also means I'm more likely to add a small feature to Dollars2Pounds and roll it out quickly which is a massive benefit. I'd tinkers with advertising changes some time ago but never released the code.

The downside? Well there is a small downside, I use to like to roll out changes to one of the quieter websites first (Dollars2Yen.com or something like that) to check that I hadn't missed something in testing that when deployed on the production server wouldn't stuff up the website. Now if I stuff it up I stuff them all up!

What's my point?

Do something different, it might just pay off big time for your main work.

If your job is writing embedded code, go build a website in your spare time, it doesn't matter if it's a failure, you will learn and that means it's not a failure - failure is not doing it. If you job is writing websites go grab an Arduino and play, it's fun and again, you might just learn something.

If you are an employer take Googles advice. Let your developers have 20% time to try something totally different and radical, you might just find your main product benefits. How many big companies do we see spending a fortune to buy a small company with some technology thats different from the big companies. Maybe if the devs had 20% time that would have grown in house and saved a fortune?

And if you are a job seeker, avoid like the plague companies that don't like you having projects in your own time, they are welcome to the 9-5 developers who go home and watch TV or who go down the pub in the evening and don't touch a computer until the next morning.

(1) AWS in its self backs up what I'm trying to say here, look at Amazon, look at AWS, the book seller who's now selling compute and bytes by the hour, who'd have thought it, but it's massive. Again, Amazon tried something different outside of their day to day work. If Microsoft had done it, well no one would be surprised, the surprise is they didn't and now they are playing catch up with Azure.

Naturally it would have helped if I'd remembered the that PowerShell uses New- as well as Add- and had not got my brain stuck into thinking of the bindings as a collection/dictionary, I might have noticed New-WebBinding earlier!

If like me you have a lot of domain names pointing to a single web application (more on this in a future post) you will want to add the binding information for each of the domain names, as well as the www subdomain.

So, here's how to do it (yes, this is more of a reminder to myself for next time!)

a) Fire up the IIS PowerShell console.

b) Check your websites and current binding information by browsing to IIS:\sites

Like me you've probably heard various voices on the internet speaking of the importance of A/B testing. I've listened and agreed and tried a few bits here and there, but I've never seen such an obvious result as with my latest Adwords.

You may have noticed I recently launched RememberTheBin.com, a reminder service so you don't forget to put the bin out. Initially I set up a Google Adwords campaign with one advert and that didn't get much interest at all so I added a second very similar ad.

Here are my two ads:

Initially these ads were getting served equally, fortunately Google has kicked in and realised it's not making money from the second one so stopped serving it so often.

The “Free SMS, Twitter and email reminders to put the bin out” ad was my first shot and am I glad I decided to do an A/B test on it. Talk about a useless ad. No clicks what so ever, zero, nothing, nada, zilcho and that's over about a month!

It's interesting to note how similar the text of the two ad's are and how different the responses are, they both have the same keywords, cost and even the same words!

My question to you is this: Are you running AdWords, or even SEO, or specific landing pages? Have you tried AB testing? If not, go, go now and try! I'll wait for you...

Which brings me nicely onto the SEO AB testing. That's a lot more difficult and time consuming because you want the search engines to update their index with what you want them to see. Instead invest some cash in Google AdWords and play with the AB testing through that and see what people click on, what gets served more, then use that information in your SEO campaign.