Bambooee: Breaking the sales barrier @ 83 OPM (Orders per minute)

When I heard the news that one of our clients would be featured on NBC’s SharkTank, a show with 7.9 million weekly viewers, I knew things were going to get interesting. The site was to be broadcast to the world and we had only one chance to catch a massive swarm of orders – if only we could design the right net.

“4,000 orders during the west coast showing, and another 3,000 when it aired on the east coast.”

I originally built and configured Bambooee.com a few years ago. I was handed a group of beautiful comp PSDs and told I had the weekend to get the site online. No problem. By sheer luck, I had just programmed my own custom, lightweight shopping cart in the language required. I chopped up the images, created a template, and got the shopping cart hooked up. That was Friday. On Saturday, I added order tacking, some basic admin tools (such as reports and postage services) and worked on a unique and intuitive checkout screen. I polished, finished, and beta-tested on Sunday. The end result was a solid, fast, good looking site. I didn’t know it at the time, but what would would make me the most proud of this site was it’s built-from-scratch infrastructure.

Late last year I was told that Bambooee would be featured on the hit ABC Show “SharkTank.” The shark that invested in them, Lori Greiner, gave us a stern warning.

Greiner, founder of QVC, quipped “Every site that’s been featured on SharkTank has crashed due to the massive influx of visitors….”

A well designed site can only perform as well as the host that serves it, so we needed to align ourselves with the right partners. Azure, one of the world’s biggest web hosts, would host an automatically scaling and stable platform. CloudFlare, a cached-content delivery provider, would ensure all code and images were quickly delivered around the world. And lastly Pingdom, a site pulse monitoring service, to ensure we received notice the second there was a problem.

We opted not for a custom configured virtual machine, but just a simple, Azure website service. I configured the site with “scale by metric” and set the Target CPU from 30-50. I then set the instance to Large, 4 cores & 7GB Memory. It’s important to note that when you have multiple server instances, you need to configure Azure Cache to allow the site instances to talk to each other. This means a user who started an order for Jane Smith on server #1 would be able to continue the order if traffic was directed to server #2. Because the database was relatively small I opted for their Standard 1GB offering.

But I didn’t stop there – for people that weren’t on the site to order something, I wanted to make sure the HTML and images were loaded lightning fast. We signed up for a distributed/cached network delivery provider that “spreads” the static content throughout the world. CloudFlare does this really well. They even included a free basic SSL which suited our needs. It was a no-brainer. All the non-dynamic about and info pages would now be spread around the world, delivered at the snap of a finger from CloudFlare’s network of servers. From there it was just a matter of squeezing every jpeg and script down to it’s absolute smallest size.

The big day came. I anxiously watched Azure scale from one server to six. After the east coast airing, I broke the news to my client – they sold over 7,000 orders in two hours – a feat even Amazon would be proud of, and a SharkTank record.

Of course there were some hiccups (a few hundred people didn’t receive their email receipts due to an SMTP send cap), but it was still a massive success. If this were a WordPress, NotNetNuke, Joomla, or any other CMS/template/cookie cutter site, Bambooee would have buckled under the pressure. But because this site was built from the ground up, and then put on a fully scalable solution, the sky was the limit . This was an impeccably tailored web-suit on a very uniquely shaped client-body. Not a single line of code didn’t have a purpose.

There were a lot of moving parts. And as part of the planning process, we even created a simple backup site where our DNS could quickly be update to route to. This site contained a simple form where we would apologize, collect e-mails, and later send “We’re sorry coupons” in case something went really, really wrong….