On February 1st, additional pricing tiers for high volume users of
Amazon CloudFront go in to effect. We've
been working to reduce our costs and to pass our savings along to you, our customers.
If you are in the top bandwidth tier you can deliver content to customers in the
United States and Europe for just $.050 per GB
(one US Nickel).

The existing tiers apply at the 10 TB, 50 TB, and 150 TB transfer levels. We've added
new levels and corresponding price breaks at 250 TB, 500 TB, 750 TB, and 1 PB. You
can visit the CloudFront home page
to see all of the pricing tiers.

I would also like to call your attention to a number of useful CloudFront resources:

The class will take place on February 17th in Oakland, California. Chris, the author of
POJOs in Action, creator of the
Cloud Tools, and proprietor of the
Cloud Foundry
(recently reviewed in
eWeek), will cover Amazon-style cloud computing, deep-dive into AWS, talk about the
use of
Amazon EC2,
provide an overview of the Cloud Tools, and then jump into developing and
deploying on EC2.

Tom Lounibos, CEO of
SOASTA, dropped me a note to tell me that they recently
used the EC2-powered
CloudTest sytem to simulate the effect of 500,000
concurrent users downloading songs from a major media site. In an
entry
on the SOASTA blog, Tom notes that

Companies such as Hallmark.com,
Genentech, and Proctor & Gamble have discovered Cloud Testing as an
affordable and scalable alternative to traditional stress testing. For a
few thousand dollars huge stress tests can be acheived in as little as one hour.

These tests generate a substantial amount of data, ranging from 100 GB up to
1 TB in certain cases. SOASTA's testing tool also provides real-time analytic processing
of this huge amount of data.

Prabhakar from
Ylastic
wrote in to tell me that they have added full support for
EC2's new
EU region.
They provide access to
all EC2 resources (AMIs, access keys, security groups, elastic IPs, EBS volumes, and EBS snapshots),
along with with a set of filtering and searching toools. Support for mobile (iPhone and
Android) versions is almost ready and will be rolled out in the near future. Ylastic
also tracks and organizes all EC2 alerts.

I met
Yuvi Kochar, CTO of
The Washington Post,
at a DC-area CTO event a year or
two ago. We've stayed in touch and I paid a visit to him and his team last year
on one of my trips east.

A few months ago Yuvi let me know that he needed a Wiki running in the EC2
cloud for a new project. After getting a good understanding of his needs
I pointed him at the
Mindtouch Deki product
(previously blogged here).
Yuvi
wrote about his experiences
in selecting and setting up the site,
noting that "Our dev and prod sites were up and running in days! Incredible!"

Last week, in conjunction with the inauguration day
festivities, they launched
Whorunsgov.com. On the site you can get an
insight into the people behind the scenes in Washington DC. You can see how deals
get made and how policy is shaped. Read more on the site's
blog.

While I was putting the finishing touches on this post, I saw a
Tweet from the
folks at Mindtouch. They put together
an excellent
press release
with even more information about the project and the selection criteria.

Jamal from
Kaavo
wrote in to tell me about the recent release of their
IMOD
product. IMOD stands for Infrastructure on Demand. IMOD uses an
application-centric view of lifecycle management, managing anything
from a single server to a complex multi-server system using their
dynamic template technology. They also support goodies such as
AED256 encryption of EBS volumes, monitoring of CPU, I/O, and memory
usage, alerts for application service levels, and automatic backup
of EBS volumes.

On Wednesday, February 4th, AWS evangelist Jinesh Varia will talk about use of AWS
within the biotech community. Co-sponsored by
Cirrhus9,
the event will take place at Pfizer's lab in La Jolla, California. Attendance
is free but you need to register.
You'd better hurry, there are just 17 seats left.

Simone and Alexis will each speak for 20-25 minutes, and there will be time for some
Q & A after that. The meeting will then adjourn to the
Crown Clerkenwall Green
and Alexis will buy the first couple of rounds! Once again,
registration is a must.

The Amazon WS Tools provide full support for
CloudFront,
SimpleDB,
S3,
SQS, and
EC2. Each tool provides one or more views into
the corresponding Amazon web service. For example, you can enter SimpleDB queries in
the Query Editor, and you can see the results in the SimpleDB View, as you can see
here.

The tools can be downloaded here. Bob provided
me with the following installation instructions:

Pete from
Juice Analytics
wrote to tell me about their new
Concentrate
application. Concentrate is a search analytics tool for SEO
and paid search professionals. It allows them to make better
decisions, target SEO efforts, understand paid search campaigns,
and better understand customer needs. Concentrate discovers
and visualizes interesting patterns in the search queries used
to locate a site. The word tree shown at right is just one
form of output.

Concentrate is available under a number of different
pricing plans, ranging from
Free
all the way up to Max. You can also start out by looking at the
online demo.

On the technical side, Concentrate runs on EC2. Here's what Pete told me:

The front end is powered by
the DJango framework
and
the jQuery JavaScript library,
all load balanced atop a number of EC2 instances.
They run
MySQL
and store their data on
EBS volumes.

The back end consists of an EC2 cluster running Concentrate's text
mining algorithms. It accepts requests using a REST API and returns
data in
JSON format. The back end uses
Amazon SQS and
S3, and was
patterned after the model found in our
Grep The Web
example. Scaling is handled using a combination of
iClassify,
Puppet, and
Capistrano. The deployment
infrastructure was built by
HJK Solutions of Seatle.

Andrew from
Job Bounty Hunter wrote to tell me
that they'd launched the site and that every last bit of it is running on top of
AWS.

Employers and recruiters can use the site to publish job advertisements, along with
a cash bounty on each job. Bloggers and web publishers can post the ads on their
site using a variety of
widgets.
Job seekers apply for the jobs via the widgets.
If they are placed in the job and stay there for 90 days, the blogger or web
publisher collects the bounty! As you can see from their
bounty chart,
there's currently $7,800 up for grabs just a few days after their launch. This
is a really cool way to monetize web traffic while also providing a very
interesting service for the site's readers.

Because the widgets are embedded in other sites, Andrew has no idea when a
traffic surge might hit. They must
be able to scale to match the traffic of all of the sites that are
displaying their widgets. Of course, this need for elastic capacity made
it a perfect for for AWS!

They decided to serve up all of the widgets as static HTML. They are created
using PHP and then downloaded using Curl into static HTML. The static HTML
is then deployed to
CloudFront (via
S3) and
Nginx. The front end of the site was implemented
using HTML, CSS,
Adobe Flex, JavaScript, and (again)
jQuery.

Job Bounty Hunter currently runs on three EC2 instances. The first runs the
free version of IBM's DB2 database.
The seocnd one runs
Apache,
Postfix,
PHP, and some web service code. The third instance
runs Nginx and serves up static content such as widgets, JavaScript, images,
HTML, and SWF files.

Andrew also sent along a very nice architecture diagram as an
image
and as
PDF.

In the
post,
he talks about how Glue is used connect people and things,
recognizing books, music, movies, and other everyday topics in hundreds
of web sites and connecting people around the topics. He then goes on
to discuss the reasons why a relational database won't solve his
problems, including scale and automatic partitioning and replication
of his data. From there he describes his use of 30 SimpleDB domains
to hold information about People, and another 30 about Things. He
uses the
djb2
hash algorithm to spread the information out across
the available domains, and also stores data redundantly to
obviate the need for joins.

Lasso2GGO uses a number of different services
including
S3,
EC2, and the
Mechanical Turk. This service makes it easy to
convert "analog" business cards into handy digital data.

You can take a picture of the card with your cell phone web cam, or otehr device and upload
it to Lasso. After some image enhancement on EC2, the image is transcribed by a
Mechanical Turk worker and deposited in Salesforce,
an email, or a spreadsheet. Read all about this in the recent Information Week story,
How The Cloud Enables A New Set Of Personal Applications.

The SimpleDB Management Tools integrate directly into Visual Studio 2008 and above. The
tools support direct addition, editing, and deletion of SimpleDB data and domains and
can also run SimpleDB queries. The tools plug in to the O/R Mapper, with direct dragging
of SimpleDB domains from the management tool into the mapper.

And that's all that I have time for today. I hope that you've enjoyed this glimpse
in to some of the cool stuff that our developer community (now 490,000 members strong)
has been up to. Send me
information about what you are doing with
AWS and I'll do my best to fit you in.

Late last year an entrepreneur from Turkey visited me at Amazon HQ in Seattle.
We talked about his plans to use AWS as part of his new social video
portal startup. I won't spill any beans before he's ready to talk
about it himself, but I will say that he has a really good concept,
strong backers, and infectious enthusiasm for the online world.

He's now ready to hire a software architect and designer in order to
bring his vision to life. I've posted the job below; you can
send your resume to apply@web.tv
if you are interested, qualified, and located in the right part
of the world.

Software Architect & Designer

We are a reputable Internet technology, software services and e-commerce company based
in
Istanbul and
Bursa,
Turkey.
We are looking for a talented Software Architect who will
be working in Istanbul for a certain period of time, for our new global scale
"social video portal" project. Below are the qualifications required and job
description for the position to be held.

Qualifications:

Extensive knowledge of web technologies.

Experienced in web based application design and development.

Solid bacground in object oriented design and development.

Preferrably experienced in live broadcasting over the internet, video streaming, video sharing and social networking web site development and design.

Knowledge and experience of design and development of multi-tier, distributed, massively multi-user systems.

Experienced in Cloud Computing applications (preferably with AWS).

Very good command of PHP or Python.

Experinced in relational database design.

Familarity with Erlang, and knowledge or experience of Java, C/C++, Ajax, Adobe Flex, mySQL is a plus.

Self motivated, enthusiastic, team player.

Job Description:

Will be mainly responsible for designing the overall system for a multi-tier, massively multi-user live video multi-casting, videosharing web site which will also have features of a social network.

Will be involved in Design and Development phases of software development cycle. Will contribute to the Analysis phase.

Will lead the Software Development Team for the period of the contract and report to the Project Coordinator.

Almost! Micro Focus just deployed a managed mainframe emulation environment that runs on top of Amazon EC2. The beauty of this environment is that you can execute CICS or IMS code in the cloud, essentially unmodified. Micro Focus’ innovation has three significant benefits: (a) your existing code investment is protected, (b) costs are much, much lower than an in-house mainframe, and (c) the application execution environment is managed, complete with the same sort of service level agreements that are expected in the mainframe world.

Mark Haynie, the Micro Focus CTO of Application Modernization, says that in his opinion “Cloud computing is about services, not languages”. Since Micro Focus Enterprise Cloud Services allows COBOL CICS and IMS applications run in the cloud as easily as they run in on-premises datacenter, the consumer of those services will not know or care what language they were written in.

Darryl Taft at eWeek also wrote an article about this, which you can read here.

The announcement is about more than bringing mainframe applications to the Amazon Cloud. In my opinion, it is also another example of an entire ISV ecosystem that is cloud-enabled. Because Micro Focus’ own development community is supported by Enterprise Cloud Services, there is a new business opportunity for many developers that blends tradition with innovation. It’s easy to forget that mainframes are at the core of many enterprises; and for that reason I believe that interest will be significant.
You can learn more at cloudservices.microfocus.com.

You are invited to join the Amazon SimpleDB team on Tuesday, January 20, 2009 at 9am PST for the first session of our new Developers’ Brown Bag. During these once monthly webinar sessions, developers will hear from the technical experts behind SimpleDB, and have the opportunity to engage in live Q&A.

In addition, developers are encouraged to pre-submit any questions they may have, to allow for a more thorough response during the live webinar. For those struggling with the development of a new application, sample code and a description of the intended application may also be submitted for review and discussion.

Amazon
CloudFront
was designed to make it really easy to distribute content to users at high speed
with low latency. Here are some new tools which provide a nice end-user interface
to CloudFront.

The newest Freeware release of the
CloudBerry Explorer
now includes CloudFront support. You can create and manage distributions,
assign CNAMES, and even
automate the entire process using the
Windows PowerShell. CloudBerry Explorer also includes some powerful support for
batched changes to S3 object Access Control Lists.
There are a couple of helpful videos here.

StreamInCloud is a free
FLV (Flash Video) encoder.
You simply create an S3 bucket and give StreamInCloud permission to read and
write it. It then monitors the bucket for new videos, encodes them into
the FLV format, and places the encoded version in the bucket. Of course, if
the bucket is part of a CloudFront distribution, the encoded content
is then available worldwide at high speed with low latency.

StreamInCloud encodes the videos at 512kbps and leaves the size as-is.
This service is free; an advanced version with additional features and options will
be available later at an additional charge.

As I noted earlier this week, Ylastic
allows you to manage your CloudFront distributions
from your iPhone.
There's now support for the Google Android Phone as well.
Watch the screencasts to learn more.

On the surface,
CloudBuddy
looks like a free S3 bucket explorer tool with full support for CloudFront.
However, there's quite a bit more beneath the surface. It is actually a
platform with a highly refined
architecture.
All CloudBuddy operations are exposed as
APIs.

The distribution includes a
Microsoft Office
plug-in to help you to manage your documents,
workbooks, emails, presentations, and projects in the cloud.
Source code
is available.

Bucket Explorer also has a number of unique and very handy features including the ability
to copy objects from one S3 account to another along with
timed backups to S3. It is available
for Windows, Mac, and Linux.

Today we’re announcing the availability of the Web-based AWS Management Console, which in this first release provides management of your Amazon EC2 environment via a point-and-click interface. A number of management tools already exist: for example a popular Firefox extension known as Elasticfox; however as you read more of this post I believe you’ll agree that the new console is compelling--especially when it’s time to log in as a new AWS developer.

For starters, it’s easier than ever to gain access to your Amazon EC2 environment. The console provides access via your Amazon username and password. No more certificates or public/secret keys to manage! If you’re like me, I never seem to have my own computer at hand when I need to check the status of the Amazon EC2 farm, or for that matter when I need to launch a new instance. It’s a lot easier to log in with a username and password than to use those same credentials to retrieve my keys, configure Firefox (if it’s even on the borrowed computer) and then log in.

Then there’s the new point-and-click AJAX user interface for managing Amazon EC2 resources. No more page refreshes every time something updates; and a timer refreshes management console components, such as the status of running instances, every few seconds.

The AWS community creates an amazing selection of innovative Amazon Machine Images, or AMIs. In fact, the count is now a staggering 1200 AMIs and growing! That’s quite a menu to choose from—especially if you are a first-time user. The new Launch Instance Wizard walks you through starting your first instance; offering a short list of Linux and Windows server choices. Choose one of these AMIs, and then the wizard even suggests which ports to open in the firewall. It’s smart enough to suggest that you open SSH (port 22) for Linux images, and RDP (port 3389) for Windows instances. The wizard even suggests settings that restrict Amazon EC2 access to “your computer only”.

And as I hinted in the opening paragraph, this is just the first in a set of Console interfaces that will provide a UI layer on top of AWS infrastructure services. We’ll be adding additional Amazon Web Services in the future.

The console feature list is extensive, and provides intuitive management of all these things:

As usual, I've got plenty to blog about. Here's a glimpse at some of
the interesting things that have recently landed in my inbox.

On Thursday, January 8th, Information Week and Amazon
will present a Webcast titled
How To Plug Into The Cloud.
Attendees
will learn how Eli Lilly
has used Amazon's
EC2 servers and
S3 storage
to support its pharmaceuticals research. Presenters will
include Dave Powers from Eli Lilly and Adam Selipsky from
Amazon Web Services. Attendance is free but you do need
to preregister.

The folks at rPath will be conducting a series of tech events on
alternate Tuesdays starting on January 13th. Attendees
at the first event will learn how to build a virtual appliance using
rBuilder Online
and launch it on Amazon EC2. Once again, attendance
is free but preregistation is a must!

Content Workspace has built a document scanning
and indexing service around
Amazon S3 and the
Amazon Mechanical Turk.
Access to the Mechanical Turk workforce gives them the ability to
offer document and form classification, document indexing, data entry,
translation, detection of duplicates, and
more.

I met
Chris Richardson a year or two
ago at a conference in Philadelphia. Since then
we've stayed in touch and have bumped in to each other at conferences on the east
and west coast. His Cloud Tools
project makes it easy to deploy, test, and manage Java EE applications on
EC2. Now, Chris has created the
Cloud Foundry in order to make this
process even easier. You can simply drop your application's ".war" files
into a container managed by Cloud Factory and running on EC2. You can choose
between a number of runtime topologies and can manage your entire cluster
with one button. Learn more by watching the
Cloud Foundry screencast.

Mike from Cirrhus9 wrote to tell about
his company and their cloud computing service offerings. They focus on the
Life Sciences industry, with 3 of the 4 founders coming from the pharmaceuticals
business. They got together as a "Cloud Integrator" to help IT managers navigate
and leverage cloud computing. Their
offerings
include an evaluation methodology
to evaluate the feasibility of moving individual applications to the cloud. They
also understand the FDA Qualification and Validation process and can work
with companies that would like to move health care applications into the
cloud.

They've also created
CloudBuntu, a
lightweight managed desktop running on EC2 and accessible through a
web browser.

I now have proof that cloud computing can be hereditary! My son
Stephen is studying Applied Math and Economics at the University of Washington. He's
also working on a really interesting research project in the Economics department. The
project makes use of the Mechanical Turk in a very unique way. The paper is still
being written and I'll be sure to link to it when it finally arrives. Over the holiday
break Stephen needed to parse some large XML files. The parsing was taking forever
on his laptop, so he launched an Extra Large EC2
instance and did his work there instead. In
his post, he explains what he did and
also provides a photo-illustrated tour of how he set up EC2 and
EBS.

Finally (I am out of time, not material) Researchers in the
Physics Department at the University
of Washington has prepared a
report on the topic of
Scientific Computing in
the Cloud.
They have investigated the feasibility performing scientific
computing in the cloud (using EC2) as an alternative to
traditional computational tools. They note that
"For research groups, cloud computing provides convenient access
to reliable, high performance clusters and storage, without the
need to purchase and maintain sophisticated hardware. For
developers, virtualization allows scientific codes to be optimized
and pre-installed on machine images, facilitating control over the
computational environment."

And that's it for today. Send me information about what you are doing with
AWS and I'll do my best to fit you in.

We rolled out a powerful new feature for Amazon S3 in the final hours of 2008.

This new feature, dubbed Requester Pays, works at the level of an S3 bucket. If the bucket's owner flags it as Requester Pays, then all data transfer and request costs are paid by the party accessing the data.

The Requester Pays model can be used in two ways.

First, by simply marking a bucket as Requester Pays, data owners can provide access to large data sets without incurring charges for data transfer or requests. For example, they could make available a 1 GB dataset at a cost of just 15 cents per month (18 cents if stored in the European instance of S3). Requesters use signed and specially flagged requests to identify themselves to AWS, paying for S3 GET requests and data transfer at the usual rates — 17 cents per GB for data transfer (even less at high volumes) and 1 cent for every 10,000 GET requests. The newest version of the
S3 Developer Guide contains the information needed to make use of S3 in this way.

Second, the Requester Pays feature can be used in conjunction with Amazon DevPay. Content owners charge a markup for access to the data. The price can include a monthly fee, a markup on the data transfer costs, and a markup on the cost of each GET. The newest version of the
DevPay Developer Guide has all of the information needed to set this up, including some helpful diagrams. Organizations with large amounts of valuable data can now use DevPay to expose and monetize the data, with payment by the month or by access (or some combination). For example, I could create a database of all dog kennels in the United States, and make it available for $20 per month, with no charge for access. My AWS account would not be charged for the data transfer and request charges, only for the data storage.

I firmly believe that business model innovation is as important as technical innovation. This new feature gives you the ability to create the new, innovative, and very efficient business models that you will need to have in order to succeed in 2009!