My counterpart in Germany, AWS Evangelist Steffen Krause, has put together the blog post below to show you how to launch and exercise SAP HANA One on AWS. Steffen is also responsible for the German language AWSAktuell blog.

-- Jeff;

An
interesting software product that created a lot of buzz lately is SAP HANA. It
is a database based on in-memory technology. The data to be analyzed are put
into memory in compressed form, which allows for very fast analytics.

But SAP
HANA is sold by SAP as an appliance, a combination of hard- and software. So to
use it, there is a big initial investment. SAP HANA One on the other hand is a
version of SAP HANA on Amazon Web Services, so everybody can test and use this
technology and develop software for it.

SAP HANA
One is available on AWS Marketplace with hourly billing. The software
currently costs 0.99 cents per hour, plus the cost of a cc2.8xlarge AWS
instance and the required EBS storage. Until May 30th, there is a
promotion where you can get 120$ in credits if you use at least 10 hours of SAP
HANA One. This video shows the deployment of SAP HANA One on AWS:

After
deployment, your HANA instance needs to be configured. To do this, you access
the instance using https://ip-address using
the Elastic IP address (EIP) that you assigned during deployment. First, you
need to enter AWS credentials (Access Key and Secret Access Key) of an AWS
account. This information is used by HANA to configure itself on AWS and for operations
like backup. I recommend against using your AWS root account credentials. Instead,
you should create an IAM user and assign this user to a group with Power Users
permissions. Then enter the credentials of this newly created IAM account in
the HANA console. You can easily delete this IAM account after testing SAP HANA
One. The following video shows the configuration of SAP HANA One on AWS:

After
configuring the HANA instance, you can use the product. Either you use SAP HANA
Studio for software development (a sample of an android app is here) or you use
the graphical analytics tool SAP Visual Intelligence. You can find a trial of
this tool in the download tab of the console. In SAP Visual Intelligence, you
connect to your HANA instance using the assigned Elastic IP, the instance id
00, the username system and the password that you assigned to system. Then you
select one of the sample datasets to analyze it interactively. The following
demo shows how this works:

After
testing SAP HANA One you should clean up so you stop paying for unused
resources. You should:

Earlier this year, IDC interviewed 11 organizations that use AWS in an effort to understand the long-term economic implications of moving their workloads to the cloud. As part of the study they also looked for changes in developer productivity, business agility, and the ability to deliver new applications that could be attributed to AWS. The AWS customers that they talked to included Samsung, BankInter, Fox, Netflix, Tomlinson Real Estate Group, United States Tennis Association, and Cycle Computing.

The paper contains a complete recitation of their findings. To summarize:

The five-year TCO of developing, deploying, and managing critical applications on AWS represents a 70% savings compared to deploying the same resources on-premises or in hosted environments.

The average five-year ROI from using AWS is 626%. Interestingly enough, the return grows (when measured in dollars of benefit for each dollar invested) over time. After 36 months, the organizations interviewed were realizing $3.50 in benefits for each $1 invested in AWS. After 60 months, the benefit grew to $8.40 for every $1 invested.

Over a five year period, the companies saw cumulative savings that averaged over $2.5 million per application. This included savings in development and deployment costs (reduced by 80%), application management costs (reduced by 52%), and infrastructure support costs (reduced by 56%). Again on average, these organizations were able to replace $1.6 million in infrastructure with $302,000 in AWS costs.

Our customers ran (and measured) both steady-state and variable-state workloads. They ranked these workloads as very critical (4.5 out of 5). In addition to costs savings, they were able to increase their business agility, and brought their applications to market far more quickly.

Additional AWS Direct Connect locations are planned for Los Angeles, London, Tokyo and Singapore in the next several months. Please feel free to use the contact page to express your interest in connections to these locations.

The first release of VM Import handled Windows 2008 images in the VMware ESX VMDK format. You can now import Windows 2003 and Windows 2008 images in any of the following formats:

VMware ESX VMDK

Citrix XenServer VHD

Microsoft Hyper-V VHD

I see VM Import as a key tool that will help our enterprise customers to move into the cloud. There are many ways to use this service – two popular ways are to extend data centers into the cloud and to be a disaster recovery repository for enterprises.

You can use the EC2 API tools, or if you use VMware vSphere, the EC2 VM Import Connector to import your VM into EC2. Once the import process is done, you will be given an instance ID that you can use to boot your new EC2 instance You have complete control of the instance size, security group, and (optionally) the VPC destination. Here's a flowchart of the import process:

You can also import data volumes, turning them in to Elastic Block Storage (EBS) volumes that can be attached to any instance in the target Availability Zone.

As I've said in the past, we plan to support additional operating systems, versions, and virtualization platforms in the future. We are also planning to enable the export of EC2 instances to common image formats.

VPC Everywhere - The Virtual Private Cloud (VPC) is now generally available, and can now be used in multiple Availability Zones of every AWS Region. VPCs can now span multiple Availability Zones, and each AWS account can now create multiple VPCs. Windows 2008 R2 is now supported, as are Reserved Instances for Windows with SQL Server.

AWS Direct Connect - Enterprises can now create a connection to an AWS Region via dedicated 1 Gbit and 10 Gbit network circuits in order to enhance privacy and reduce network latency.

Identity Federation - Enterprises can now create temporary security credentials for AWS to allow existing identities (from, for example, a LDAP server) to make use of IAM's fine-grained access controls.

I've written an entire post for each of these new features. Check them out, and let me know what you think!

The assessment covered the solution's management, operational, and technical security controls and was performed under the Federal Information Security Management Act of 2002 (FISMA for those of you inside of the Capital Beltway). The FISMA framework is used for managing information security for all information systems used or operated by a U.S. federal government agency or by a contractor or other organization on behalf of a federal agency.

Net-net, Government agencies will now have access to a cloud computing solution from Appian and Amazon Web Services that has gone through the rigorous Certification and Accreditation based on the government’s FISMA (Federal Information Security Management Act of 2002).

Late last week I met Jim Kaskade of SIOS at a Seattle-area Starbucks for a meeting and a product demo. With the very cool (and appropriate) title "Chief of Cloud", Jim was the right person to demonstrate his company's new cloud-powered high availability and disaster recovery solution.

Jim's Mac laptop was running Centos. He used Xen and Red Hat's Virtual Machine Manager to host a couple of virtual machines representing the web, application, and database tiers of a SugarCRM installation. Each of the guest operating systems was running a copy of the new SIOS CloudStation product. Each copy of CloudStation was configured (using a web-based GUI) to replicate the state of the virtual machine to an Amazon EC2 instance running in a user-selected Region.

Once everything was up and running, Jim showed me how he could selectively kill the local virtual machines while keeping the application running. The demo was designed to feature a very short RPO (Recovery Point Objective) so that changes made locally just seconds before the database was killed were available from the cloud-based virtual mirror. Jim walked me through a number of different failure and recovery scenarios.

It was quite impressive and makes a great demo of the cloud-based DR (Disaster Recovery) and HA (High Availability) that I've been telling my audiences about for the last couple of years. Once configured, CloudStation can fail over from local processing to the cloud, from one cloud region to another, or even from one cloud provider to another. It can also be used as a migration tool, or what is sometimes calls P2V (Physical to Virtual) or P2C (Physical to Cloud).

Good day, everyone. The Software and Information Industry Association (SIIA) is hosting a webinar about cloud security on Tuesday 19 January 2010 at 12:30 PM EST/9:30 AM PST. I'm one of the panelists. Here's a brief blurb and a list of the participants:

Cloud webinar series: Cloud Security for DummiesSecurity and cloud computing have come a long way in just a few years. Understanding these issues becomes vital as cloud computing expands into government and the large enterprise. New trends -- like the emergence of private clouds -- are changing the way companies think about their security strategy. In this webinar, you'll hear perspectives from service providers, platforms, pure-play firms, and other players in the cloud security space.

Good day, everyone. I'm Steve Riley. In July 2009 I joined the AWS evangelism team. I spent my first few months absorbing information about all our offerings and am now getting back on the road again, speaking at various events and user groups and meeting with customers. I came from Microsoft, where I was in the telecommunications consulting practice for three years and in the Trustworthy Computing group for seven. I was a global security evangelist there and also worked closely with our chief security officer and enterprise security architect communities. I'm continuing that work here at Amazon Web Services, concentrating on enterprise deployment of cloud computing, all things cloud security, and of course the Windows Server aspects of our offerings.

I'm very excited to be part of AWS. The cloud is the future, and I look forward to meeting many of you and working together. As with all of us on the team, I'm here to help you succeed. More information in the links below.

Adoption of the AWS Cloud by mainstream ISVs is underway as you read this. There are numerous posts about IBM’s work to bring their product line into the AWS environment, and today’s is no exception. IBM Tivoli monitoring is now available as an Amazon Machine Image (AMI) that runs as a virtual computer in the AWS environment. It’s one more example of enterprise-class applications from household-name ISVs that run in the Amazon Cloud.

And it's simple to Use - IBM provides self-install scripts for data collection agents, self-help guides and maintains infrastructure for delivery of Tivoli software Because there is no hardware or software to purchase – and because the hourly price for Tivoli on EC2 includes an IBM license – it’s super easy to get data collection up and running. At the end of the day, it’s the same enterprise-class software that organizations used to buy traditional licenses for – but without the big PO approval required. In fact, it’s as simple as logging in to the AWS Console, and then searching for AMIs with “Tivoli” in the name.