The Problem with S3 Bucket Data Leaks

It seems like every couple of weeks there is an announcement of another exposed Amazon S3 bucket. Recently a branch of the military, the United States Army Intelligence and Security Command (INSCOM), was caught by security firm Upguard with a wide open S3 bucket. Why are Amazon buckets exposed and what can IT do about it?

The INSCOM Leak

The INSCOM leak is similar to many others. According to Upguard, either INSCOM or a contractor created an S3 bucket and probably, for the convenience of access, opened the bucket security measures up, so anyone with the URL has access to it. In the case of the INSCOM leak, the repository contained numerous viewable files and folders. It also contained three downloadable files, one of which was an Oracle Virtual Appliance (.ova). The properties of this volume included areas clearly marked top secret.

How’d We Get Here?

Amazon Simple Storage Service (S3) is an object storage system accessible in the public cloud. When a user requests storage from Amazon, the service creates a bucket, which is somewhat similar to a volume on a traditional disk array. By default that bucket, at the point of creation, is only available to the AWS account creating it. To make a bucket accessible to anyone with the URL takes a bit of work and has to be done intentionally, not accidentally.

The fault of all of these exposed buckets is not Amazon’s. Amazon is an infrastructure provider, and its job is to provide the tools, it is IT’s job to implement those tools correctly. Amazon does more than it needs to do by making sure that at the point of creation the bucket is secure.

Why Leave Buckets Unprotected?

The top reason (excuse) given for leaving a bucket unprotected is convenience. Simple URL access enables multiple people and organizations to operate on a set of data at the same time. Another reason may be disgruntled employees who want to expose organizational information without being caught leaving the building with a hard drive. They may do this so they can download the information when they get outside the organization or they may do it just to embarrass their current organization.

How to Stop Unprotected Buckets

There are two obvious ways to prevent unsecured S3 buckets. The first is not to use the public cloud at all. Instead, use an on-premises system that only shares data within the organization. That way, if a volume is incorrectly configured, the scope of exposure is limited to the organization not the whole world. The second is do not change the Amazon defaults when creating a bucket.

There are legitimate reasons to open up an S3 bucket. How legitimate the need is to make it wide open is questionable though. But there is the real danger of a rogue employee using Amazon as a way to leak data out of the organization. In these cases, IT needs to implement a policy of routine scanning for the status of its public cloud utilization. It should know when a public storage area has been created when access rights have changed and who is externally accessing those buckets.

To learn more on exposed S3 buckets and how to secure them, join us for our live webinar. Tune in with George Crump, Chief Steward, Storage Switzerland, David Linthicum, Cloud Computing Visionary, Author & Speaker, Charles Goldberg, Sr. Director of Product Marketing, Thales e-Security, and Mark Carlson, Co-Chair, SNIA Technical Council & Cloud Storage Initiative. They will have a live panel discussion on this ever-important topic.

Share this:

Like this:

Related

Twelve years ago George Crump founded Storage Switzerland with one simple goal; to educate IT professionals about all aspects of data center storage. He is the primary contributor to Storage Switzerland and is a heavily sought after public speaker. With over 25 years of experience designing storage solutions for data centers across the US, he has seen the birth of such technologies as RAID, NAS and SAN, Virtualization, Cloud and Enterprise Flash. Prior to founding Storage Switzerland he was CTO at one of the nation's largest storage integrators where he was in charge of technology testing, integration and product selection.