This tutorial shows you how to write a simple Python program that performs basic Google Cloud Storage operations
using the XML API. This document assumes you are familiar with Python and the Google Cloud Storage
concepts and operations presented in the Hello Google Cloud Storage! guide.

Note: This tutorial has been tested with Python 2.6.5, but should also work with 2.6.x and 2.7.x.

Setting up your environment

boto is an open source Python library that is used
as an interface to Google Cloud Storage.
gcs-oauth2-boto-plugin
is an authentication plugin for the boto auth plugin framework. It provides OAuth 2.0 credentials
that can be used with Google Cloud Storage.

Setup to use the boto library and oauth2 plugin will depend on the system you are using.
Use the setup examples below as guidance. These commands install
pip and then use pip to install other packages.
The last three commands show testing the import of the two modules to verify the installation.

You can configure your boto configuration file to use service account or user account credentials.
Service account credentials are the preferred type of credential to use when authenticating on
behalf of a service or application. User account credentials are the preferred type of credentials
for authenticating requests on behalf of a specific user (i.e., a human). For more information about
these two credential types, see
Supported Credential Types.

Using service account credentials

Use an existing service account or create a new one, and download the associated private key.

Configure the .boto file with the service account. You can do this with
gsutil:

$ gsutil config -e

The command will prompt you for the service account email address and the location of the
service account private key (.p12). Be sure to have the private key on the computer where you
are running the gsutil command.

Using user account credentials

If you don't already have a .boto file create one. You can do this
with gsutil.

Edit the .boto file. In the [OAuth2] section, specify the
client_id and client_secret values with the ones you generated.

Run the gsutil config again command to generate a refresh token based on the
client ID and secret you entered.

If you get an error message that indicates the .boto cannot be backed up,
remove or rename the backup configuration file .boto.bak.

Configure refresh token fallback logic.

The gcs-oauth2-boto-plugin requires fallback logic for generating auth tokens
when you are using application credentials. Fallback logic is not needed when you use a service account.

You have the following options for enabling fallback:

Set the client_id and the client_secret in the .boto
config file. This is the recommended option, and it is required for using gsutil
with your new .boto config file.

Set environment variables OAUTH2_CLIENT_ID and OAUTH2_CLIENT_SECRET.

Use the SetFallbackClientIdAndSecret function as shown in the examples
below.

Setting up your Python source file

To start this tutorial, use your favorite text editor to create a new Python file.
Then, add the following directives, import statements, configuration, and constant
assignments shown.

Note that in the code here, we use the SetFallbackClientIdAndSecret function as
a fallback for generating refresh tokens. See Using application credentials for
other ways to specify a fallback. If you are using a service account to authenticate,
you do not need to include the fallback logic.

Creating buckets

This code creates two buckets. Because bucket names must be globally unique
(see the naming guidelines), a timestamp is appended
to each bucket name to help guarantee uniqueness.

If these bucket names are already in use, you'll need to modify the code to generate unique bucket names.

Note:
Google Cloud Storage has kept the concept of default project from earlier versions of the product. A
default project exists for interoperability reasons. For more information and to learn how to set a
default project, see Setting a default project.
The existence of a default project affects the way the code shown on the right is written.

now = time.time()
CATS_BUCKET = 'cats-%d' % now
DOGS_BUCKET = 'dogs-%d' % now
# Your project ID can be found at https://console.developers.google.com/
# If there is no domain for your project, then project_id = 'YOUR_PROJECT'
project_id = 'YOUR_DOMAIN:YOUR_PROJECT'
for name in (CATS_BUCKET, DOGS_BUCKET):
# Instantiate a BucketStorageUri object.
uri = boto.storage_uri(name, GOOGLE_STORAGE)
# Try to create the bucket.
try:
# If the default project is defined,
# you do not need the headers.
# Just call: uri.create_bucket()
header_values = {"x-goog-project-id": project_id}
uri.create_bucket(headers=header_values)
print 'Successfully created bucket "%s"' % name
except boto.exception.StorageCreateError, e:
print 'Failed to create bucket:', e

Listing buckets

To retrieve a list of all buckets, call storage_uri() to instantiate a BucketStorageUri object, specifying the empty string as the URI. Then, call the get_all_buckets() instance method.

Uploading objects

To upload objects, create a file object (opened for read) that points to your local file and a storage URI object that points to the destination object on Google Cloud Storage. Call the set_file_from_contents() instance method, specifying the file handle as the argument.

Listing objects

To list all objects in a bucket, call storage_uri() and specify the bucket's URI and the Google Cloud Storage URI scheme as the arguments. Then, retrieve a list of objects using the get_bucket() instance method.

Downloading and copying objects

This code reads objects in DOGS_BUCKET and copies them to both your home directory and CATS_BUCKET. It also demonstrates that you can use the boto library to operate against both local files and Google Cloud Storage objects using the same interface.