Python

Report an Issue

Buckets#

Create / interact with gcloud storage buckets.

class gcloud.storage.bucket.Bucket(name=None)[source]#

Bases: gcloud.storage._helpers._PropertyMixin

A class representing a Bucket on Cloud Storage.

Parameters:
  • name (string) – The name of the bucket.
  • properties (dictionary or NoneType) – The properties associated with the bucket.
acl#

Create our ACL on demand.

configure_website(main_page_suffix=None, not_found_page=None)[source]#

Configure website-related properties.

See: https://developers.google.com/storage/docs/website-configuration

Note

This (apparently) only works if your bucket name is a domain name (and to do that, you need to get approved somehow...).

If you want this bucket to host a website, just provide the name of an index page and a page to use when a blob isn’t found:

>>> from gcloud import storage
>>> connection = storage.get_connection()
>>> bucket = storage.get_bucket(bucket_name, connection=connection)
>>> bucket.configure_website('index.html', '404.html')

You probably should also make the whole bucket public:

>>> bucket.make_public(recursive=True, future=True)

This says: “Make the bucket public, and all the stuff already in the bucket, and anything else I add to the bucket. Just make it all public.”

Parameters:
  • main_page_suffix (string) – The page to use as the main page of a directory. Typically something like index.html.
  • not_found_page (string) – The file to use when a page isn’t found.
static copy_blob(blob, destination_bucket, new_name=None, connection=None)[source]#

Copy the given blob to the given bucket, optionally with a new name.

Parameters:
Return type:

gcloud.storage.blob.Blob

Returns:

The new Blob.

cors#

Retrieve CORS policies configured for this bucket.

See: http://www.w3.org/TR/cors/ and
https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type:list of dictionaries
Returns:A sequence of mappings describing each CORS policy.
create(project=None, connection=None)[source]#

Creates current bucket.

If the bucket already exists, will raise gcloud.exceptions.Conflict.

This implements “storage.buckets.insert”.

Parameters:
  • project (string) – Optional. The project to use when creating bucket. If not provided, falls back to default.
  • connection (gcloud.storage.connection.Connection or NoneType) – Optional. The connection to use when sending requests. If not provided, falls back to default.
Return type:

gcloud.storage.bucket.Bucket

Returns:

The newly created bucket.

Raises:

EnvironmentError if the project is not given and can’t be inferred.

default_object_acl#

Create our defaultObjectACL on demand.

delete(force=False, connection=None)[source]#

Delete this bucket.

The bucket must be empty in order to submit a delete request. If force=True is passed, this will first attempt to delete all the objects / blobs in the bucket (i.e. try to empty the bucket).

If the bucket doesn’t exist, this will raise gcloud.exceptions.NotFound. If the bucket is not empty (and force=False), will raise gcloud.exceptions.Conflict.

If force=True and the bucket contains more than 256 objects / blobs this will cowardly refuse to delete the objects (or the bucket). This is to prevent accidental bucket deletion and to prevent extremely long runtime of this method.

Parameters:
  • force (boolean) – If True, empties the bucket’s objects then deletes it.
  • connection (gcloud.storage.connection.Connection or NoneType) – Optional. The connection to use when sending requests. If not provided, falls back to default.
Raises:

ValueError if force is True and the bucket contains more than 256 objects / blobs.

delete_blob(blob_name, connection=None)[source]#

Deletes a blob from the current bucket.

If the blob isn’t found (backend 404), raises a gcloud.exceptions.NotFound.

For example:

>>> from gcloud.exceptions import NotFound
>>> from gcloud import storage
>>> connection = storage.get_connection()
>>> bucket = storage.get_bucket('my-bucket', connection=connection)
>>> print bucket.list_blobs()
[<Blob: my-bucket, my-file.txt>]
>>> bucket.delete_blob('my-file.txt')
>>> try:
...   bucket.delete_blob('doesnt-exist')
... except NotFound:
...   pass
Parameters:
  • blob_name (string) – A blob name to delete.
  • connection (gcloud.storage.connection.Connection or NoneType) – Optional. The connection to use when sending requests. If not provided, falls back to default.
Raises:

gcloud.exceptions.NotFound (to suppress the exception, call delete_blobs, passing a no-op on_error callback, e.g.:

>>> bucket.delete_blobs([blob], on_error=lambda blob: None)
delete_blobs(blobs, on_error=None, connection=None)[source]#

Deletes a list of blobs from the current bucket.

Uses Bucket.delete_blob() to delete each individual blob.

Parameters:
Raises:

gcloud.exceptions.NotFound (if on_error is not passed).

disable_logging()[source]#

Disable access logging for this bucket.

See: https://cloud.google.com/storage/docs/accesslogs#disabling

disable_website()[source]#

Disable the website configuration for this bucket.

This is really just a shortcut for setting the website-related attributes to None.

enable_logging(bucket_name, object_prefix='')[source]#

Enable access logging for this bucket.

See: https://cloud.google.com/storage/docs/accesslogs#delivery

Parameters:
  • bucket_name (string) – name of bucket in which to store access logs
  • object_prefix (string) – prefix for access log filenames
etag#

Retrieve the ETag for the bucket.

See: http://tools.ietf.org/html/rfc2616#section-3.11 and
https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type:string or NoneType
Returns:The bucket etag or None if the property is not set locally.
exists(connection=None)[source]#

Determines whether or not this bucket exists.

Parameters:connection (gcloud.storage.connection.Connection or NoneType) – Optional. The connection to use when sending requests. If not provided, falls back to default.
Return type:boolean
Returns:True if the bucket exists in Cloud Storage.
get_blob(blob_name, connection=None)[source]#

Get a blob object by name.

This will return None if the blob doesn’t exist:

>>> from gcloud import storage
>>> connection = storage.get_connection()
>>> bucket = storage.get_bucket('my-bucket', connection=connection)
>>> print bucket.get_blob('/path/to/blob.txt')
<Blob: my-bucket, /path/to/blob.txt>
>>> print bucket.get_blob('/does-not-exist.txt')
None
Parameters:
  • blob_name (string) – The name of the blob to retrieve.
  • connection (gcloud.storage.connection.Connection or NoneType) – Optional. The connection to use when sending requests. If not provided, falls back to default.
Return type:

gcloud.storage.blob.Blob or None

Returns:

The blob object if it exists, otherwise None.

get_logging()[source]#

Return info about access logging for this bucket.

See: https://cloud.google.com/storage/docs/accesslogs#status

Return type:dict or None
Returns:a dict w/ keys, logBucket and logObjectPrefix (if logging is enabled), or None (if not).
id#

Retrieve the ID for the bucket.

See: https://cloud.google.com/storage/docs/json_api/v1/buckets

Return type:string or NoneType
Returns:The ID of the bucket or None if the property is not set locally.
lifecycle_rules#

Lifecycle rules configured for this bucket.

See: https://cloud.google.com/storage/docs/lifecycle and
https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type:list(dict)
Returns:A sequence of mappings describing each lifecycle rule.
list_blobs(max_results=None, page_token=None, prefix=None, delimiter=None, versions=None, projection='noAcl', fields=None, connection=None)[source]#

Return an iterator used to find blobs in the bucket.

Parameters:
  • max_results (integer or NoneType) – maximum number of blobs to return.
  • page_token (string) – opaque marker for the next “page” of blobs. If not passed, will return the first page of blobs.
  • prefix (string or NoneType) – optional prefix used to filter blobs.
  • delimiter (string or NoneType) – optional delimter, used with prefix to emulate hierarchy.
  • versions (boolean or NoneType) – whether object versions should be returned as separate blobs.
  • projection (string or NoneType) – If used, must be ‘full’ or ‘noAcl’. Defaults to ‘noAcl’. Specifies the set of properties to return.
  • fields (string or NoneType) – Selector specifying which fields to include in a partial response. Must be a list of fields. For example to get a partial response with just the next page token and the language of each blob returned: ‘items/contentLanguage,nextPageToken’
  • connection (gcloud.storage.connection.Connection or NoneType) – Optional. The connection to use when sending requests. If not provided, falls back to default.
Return type:

_BlobIterator.

Returns:

An iterator of blobs.

location#

Retrieve location configured for this bucket.

See: https://cloud.google.com/storage/docs/json_api/v1/buckets and https://cloud.google.com/storage/docs/concepts-techniques#specifyinglocations

If the property is not set locally, returns None.

Return type:string or NoneType
make_public(recursive=False, future=False, connection=None)[source]#

Make a bucket public.

If recursive=True and the bucket contains more than 256 objects / blobs this will cowardly refuse to make the objects public. This is to prevent extremely long runtime of this method.

Parameters:
  • recursive (boolean) – If True, this will make all blobs inside the bucket public as well.
  • future (boolean) – If True, this will make all objects created in the future public as well.
  • connection (gcloud.storage.connection.Connection or NoneType) – Optional. The connection to use when sending requests. If not provided, falls back to default.
metageneration#

Retrieve the metageneration for the bucket.

See: https://cloud.google.com/storage/docs/json_api/v1/buckets

Return type:integer or NoneType
Returns:The metageneration of the bucket or None if the property is not set locally.
owner#

Retrieve info about the owner of the bucket.

See: https://cloud.google.com/storage/docs/json_api/v1/buckets

Return type:dict or NoneType
Returns:Mapping of owner’s role/ID. If the property is not set locally, returns None.
path#

The URL path to this bucket.

static path_helper(bucket_name)[source]#

Relative URL path for a bucket.

Parameters:bucket_name (string) – The bucket name in the path.
Return type:string
Returns:The relative URL path for bucket_name.
project_number#

Retrieve the number of the project to which the bucket is assigned.

See: https://cloud.google.com/storage/docs/json_api/v1/buckets

Return type:integer or NoneType
Returns:The project number that owns the bucket or None if the property is not set locally.

Retrieve the URI for the bucket.

See: https://cloud.google.com/storage/docs/json_api/v1/buckets

Return type:string or NoneType
Returns:The self link for the bucket or None if the property is not set locally.
storage_class#

Retrieve the storage class for the bucket.

See: https://cloud.google.com/storage/docs/storage-classes https://cloud.google.com/storage/docs/nearline-storage https://cloud.google.com/storage/docs/durable-reduced-availability

Return type:string or NoneType
Returns:If set, one of “STANDARD”, “NEARLINE”, or “DURABLE_REDUCED_AVAILABILITY”, else None.
time_created#

Retrieve the timestamp at which the bucket was created.

See: https://cloud.google.com/storage/docs/json_api/v1/buckets

Return type:datetime.datetime or NoneType
Returns:Datetime object parsed from RFC3339 valid timestamp, or None if the property is not set locally.
upload_file(filename, blob_name=None, connection=None)[source]#

Shortcut method to upload a file into this bucket.

Use this method to quickly put a local file in Cloud Storage.

For example:

>>> from gcloud import storage
>>> connection = storage.get_connection()
>>> bucket = storage.get_bucket('my-bucket', connection=connection)
>>> bucket.upload_file('~/my-file.txt', 'remote-text-file.txt')
>>> print bucket.list_blobs()
[<Blob: my-bucket, remote-text-file.txt>]

If you don’t provide a blob name, we will try to upload the file using the local filename (not the complete path):

>>> from gcloud import storage
>>> connection = storage.get_connection()
>>> bucket = storage.get_bucket('my-bucket', connection=connection)
>>> bucket.upload_file('~/my-file.txt')
>>> print bucket.list_blobs()
[<Blob: my-bucket, my-file.txt>]
Parameters:
  • filename (string) – Local path to the file you want to upload.
  • blob_name (string) – The name of the blob to upload the file to. If this is blank, we will try to upload the file to the root of the bucket with the same name as on your local file system.
  • connection (gcloud.storage.connection.Connection or NoneType) – Optional. The connection to use when sending requests. If not provided, falls back to default.
Return type:

Blob

Returns:

The updated Blob object.

upload_file_object(file_obj, blob_name=None, connection=None)[source]#

Shortcut method to upload a file object into this bucket.

Use this method to quickly put a local file in Cloud Storage.

For example:

>>> from gcloud import storage
>>> connection = storage.get_connection()
>>> bucket = storage.get_bucket('my-bucket', connection=connection)
>>> bucket.upload_file(open('~/my-file.txt'), 'remote-text-file.txt')
>>> print bucket.list_blobs()
[<Blob: my-bucket, remote-text-file.txt>]

If you don’t provide a blob name, we will try to upload the file using the local filename (not the complete path):

>>> from gcloud import storage
>>> connection = storage.get_connection()
>>> bucket = storage.get_bucket('my-bucket', connection=connection)
>>> bucket.upload_file(open('~/my-file.txt'))
>>> print bucket.list_blobs()
[<Blob: my-bucket, my-file.txt>]
Parameters:
  • file_obj (file) – A file handle open for reading.
  • blob_name (string) – The name of the blob to upload the file to. If this is blank, we will try to upload the file to the root of the bucket with the same name as on your local file system.
  • connection (gcloud.storage.connection.Connection or NoneType) – Optional. The connection to use when sending requests. If not provided, falls back to default.
Return type:

Blob

Returns:

The updated Blob object.

versioning_enabled#

Is versioning enabled for this bucket?

See: https://cloud.google.com/storage/docs/object-versioning for details.

Return type:boolean
Returns:True if enabled, else False.