Python
Buckets#
Create / interact with gcloud storage buckets.
If you want to check whether a blob exists, you can use the in
operator
in Python:
>>> print 'kitten.jpg' in bucket
True
>>> print 'does-not-exist' in bucket
False
If you want to get all the blobs in the bucket, you can use
get_all_blobs
:
>>> blobs = bucket.get_all_blobs()
You can also use the bucket as an iterator:
>>> for blob in bucket:
... print blob
-
class
gcloud.storage.bucket.
Bucket
(connection=None, name=None, properties=None)[source]# Bases:
gcloud.storage._helpers._PropertyMixin
A class representing a Bucket on Cloud Storage.
Parameters: - connection (
gcloud.storage.connection.Connection
) – The connection to use when sending requests. - name (string) – The name of the bucket.
-
acl
# Create our ACL on demand.
-
configure_website
(main_page_suffix=None, not_found_page=None)[source]# Configure website-related properties.
See: https://developers.google.com/storage/docs/website-configuration
Note
This (apparently) only works if your bucket name is a domain name (and to do that, you need to get approved somehow...).
If you want this bucket to host a website, just provide the name of an index page and a page to use when a blob isn’t found:
>>> from gcloud import storage >>> connection = storage.get_connection(project) >>> bucket = storage.get_bucket(bucket_name, connection=connection) >>> bucket.configure_website('index.html', '404.html')
You probably should also make the whole bucket public:
>>> bucket.make_public(recursive=True, future=True)
This says: “Make the bucket public, and all the stuff already in the bucket, and anything else I add to the bucket. Just make it all public.”
Parameters: - main_page_suffix (string) – The page to use as the main page of a directory. Typically something like index.html.
- not_found_page (string) – The file to use when a page isn’t found.
-
connection
# Getter property for the connection to use with this Bucket.
Return type: gcloud.storage.connection.Connection
Returns: The connection to use.
-
copy_blob
(blob, destination_bucket, new_name=None)[source]# Copy the given blob to the given bucket, optionally with a new name.
Parameters: - blob (string or
gcloud.storage.blob.Blob
) – The blob to be copied. - destination_bucket (
gcloud.storage.bucket.Bucket
) – The bucket into which the blob should be copied. - new_name (string) – (optional) the new name for the copied file.
Return type: Returns: The new Blob.
- blob (string or
-
default_object_acl
# Create our defaultObjectACL on demand.
-
delete
(force=False)[source]# Delete this bucket.
The bucket must be empty in order to submit a delete request. If
force=True
is passed, this will first attempt to delete all the objects / blobs in the bucket (i.e. try to empty the bucket).If the bucket doesn’t exist, this will raise
gcloud.exceptions.NotFound
. If the bucket is not empty (andforce=False
), will raisegcloud.exceptions.Conflict
.If
force=True
and the bucket contains more than 256 objects / blobs this will cowardly refuse to delete the objects (or the bucket). This is to prevent accidental bucket deletion and to prevent extremely long runtime of this method.Parameters: force (boolean) – If True, empties the bucket’s objects then deletes it. Raises: ValueError
ifforce
isTrue
and the bucket contains more than 256 objects / blobs.
-
delete_blob
(blob)[source]# Deletes a blob from the current bucket.
If the blob isn’t found, raise a
gcloud.exceptions.NotFound
.For example:
>>> from gcloud.exceptions import NotFound >>> from gcloud import storage >>> connection = storage.get_connection(project) >>> bucket = storage.get_bucket('my-bucket', connection=connection) >>> print bucket.get_all_blobs() [<Blob: my-bucket, my-file.txt>] >>> bucket.delete_blob('my-file.txt') >>> try: ... bucket.delete_blob('doesnt-exist') ... except NotFound: ... pass
Parameters: blob (string or
gcloud.storage.blob.Blob
) – A blob name or Blob object to delete.Return type: Returns: The blob that was just deleted.
Raises: gcloud.exceptions.NotFound
(to suppress the exception, calldelete_blobs
, passing a no-opon_error
callback, e.g.:>>> bucket.delete_blobs([blob], on_error=lambda blob: None)
-
delete_blobs
(blobs, on_error=None)[source]# Deletes a list of blobs from the current bucket.
Uses
Bucket.delete_blob()
to delete each individual blob.Parameters: - blobs (list of string or
gcloud.storage.blob.Blob
) – A list of blob names or Blob objects to delete. - on_error (a callable taking (blob)) – If not
None
, called once for each blob raisinggcloud.exceptions.NotFound
; otherwise, the exception is propagated.
Raises: gcloud.exceptions.NotFound
(if on_error is not passed).- blobs (list of string or
-
disable_logging
()[source]# Disable access logging for this bucket.
See: https://cloud.google.com/storage/docs/accesslogs#disabling
-
disable_website
()[source]# Disable the website configuration for this bucket.
This is really just a shortcut for setting the website-related attributes to
None
.
-
enable_logging
(bucket_name, object_prefix='')[source]# Enable access logging for this bucket.
See: https://cloud.google.com/storage/docs/accesslogs#delivery
Parameters: - bucket_name (string) – name of bucket in which to store access logs
- object_prefix (string) – prefix for access log filenames
-
etag
# Retrieve the ETag for the bucket.
- See: http://tools.ietf.org/html/rfc2616#section-3.11 and
- https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: string
-
exists
()[source]# Determines whether or not this bucket exists.
Return type: boolean Returns: True if the bucket exists in Cloud Storage.
-
get_all_blobs
()[source]# List all the blobs in this bucket.
This will not retrieve all the data for all the blobs, it will only retrieve the blob paths.
This is equivalent to:
blobs = [blob for blob in bucket]
Return type: list of gcloud.storage.blob.Blob
Returns: A list of all the Blob objects in this bucket.
-
get_blob
(blob)[source]# Get a blob object by name.
This will return None if the blob doesn’t exist:
>>> from gcloud import storage >>> connection = storage.get_connection(project) >>> bucket = storage.get_bucket('my-bucket', connection=connection) >>> print bucket.get_blob('/path/to/blob.txt') <Blob: my-bucket, /path/to/blob.txt> >>> print bucket.get_blob('/does-not-exist.txt') None
Parameters: blob (string or gcloud.storage.blob.Blob
) – The name of the blob to retrieve.Return type: gcloud.storage.blob.Blob
or NoneReturns: The blob object if it exists, otherwise None.
-
get_cors
()[source]# Retrieve CORS policies configured for this bucket.
Return type: list(dict) Returns: A sequence of mappings describing each CORS policy.
-
get_default_object_acl
()[source]# Get the current Default Object ACL rules.
If the acl isn’t available locally, this method will reload it from Cloud Storage.
Return type: gcloud.storage.acl.DefaultObjectACL
Returns: A DefaultObjectACL object for this bucket.
-
get_lifecycle
()[source]# Retrieve lifecycle rules configured for this bucket.
- See: https://cloud.google.com/storage/docs/lifecycle and
- https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: list(dict) Returns: A sequence of mappings describing each lifecycle rule.
-
get_logging
()[source]# Return info about access logging for this bucket.
See: https://cloud.google.com/storage/docs/accesslogs#status
Return type: dict or None Returns: a dict w/ keys, logBucket
andlogObjectPrefix
(if logging is enabled), or None (if not).
-
id
# Retrieve the ID for the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: string
-
iterator
(prefix=None, delimiter=None, max_results=None, versions=None)[source]# Return an iterator used to find blobs in the bucket.
Parameters: - prefix (string or None) – optional prefix used to filter blobs.
- delimiter (string or None) – optional delimter, used with
prefix
to emulate hierarchy. - max_results (integer or None) – maximum number of blobs to return.
- versions (boolean or None) – whether object versions should be returned as separate blobs.
Return type: _BlobIterator
-
location
# Retrieve location configured for this bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets and https://cloud.google.com/storage/docs/concepts-techniques#specifyinglocations
Return type: string
-
make_public
(recursive=False, future=False)[source]# Make a bucket public.
Parameters: - recursive (boolean) – If True, this will make all blobs inside the bucket public as well.
- future (boolean) – If True, this will make all objects created in the future public as well.
-
metageneration
# Retrieve the metageneration for the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: integer
-
new_blob
(blob)[source]# Given path name (or Blob), return a
Blob
object.This is really useful when you’re not sure if you have a
Blob
instance or a string path name. Given either of those types, this returns the correspondingBlob
.Parameters: blob (string or gcloud.storage.blob.Blob
) – A path name or actual blob object.Return type: gcloud.storage.blob.Blob
Returns: A Blob object with the path provided.
-
owner
# Retrieve info about the owner of the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: dict Returns: mapping of owner’s role/ID.
-
path
# The URL path to this bucket.
-
static
path_helper
(bucket_name)[source]# Relative URL path for a bucket.
Parameters: bucket_name (string) – The bucket name in the path. Return type: string Returns: The relative URL path for bucket_name
.
-
project_number
# Retrieve the number of the project to which the bucket is assigned.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: integer
-
self_link
# Retrieve the URI for the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: string
-
storage_class
# Retrieve the storage class for the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets and https://cloud.google.com/storage/docs/durable-reduced-availability
Return type: string Returns: Currently one of “STANDARD”, “DURABLE_REDUCED_AVAILABILITY”
-
time_created
# Retrieve the timestamp at which the bucket was created.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: string Returns: timestamp in RFC 3339 format.
-
update_cors
(entries)[source]# Update CORS policies configured for this bucket.
Parameters: entries (list(dict)) – A sequence of mappings describing each CORS policy.
-
update_lifecycle
(rules)[source]# Update CORS policies configured for this bucket.
- See: https://cloud.google.com/storage/docs/lifecycle and
- https://cloud.google.com/storage/docs/json_api/v1/buckets
Parameters: rules (list(dict)) – A sequence of mappings describing each lifecycle rule.
-
upload_file
(filename, blob=None)[source]# Shortcut method to upload a file into this bucket.
Use this method to quickly put a local file in Cloud Storage.
For example:
>>> from gcloud import storage >>> connection = storage.get_connection(project) >>> bucket = storage.get_bucket('my-bucket', connection=connection) >>> bucket.upload_file('~/my-file.txt', 'remote-text-file.txt') >>> print bucket.get_all_blobs() [<Blob: my-bucket, remote-text-file.txt>]
If you don’t provide a blob value, we will try to upload the file using the local filename as the blob (not the complete path):
>>> from gcloud import storage >>> connection = storage.get_connection(project) >>> bucket = storage.get_bucket('my-bucket', connection=connection) >>> bucket.upload_file('~/my-file.txt') >>> print bucket.get_all_blobs() [<Blob: my-bucket, my-file.txt>]
Parameters: - filename (string) – Local path to the file you want to upload.
- blob (string or
gcloud.storage.blob.Blob
) – The blob (either an object or a remote path) of where to put the file. If this is blank, we will try to upload the file to the root of the bucket with the same name as on your local file system.
Return type: Blob
Returns: The updated Blob object.
-
upload_file_object
(file_obj, blob=None)[source]# Shortcut method to upload a file object into this bucket.
Use this method to quickly put a local file in Cloud Storage.
For example:
>>> from gcloud import storage >>> connection = storage.get_connection(project) >>> bucket = storage.get_bucket('my-bucket', connection=connection) >>> bucket.upload_file(open('~/my-file.txt'), 'remote-text-file.txt') >>> print bucket.get_all_blobs() [<Blob: my-bucket, remote-text-file.txt>]
If you don’t provide a blob value, we will try to upload the file using the local filename as the blob (not the complete path):
>>> from gcloud import storage >>> connection = storage.get_connection(project) >>> bucket = storage.get_bucket('my-bucket', connection=connection) >>> bucket.upload_file(open('~/my-file.txt')) >>> print bucket.get_all_blobs() [<Blob: my-bucket, my-file.txt>]
Parameters: - file_obj (file) – A file handle open for reading.
- blob (string or
gcloud.storage.blob.Blob
) – The blob (either an object or a remote path) of where to put the file. If this is blank, we will try to upload the file to the root of the bucket with the same name as on your local file system.
Return type: Blob
Returns: The updated Blob object.
-
versioning_enabled
# Is versioning enabled for this bucket?
See: https://cloud.google.com/storage/docs/object-versioning for details.
Return type: boolean Returns: True if enabled, else False.
- connection (