Python
Buckets#
Create / interact with gcloud storage buckets.
If you want to check whether a blob exists, you can use the in
operator
in Python:
>>> print 'kitten.jpg' in bucket
True
>>> print 'does-not-exist' in bucket
False
If you want to get all the blobs in the bucket, you can use
list_blobs
:
>>> blobs = bucket.list_blobs()
You can also use the bucket as an iterator:
>>> for blob in bucket:
... print blob
-
class
gcloud.storage.bucket.
Bucket
(name=None, connection=None)[source]# Bases:
gcloud.storage._helpers._PropertyMixin
A class representing a Bucket on Cloud Storage.
Parameters: - name (string) – The name of the bucket.
- connection (
gcloud.storage.connection.Connection
) – The connection to use when sending requests. - properties (dictionary or
NoneType
) – The properties associated with the bucket.
-
acl
# Create our ACL on demand.
-
configure_website
(main_page_suffix=None, not_found_page=None)[source]# Configure website-related properties.
See: https://developers.google.com/storage/docs/website-configuration
Note
This (apparently) only works if your bucket name is a domain name (and to do that, you need to get approved somehow...).
If you want this bucket to host a website, just provide the name of an index page and a page to use when a blob isn’t found:
>>> from gcloud import storage >>> connection = storage.get_connection() >>> bucket = storage.get_bucket(bucket_name, connection=connection) >>> bucket.configure_website('index.html', '404.html')
You probably should also make the whole bucket public:
>>> bucket.make_public(recursive=True, future=True)
This says: “Make the bucket public, and all the stuff already in the bucket, and anything else I add to the bucket. Just make it all public.”
Parameters: - main_page_suffix (string) – The page to use as the main page of a directory. Typically something like index.html.
- not_found_page (string) – The file to use when a page isn’t found.
-
connection
# Getter property for the connection to use with this Bucket.
Return type: gcloud.storage.connection.Connection
Returns: The connection to use.
-
copy_blob
(blob, destination_bucket, new_name=None)[source]# Copy the given blob to the given bucket, optionally with a new name.
Parameters: - blob (string or
gcloud.storage.blob.Blob
) – The blob to be copied. - destination_bucket (
gcloud.storage.bucket.Bucket
) – The bucket into which the blob should be copied. - new_name (string) – (optional) the new name for the copied file.
Return type: Returns: The new Blob.
- blob (string or
-
cors
# Retrieve CORS policies configured for this bucket.
Return type: list of dictionaries Returns: A sequence of mappings describing each CORS policy.
-
create
(project=None)[source]# Creates current bucket.
If the bucket already exists, will raise
gcloud.exceptions.Conflict
.This implements “storage.buckets.insert”.
Parameters: project (string) – Optional. The project to use when creating bucket. If not provided, falls back to default. Return type: gcloud.storage.bucket.Bucket
Returns: The newly created bucket. Raises: EnvironmentError
if the project is not given and can’t be inferred.
-
default_object_acl
# Create our defaultObjectACL on demand.
-
delete
(force=False)[source]# Delete this bucket.
The bucket must be empty in order to submit a delete request. If
force=True
is passed, this will first attempt to delete all the objects / blobs in the bucket (i.e. try to empty the bucket).If the bucket doesn’t exist, this will raise
gcloud.exceptions.NotFound
. If the bucket is not empty (andforce=False
), will raisegcloud.exceptions.Conflict
.If
force=True
and the bucket contains more than 256 objects / blobs this will cowardly refuse to delete the objects (or the bucket). This is to prevent accidental bucket deletion and to prevent extremely long runtime of this method.Parameters: force (boolean) – If True, empties the bucket’s objects then deletes it. Raises: ValueError
ifforce
isTrue
and the bucket contains more than 256 objects / blobs.
-
delete_blob
(blob_name)[source]# Deletes a blob from the current bucket.
If the blob isn’t found (backend 404), raises a
gcloud.exceptions.NotFound
.For example:
>>> from gcloud.exceptions import NotFound >>> from gcloud import storage >>> connection = storage.get_connection() >>> bucket = storage.get_bucket('my-bucket', connection=connection) >>> print bucket.list_blobs() [<Blob: my-bucket, my-file.txt>] >>> bucket.delete_blob('my-file.txt') >>> try: ... bucket.delete_blob('doesnt-exist') ... except NotFound: ... pass
Parameters: blob_name (string) – A blob name to delete.
Raises: gcloud.exceptions.NotFound
(to suppress the exception, calldelete_blobs
, passing a no-opon_error
callback, e.g.:>>> bucket.delete_blobs([blob], on_error=lambda blob: None)
-
delete_blobs
(blobs, on_error=None)[source]# Deletes a list of blobs from the current bucket.
Uses
Bucket.delete_blob()
to delete each individual blob.Parameters: - blobs (list of string or
gcloud.storage.blob.Blob
) – A list of blob names or Blob objects to delete. - on_error (a callable taking (blob)) – If not
None
, called once for each blob raisinggcloud.exceptions.NotFound
; otherwise, the exception is propagated.
Raises: gcloud.exceptions.NotFound
(if on_error is not passed).- blobs (list of string or
-
disable_logging
()[source]# Disable access logging for this bucket.
See: https://cloud.google.com/storage/docs/accesslogs#disabling
-
disable_website
()[source]# Disable the website configuration for this bucket.
This is really just a shortcut for setting the website-related attributes to
None
.
-
enable_logging
(bucket_name, object_prefix='')[source]# Enable access logging for this bucket.
See: https://cloud.google.com/storage/docs/accesslogs#delivery
Parameters: - bucket_name (string) – name of bucket in which to store access logs
- object_prefix (string) – prefix for access log filenames
-
etag
# Retrieve the ETag for the bucket.
- See: http://tools.ietf.org/html/rfc2616#section-3.11 and
- https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: string or NoneType
Returns: The bucket etag or None
if the property is not set locally.
-
exists
()[source]# Determines whether or not this bucket exists.
Return type: boolean Returns: True if the bucket exists in Cloud Storage.
-
get_blob
(blob_name)[source]# Get a blob object by name.
This will return None if the blob doesn’t exist:
>>> from gcloud import storage >>> connection = storage.get_connection() >>> bucket = storage.get_bucket('my-bucket', connection=connection) >>> print bucket.get_blob('/path/to/blob.txt') <Blob: my-bucket, /path/to/blob.txt> >>> print bucket.get_blob('/does-not-exist.txt') None
Parameters: blob_name (string) – The name of the blob to retrieve. Return type: gcloud.storage.blob.Blob
or NoneReturns: The blob object if it exists, otherwise None.
-
get_logging
()[source]# Return info about access logging for this bucket.
See: https://cloud.google.com/storage/docs/accesslogs#status
Return type: dict or None Returns: a dict w/ keys, logBucket
andlogObjectPrefix
(if logging is enabled), or None (if not).
-
id
# Retrieve the ID for the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: string or NoneType
Returns: The ID of the bucket or None
if the property is not set locally.
-
lifecycle_rules
# Lifecycle rules configured for this bucket.
- See: https://cloud.google.com/storage/docs/lifecycle and
- https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: list(dict) Returns: A sequence of mappings describing each lifecycle rule.
-
list_blobs
(max_results=None, page_token=None, prefix=None, delimiter=None, versions=None, projection='noAcl', fields=None)[source]# Return an iterator used to find blobs in the bucket.
Parameters: - max_results (integer or
NoneType
) – maximum number of blobs to return. - page_token (string) – opaque marker for the next “page” of blobs. If not passed, will return the first page of blobs.
- prefix (string or
NoneType
) – optional prefix used to filter blobs. - delimiter (string or
NoneType
) – optional delimter, used withprefix
to emulate hierarchy. - versions (boolean or
NoneType
) – whether object versions should be returned as separate blobs. - projection (string or
NoneType
) – If used, must be ‘full’ or ‘noAcl’. Defaults to ‘noAcl’. Specifies the set of properties to return. - fields (string or
NoneType
) – Selector specifying which fields to include in a partial response. Must be a list of fields. For example to get a partial response with just the next page token and the language of each blob returned: ‘items/contentLanguage,nextPageToken’
Return type: _BlobIterator
.Returns: An iterator of blobs.
- max_results (integer or
-
location
# Retrieve location configured for this bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets and https://cloud.google.com/storage/docs/concepts-techniques#specifyinglocations
If the property is not set locally, returns
None
.Return type: string or NoneType
-
make_public
(recursive=False, future=False)[source]# Make a bucket public.
Parameters: - recursive (boolean) – If True, this will make all blobs inside the bucket public as well.
- future (boolean) – If True, this will make all objects created in the future public as well.
-
metageneration
# Retrieve the metageneration for the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: integer or NoneType
Returns: The metageneration of the bucket or None
if the property is not set locally.
-
owner
# Retrieve info about the owner of the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: dict or NoneType
Returns: Mapping of owner’s role/ID. If the property is not set locally, returns None
.
-
path
# The URL path to this bucket.
-
static
path_helper
(bucket_name)[source]# Relative URL path for a bucket.
Parameters: bucket_name (string) – The bucket name in the path. Return type: string Returns: The relative URL path for bucket_name
.
-
project_number
# Retrieve the number of the project to which the bucket is assigned.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: integer or NoneType
Returns: The project number that owns the bucket or None
if the property is not set locally.
-
self_link
# Retrieve the URI for the bucket.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: string or NoneType
Returns: The self link for the bucket or None
if the property is not set locally.
-
storage_class
# Retrieve the storage class for the bucket.
See: https://cloud.google.com/storage/docs/storage-classes https://cloud.google.com/storage/docs/nearline-storage https://cloud.google.com/storage/docs/durable-reduced-availability
Return type: string or NoneType
Returns: If set, one of “STANDARD”, “NEARLINE”, or “DURABLE_REDUCED_AVAILABILITY”, else None
.
-
time_created
# Retrieve the timestamp at which the bucket was created.
See: https://cloud.google.com/storage/docs/json_api/v1/buckets
Return type: datetime.datetime
orNoneType
Returns: Datetime object parsed from RFC3339 valid timestamp, or None
if the property is not set locally.
-
upload_file
(filename, blob_name=None)[source]# Shortcut method to upload a file into this bucket.
Use this method to quickly put a local file in Cloud Storage.
For example:
>>> from gcloud import storage >>> connection = storage.get_connection() >>> bucket = storage.get_bucket('my-bucket', connection=connection) >>> bucket.upload_file('~/my-file.txt', 'remote-text-file.txt') >>> print bucket.list_blobs() [<Blob: my-bucket, remote-text-file.txt>]
If you don’t provide a blob name, we will try to upload the file using the local filename (not the complete path):
>>> from gcloud import storage >>> connection = storage.get_connection() >>> bucket = storage.get_bucket('my-bucket', connection=connection) >>> bucket.upload_file('~/my-file.txt') >>> print bucket.list_blobs() [<Blob: my-bucket, my-file.txt>]
Parameters: - filename (string) – Local path to the file you want to upload.
- blob_name (string) – The name of the blob to upload the file to. If this is blank, we will try to upload the file to the root of the bucket with the same name as on your local file system.
Return type: Blob
Returns: The updated Blob object.
-
upload_file_object
(file_obj, blob_name=None)[source]# Shortcut method to upload a file object into this bucket.
Use this method to quickly put a local file in Cloud Storage.
For example:
>>> from gcloud import storage >>> connection = storage.get_connection() >>> bucket = storage.get_bucket('my-bucket', connection=connection) >>> bucket.upload_file(open('~/my-file.txt'), 'remote-text-file.txt') >>> print bucket.list_blobs() [<Blob: my-bucket, remote-text-file.txt>]
If you don’t provide a blob name, we will try to upload the file using the local filename (not the complete path):
>>> from gcloud import storage >>> connection = storage.get_connection() >>> bucket = storage.get_bucket('my-bucket', connection=connection) >>> bucket.upload_file(open('~/my-file.txt')) >>> print bucket.list_blobs() [<Blob: my-bucket, my-file.txt>]
Parameters: - file_obj (file) – A file handle open for reading.
- blob_name (string) – The name of the blob to upload the file to. If this is blank, we will try to upload the file to the root of the bucket with the same name as on your local file system.
Return type: Blob
Returns: The updated Blob object.
-
versioning_enabled
# Is versioning enabled for this bucket?
See: https://cloud.google.com/storage/docs/object-versioning for details.
Return type: boolean Returns: True if enabled, else False.