Gcloud::Bigquery::Job
Job¶ ↑
Represents a generic Job that may be performed on a Table.
See Managing Jobs, Datasets, and Projects for an overview of BigQuery jobs, and the Jobs API reference for details.
The subclasses of Job represent the specific BigQuery job types: CopyJob, ExtractJob, LoadJob, and QueryJob.
A job instance is created when you call Gcloud::Bigquery::Project#query_job, Gcloud::Bigquery::Dataset#query_job, Gcloud::Bigquery::Table#copy, Gcloud::Bigquery::Table#extract, Gcloud::Bigquery::Table#load, or Gcloud::Bigquery::View#data.
require "gcloud" gcloud = Gcloud.new bigquery = gcloud.bigquery q = "SELECT COUNT(word) as count FROM publicdata:samples.shakespeare" job = bigquery.query_job q job.wait_until_done! if job.failed? puts job.error else puts job.query_results.first end
Methods
Public Instance Methods
configuration()
¶
↑
The configuration for the job. Returns a hash. See the Jobs API reference.
done?()
¶
↑
Checks if the job's state is DONE
. When true
,
the job has stopped running. However, a DONE
state does not
mean that the job completed successfully. Use failed? to detect if an error
occurred or if the job was successful.
error()
¶
↑
The last error for the job, if any errors have occurred. Returns a hash. See the Jobs API reference.
Returns¶ ↑
Hash
{ "reason"=>"notFound", "message"=>"Not found: Table publicdata:samples.BAD_ID" }
errors()
¶
↑
The errors for the job, if any errors have occurred. Returns an array of hash objects. See error.
reload!()
¶
↑
Reloads the job with current data from the BigQuery service.
started_at()
¶
↑
The time when the job was started. This field is present after the
job's state changes from PENDING
to either
RUNNING
or DONE
.
state()
¶
↑
The current state of the job. The possible values are PENDING
,
RUNNING
, and DONE
. A DONE
state does
not mean that the job completed successfully. Use failed? to discover if an error
occurred or if the job was successful.
statistics()
¶
↑
The statistics for the job. Returns a hash. See the Jobs API reference.
wait_until_done!()
¶
↑
Refreshes the job until the job is DONE
. The delay between
refreshes will incrementally increase.
Example¶ ↑
require "gcloud" gcloud = Gcloud.new bigquery = gcloud.bigquery dataset = bigquery.dataset "my_dataset" table = dataset.table "my_table" extract_job = table.extract "gs://my-bucket/file-name.json", format: "json" extract_job.wait_until_done! extract_job.done? #=> true