Google Cloud

gcloud CLI Cheat Sheet

student cheating during an exam

When you are working with the Google Cloud, it comes in handy to use the Terminal from time to time. On this page, you will find various commands and code snippets, which have been very useful in the past for me.

See the current project

The following command will show the current account and project.

gcloud config list

List all available projects under the current account

gcloud projects list

Change the current project

Set the <project_id> to active for all following commands.

gcloud config set project <project_id>

Configure server-side Google Tag Manager in App Engine (SGTM)

If you have SGTM running in Google Cloud App Engine, you can use this command to reconfigure it. It will guide you through the update. The same command actually works also for the initial installation.

bash -c "$(curl -fsSL https://googletagmanager.com/static/serverjs/setup.sh)"

Export BigQuery Table to Cloud Storage

Export a complete table from BigQuery to Cloud Storage. For an export size above 1GB, a wildcard filename must be used. The format here is JSONP and GZIP. This can be changed according to the documentation.

bq extract --location=eu \
--destination_format NEWLINE_DELIMITED_JSON \
--compression GZIP \
<project>:<dataset>.<table> \
"gs://<bucket>/<filename>_*.gzip"

Read the full documentation here: https://cloud.google.com/bigquery/docs/exporting-data#bq

Export from SQL as AVRO

Also a query can directly be exported to Cloud Storage. Here the result will be exportet as AVRO immediately after executing the SQL query.

EXPORT DATA
  OPTIONS(
    uri='gs://<bucket>/<filename>_*.gzip',
    format='AVRO')
  AS SELECT * FROM `<project>.<dataset>.<table>`

Read the full documentation here: https://cloud.google.com/bigquery/docs/exporting-data#sql

Download or upload files from Cloud Storage to local

Download or upload a folder or single file from or to Google Cloud Storage. The format is gsutil cp <source> <destination>. The parameter -r will transmit the whole folder.

gsutil cp -r gs://some_bucket/folder /Users/currentUser/Downloads

You can also move data between Amazon AWS S3 and Google Cloud Storage. For that you need to provide access keys to AWS S3 in your .boto file. You can then use S3 path as source or destination, like this:

gsutil -m cp -r gs://sourceBucket/folder s3://destinationBucket/folder

The files will be temporarily loaded to your local machine. Keep this in mind when moving large amount of data.

Read the full documentation here: https://cloud.google.com/storage/docs/gsutil/commands/cp

Leave a Reply