- 04 Jul 2024
- 3 Minutes to read
- PDF
Consuming a data transfer in GCS
- Updated on 04 Jul 2024
- 3 Minutes to read
- PDF
This article will describe how to consume a Bobsled data transfer in Google Cloud Storage. This describes the process to access the data delivered to the Bobsled-managed destination.
This document provides three simple examples of how to access a data transfer on your computer’s command-line interface. This method is ideal to view, copy and sync the data transfer from the Bobsled-managed bucket to your own bucket. However, a data transfer can be accessed and consumed in production from any GCP client with the account that has been granted access to the share.
Prerequisites
Before consuming a data transfer, a data transfer must be sent to the destination and access must be configured in Bobsled for the identity that is consuming the data. To learn how to configure access to the destination, please visit Configure a Google Cloud Storage Destination
If you are accessing the data via the GCP command-line tool(s), you must install the CLI. Visit Installing the GCP CLI for more information.
Consuming a data transfer
From the Shares list page, click on the share that you would like to access.
Once a data transfer in the share has been completed, select the button Access Data.

Option 1: Accessing Data via Web Console
You can easily access the data using the web console link to view and download the data. To do so, please be sure to log in to the [insert cloud’s console name] with the account that has been configured to access the Bobsled share.
Select the Web console tab in the access dialog
Click the link icon to view the data in the GCP Web Console.
Option 2: Accessing Data via Command line
Using the Google Cloud Storage command-line tools (Gsutil or Gcloud), you can list
, copy
, and sync
the contents of the data transfer in Google Cloud Storage. To use the following commands, you will need to copy the Cloud Storage URI located in the access data dialog as pictured above.
Login to the GCP Command-line
Run the
login
command.gcloud auth login
If you would like to access the data as a service account, run the command:
gcloud auth activate-service-account [service-account-email] --key-file=[path-to-private-key-file]
Access via Gcloud command-line tool
NOTE:
To use the following commands, you will need to copy the Cloud Storage bucket URI located in the access dialog as pictured above.
List the contents. To list the data in the bucket, you will use the command gcloud ls ↗
gcloud storage ls -r <storage-bucket-URI>
Parameters used:
-r
(recursive) lists all objects in a bucketOptional parameters to use with the list command:
-l
(info) additional information about the bucket (object size, creation time, etc)
Copy the contents to your own bucket. To copy the data in the bucket, you will use the command gcloud cp ↗
gcloud storage cp -r <storage-bucket-URI> <your bucket/path>
Parameters used: '
-r
' (recursive) copies entire directory treeOptional parameters to use with the copy command: '
-n
' (no-clobber) prevent overwriting the content of existing files at the destination.
Sync the contents. Please use gsutil to sync the contents of the data transfer to your bucket.
Access via Gsutil command-line tool
List the contents. To list the data in the bucket, you will use the command gsutil ls ↗
gsutil ls -r <storage-bucket-URI>
Parameters used:
-r
(recursive) lists all objects in a bucketOptional parameters to use with the list command:
-l
(info) additional information about the bucket (object size, creation time, etc)
Copy the contents to your own bucket. To copy the data in the bucket, you will use the command gsutil cp ↗
gsutil cp -r <storage-bucket-URI> <your bucket/path>
Parameters used: '
-r
' (recursive) copies entire directory treeOptional parameters to use with the copy command: '
-n
' (no-clobber) prevent overwriting the content of existing files at the destination
Sync the contents. Use sync if you would like to copy only files that have are new or updated. Learn more about the command rsync ↗
gsutil rsync <storage-bucket-URI> <your bucket/path>
Parameters used: '
-r
' (recursive) copies entire directory tree.