Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ The [Config Sync Operator](https://cloud.google.com/kubernetes-engine/docs/add-o
cd ~/$GROUP_NAME/acm
mkdir setup
cd setup/
gsutil cp gs://config-management-release/released/latest/config-management-operator.yaml config-management-operator.yaml
gcloud storage cp gs://config-management-release/released/latest/config-management-operator.yaml config-management-operator.yaml

for i in "dev" "prod"; do
gcloud container clusters get-credentials ${i} --zone=$ZONE
Expand Down
10 changes: 5 additions & 5 deletions examples/cloud-composer-cicd/cloudbuild.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -65,15 +65,15 @@ steps:
find /workspace/airflow/logs/ -type f -name *.log | sort | xargs cat
# Fail if the backfill failed.
exit $ret
- name: gcr.io/cloud-builders/gsutil
- name: gcr.io/cloud-builders/gcloud
# Deploy the DAGs to your composer environment DAGs GCS folder
id: Deploy DAGs
args:
- -m
- storage
- rsync
- -r
- -c
- -x
- --recursive
- --checksums-only
- --exclude
- .*\.pyc|airflow_monitoring.py
- /workspace/${_DIRECTORY}/dags/
- ${_DEPLOY_DAGS_LOCATION}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -98,6 +98,6 @@ The following high-level steps describe the setup needed to run this example:
The workflow is automatically triggered by Cloud Function that gets invoked when a new file is uploaded into the *input-gcs-bucket*
For this example workflow, the [usa_names.csv](resources/usa_names.csv) file can be uploaded into the *input-gcs-bucket*

`gsutil cp resources/usa_names.csv gs://` **_input-gcs-bucket_**
`gcloud storage cp resources/usa_names.csv gs://` **_input-gcs-bucket_**

***
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ gcloud config set project $PROJECT
```
1. Create a Cloud Storage (GCS) bucket for receiving input files (*input-gcs-bucket*).
```bash
gsutil mb -c regional -l us-central1 gs://$PROJECT
gcloud storage buckets create --default-storage-class=regional --location=us-central1 gs://$PROJECT
```
2. Export the public BigQuery Table to a new dataset.
```bash
Expand Down Expand Up @@ -131,13 +131,13 @@ gcloud composer environments run demo-ephemeral-dataproc \

7. Upload the PySpark code [spark_avg_speed.py](composer_http_examples/spark_avg_speed.py) into a *spark-jobs* folder in GCS.
```bash
gsutil cp ~/professional-services/examples/cloud-composer-examples/composer_http_post_example/spark_avg_speed.py gs://$PROJECT/spark-jobs/
gcloud storage cp ~/professional-services/examples/cloud-composer-examples/composer_http_post_example/spark_avg_speed.py gs://$PROJECT/spark-jobs/
```

8. The DAG folder is essentially a Cloud Storage bucket. Upload the [ephemeral_dataproc_spark_dag.py](composer_http_examples/ephemeral_dataproc_spark_dag.py) file into the folder:

```bash
gsutil cp ~/professional-services/examples/cloud-composer-examples/composer_http_post_example/ephemeral_dataproc_spark_dag.py gs://<dag-folder>/dags
gcloud storage cp ~/professional-services/examples/cloud-composer-examples/composer_http_post_example/ephemeral_dataproc_spark_dag.py gs://<dag-folder>/dags
```
***

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -126,14 +126,14 @@
# Delete gcs files in the timestamped transformed folder.
delete_transformed_files = BashOperator(
task_id='delete_transformed_files',
bash_command="gsutil -m rm -r gs://{{ var.value.gcs_bucket }}" +
bash_command="gcloud storage rm --recursive gs://{{ var.value.gcs_bucket }}" +
"/{{ dag_run.conf['transformed_path'] }}/")

# If the spark job or BQ Load fails we rename the timestamped raw path to
# a timestamped failed path.
move_failed_files = BashOperator(
task_id='move_failed_files',
bash_command="gsutil mv gs://{{ var.value.gcs_bucket }}" +
bash_command="gcloud storage mv gs://{{ var.value.gcs_bucket }}" +
"/{{ dag_run.conf['raw_path'] }}/ " +
"gs://{{ var.value.gcs_bucket}}" +
"/{{ dag_run.conf['failed_path'] }}/",
Expand Down
2 changes: 1 addition & 1 deletion examples/cloudml-bee-health-detection/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ The code leverages pre-trained TF Hub image modules and uses Google Cloud Machin
JOB_NAME = ml_job$(date +%Y%m%d%H%M%S)
JOB_FOLDER = MLEngine/${JOB_NAME}
BUCKET_NAME = bee-health
MODEL_PATH = $(gsutil ls gs://${BUCKET_NAME}/${JOB_FOLDER}/export/estimator/ | tail -1)
MODEL_PATH = $(gcloud storage ls gs://${BUCKET_NAME}/${JOB_FOLDER}/export/estimator/ | tail -1)
MODEL_NAME = prediction_model
MODEL_VERSION = version_1
TEST_DATA = data/test.csv
Expand Down
2 changes: 1 addition & 1 deletion examples/cloudml-churn-prediction/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,7 @@ The SavedModel was saved in a timestamped subdirectory of model_dir.
```shell
MODEL_NAME="survival_model"
VERSION_NAME="demo_version"
SAVED_MODEL_DIR=$(gsutil ls $MODEL_DIR/export/export | tail -1)
SAVED_MODEL_DIR=$(gcloud storage ls $MODEL_DIR/export/export | tail -1)

gcloud ai-platform models create $MODEL_NAME \
--regions us-east1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ PROJECT_ID="$(get_project_id)"
VERSION_NAME="v${MODEL_OUTPUTS_DIR}_${TRIAL}"
INPUT_BUCKET="gs://${PROJECT_ID}-bucket"
MODEL_OUTPUTS_PATH="${INPUT_BUCKET}/${USER}/${MODEL_DIR}/${MODEL_OUTPUTS_DIR}/${TRIAL}/export/export"
MODEL_PATH="$(gsutil ls ${MODEL_OUTPUTS_PATH} | tail -n1)"
MODEL_PATH="$(gcloud storage ls ${MODEL_OUTPUTS_PATH} | tail -n1)"

gcloud ai-platform models create "${MODEL_NAME}" \
--regions us-east1 \
Expand Down
2 changes: 1 addition & 1 deletion examples/cloudml-energy-price-forecasting/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ The code takes in raw data from BigQuery, transforms and prepares the data, uses
JOB_NAME = ml_job$(date +%Y%m%d%H%M%S)
JOB_FOLDER = MLEngine/${JOB_NAME}
BUCKET_NAME = energyforecast
MODEL_PATH = $(gsutil ls gs://${BUCKET_NAME}/${JOB_FOLDER}/export/estimator/ | tail -1)
MODEL_PATH = $(gcloud storage ls gs://${BUCKET_NAME}/${JOB_FOLDER}/export/estimator/ | tail -1)
MODEL_NAME = forecaster_model
MODEL_VERSION = version_1
TEST_DATA = data/csv/DataTest.csv
Expand Down
6 changes: 3 additions & 3 deletions examples/cloudml-fraud-detection/README.MD
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@ Different versions of a same model can be stored in the ML-engine. The ML-engine
MODEL_NAME=fraud_detection
MODEL_VERSION=v_$(date +"%Y%m%d_%H%M%S")
TRIAL_NUMBER=1
MODEL_SAVED_NAME=$(gsutil ls ${TRAINING_OUTPUT_DIR}/trials/${TRIAL_NUMBER}/export/exporter/ | tail -1)
MODEL_SAVED_NAME=$(gcloud storage ls ${TRAINING_OUTPUT_DIR}/trials/${TRIAL_NUMBER}/export/exporter/ | tail -1)
gcloud ml-engine models create $MODEL_NAME \
--regions us-central1
gcloud ml-engine versions create $MODEL_VERSION \
Expand Down Expand Up @@ -162,11 +162,11 @@ Assess model's performances on out-of-sample data. Compute precision-recall curv
```
ANALYSIS_OUTPUT_PATH=.
mkdir ${ANALYSIS_OUTPUT_PATH}/labels
gsutil cp gs://${BUCKET_ID}/${DATAFLOW_OUTPUT_DIR}split_data/split_data_TEST_labels.txt* labels/
gcloud storage cp gs://${BUCKET_ID}/${DATAFLOW_OUTPUT_DIR}split_data/split_data_TEST_labels.txt* labels/
cat ${ANALYSIS_OUTPUT_PATH}/labels/* > ${ANALYSIS_OUTPUT_PATH}/labels.txt

mkdir ${ANALYSIS_OUTPUT_PATH}/predictions
gsutil cp ${PREDICTIONS_OUTPUT_PATH}/prediction.results* ./predictions/
gcloud storage cp ${PREDICTIONS_OUTPUT_PATH}/prediction.results* ./predictions/
cat ${ANALYSIS_OUTPUT_PATH}/predictions/* > ${ANALYSIS_OUTPUT_PATH}/predictions.txt
```

Expand Down
8 changes: 4 additions & 4 deletions examples/cloudml-sentiment-analysis/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ gcloud config set project $PROJECT_ID
### Move data to GCP.

```sh
gsutil -m cp -r $DATA_PATH/aclImdb $BUCKET_PATH
gcloud storage cp --recursive $DATA_PATH/aclImdb $BUCKET_PATH
GCP_INPUT_DATA=$BUCKET_PATH/aclImdb/train
```

Expand Down Expand Up @@ -121,12 +121,12 @@ tensorboard --logdir=$TRAINING_OUTPUT_DIR
**With HP tuning:**
```sh
TRIAL_NUMBER=''
MODEL_SAVED_NAME=$(gsutil ls ${TRAINING_OUTPUT_DIR}/${TRIAL_NUMBER}/export/exporter/ | tail -1)
MODEL_SAVED_NAME=$(gcloud storage ls ${TRAINING_OUTPUT_DIR}/${TRIAL_NUMBER}/export/exporter/ | tail -1)
```

**Without HP tuning:**
```sh
MODEL_SAVED_NAME=$(gsutil ls ${TRAINING_OUTPUT_DIR}/export/exporter/ | tail -1)
MODEL_SAVED_NAME=$(gcloud storage ls ${TRAINING_OUTPUT_DIR}/export/exporter/ | tail -1)
```

```sh
Expand Down Expand Up @@ -159,7 +159,7 @@ gcloud ml-engine predict \

```sh
PREDICTION_DATA_PATH=${BUCKET_PATH}/prediction_data
gsutil -m cp -r ${DATA_PATH}/aclImdb/test/ $PREDICTION_DATA_PATH
gcloud storage cp --recursive ${DATA_PATH}/aclImdb/test/ $PREDICTION_DATA_PATH
```

### Make batch predictions with GCP.
Expand Down
2 changes: 1 addition & 1 deletion tools/agile-machine-learning-api/update.sh
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,6 @@ TRAINER_PACKAGE='trainer-0.0.tar.gz'
cd codes/
python setup.py sdist
export GOOGLE_APPLICATION_CREDENTIALS=$service_account_json_key
gsutil cp -r dist/$TRAINER_PACKAGE $bucket_name/$TRAINER_PACKAGE
gcloud storage cp --recursive dist/$TRAINER_PACKAGE $bucket_name/$TRAINER_PACKAGE

echo "INFO: Please make sure that train.yaml and config.yaml have same name for trainer file"
9 changes: 4 additions & 5 deletions tools/airpiler/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ example_dag:
Now let's copy that to our gcs bucket into the **data** folder:

```bash
> gsutil cp dag-factory.yaml gs://us-central1-test-692672b8-bucket/data
> gcloud storage cp dag-factory.yaml gs://us-central1-test-692672b8-bucket/data
Copying file://dag-factory.yaml [Content-Type=application/octet-stream]...
/ [1 files][ 386.0 B/ 386.0 B]
Operation completed over 1 objects/386.0 B.
Expand Down Expand Up @@ -191,7 +191,7 @@ Run the following to get your GCS Bucket
gcloud composer environments describe <YOUR_ENV> --location us-central1 --format="get(config.dagGcsPrefix)"

Run the following to upload the dag-factory yaml file to the bucket:
gsutil cp use-case2.yaml gs://<YOUR_ENV>/data
gcloud storage cp use-case2.yaml gs://<YOUR_ENV>/data

Then run the following to upload the airflow dag python script to your composer environment:
gcloud composer environments storage dags import --environment <YOUR_ENV> --location us-central1 --source use-case2-dag.py
Expand All @@ -205,7 +205,7 @@ Then visit the URL and trigger your DAG
Then following the instructions we can run the following to upload the files:

```bash
gsutil cp use-case2.yaml gs://us-central1-test-692672b8-bucket/data
gcloud storage cp use-case2.yaml gs://us-central1-test-692672b8-bucket/data
gcloud composer environments storage dags import --environment test --location us-central1 --source use-case2-dag.py
```

Expand Down Expand Up @@ -269,7 +269,7 @@ digraph USE_CASE2_TG_DAG {
All the logs are written to the GCS bucket and you can check them out by putting all the above information together ([Log folder directory structure](https://cloud.google.com/composer/docs/concepts/logs#log_folder_directory_structure) describes the format):

```bash
> gsutil cat gs://us-central1-test-692672b8-bucket/logs/example_dag/task_3/2021-05-12T15:19:58+00:00/1.log
> gcloud storage cat gs://us-central1-test-692672b8-bucket/logs/example_dag/task_3/2021-05-12T15:19:58+00:00/1.log
[2021-05-12 15:21:01,602] {taskinstance.py:671} INFO - Dependencies all met for <TaskInstance: example_dag.task_3 2021-05-12T15:19:58+00:00 [queued]>@-@{"workflow": "example_dag", "task-id": "task_3", "execution-date": "2021-05-12T15:19:58+00:00"}
[2021-05-12 15:21:01,733] {taskinstance.py:671} INFO - Dependencies all met for <TaskInstance: example_dag.task_3 2021-05-12T15:19:58+00:00 [queued]>@-@{"workflow": "example_dag", "task-id": "task_3", "execution-date": "2021-05-12T15:19:58+00:00"}
[2021-05-12 15:21:01,734] {taskinstance.py:881} INFO -
Expand Down Expand Up @@ -311,4 +311,3 @@ https://tddbc3f0ad77184ffp-tp.appspot.com
```

Upon visiting the above page and authenticating using IAP you will see a list of the available DAGS and also check out the logs as well.

6 changes: 3 additions & 3 deletions tools/airpiler/airpiler.py
Original file line number Diff line number Diff line change
Expand Up @@ -315,8 +315,8 @@ def parse_jil(input_file):
gcloud_gcs_command = (
f"gcloud composer environments describe {ENV_TEMPL}"
f" --location us-central1 --format=\"get(config.dagGcsPrefix)\"")
gsutil_cp_command = (
f"gsutil cp {dag_factory_yaml_file} gs://{ENV_TEMPL}/data")
gcloud_storage_cp_command = (
f"gcloud storage cp {dag_factory_yaml_file} gs://{ENV_TEMPL}/data")

gcloud_upload_command = (
f"gcloud composer environments storage dags import --environment"
Expand All @@ -327,7 +327,7 @@ def parse_jil(input_file):
f"Run the following to get your GCS Bucket \n"
f"{gcloud_gcs_command}\n\n"
f"Run the following to upload the dag-factory yaml file to the "
f"bucket:\n{gsutil_cp_command}\n\n"
f"bucket:\n{gcloud_storage_cp_command}\n\n"
f"Then run the following to upload the airflow dag python"
f" script to your composer environment: \n"
f"{gcloud_upload_command}\n\n"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@

- name: Download bmctl
shell:
cmd: gsutil cp {{ bmctl_download_url }} .
cmd: gcloud storage cp {{ bmctl_download_url }} .

- name: Make bmctl executable
ansible.builtin.file:
Expand Down
2 changes: 1 addition & 1 deletion tools/anthosvmware-ansible-module/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ Consider these assumptions when you wonder how certain tasks are implemented or
## Prerequisites

- Ansible
- Authenticate `gcloud` on jumphost with `gcloud auth login` so that Ansible can run the `gsutil` command on the jumphost
- Authenticate `gcloud` on jumphost with `gcloud auth login` so that Ansible can run the `gcloud storage` command on the jumphost
- vSphere: Create VM-Folder for Anthos VMs
- vSphere: Create folder on vSAN for Anthos Admin Cluster (if using vSAN).
Consider using value of Ansible variable `{{ ac_name }}` as the vSAN folder name to be consistent.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,8 @@
ansible.builtin.command: # noqa 204 301
chdir: "{{ yamldestpath }}"
argv:
- gsutil
- gcloud
- storage
- cp
- gs://gke-on-prem-release/gkeadm/{{ glb_anthos_version }}/linux/gkeadm
- "{{ yamldestpath }}/gkeadm-{{ glb_anthos_version }}"
Expand All @@ -63,7 +64,8 @@
ansible.builtin.command: # noqa 204 301
chdir: "{{ yamldestpath }}"
argv:
- gsutil
- gcloud
- storage
- cp
- gs://gke-on-prem-release/gkeadm/{{ glb_anthos_version }}/linux/gkeadm.1.sig
- "{{ yamldestpath }}/gkeadm-{{ glb_anthos_version }}.1.sig"
Expand Down
3 changes: 2 additions & 1 deletion tools/anthosvmware-ansible-module/roles/ais/tasks/upload.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,8 @@
- name: "[ais] Upload login config file"
ansible.builtin.command:
argv:
- gsutil
- gcloud
- storage
- cp
- "{{ [yamldestpath, ais_login_config_file] | join('/') }}"
- "{{ ais_gcsbucket }}/{{ uc_name if uc_name is defined else ac_name }}/{{ ais_login_config_file }}"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,8 @@
- name: "[upload_artifactory] Download files from GCS"
ansible.builtin.command: # noqa 305 301 no-changed-when
argv:
- gsutil
- gcloud
- storage
- cp
- "{{ item.item.src }}"
- "{{ workdir }}/{{ item.item.file }}"
Expand Down
4 changes: 2 additions & 2 deletions tools/anthosvmware-ansible-module/roles/usercluster/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -164,12 +164,12 @@ ASM Service Mesh version, revision, and network ID information.

The available `asmcli` versions for use can be found by using the below command:
```
gsutil ls gs://csm-artifacts/asm/
gcloud storage ls gs://csm-artifacts/asm/
```

You can filter for a specific revision with `grep`. For example:
```
gsutil ls gs://csm-artifacts/asm/ | grep 1.14
gcloud storage ls gs://csm-artifacts/asm/ | grep 1.14
```

> **Note:** `asm_network_id` is used for configuring a multi-cluster mesh. It *must be unique* for proper
Expand Down
2 changes: 1 addition & 1 deletion tools/asset-inventory/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ It's suggested to create a new project to hold the asset inventory resources. Es

```
export BUCKET=gs://${ORGANIZATION_ID}-assets
gsutil mb $BUCKET
gcloud storage buckets create $BUCKET
```

1. Create the dataset to hold the resource tables in BigQuey.
Expand Down
4 changes: 2 additions & 2 deletions tools/asset-inventory/asset_inventory/export.py
Original file line number Diff line number Diff line change
Expand Up @@ -139,8 +139,8 @@ def add_argparse_args(ap, required=False):
' the project that you enabled the API on, then you must also grant the'
' "service-<project-id>@gcp-sa-cloudasset.iam.gserviceaccount.com" account'
' objectAdmin privileges to the bucket:\n'
'gsutil iam ch serviceAccount:service-<project-id>@gcp-sa-cloudasset.iam.gserviceaccount.com:objectAdmin '
'gs://<bucket>\n'
'gcloud storage buckets add-iam-policy-binding gs://<bucket> '
'--member=serviceAccount:service-<project-id>@gcp-sa-cloudasset.iam.gserviceaccount.com --role=roles/storage.objectAdmin\n'
'\n\n')
ap.add_argument(
'--parent',
Expand Down
3 changes: 1 addition & 2 deletions tools/bigdata-generator/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ When running the program using Dataflow, the config file needs to be stored in G
Upload the config file to GCS, for example:
```
CONFIG_FILE_PATH=gs://${TMP_BUCKET}/config.json
gsutil cp config_file_samples/sales_sample_bigquery.json $CONFIG_FILE_PATH
gcloud storage cp config_file_samples/sales_sample_bigquery.json $CONFIG_FILE_PATH
```

submitting the Dataflow job
Expand All @@ -107,4 +107,3 @@ This project was developed using a GCP sandbox that has policies that make the c
Given these restrictions, a custom Dataflow container is being used (defined by the [Dockerfile](Dockerfile)) that installs the dependencies. The Dataflow job is submitted to run inside a VPC with no public IP address.

Feel free to run the data generator process as best fits your needs.

2 changes: 1 addition & 1 deletion tools/bigquery-s3tobq/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ $gcloud projects add-iam-policy-binding [PROJECT_ID] \
--member=serviceAccount:service-[PROJECT_NUMBER]@gcp-sa-pubsub.iam.gserviceaccount.com \
--role=roles/iam.serviceAccountTokenCreator

SERVICE_ACCOUNT="$(gsutil kms serviceaccount -p [PROJECT_ID])"
SERVICE_ACCOUNT="$(gcloud storage service-agent --project [PROJECT_ID])"

$gcloud projects add-iam-policy-binding [PROJECT_ID] \
--member="serviceAccount:${SERVICE_ACCOUNT}" \
Expand Down
2 changes: 1 addition & 1 deletion tools/cloud-composer-backup-restore/composer_br/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ def _upload_blob(bucket_name: str, source_file_name: str,


def _copy_gcs_folder(bucket_from: str, bucket_to: str) -> None:
command_utils.sh(['gsutil', '-m', 'rsync', '-r', bucket_from, bucket_to])
command_utils.sh(['gcloud', 'storage', 'rsync', '--recursive', bucket_from, bucket_to])


def _check_cli_depdendencies() -> None:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ def import_db(username: str, password: str, host: str, port: str, database: str,
"""
Extract a SQL filefrom a GCS path and imports it into postgres
"""
command_utils.sh(['gsutil', 'cp', gcs_sql_file_path, '/tmp/'])
command_utils.sh(['gcloud', 'storage', 'cp', gcs_sql_file_path, '/tmp/'])

split_path = gcs_sql_file_path.split('/')

Expand Down
4 changes: 2 additions & 2 deletions tools/cloud-composer-dag-validation/cloudbuild.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -24,11 +24,11 @@ steps:
dir: /workspace
- id: deploy-dags
dir: tools/cloud-composer-dag-validation
name: 'gcr.io/cloud-builders/gsutil'
name: 'gcr.io/cloud-builders/gcloud'
entrypoint: bash
args:
- '-c'
- |
gsutil -m cp -r 'dags/' '${_COMPOSER_BUCKET}/dags_export'
gcloud storage cp --recursive 'dags/' '${_COMPOSER_BUCKET}/dags_export'
options:
logging: CLOUD_LOGGING_ONLY
Loading