This exporter allows you to export logs, metrics, and traces to Google Cloud Storage. Telemetry is exported in OpenTelemetry Protocol JSON format.
- Introduced: v1.72.0
- Logs
- Metrics
- Traces
- The exporter marshals any telemetry sent to it into OTLP Json Format. It will apply compression if configured to do so.
- Exports telemetry to Google Cloud Storage. See Object Path for more information on the expected object path format.
- The exporter will create a bucket if it doesn't exist. Bucket names in GCS are globally unique, so if any organization contains a bucket with the given name, it will fail to create and will likely give 403 Forbidden codes back when it tries to write. You can manually create the bucket in GCS to ensure the name is not taken. More info
- You can authenticate to Google Cloud using the provided
credentials,credentials_file, or by using Application Default Credentials. - Your authentication credentials must have the Storage Admin permission to create buckets, folders, and objects.
| Field | Type | Default | Required | Description |
|---|---|---|---|---|
| bucket_name | string | true |
The name of the bucket to store objects in. | |
| project_id | string | false |
The ID of the Google Cloud project the bucket belongs to. The exporter will read this value from the credentials if it is not configured. | |
| bucket_location | string | false |
The location of the bucket. Uses GCS default location if not set. Can only be set during bucket creation. | |
| bucket_storage_class | string | false |
The storage class of the bucket. Uses GCS default storage class if not set. Can only be set during bucket creation. | |
| folder_name | string | false |
An optional folder to put the objects in. | |
| object_prefix | string | false |
An optional prefix to prepend to the object file name. | |
| credentials | string | false |
Optional credentials to provide authentication to Google Cloud. Mutually exclusive with credentials_file. |
|
| credentials_file | string | false |
Optional file path to credentials to provide authentication to Google Cloud. Mutually exclusive with credentials. |
|
| partition | string | minute |
false |
Time granularity of object name. Valid values are hour or minute. |
| compression | string | none |
false |
The type of compression applied to the data before sending it to storage. Valid values are none and gzip. |
| timeout | string | false |
See doc for details. | |
| sending_queue | map | false |
See doc for details. | |
| retry_on_failure | map | false |
See doc for details. |
Object paths will be in the form:
{folder_name}/year=XXXX/month=XX/day=XX/hour=XX/minute=XX
This configuration only specifies the bucket name. The exporter will use the default partition of minute and will not apply compression. It will use the Google Cloud default location and storage class for the bucket, will not add a folder name or object prefix, and will authenticate to google cloud using application default credentials instead of credentials or credentials_file. The project ID is read from the credentials.
googlecloudstorage:
bucket_name: "my-bucket-name"Example Object Names:
year=2021/month=01/day=01/hour=01/minute=00/metrics_{random_id}.json
year=2021/month=01/day=01/hour=01/minute=00/logs_{random_id}.json
year=2021/month=01/day=01/hour=01/minute=00/traces_{random_id}.json
This shows specifying a partition of hour and that the minute=XX portion of the blob path will be omitted. It also adds compression, reducing the object size significantly.
googlecloudstorage:
bucket_name: "my-bucket-name"
partition: "hour"
compression: "gzip"Example Object Names:
year=2021/month=01/day=01/hour=01/metrics_{random_id}.json.gz
year=2021/month=01/day=01/hour=01/logs_{random_id}.json.gz
year=2021/month=01/day=01/hour=01/traces_{random_id}.json.gz
This configuration shows all fields filled out.
googlecloudstorage:
project_id: "my-project-id-18352"
bucket_name: "my-bucket-name"
bucket_location: "US-EAST1"
bucket_storage_class: "NEARLINE"
credentials_file: "/path/to/googlecloud/credentials/file"
folder_name: "my-folder-name"
object_prefix: "object-prefix_"
partition: "hour"
compression: "gzip"Example Blob Names:
my-folder-name/year=2021/month=01/day=01/hour=01/object-prefix_metrics_{random_id}.json.gz
my-folder-name/year=2021/month=01/day=01/hour=01/object-prefix_logs_{random_id}.json.gz
my-folder-name/year=2021/month=01/day=01/hour=01/object-prefix_traces_{random_id}.json.gz