-
Notifications
You must be signed in to change notification settings - Fork 4.5k
[yaml] Iceberg add files yaml transform and test #37938
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
369f5ea
15d6012
802f692
0b99815
772176b
59ba559
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -403,3 +403,27 @@ | |
| 'ReadFromIcebergCDC': 'beam:schematransform:org.apache.beam:iceberg_cdc_read:v1' | ||
| config: | ||
| gradle_target: 'sdks:java:io:expansion-service:shadowJar' | ||
|
|
||
| #IcebergAddFiles | ||
| - type: renaming | ||
| transforms: | ||
| 'IcebergAddFiles': 'IcebergAddFiles' | ||
| config: | ||
| mappings: | ||
| 'IcebergAddFiles': | ||
| table: 'table' | ||
| catalog_properties: 'catalog_properties' | ||
| config_properties: 'config_properties' | ||
| triggering_frequency_seconds: 'triggering_frequency_seconds' | ||
| append_batch_size: 'append_batch_size' | ||
| location_prefix: 'location_prefix' | ||
|
Comment on lines
+413
to
+419
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. We added a few more properties to the provider:
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. done thanks |
||
| partition_fields: 'partition_fields' | ||
| table_properties: 'table_properties' | ||
| error_handling: 'error_handling' | ||
| underlying_provider: | ||
| type: beamJar | ||
| transforms: | ||
| 'IcebergAddFiles': 'beam:schematransform:iceberg_add_files:v1' | ||
| config: | ||
| gradle_target: 'sdks:java:io:expansion-service:shadowJar' | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,115 @@ | ||
| # | ||
| # Licensed to the Apache Software Foundation (ASF) under one or more | ||
| # contributor license agreements. See the NOTICE file distributed with | ||
| # this work for additional information regarding copyright ownership. | ||
| # The ASF licenses this file to You under the Apache License, Version 2.0 | ||
| # (the "License"); you may not use this file except in compliance with | ||
| # the License. You may obtain a copy of the License at | ||
| # | ||
| # http://www.apache.org/licenses/LICENSE-2.0 | ||
| # | ||
| # Unless required by applicable law or agreed to in writing, software | ||
| # distributed under the License is distributed on an "AS IS" BASIS, | ||
| # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| # See the License for the specific language governing permissions and | ||
| # limitations under the License. | ||
| # | ||
|
|
||
| fixtures: | ||
| - name: TEMP_DIR | ||
| type: "tempfile.TemporaryDirectory" | ||
|
|
||
| pipelines: | ||
|
|
||
| # Pipeline 1: Write a dummy Parquet file | ||
| - pipeline: | ||
| type: chain | ||
| transforms: | ||
| - type: Create | ||
| config: | ||
| elements: | ||
| - {label: "11a", rank: 0, bool: true} | ||
| - {label: "37a", rank: 1, bool: false} | ||
| - {label: "389a", rank: 2, bool: false} | ||
| - {label: "3821b", rank: 3, bool: true} | ||
| - {label: "990c", rank: 4, bool: true} | ||
| - {label: "1024d", rank: 5, bool: false} | ||
| - type: WriteToParquet | ||
| config: | ||
| path: "{TEMP_DIR}/data/data" | ||
| file_name_suffix: ".parquet" | ||
|
|
||
| # Pipeline 2: Add our generated file to the Iceberg table | ||
| - pipeline: | ||
| type: chain | ||
| transforms: | ||
| - type: Create | ||
| config: | ||
| elements: | ||
| # By default, Beam writes a sharded file like <prefix>-00000-of-00001 | ||
| - {file: "{TEMP_DIR}/data/data-00000-of-00001.parquet"} | ||
| - type: IcebergAddFiles | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Let's also add a test that configures a DLQ (to GCS) since AddFiles transform may push some of the unprocessed files to a DLQ.
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. done, thanks |
||
| config: | ||
| table: "default.table" | ||
| location_prefix: "{TEMP_DIR}/data/" | ||
| catalog_properties: | ||
| type: "hadoop" | ||
| warehouse: "{TEMP_DIR}/dir" | ||
|
|
||
| # Pipeline 3: Read from Iceberg and verify the contents | ||
| - pipeline: | ||
| type: chain | ||
| transforms: | ||
| - type: ReadFromIceberg | ||
| config: | ||
| table: "default.table" | ||
| catalog_properties: | ||
| type: "hadoop" | ||
| warehouse: "{TEMP_DIR}/dir" | ||
| - type: AssertEqual | ||
| config: | ||
| elements: | ||
| - {label: "11a", rank: 0, bool: true} | ||
| - {label: "37a", rank: 1, bool: false} | ||
| - {label: "389a", rank: 2, bool: false} | ||
| - {label: "3821b", rank: 3, bool: true} | ||
| - {label: "990c", rank: 4, bool: true} | ||
| - {label: "1024d", rank: 5, bool: false} | ||
|
|
||
| # Pipeline 4: Add an invalid file to trigger the DLQ | ||
| - pipeline: | ||
| type: composite | ||
| transforms: | ||
| - type: Create | ||
| config: | ||
| elements: | ||
| - {file: "gs://dummy-bucket/does-not-exist.txt"} | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Thanks, this test made me realize I forgot to handle FileNotFound errors! Your test will still pass though, because |
||
| - type: IcebergAddFiles | ||
| name: AddInvalidFile | ||
| input: Create | ||
| config: | ||
| table: "default.table" | ||
| location_prefix: "gs://dummy-bucket/" | ||
| catalog_properties: | ||
| type: "hadoop" | ||
| warehouse: "{TEMP_DIR}/dir" | ||
| error_handling: | ||
| output: error_output | ||
| - type: WriteToJson | ||
| name: WriteErrorsToJson | ||
| input: AddInvalidFile.error_output | ||
| config: | ||
| path: "{TEMP_DIR}/error.json" | ||
|
|
||
| # Pipeline 5: Ensure errors were written | ||
| - pipeline: | ||
| type: chain | ||
| transforms: | ||
| - type: ReadFromJson | ||
| config: | ||
| path: "{TEMP_DIR}/error.json*" | ||
|
Comment on lines
+104
to
+110
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Do we need to verify with an
Collaborator
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Good catch, added. Thanks! |
||
| - type: AssertEqual | ||
| config: | ||
| elements: | ||
| - {file: "gs://dummy-bucket/does-not-exist.txt", error: "Could not determine the file's format"} | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we have missed partition_fields and table_properties supported by AddFilesSchemaTransform
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done thanks