-
Notifications
You must be signed in to change notification settings - Fork 4.5k
[Bug]: BigtableIO integration tests not actually testing #32071
Copy link
Copy link
Open
Description
What happened?
BigtableReadIT and BigtableWriteIT are written incorrectly. Firstly, instead of using a TestPipeline, it first get the PipelineOptions from TestPipeline.testingPipelineOptions() in @Before setup(), then assemble a normal Pipeline and calls Pipeline.run(). This has a couple of problems
-
It does not check if pipeline failed. Actually,
testE2EBigtableSegmentReadhas been consistently failing on Direct Runner due to OOM. It was until Add Lineage metrics for BigtableIO #32068 tries to assert Lineage metrics found that the pipeline actually failed and did not report the metrics -
on Dataflow runner, pipeline.run() is non-blocking, the test just submit a pipeline and ends
Issue Priority
Priority: 2 (default / most bugs should be filed as P2)
Issue Components
- Component: Python SDK
- Component: Java SDK
- Component: Go SDK
- Component: Typescript SDK
- Component: IO connector
- Component: Beam YAML
- Component: Beam examples
- Component: Beam playground
- Component: Beam katas
- Component: Website
- Component: Infrastructure
- Component: Spark Runner
- Component: Flink Runner
- Component: Samza Runner
- Component: Twister2 Runner
- Component: Hazelcast Jet Runner
- Component: Google Cloud Dataflow Runner
Reactions are currently unavailable