Skip to content

[Bug]: BigtableIO integration tests not actually testing #32071

@Abacn

Description

@Abacn

What happened?

BigtableReadIT and BigtableWriteIT are written incorrectly. Firstly, instead of using a TestPipeline, it first get the PipelineOptions from TestPipeline.testingPipelineOptions() in @Before setup(), then assemble a normal Pipeline and calls Pipeline.run(). This has a couple of problems

  • It does not check if pipeline failed. Actually, testE2EBigtableSegmentRead has been consistently failing on Direct Runner due to OOM. It was until Add Lineage metrics for BigtableIO #32068 tries to assert Lineage metrics found that the pipeline actually failed and did not report the metrics

  • on Dataflow runner, pipeline.run() is non-blocking, the test just submit a pipeline and ends

Issue Priority

Priority: 2 (default / most bugs should be filed as P2)

Issue Components

  • Component: Python SDK
  • Component: Java SDK
  • Component: Go SDK
  • Component: Typescript SDK
  • Component: IO connector
  • Component: Beam YAML
  • Component: Beam examples
  • Component: Beam playground
  • Component: Beam katas
  • Component: Website
  • Component: Infrastructure
  • Component: Spark Runner
  • Component: Flink Runner
  • Component: Samza Runner
  • Component: Twister2 Runner
  • Component: Hazelcast Jet Runner
  • Component: Google Cloud Dataflow Runner

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions