Skip to content

File#createReadStream() can throw ERR_STREAM_UNABLE_TO_PIPE if the consumer closes before the response stream is piped #7603

@michaellatman

Description

@michaellatman

File#createReadStream() can throw ERR_STREAM_UNABLE_TO_PIPE if the consumer closes before the response stream is piped

Bug shape

If a caller uses file.createReadStream() and the downstream consumer closes early, the library can throw:

Error [ERR_STREAM_UNABLE_TO_PIPE]: Cannot pipe to a closed or destroyed stream

The stack points to the pipeline(...) call inside the createReadStream() response path.

Stack trace

node:internal/streams/pipeline:264
        throw new ERR_STREAM_UNABLE_TO_PIPE();
              ^

Error [ERR_STREAM_UNABLE_TO_PIPE]: Cannot pipe to a closed or destroyed stream
    at pipelineImpl (node:internal/streams/pipeline:264:15)
    at pipeline (node:internal/streams/pipeline:183:10)
    at onResponse (file:///prod/backend/node_modules/@google-cloud/storage/build/esm/src/file.js:1067:13)
    at Util.handleResp (file:///prod/backend/node_modules/@google-cloud/storage/build/esm/src/nodejs-common/util.js:173:9)
    at Duplexify.<anonymous> (file:///prod/backend/node_modules/@google-cloud/storage/build/esm/src/file.js:1102:22)
    at Duplexify.emit (node:events:508:28)
    at PassThrough.emit (node:events:508:28)
    at onResponse (/prod/backend/node_modules/retry-request/index.js:253:19)
    at PassThrough.<anonymous> (/prod/backend/node_modules/retry-request/index.js:183:11)
    at PassThrough.emit (node:events:520:35) {
  code: 'ERR_STREAM_UNABLE_TO_PIPE'
}

Repro shape

This seems to require:

  1. create a GCS read stream with file.createReadStream()
  2. attach it to a downstream consumer that may close early
  3. let the downstream consumer close before the GCS response path finishes wiring up the internal pipeline

Examples of downstream consumers that can close early:

  • an HTTP response stream when the client disconnects
  • a proxy/load balancer closing the request
  • any writable/transform stream that is destroyed before the source finishes

Why this seems library-side

The failing stack points into the createReadStream() implementation, specifically the code path that does:

pipeline(rawResponseStream, ...transformStreams, throughStream, onComplete);

The issue appears to be that throughStream can already be closed or destroyed by the time this pipeline(...) call happens.

Actual behavior

The process throws ERR_STREAM_UNABLE_TO_PIPE from the library read path.

Expected behavior

If the downstream consumer closes early, the library should surface a normal stream error or safely stop/drain the source stream, but it should not synchronously throw from the internal pipeline(...) setup path.

Version

  • @google-cloud/storage@7.17.0
  • Node.js 22.x

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions