Skip to content

[ci-scan] Test failure: CdacXPlatDumpTest IOException: Stream was too long in runtime-diagnostics (pipeline 309) #127859

@github-actions

Description

@github-actions

Reasoning

The CdacXPlatDumpTest in pipeline runtime-diagnostics (def 309) has been failing in every build for the past week. The Helix payload bundles core dump files from 8 different platforms (linux-arm, linux-arm64, linux-x64, osx-arm64, osx-x64, windows-arm64, windows-x86, windows-x64) into a single zip stream. When the combined uncompressed size of all dumps exceeds 2 GB, the MemoryStream used to back the zip archive throws IOException: Stream was too long because MemoryStream capacity is limited to int.MaxValue (~2.1 GB).

This is a test infrastructure bug in how multi-platform dumps are packaged for Helix, not a product regression. The fix must restructure the test to avoid aggregating all platform dumps into a single stream.

Impact on platforms

  • runtime-diagnostics (def 309) — all 8 platform legs — CdacXPlatDumpTestIOException: Stream was too long — builds 1409927, 1408350, 1406820, 1406418, 1404739

Errors log

System.IO.IOException: Stream was too long.
   at System.IO.MemoryStream.set_Capacity(Int32 value)
   at System.IO.MemoryStream.EnsureCapacity(Int32 value)

The error occurs during Helix payload creation in cdac-dump-xplat-test-helix.proj when zipping multi-platform dumps into a single archive.

First build it occurred

First observed in scanned window: build 1404739 (earliest of 5 failing builds). All 5 scanned builds of pipeline 309 are affected. Build link: https://dev.azure.com/dnceng-public/public/_build/results?buildId=1404739. This is computed within the scanned window and may not be the true origin.

Recommended action

Owner: @dotnet/diagnostics (cDAC team)

File to investigate: src/native/managed/cdac/tests/DumpTests/cdac-dump-xplat-test-helix.proj — the test project that aggregates dump files from all platforms into a single Helix payload

Fix options (choose one):

  1. Split the Helix payload into per-platform jobs so each job sends only its own platform's dump (preferred — also improves isolation)
  2. Switch from MemoryStream to FileStream or a chunked approach for the zip archive
  3. Compress/filter dump files before packaging to reduce total size below 2 GB

All 5 scanned builds fail; this pipeline is 100% red in the scan window.

Note

🔒 Integrity filter blocked 6 items

The following items were blocked because they don't meet the GitHub integrity level.

  • #92420 search_issues: has lower integrity than agent requires. The agent cannot read data with integrity below "approved".
  • #123982 search_issues: has lower integrity than agent requires. The agent cannot read data with integrity below "approved".
  • #111752 search_issues: has lower integrity than agent requires. The agent cannot read data with integrity below "approved".
  • #125600 search_issues: has lower integrity than agent requires. The agent cannot read data with integrity below "approved".
  • #124206 search_issues: has lower integrity than agent requires. The agent cannot read data with integrity below "approved".
  • #126448 search_issues: has lower integrity than agent requires. The agent cannot read data with integrity below "approved".

To allow these resources, lower min-integrity in your GitHub frontmatter:

tools:
  github:
    min-integrity: approved  # merged | approved | unapproved | none

Generated by CI Outer-Loop Failure Scanner · ● 16.2M ·

Report

Summary

24-Hour Hit Count 7-Day Hit Count 1-Month Count
0 0 0

Metadata

Metadata

Assignees

No one assigned

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions