From 344383740317e78e133898221e86081369027800 Mon Sep 17 00:00:00 2001 From: Mytreya Kasturi Date: Thu, 7 May 2026 08:16:56 +0530 Subject: [PATCH] UPSTREAM: : automate hermetic build requirements generation MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Replace the manually maintained ~100-line bash pipeline in openshift/Dockerfile.requirements with a Python script that derives all four requirements files entirely from the Pipfile, with no hardcoded package names. openshift/hack/generate_requirements.py implements five stages: Stage 1 – pipenv install + CVE auto-fix via Safety/pipenv update, then pip freeze to capture all pinned runtime packages. Stage 2 – Iterative pip-compile with dynamic conflict exclusion to produce requirements.txt. Packages that make pip-compile fail due to incompatible declared metadata (e.g. conflicting setuptools version ranges) are detected from the error output, excluded from compilation, and appended manually. RPM-installed packages (cryptography, cffi, pycparser, maturin) are commented out in post-processing. Stage 3 – pip_find_builddeps.py is run once per runtime package so that every package's build-system requirements can be associated with it individually. Stage 4 – Conflict detection and phase splitting. Merging all build-dep constraints is attempted with pip-compile; when it fails the conflicting dependency is identified and packages split into an earlier phase (needing the older version) and a later phase (needing the newer version) using three fallback strategies in order: direct upper-bound heuristic, per-package compilation to detect transitive conflicts, and single-package bisection. N discovered phases are mapped to exactly three build files with a greedy merge that verifies compatibility before absorbing each middle phase into the main build group. Build-isolation exact-version pins (e.g. wheel==0.45.1 declared by ansible-core's pyproject.toml) are discovered automatically from pkg_constraints, injected into requirements-pre-build.txt so cachi2 pre-fetches them, and stripped from later phases so those phases resolve newer CVE-fixed versions. No version numbers are hardcoded. Stage 5 – Safety scans each generated build requirements file for CVEs and attempts to fix them by adding minimum-version constraints and re-running pip-compile. Conflicts that prevent the fix are reported with the name of the blocking constraint. openshift/hack/generate_requirements.md documents the full algorithm. openshift/Dockerfile.requirements is reduced to installing the toolchain and invoking the script. The generated requirements files and Pipfile.lock (updated by any CVE auto-fixes) are exported to the mounted volume by the ENTRYPOINT. The regenerated requirements files reflect: - pyasn1 0.6.2 → 0.6.3 (CVE-2026-30922) - requests 2.32.5 → 2.33.1 (CVE-2026-25645) - wheel 0.46.3 → 0.47.0 in requirements-build.txt (CVE-2026-24049; 0.45.1 is retained in requirements-pre-build.txt for ansible-core build isolation as discovered from its build-system metadata) - certifi, charset-normalizer, idna, packaging, pathspec, poetry-core, setuptools, trove-classifiers bumped to latest - kubernetes 33.1.0 build deps skipped (internal setuptools-scm conflict in the package's own build-system declaration) Co-authored-by: Cursor --- openshift/Dockerfile.requirements | 107 +- openshift/Pipfile.lock | 685 ++++++++++++ openshift/hack/generate_requirements.md | 392 +++++++ openshift/hack/generate_requirements.py | 1259 +++++++++++++++++++++++ openshift/requirements-build.txt | 68 +- openshift/requirements-build1.txt | 42 +- openshift/requirements-pre-build.txt | 25 +- openshift/requirements.txt | 92 +- 8 files changed, 2468 insertions(+), 202 deletions(-) create mode 100644 openshift/Pipfile.lock create mode 100644 openshift/hack/generate_requirements.md create mode 100644 openshift/hack/generate_requirements.py diff --git a/openshift/Dockerfile.requirements b/openshift/Dockerfile.requirements index 612d20ad..50553c11 100644 --- a/openshift/Dockerfile.requirements +++ b/openshift/Dockerfile.requirements @@ -7,6 +7,7 @@ RUN set -e && dnf clean all && rm -rf /var/cache/dnf/* \ && python3 -m ensurepip --upgrade COPY ./images/ansible-operator/Pipfile* ./ +COPY ./openshift/hack/generate_requirements.py ./ # The build dependencies are required by cachito. Following script # does exactly the same. More info at: https://github.com/containerbuildsystem/cachito/blob/master/docs/pip.md#build-dependencies @@ -15,95 +16,25 @@ RUN curl -LO https://raw.githubusercontent.com/containerbuildsystem/cachito/mast RUN python3 -m pip install --upgrade pip -# Create requirements.in file from the pipenv created using the -# same Pipfile and Pipfile.lock used for upstream image. Then -# use pip-compile to generate the requirements.txt file. +# Install tooling, then run the requirements generator. +# +# generate_requirements.py dynamically: +# 1. Resolves all runtime packages via pipenv + pip freeze. +# 2. Detects packages that make pip-compile fail due to conflicting dependency +# metadata (e.g. incompatible setuptools/setuptools-scm version ranges) and +# excludes them from compilation, appending them manually to requirements.txt. +# 3. Collects per-package build-system requirements via pip_find_builddeps.py. +# 4. Detects version conflicts across all build deps using pip-compile + error +# parsing and splits conflicting packages into ordered installation phases. +# 5. Produces requirements-pre-build.txt, requirements-build1.txt, +# requirements-build.txt, and requirements.txt with RPM-installed packages +# (cryptography, cffi, pycparser, maturin) commented out in every file. +# +# No package names are hardcoded in the script beyond the RPM exclusion list +# and the cachi2-specific wheel==0.45.1 pin in the pre-build phase. RUN python3 -m pip install pipenv==2023.11.15 \ && python3 -m pip install pip-tools \ - && pipenv install --deploy \ - && pipenv check \ - && pipenv run pip freeze --all > ./requirements.in \ - # NOTE: Comment out ansible-core, ansible-runner, and requests-unixsocket from - # `requirements.in` to avoid dependency conflicts during pip-compile. - # These are required to properly resolve dependency conflicts: - # 1. Resolve conflict with `setuptools` version: - # - `setuptools>=70.1` is required by ansible-core (via markupsafe). - # - `setuptools<=69.0.2,>=45` is required by ansible-runner. - # - `setuptools>=77.0.3` is required by charset-normalizer (via types-psutil). - # 2. Resolve conflict with `setuptools_scm` version: - # - `setuptools_scm>=8` is required by requests-unixsocket. - # - `setuptools_scm<8` is required by kubernetes. - # Save the original requirements.in before commenting out - && cp ./requirements.in ./requirements.in.orig \ - && sed -i '/ansible-core==/s/^/#/g' ./requirements.in \ - && sed -i '/ansible-runner==/s/^/#/g' ./requirements.in \ - && sed -i '/requests-unixsocket==/s/^/#/g' ./requirements.in \ - && pip-compile --output-file=./requirements.txt ./requirements.in --strip-extras \ - # Pin python-dateutil to 2.9.0 instead of 2.9.0.post0 - && sed -i 's/python-dateutil==2.9.0.post0/python-dateutil==2.9.0/g' ./requirements.txt \ - # Add back the commented packages from the original requirements.in to requirements.txt - && grep "ansible-core==" ./requirements.in.orig >> ./requirements.txt || true \ - && grep "ansible-runner==" ./requirements.in.orig >> ./requirements.txt || true \ - && grep "requests-unixsocket==" ./requirements.in.orig >> ./requirements.txt || true \ - # Now comment them out again for pip_find_builddeps.py - && sed -i '/ansible-core==/s/^/#/g' ./requirements.txt \ - && sed -i '/ansible-runner==/s/^/#/g' ./requirements.txt \ - && sed -i '/requests-unixsocket==/s/^/#/g' ./requirements.txt \ - # Also comment out kubernetes to avoid setuptools-scm conflicts in build dependencies - && sed -i '/^kubernetes==/s/^/#/g' ./requirements.txt \ - # Comment out google-auth to prevent pip from pulling in cryptography (and its - # Rust/maturin build chain) as a transitive dependency during pip download - && sed -i '/^google-auth==/s/^/#/g' ./requirements.txt \ - # NOTE: Comment out cryptography and its dependencies from the requirements.txt - # files as these packages can't be installed in the isolated environment of OSBS - # image build. These packages will be installed through rpms. - && sed -i '/cryptography==/s/^/#/g' ./requirements.txt \ - && sed -i '/cffi==/s/^/#/g' ./requirements.txt \ - && sed -i '/pycparser==/s/^/#/g' ./requirements.txt \ - && sed -i '/maturin==/s/^/#/g' ./requirements.txt \ - && ./pip_find_builddeps.py requirements.txt -o requirements-build.in --append \ - # Uncomment ansible-core, ansible-runner, requests-unixsocket, and kubernetes - # so they are present in the final requirements.txt file - && sed -i '/ansible-core==/s/^#//g' ./requirements.txt \ - && sed -i '/ansible-runner==/s/^#//g' ./requirements.txt \ - && sed -i '/requests-unixsocket==/s/^#//g' ./requirements.txt \ - && sed -i '/^#kubernetes==/s/^#//g' ./requirements.txt \ - && sed -i '/^#google-auth==/s/^#//g' ./requirements.txt \ - # Comment out setuptools-scm from requirements-build.in to avoid version conflicts during pip-compile - && sed -i '/setuptools-scm/s/^/#/g' ./requirements-build.in \ - && sed -i '/setuptools_scm/s/^/#/g' ./requirements-build.in \ - && pip-compile --output-file=./requirements-build.txt ./requirements-build.in --strip-extras --allow-unsafe \ - # NOTE: Comment out cryptography and its dependencies from the requirements-build.txt - # files as these packages can't be installed in the isolated environment of OSBS image - # build. These packages will be installed through rpms. - && sed -i '/cryptography==/s/^/#/g' ./requirements-build.txt \ - && sed -i '/cffi==/s/^/#/g' ./requirements-build.txt \ - && sed -i '/pycparser==/s/^/#/g' ./requirements-build.txt \ - && sed -i '/maturin==/s/^/#/g' ./requirements-build.txt \ - # Add ansible-core into a separate requirements-build1-temp.in file to include it - # into the requirements-build1.in file. - && grep "urllib3==" ./requirements.in >> ./requirements-build1-temp.in || true \ - && ./pip_find_builddeps.py requirements-build1-temp.in -o requirements-build1.in --append \ - && pip-compile --output-file=./requirements-build1.txt ./requirements-build1.in --strip-extras --allow-unsafe \ - # NOTE: Comment out cryptography and its dependencies from the requirements-build1.txt - # files as these packages can't be installed in the isolated environment of OSBS - # image build. These packages will be installed through rpms. - && sed -i '/cryptography==/s/^/#/g' ./requirements-build1.txt \ - && sed -i '/cffi==/s/^/#/g' ./requirements-build1.txt \ - && sed -i '/pycparser==/s/^/#/g' ./requirements-build1.txt \ - && sed -i '/maturin==/s/^/#/g' ./requirements-build1.txt \ - # Add ansible-runner into a separate requirements-pre-build-temp.in - # file to include it into the requirements-pre-build.in file. - && grep "ansible-runner==" ./requirements.txt >> ./requirements-pre-build-temp.in || true \ - # Add flit-core to requirements-pre-build.in file as this package is part of the - # build dependencies of some packages in requirements-build.txt file. - && grep "flit-core==" ./requirements-build.txt >> ./requirements-pre-build.in || true \ - && ./pip_find_builddeps.py requirements-pre-build-temp.in -o requirements-pre-build.in --append \ - && pip-compile --output-file=./requirements-pre-build.txt ./requirements-pre-build.in --strip-extras --allow-unsafe \ - # Pin wheel to 0.45.1 in pre-build so cachi2 fetches it for ansible-core's - # build isolation (which requires exactly wheel==0.45.1). - # requirements-build.txt still has wheel==0.46.3 which upgrades it later. - && sed -i 's/^wheel==.*/wheel==0.45.1/' ./requirements-pre-build.txt + && python3 generate_requirements.py VOLUME /tmp/requirements -ENTRYPOINT ["cp", "./requirements.txt", "./requirements-build.txt", "./requirements-build1.txt", "./requirements-pre-build.txt", "/tmp/requirements/"] +ENTRYPOINT ["cp", "./requirements.txt", "./requirements-build.txt", "./requirements-build1.txt", "./requirements-pre-build.txt", "./Pipfile.lock", "/tmp/requirements/"] diff --git a/openshift/Pipfile.lock b/openshift/Pipfile.lock new file mode 100644 index 00000000..0a90ebc7 --- /dev/null +++ b/openshift/Pipfile.lock @@ -0,0 +1,685 @@ +{ + "_meta": { + "hash": { + "sha256": "88c5e57e0ba038c97ac540c13aeb739d8b613401d85c31265821db962b2c29c0" + }, + "pipfile-spec": 6, + "requires": { + "python_version": "3.12" + }, + "sources": [ + { + "name": "pypi", + "url": "https://pypi.org/simple", + "verify_ssl": true + } + ] + }, + "default": { + "ansible-core": { + "hashes": [ + "sha256:da9ff29f673e2cf2632ddd266883977b6602559c65cbd0cfc4d48e70306e85d5", + "sha256:e6b7eb39b1ffeffd229c3a88e7ea5412f184e0c7aa0577173c8a4ccb62a44633" + ], + "index": "pypi", + "markers": "python_version >= '3.11'", + "version": "==2.18.14" + }, + "ansible-runner": { + "hashes": [ + "sha256:0bde6cb39224770ff49ccdc6027288f6a98f4ed2ea0c64688b31217033221893", + "sha256:331d4da8d784e5a76aa9356981c0255f4bb1ba640736efe84b0bd7c73a4ca420" + ], + "index": "pypi", + "markers": "python_version >= '3.9'", + "version": "==2.4.2" + }, + "certifi": { + "hashes": [ + "sha256:3cb2210c8f88ba2318d29b0388d1023c8492ff72ecdde4ebdaddbb13a31b1c4a", + "sha256:8d455352a37b71bf76a79caa83a3d6c25afee4a385d632127b6afb3963f1c580" + ], + "markers": "python_version >= '3.7'", + "version": "==2026.4.22" + }, + "cffi": { + "hashes": [ + "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", + "sha256:07b271772c100085dd28b74fa0cd81c8fb1a3ba18b21e03d7c27f3436a10606b", + "sha256:087067fa8953339c723661eda6b54bc98c5625757ea62e95eb4898ad5e776e9f", + "sha256:0a1527a803f0a659de1af2e1fd700213caba79377e27e4693648c2923da066f9", + "sha256:0cf2d91ecc3fcc0625c2c530fe004f82c110405f101548512cce44322fa8ac44", + "sha256:0f6084a0ea23d05d20c3edcda20c3d006f9b6f3fefeac38f59262e10cef47ee2", + "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c", + "sha256:19f705ada2530c1167abacb171925dd886168931e0a7b78f5bffcae5c6b5be75", + "sha256:1cd13c99ce269b3ed80b417dcd591415d3372bcac067009b6e0f59c7d4015e65", + "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e", + "sha256:1f72fb8906754ac8a2cc3f9f5aaa298070652a0ffae577e0ea9bd480dc3c931a", + "sha256:1fc9ea04857caf665289b7a75923f2c6ed559b8298a1b8c49e59f7dd95c8481e", + "sha256:203a48d1fb583fc7d78a4c6655692963b860a417c0528492a6bc21f1aaefab25", + "sha256:2081580ebb843f759b9f617314a24ed5738c51d2aee65d31e02f6f7a2b97707a", + "sha256:21d1152871b019407d8ac3985f6775c079416c282e431a4da6afe7aefd2bccbe", + "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b", + "sha256:256f80b80ca3853f90c21b23ee78cd008713787b1b1e93eae9f3d6a7134abd91", + "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592", + "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187", + "sha256:2de9a304e27f7596cd03d16f1b7c72219bd944e99cc52b84d0145aefb07cbd3c", + "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1", + "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94", + "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba", + "sha256:3e837e369566884707ddaf85fc1744b47575005c0a229de3327f8f9a20f4efeb", + "sha256:3f4d46d8b35698056ec29bca21546e1551a205058ae1a181d871e278b0b28165", + "sha256:44d1b5909021139fe36001ae048dbdde8214afa20200eda0f64c068cac5d5529", + "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", + "sha256:4647afc2f90d1ddd33441e5b0e85b16b12ddec4fca55f0d9671fef036ecca27c", + "sha256:4671d9dd5ec934cb9a73e7ee9676f9362aba54f7f34910956b84d727b0d73fb6", + "sha256:53f77cbe57044e88bbd5ed26ac1d0514d2acf0591dd6bb02a3ae37f76811b80c", + "sha256:5eda85d6d1879e692d546a078b44251cdd08dd1cfb98dfb77b670c97cee49ea0", + "sha256:5fed36fccc0612a53f1d4d9a816b50a36702c28a2aa880cb8a122b3466638743", + "sha256:61d028e90346df14fedc3d1e5441df818d095f3b87d286825dfcbd6459b7ef63", + "sha256:66f011380d0e49ed280c789fbd08ff0d40968ee7b665575489afa95c98196ab5", + "sha256:6824f87845e3396029f3820c206e459ccc91760e8fa24422f8b0c3d1731cbec5", + "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4", + "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d", + "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", + "sha256:730cacb21e1bdff3ce90babf007d0a0917cc3e6492f336c2f0134101e0944f93", + "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205", + "sha256:74a03b9698e198d47562765773b4a8309919089150a0bb17d829ad7b44b60d27", + "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512", + "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d", + "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c", + "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037", + "sha256:8941aaadaf67246224cee8c3803777eed332a19d909b47e29c9842ef1e79ac26", + "sha256:89472c9762729b5ae1ad974b777416bfda4ac5642423fa93bd57a09204712322", + "sha256:8ea985900c5c95ce9db1745f7933eeef5d314f0565b27625d9a10ec9881e1bfb", + "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c", + "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8", + "sha256:9332088d75dc3241c702d852d4671613136d90fa6881da7d770a483fd05248b4", + "sha256:94698a9c5f91f9d138526b48fe26a199609544591f859c870d477351dc7b2414", + "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9", + "sha256:9de40a7b0323d889cf8d23d1ef214f565ab154443c42737dfe52ff82cf857664", + "sha256:a05d0c237b3349096d3981b727493e22147f934b20f6f125a3eba8f994bec4a9", + "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775", + "sha256:b18a3ed7d5b3bd8d9ef7a8cb226502c6bf8308df1525e1cc676c3680e7176739", + "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc", + "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062", + "sha256:b4c854ef3adc177950a8dfc81a86f5115d2abd545751a304c5bcf2c2c7283cfe", + "sha256:b882b3df248017dba09d6b16defe9b5c407fe32fc7c65a9c69798e6175601be9", + "sha256:baf5215e0ab74c16e2dd324e8ec067ef59e41125d3eade2b863d294fd5035c92", + "sha256:c649e3a33450ec82378822b3dad03cc228b8f5963c0c12fc3b1e0ab940f768a5", + "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13", + "sha256:c6638687455baf640e37344fe26d37c404db8b80d037c3d29f58fe8d1c3b194d", + "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", + "sha256:cb527a79772e5ef98fb1d700678fe031e353e765d1ca2d409c92263c6d43e09f", + "sha256:cf364028c016c03078a23b503f02058f1814320a56ad535686f90565636a9495", + "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", + "sha256:d68b6cef7827e8641e8ef16f4494edda8b36104d79773a334beaa1e3521430f6", + "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", + "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef", + "sha256:da68248800ad6320861f129cd9c1bf96ca849a2771a59e0344e88681905916f5", + "sha256:da902562c3e9c550df360bfa53c035b2f241fed6d9aef119048073680ace4a18", + "sha256:dbd5c7a25a7cb98f5ca55d258b103a2054f859a46ae11aaf23134f9cc0d356ad", + "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", + "sha256:de8dad4425a6ca6e4e5e297b27b5c824ecc7581910bf9aee86cb6835e6812aa7", + "sha256:e11e82b744887154b182fd3e7e8512418446501191994dbf9c9fc1f32cc8efd5", + "sha256:e6e73b9e02893c764e7e8d5bb5ce277f1a009cd5243f8228f75f842bf937c534", + "sha256:f73b96c41e3b2adedc34a7356e64c8eb96e03a3782b535e043a986276ce12a49", + "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", + "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5", + "sha256:fc7de24befaeae77ba923797c7c87834c73648a05a4bde34b3b7e5588973a453", + "sha256:fe562eb1a64e67dd297ccc4f5addea2501664954f2692b69a76449ec7913ecbf" + ], + "markers": "python_full_version >= '3.9' and platform_python_implementation != 'PyPy'", + "version": "==2.0.0" + }, + "charset-normalizer": { + "hashes": [ + "sha256:007d05ec7321d12a40227aae9e2bc6dca73f3cb21058999a1df9e193555a9dcc", + "sha256:03853ed82eeebbce3c2abfdbc98c96dc205f32a79627688ac9a27370ea61a49c", + "sha256:07d9e39b01743c3717745f4c530a6349eadbfa043c7577eef86c502c15df2c67", + "sha256:08e721811161356f97b4059a9ba7bafb23ea5ee2255402c42881c214e173c6b4", + "sha256:0c96c3b819b5c3e9e165495db84d41914d6894d55181d2d108cc1a69bfc9cce0", + "sha256:0ea948db76d31190bf08bd371623927ee1339d5f2a0b4b1b4a4439a65298703c", + "sha256:0f7eb884681e3938906ed0434f20c63046eacd0111c4ba96f27b76084cd679f5", + "sha256:12a6fff75f6bc66711b73a2f0addfc4c8c15a20e805146a02d147a318962c444", + "sha256:12d8baf840cc7889b37c7c770f478adea7adce3dcb3944d02ec87508e2dcf153", + "sha256:14265bfe1f09498b9d8ec91e9ec9fa52775edf90fcbde092b25f4a33d444fea9", + "sha256:16d971e29578a5e97d7117866d15889a4a07befe0e87e703ed63cd90cb348c01", + "sha256:177a0ba5f0211d488e295aaf82707237e331c24788d8d76c96c5a41594723217", + "sha256:1a87ca9d5df6fe460483d9a5bbf2b18f620cbed41b432e2bddb686228282d10b", + "sha256:1c2a768fdd44ee4a9339a9b0b130049139b8ce3c01d2ce09f67f5a68048d477c", + "sha256:1c2aed2e5e41f24ea8ef1590b8e848a79b56f3a5564a65ceec43c9d692dc7d8a", + "sha256:1dc8b0ea451d6e69735094606991f32867807881400f808a106ee1d963c46a83", + "sha256:1efde3cae86c8c273f1eb3b287be7d8499420cf2fe7585c41d370d3e790054a5", + "sha256:202389074300232baeb53ae2569a60901f7efadd4245cf3a3bf0617d60b439d7", + "sha256:203104ed3e428044fd943bc4bf45fa73c0730391f9621e37fe39ecf477b128cb", + "sha256:2257141f39fe65a3fdf38aeccae4b953e5f3b3324f4ff0daf9f15b8518666a2c", + "sha256:298930cec56029e05497a76988377cbd7457ba864beeea92ad7e844fe74cd1f1", + "sha256:2cd4a60d0e2fb04537162c62bbbb4182f53541fe0ede35cdf270a1c1e723cc42", + "sha256:2d6eb928e13016cea4f1f21d1e10c1cebd5a421bc57ddf5b1142ae3f86824fab", + "sha256:2fe249cb4651fd12605b7288b24751d8bfd46d35f12a20b1ba33dea122e690df", + "sha256:30b8d1d8c52a48c2c5690e152c169b673487a2a58de1ec7393196753063fcd5e", + "sha256:320ade88cfb846b8cd6b4ddf5ee9e80ee0c1f52401f2456b84ae1ae6a1a5f207", + "sha256:3534e7dcbdcf757da6b85a0bbf5b6868786d5982dd959b065e65481644817a18", + "sha256:36836d6ff945a00b88ba1e4572d721e60b5b8c98c155d465f56ad19d68f23734", + "sha256:38c0109396c4cfc574d502df99742a45c72c08eff0a36158b6f04000043dbf38", + "sha256:3946fa46a0cf3e4c8cb1cc52f56bb536310d34f25f01ca9b6c16afa767dab110", + "sha256:3bec022aec2c514d9cf199522a802bd007cd588ab17ab2525f20f9c34d067c18", + "sha256:3c9a494bc5ec77d43cea229c4f6db1e4d8fe7e1bbffa8b6f0f0032430ff8ab44", + "sha256:3dce51d0f5e7951f8bb4900c257dad282f49190fdbebecd4ba99bcc41fef404d", + "sha256:3dedcc22d73ec993f42055eff4fcfed9318d1eeb9a6606c55892a26964964e48", + "sha256:4042d5c8f957e15221d423ba781e85d553722fc4113f523f2feb7b188cc34c5e", + "sha256:481551899c856c704d58119b5025793fa6730adda3571971af568f66d2424bb5", + "sha256:4dc1e73c36828f982bfe79fadf5919923f8a6f4df2860804db9a98c48824ce8d", + "sha256:4e5163c14bffd570ef2affbfdd77bba66383890797df43dc8b4cc7d6f500bf53", + "sha256:511ef87c8aec0783e08ac18565a16d435372bc1ac25a91e6ac7f5ef2b0bff790", + "sha256:532bc9bf33a68613fd7d65e4b1c71a6a38d7d42604ecf239c77392e9b4e8998c", + "sha256:54523e136b8948060c0fa0bc7b1b50c32c186f2fceee897a495406bb6e311d2b", + "sha256:5649fd1c7bade02f320a462fdefd0b4bd3ce036065836d4f42e0de958038e116", + "sha256:56be790f86bfb2c98fb742ce566dfb4816e5a83384616ab59c49e0604d49c51d", + "sha256:5b77459df20e08151cd6f8b9ef8ef1f961ef73d85c21a555c7eed5b79410ec10", + "sha256:5ed6ab538499c8644b8a3e18debabcd7ce684f3fa91cf867521a7a0279cab2d6", + "sha256:6178f72c5508bfc5fd446a5905e698c6212932f25bcdd4b47a757a50605a90e2", + "sha256:6370e8686f662e6a3941ee48ed4742317cafbe5707e36406e9df792cdb535776", + "sha256:64f02c6841d7d83f832cd97ccf8eb8a906d06eb95d5276069175c696b024b60a", + "sha256:65bcd23054beab4d166035cabbc868a09c1a49d1efe458fe8e4361215df40265", + "sha256:66671f93accb62ed07da56613636f3641f1a12c13046ce91ffc923721f23c008", + "sha256:6696b7688f54f5af4462118f0bfa7c1621eeb87154f77fa04b9295ce7a8f2943", + "sha256:6785f414ae0f3c733c437e0f3929197934f526d19dfaa75e18fdb4f94c6fb374", + "sha256:67f6279d125ca0046a7fd386d01b311c6363844deac3e5b069b514ba3e63c246", + "sha256:6c114670c45346afedc0d947faf3c7f701051d2518b943679c8ff88befe14f8e", + "sha256:6e0d51f618228538a3e8f46bd246f87a6cd030565e015803691603f55e12afb5", + "sha256:6ed74185b2db44f41ef35fd1617c5888e59792da9bbc9190d6c7300617182616", + "sha256:708838739abf24b2ceb208d0e22403dd018faeef86ddac04319a62ae884c4f15", + "sha256:715479b9a2802ecac752a3b0efa2b0b60285cf962ee38414211abdfccc233b41", + "sha256:733784b6d6def852c814bce5f318d25da2ee65dd4839a0718641c696e09a2960", + "sha256:750e02e074872a3fad7f233b47734166440af3cdea0add3e95163110816d6752", + "sha256:752a45dc4a6934060b3b0dab47e04edc3326575f82be64bc4fc293914566503e", + "sha256:7579e913a5339fb8fa133f6bbcfd8e6749696206cf05acdbdca71a1b436d8e72", + "sha256:7641bb8895e77f921102f72833904dcd9901df5d6d72a2ab8f31d04b7e51e4e7", + "sha256:7804338df6fcc08105c7745f1502ba68d900f45fd770d5bdd5288ddccb8a42d8", + "sha256:80d04837f55fc81da168b98de4f4b797ef007fc8a79ab71c6ec9bc4dd662b15b", + "sha256:813c0e0132266c08eb87469a642cb30aaff57c5f426255419572aaeceeaa7bf4", + "sha256:82b271f5137d07749f7bf32f70b17ab6eaabedd297e75dce75081a24f76eb545", + "sha256:84c018e49c3bf790f9c2771c45e9313a08c2c2a6342b162cd650258b57817706", + "sha256:8751d2787c9131302398b11e6c8068053dcb55d5a8964e114b6e196cf16cb366", + "sha256:8778f0c7a52e56f75d12dae53ae320fae900a8b9b4164b981b9c5ce059cd1fcb", + "sha256:87fad7d9ba98c86bcb41b2dc8dbb326619be2562af1f8ff50776a39e55721c5a", + "sha256:8d828b6667a32a728a1ad1d93957cdf37489c57b97ae6c4de2860fa749b8fc1e", + "sha256:8e385e4267ab76874ae30db04c627faaaf0b509e1ccc11a95b3fc3e83f855c00", + "sha256:92a0a01ead5e668468e952e4238cccd7c537364eb7d851ab144ab6627dbbe12f", + "sha256:94e1885b270625a9a828c9793b4d52a64445299baa1fea5a173bf1d3dd9a1a5a", + "sha256:a180c5e59792af262bf263b21a3c49353f25945d8d9f70628e73de370d55e1e1", + "sha256:a277ab8928b9f299723bc1a2dabb1265911b1a76341f90a510368ca44ad9ab66", + "sha256:a5fe03b42827c13cdccd08e6c0247b6a6d4b5e3cdc53fd1749f5896adcdc2356", + "sha256:a6c5863edfbe888d9eff9c8b8087354e27618d9da76425c119293f11712a6319", + "sha256:a89c23ef8d2c6b27fd200a42aa4ac72786e7c60d40efdc76e6011260b6e949c4", + "sha256:adb2597b428735679446b46c8badf467b4ca5f5056aae4d51a19f9570301b1ad", + "sha256:ae196f021b5e7c78e918242d217db021ed2a6ace2bc6ae94c0fc596221c7f58d", + "sha256:ae89db9e5f98a11a4bf50407d4363e7b09b31e55bc117b4f7d80aab97ba009e5", + "sha256:aed52fea0513bac0ccde438c188c8a471c4e0f457c2dd20cdbf6ea7a450046c7", + "sha256:aef65cd602a6d0e0ff6f9930fcb1c8fec60dd2cfcb6facaf4bdb0e5873042db0", + "sha256:af21eb4409a119e365397b2adbaca4c9ccab56543a65d5dbd9f920d6ac29f686", + "sha256:b14b2d9dac08e28bb8046a1a0434b1750eb221c8f5b87a68f4fa11a6f97b5e34", + "sha256:bb6d88045545b26da47aa879dd4a89a71d1dce0f0e549b1abcb31dfe4a8eac49", + "sha256:bb8cc7534f51d9a017b93e3e85b260924f909601c3df002bcdb58ddb4dc41a5c", + "sha256:bc17a677b21b3502a21f66a8cc64f5bfad4df8a0b8434d661666f8ce90ac3af1", + "sha256:bd6c2a1c7573c64738d716488d2cdd3c00e340e4835707d8fdb8dc1a66ef164e", + "sha256:bd9b23791fe793e4968dba0c447e12f78e425c59fc0e3b97f6450f4781f3ee60", + "sha256:c03a41a8784091e67a39648f70c5f97b5b6a37f216896d44d2cdcb82615339a0", + "sha256:c0f081d69a6e58272819b70288d3221a6ee64b98df852631c80f293514d3b274", + "sha256:c35abb8bfff0185efac5878da64c45dafd2b37fb0383add1be155a763c1f083d", + "sha256:c36c333c39be2dbca264d7803333c896ab8fa7d4d6f0ab7edb7dfd7aea6e98c0", + "sha256:c45e9440fb78f8ddabcf714b68f936737a121355bf59f3907f4e17721b9d1aae", + "sha256:c593052c465475e64bbfe5dbd81680f64a67fdc752c56d7a0ae205dc8aeefe0f", + "sha256:cdd68a1fb318e290a2077696b7eb7a21a49163c455979c639bf5a5dcdc46617d", + "sha256:ce3412fbe1e31eb81ea42f4169ed94861c56e643189e1e75f0041f3fe7020abe", + "sha256:cf1493cd8607bec4d8a7b9b004e699fcf8f9103a9284cc94962cb73d20f9d4a3", + "sha256:cf29836da5119f3c8a8a70667b0ef5fdca3bb12f80fd06487cfa575b3909b393", + "sha256:d4a48e5b3c2a489fae013b7589308a40146ee081f6f509e047e0e096084ceca1", + "sha256:d560742f3c0d62afaccf9f41fe485ed69bd7661a241f86a3ef0f0fb8b1a397af", + "sha256:d6038d37043bced98a66e68d3aa2b6a35505dc01328cd65217cefe82f25def44", + "sha256:d61f00a0869d77422d9b2aba989e2d24afa6ffd552af442e0e58de4f35ea6d00", + "sha256:d635aab80466bc95771bb78d5370e74d36d1fe31467b6b29b8b57b2a3cd7d22c", + "sha256:dca4bbc466a95ba9c0234ef56d7dd9509f63da22274589ebd4ed7f1f4d4c54e3", + "sha256:dd915403e231e6b1809fe9b6d9fc55cf8fb5e02765ac625d9cd623342a7905d7", + "sha256:e044c39e41b92c845bc815e5ae4230804e8e7bc29e399b0437d64222d92809dd", + "sha256:e060d01aec0a910bdccb8be71faf34e7799ce36950f8294c8bf612cba65a2c9e", + "sha256:e1421b502d83040e6d7fb2fb18dff63957f720da3d77b2fbd3187ceb63755d7b", + "sha256:e17b8d5d6a8c47c85e68ca8379def1303fd360c3e22093a807cd34a71cd082b8", + "sha256:e5f4d355f0a2b1a31bc3edec6795b46324349c9cb25eed068049e4f472fb4259", + "sha256:e712b419df8ba5e42b226c510472b37bd57b38e897d3eca5e8cfd410a29fa859", + "sha256:e74327fb75de8986940def6e8dee4f127cc9752bee7355bb323cc5b2659b6d46", + "sha256:e80c8378d8f3d83cd3164da1ad2df9e37a666cdde7b1cb2298ed0b558064be30", + "sha256:e8ac484bf18ce6975760921bb6148041faa8fef0547200386ea0b52b5d27bf7b", + "sha256:eca9705049ad3c7345d574e3510665cb2cf844c2f2dcfe675332677f081cbd46", + "sha256:ed065083d0898c9d5b4bbec7b026fd755ff7454e6e8b73a67f8c744b13986e24", + "sha256:edac0f1ab77644605be2cbba52e6b7f630731fc42b34cb0f634be1a6eface56a", + "sha256:effc3f449787117233702311a1b7d8f59cba9ced946ba727bdc329ec69028e24", + "sha256:f22dec1690b584cea26fade98b2435c132c1b5f68e39f5a0b7627cd7ae31f1dc", + "sha256:f495a1652cf3fbab2eb0639776dad966c2fb874d79d87ca07f9d5f059b8bd215", + "sha256:f496c9c3cc02230093d8330875c4c3cdfc3b73612a5fd921c65d39cbcef08063", + "sha256:f59099f9b66f0d7145115e6f80dd8b1d847176df89b234a5a6b3f00437aa0832", + "sha256:f59ad4c0e8f6bba240a9bb85504faa1ab438237199d4cce5f622761507b8f6a6", + "sha256:fbccdc05410c9ee21bbf16a35f4c1d16123dcdeb8a1d38f33654fa21d0234f79", + "sha256:fea24543955a6a729c45a73fe90e08c743f0b3334bbf3201e6c4bc1b0c7fa464" + ], + "markers": "python_version >= '3.7'", + "version": "==3.4.7" + }, + "cryptography": { + "hashes": [ + "sha256:02f547fce831f5096c9a567fd41bc12ca8f11df260959ecc7c3202555cc47a72", + "sha256:039917b0dc418bb9f6edce8a906572d69e74bd330b0b3fea4f79dab7f8ddd235", + "sha256:1abfdb89b41c3be0365328a410baa9df3ff8a9110fb75e7b52e66803ddabc9a9", + "sha256:2ae6971afd6246710480e3f15824ed3029a60fc16991db250034efd0b9fb4356", + "sha256:2b7a67c9cd56372f3249b39699f2ad479f6991e62ea15800973b956f4b73e257", + "sha256:351695ada9ea9618b3500b490ad54c739860883df6c1f555e088eaf25b1bbaad", + "sha256:38946c54b16c885c72c4f59846be9743d699eee2b69b6988e0a00a01f46a61a4", + "sha256:3b4995dc971c9fb83c25aa44cf45f02ba86f71ee600d81091c2f0cbae116b06c", + "sha256:3ce58ba46e1bc2aac4f7d9290223cead56743fa6ab94a5d53292ffaac6a91614", + "sha256:3ee190460e2fbe447175cda91b88b84ae8322a104fc27766ad09428754a618ed", + "sha256:4108d4c09fbbf2789d0c926eb4152ae1760d5a2d97612b92d508d96c861e4d31", + "sha256:420d0e909050490d04359e7fdb5ed7e667ca5c3c402b809ae2563d7e66a92229", + "sha256:47fb8a66058b80e509c47118ef8a75d14c455e81ac369050f20ba0d23e77fee0", + "sha256:4c3341037c136030cb46e4b1e17b7418ea4cbd9dd207e4a6f3b2b24e0d4ac731", + "sha256:4d7e3d356b8cd4ea5aff04f129d5f66ebdc7b6f8eae802b93739ed520c47c79b", + "sha256:4d8ae8659ab18c65ced284993c2265910f6c9e650189d4e3f68445ef82a810e4", + "sha256:4e817a8920bfbcff8940ecfd60f23d01836408242b30f1a708d93198393a80b4", + "sha256:50bfb6925eff619c9c023b967d5b77a54e04256c4281b0e21336a130cd7fc263", + "sha256:556e106ee01aa13484ce9b0239bca667be5004efb0aabbed28d353df86445595", + "sha256:582f5fcd2afa31622f317f80426a027f30dc792e9c80ffee87b993200ea115f1", + "sha256:5be7bf2fb40769e05739dd0046e7b26f9d4670badc7b032d6ce4db64dddc0678", + "sha256:60ee7e19e95104d4c03871d7d7dfb3d22ef8a9b9c6778c94e1c8fcc8365afd48", + "sha256:61aa400dce22cb001a98014f647dc21cda08f7915ceb95df0c9eaf84b4b6af76", + "sha256:68f68d13f2e1cb95163fa3b4db4bf9a159a418f5f6e7242564fc75fcae667fd0", + "sha256:7d1f30a86d2757199cb2d56e48cce14deddf1f9c95f1ef1b64ee91ea43fe2e18", + "sha256:7d731d4b107030987fd61a7f8ab512b25b53cef8f233a97379ede116f30eb67d", + "sha256:803812e111e75d1aa73690d2facc295eaefd4439be1023fefc4995eaea2af90d", + "sha256:80a8d7bfdf38f87ca30a5391c0c9ce4ed2926918e017c29ddf643d0ed2778ea1", + "sha256:8293f3dea7fc929ef7240796ba231413afa7b68ce38fd21da2995549f5961981", + "sha256:8456928655f856c6e1533ff59d5be76578a7157224dbd9ce6872f25055ab9ab7", + "sha256:890bcb4abd5a2d3f852196437129eb3667d62630333aacc13dfd470fad3aaa82", + "sha256:94a76daa32eb78d61339aff7952ea819b1734b46f73646a07decb40e5b3448e2", + "sha256:9f16fbdf4da055efb21c22d81b89f155f02ba420558db21288b3d0035bafd5f4", + "sha256:a3d1fae9863299076f05cb8a778c467578262fae09f9dc0ee9b12eb4268ce663", + "sha256:a3d507bb6a513ca96ba84443226af944b0f7f47dcc9a399d110cd6146481d24c", + "sha256:abace499247268e3757271b2f1e244b36b06f8515cf27c4d49468fc9eb16e93d", + "sha256:ba2a27ff02f48193fc4daeadf8ad2590516fa3d0adeeb34336b96f7fa64c1e3a", + "sha256:bc84e875994c3b445871ea7181d424588171efec3e185dced958dad9e001950a", + "sha256:bfd56bb4b37ed4f330b82402f6f435845a5f5648edf1ad497da51a8452d5d62d", + "sha256:c18ff11e86df2e28854939acde2d003f7984f721eba450b56a200ad90eeb0e6b", + "sha256:c3bcce8521d785d510b2aad26ae2c966092b7daa8f45dd8f44734a104dc0bc1a", + "sha256:c4143987a42a2397f2fc3b4d7e3a7d313fbe684f67ff443999e803dd75a76826", + "sha256:c69fd885df7d089548a42d5ec05be26050ebcd2283d89b3d30676eb32ff87dee", + "sha256:ced80795227d70549a411a4ab66e8ce307899fad2220ce5ab2f296e687eacde9", + "sha256:d66e421495fdb797610a08f43b05269e0a5ea7f5e652a89bfd5a7d3c1dee3648", + "sha256:d861ee9e76ace6cf36a6a89b959ec08e7bc2493ee39d07ffe5acb23ef46d27da", + "sha256:e9251e3be159d1020c4030bd2e5f84d6a43fe54b6c19c12f51cde9542a2817b2", + "sha256:f145bba11b878005c496e93e257c1e88f154d278d2638e6450d17e0f31e558d2", + "sha256:fe346b143ff9685e40192a4960938545c699054ba11d4f9029f94751e3f71d87" + ], + "markers": "python_version >= '3.8' and python_full_version not in '3.9.0, 3.9.1'", + "version": "==46.0.5" + }, + "durationpy": { + "hashes": [ + "sha256:1fa6893409a6e739c9c72334fc65cca1f355dbdd93405d30f726deb5bde42fba", + "sha256:3b41e1b601234296b4fb368338fdcd3e13e0b4fb5b67345948f4f2bf9868b286" + ], + "version": "==0.10" + }, + "google-auth": { + "hashes": [ + "sha256:2e2a537873d449434252a9632c28bfc268b0adb1e53f9fb62afc5333a975903f", + "sha256:4f7e706b0cd3208a3d940a19a822c37a476ddba5450156c3e6624a71f7c841ce" + ], + "markers": "python_version >= '3.8'", + "version": "==2.48.0" + }, + "idna": { + "hashes": [ + "sha256:585ea8fe5d69b9181ec1afba340451fba6ba764af97026f92a91d4eef164a242", + "sha256:892ea0cde124a99ce773decba204c5552b69c3c67ffd5f232eb7696135bc8bb3" + ], + "markers": "python_version >= '3.8'", + "version": "==3.13" + }, + "jinja2": { + "hashes": [ + "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d", + "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67" + ], + "markers": "python_version >= '3.7'", + "version": "==3.1.6" + }, + "kubernetes": { + "hashes": [ + "sha256:544de42b24b64287f7e0aa9513c93cb503f7f40eea39b20f66810011a86eabc5", + "sha256:f64d829843a54c251061a8e7a14523b521f2dc5c896cf6d65ccf348648a88993" + ], + "index": "pypi", + "markers": "python_version >= '3.6'", + "version": "==33.1.0" + }, + "lockfile": { + "hashes": [ + "sha256:6aed02de03cba24efabcd600b30540140634fc06cfa603822d508d5361e9f799", + "sha256:6c3cb24f344923d30b2785d5ad75182c8ea7ac1b6171b08657258ec7429d50fa" + ], + "version": "==0.12.2" + }, + "markupsafe": { + "hashes": [ + "sha256:0303439a41979d9e74d18ff5e2dd8c43ed6c6001fd40e5bf2e43f7bd9bbc523f", + "sha256:068f375c472b3e7acbe2d5318dea141359e6900156b5b2ba06a30b169086b91a", + "sha256:0bf2a864d67e76e5c9a34dc26ec616a66b9888e25e7b9460e1c76d3293bd9dbf", + "sha256:0db14f5dafddbb6d9208827849fad01f1a2609380add406671a26386cdf15a19", + "sha256:0eb9ff8191e8498cca014656ae6b8d61f39da5f95b488805da4bb029cccbfbaf", + "sha256:0f4b68347f8c5eab4a13419215bdfd7f8c9b19f2b25520968adfad23eb0ce60c", + "sha256:1085e7fbddd3be5f89cc898938f42c0b3c711fdcb37d75221de2666af647c175", + "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219", + "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb", + "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6", + "sha256:1353ef0c1b138e1907ae78e2f6c63ff67501122006b0f9abad68fda5f4ffc6ab", + "sha256:15d939a21d546304880945ca1ecb8a039db6b4dc49b2c5a400387cdae6a62e26", + "sha256:177b5253b2834fe3678cb4a5f0059808258584c559193998be2601324fdeafb1", + "sha256:1872df69a4de6aead3491198eaf13810b565bdbeec3ae2dc8780f14458ec73ce", + "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218", + "sha256:1b52b4fb9df4eb9ae465f8d0c228a00624de2334f216f178a995ccdcf82c4634", + "sha256:1ba88449deb3de88bd40044603fafffb7bc2b055d626a330323a9ed736661695", + "sha256:1cc7ea17a6824959616c525620e387f6dd30fec8cb44f649e31712db02123dad", + "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73", + "sha256:26a5784ded40c9e318cfc2bdb30fe164bdb8665ded9cd64d500a34fb42067b1c", + "sha256:2713baf880df847f2bece4230d4d094280f4e67b1e813eec43b4c0e144a34ffe", + "sha256:2a15a08b17dd94c53a1da0438822d70ebcd13f8c3a95abe3a9ef9f11a94830aa", + "sha256:2f981d352f04553a7171b8e44369f2af4055f888dfb147d55e42d29e29e74559", + "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa", + "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37", + "sha256:3537e01efc9d4dccdf77221fb1cb3b8e1a38d5428920e0657ce299b20324d758", + "sha256:35add3b638a5d900e807944a078b51922212fb3dedb01633a8defc4b01a3c85f", + "sha256:38664109c14ffc9e7437e86b4dceb442b0096dfe3541d7864d9cbe1da4cf36c8", + "sha256:3a7e8ae81ae39e62a41ec302f972ba6ae23a5c5396c8e60113e9066ef893da0d", + "sha256:3b562dd9e9ea93f13d53989d23a7e775fdfd1066c33494ff43f5418bc8c58a5c", + "sha256:457a69a9577064c05a97c41f4e65148652db078a3a509039e64d3467b9e7ef97", + "sha256:4bd4cd07944443f5a265608cc6aab442e4f74dff8088b0dfc8238647b8f6ae9a", + "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19", + "sha256:4faffd047e07c38848ce017e8725090413cd80cbc23d86e55c587bf979e579c9", + "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9", + "sha256:5678211cb9333a6468fb8d8be0305520aa073f50d17f089b5b4b477ea6e67fdc", + "sha256:591ae9f2a647529ca990bc681daebdd52c8791ff06c2bfa05b65163e28102ef2", + "sha256:5a7d5dc5140555cf21a6fefbdbf8723f06fcd2f63ef108f2854de715e4422cb4", + "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354", + "sha256:6b5420a1d9450023228968e7e6a9ce57f65d148ab56d2313fcd589eee96a7a50", + "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698", + "sha256:729586769a26dbceff69f7a7dbbf59ab6572b99d94576a5592625d5b411576b9", + "sha256:77f0643abe7495da77fb436f50f8dab76dbc6e5fd25d39589a0f1fe6548bfa2b", + "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc", + "sha256:7be7b61bb172e1ed687f1754f8e7484f1c8019780f6f6b0786e76bb01c2ae115", + "sha256:7c3fb7d25180895632e5d3148dbdc29ea38ccb7fd210aa27acbd1201a1902c6e", + "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485", + "sha256:83891d0e9fb81a825d9a6d61e3f07550ca70a076484292a70fde82c4b807286f", + "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12", + "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025", + "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009", + "sha256:915c04ba3851909ce68ccc2b8e2cd691618c4dc4c4232fb7982bca3f41fd8c3d", + "sha256:949b8d66bc381ee8b007cd945914c721d9aba8e27f71959d750a46f7c282b20b", + "sha256:94c6f0bb423f739146aec64595853541634bde58b2135f27f61c1ffd1cd4d16a", + "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5", + "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f", + "sha256:a320721ab5a1aba0a233739394eb907f8c8da5c98c9181d1161e77a0c8e36f2d", + "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1", + "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287", + "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6", + "sha256:bc51efed119bc9cfdf792cdeaa4d67e8f6fcccab66ed4bfdd6bde3e59bfcbb2f", + "sha256:bdc919ead48f234740ad807933cdf545180bfbe9342c2bb451556db2ed958581", + "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed", + "sha256:be8813b57049a7dc738189df53d69395eba14fb99345e0a5994914a3864c8a4b", + "sha256:c0c0b3ade1c0b13b936d7970b1d37a57acde9199dc2aecc4c336773e1d86049c", + "sha256:c47a551199eb8eb2121d4f0f15ae0f923d31350ab9280078d1e5f12b249e0026", + "sha256:c4ffb7ebf07cfe8931028e3e4c85f0357459a3f9f9490886198848f4fa002ec8", + "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676", + "sha256:d2ee202e79d8ed691ceebae8e0486bd9a2cd4794cec4824e1c99b6f5009502f6", + "sha256:d53197da72cc091b024dd97249dfc7794d6a56530370992a5e1a08983ad9230e", + "sha256:d6dd0be5b5b189d31db7cda48b91d7e0a9795f31430b7f271219ab30f1d3ac9d", + "sha256:d88b440e37a16e651bda4c7c2b930eb586fd15ca7406cb39e211fcff3bf3017d", + "sha256:de8a88e63464af587c950061a5e6a67d3632e36df62b986892331d4620a35c01", + "sha256:df2449253ef108a379b8b5d6b43f4b1a8e81a061d6537becd5582fba5f9196d7", + "sha256:e1c1493fb6e50ab01d20a22826e57520f1284df32f2d8601fdd90b6304601419", + "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795", + "sha256:e2103a929dfa2fcaf9bb4e7c091983a49c9ac3b19c9061b6d5427dd7d14d81a1", + "sha256:e56b7d45a839a697b5eb268c82a71bd8c7f6c94d6fd50c3d577fa39a9f1409f5", + "sha256:e8afc3f2ccfa24215f8cb28dcf43f0113ac3c37c2f0f0806d8c70e4228c5cf4d", + "sha256:e8fc20152abba6b83724d7ff268c249fa196d8259ff481f3b1476383f8f24e42", + "sha256:eaa9599de571d72e2daf60164784109f19978b327a3910d3e9de8c97b5b70cfe", + "sha256:ec15a59cf5af7be74194f7ab02d0f59a62bdcf1a537677ce67a2537c9b87fcda", + "sha256:f190daf01f13c72eac4efd5c430a8de82489d9cff23c364c3ea822545032993e", + "sha256:f34c41761022dd093b4b6896d4810782ffbabe30f2d443ff5f083e0cbbb8c737", + "sha256:f3e98bb3798ead92273dc0e5fd0f31ade220f59a266ffd8a4f6065e0a3ce0523", + "sha256:f42d0984e947b8adf7dd6dde396e720934d12c506ce84eea8476409563607591", + "sha256:f71a396b3bf33ecaa1626c255855702aca4d3d9fea5e051b41ac59a9c1c41edc", + "sha256:f9e130248f4462aaa8e2552d547f36ddadbeaa573879158d721bbd33dfe4743a", + "sha256:fed51ac40f757d41b7c48425901843666a6677e3e8eb0abcff09e4ba6e664f50" + ], + "markers": "python_version >= '3.9'", + "version": "==3.0.3" + }, + "oauthlib": { + "hashes": [ + "sha256:0f0f8aa759826a193cf66c12ea1af1637f87b9b4622d46e866952bb022e538c9", + "sha256:88119c938d2b8fb88561af5f6ee0eec8cc8d552b7bb1f712743136eb7523b7a1" + ], + "markers": "python_version >= '3.8'", + "version": "==3.3.1" + }, + "packaging": { + "hashes": [ + "sha256:00243ae351a257117b6a241061796684b084ed1c516a08c48a3f7e147a9d80b4", + "sha256:b36f1fef9334a5588b4166f8bcd26a14e521f2b55e6b9de3aaa80d3ff7a37529" + ], + "markers": "python_version >= '3.8'", + "version": "==26.0" + }, + "pexpect": { + "hashes": [ + "sha256:7236d1e080e4936be2dc3e326cec0af72acf9212a7e1d060210e70a47e253523", + "sha256:ee7d41123f3c9911050ea2c2dac107568dc43b2d3b0c7557a33212c398ead30f" + ], + "version": "==4.9.0" + }, + "ptyprocess": { + "hashes": [ + "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35", + "sha256:5c5d0a3b48ceee0b48485e0c26037c0acd7d29765ca3fbb5cb3831d347423220" + ], + "version": "==0.7.0" + }, + "pyasn1": { + "hashes": [ + "sha256:697a8ecd6d98891189184ca1fa05d1bb00e2f84b5977c481452050549c8a72cf", + "sha256:a80184d120f0864a52a073acc6fc642847d0be408e7c7252f31390c0f4eadcde" + ], + "index": "pypi", + "markers": "python_version >= '3.8'", + "version": "==0.6.3" + }, + "pyasn1-modules": { + "hashes": [ + "sha256:29253a9207ce32b64c3ac6600edc75368f98473906e8fd1043bd6b5b1de2c14a", + "sha256:677091de870a80aae844b1ca6134f54652fa2c8c5a52aa396440ac3106e941e6" + ], + "markers": "python_version >= '3.8'", + "version": "==0.4.2" + }, + "pycparser": { + "hashes": [ + "sha256:600f49d217304a5902ac3c37e1281c9fe94e4d0489de643a9504c5cdfdfc6b29", + "sha256:b727414169a36b7d524c1c3e31839a521725078d7b2ff038656844266160a992" + ], + "markers": "implementation_name != 'PyPy'", + "version": "==3.0" + }, + "python-daemon": { + "hashes": [ + "sha256:b906833cef63502994ad48e2eab213259ed9bb18d54fa8774dcba2ff7864cec6", + "sha256:f7b04335adc473de877f5117e26d5f1142f4c9f7cd765408f0877757be5afbf4" + ], + "markers": "python_version >= '3.7'", + "version": "==3.1.2" + }, + "python-dateutil": { + "hashes": [ + "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3", + "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427" + ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'", + "version": "==2.9.0.post0" + }, + "pyyaml": { + "hashes": [ + "sha256:00c4bdeba853cc34e7dd471f16b4114f4162dc03e6b7afcc2128711f0eca823c", + "sha256:0150219816b6a1fa26fb4699fb7daa9caf09eb1999f3b70fb6e786805e80375a", + "sha256:02893d100e99e03eda1c8fd5c441d8c60103fd175728e23e431db1b589cf5ab3", + "sha256:02ea2dfa234451bbb8772601d7b8e426c2bfa197136796224e50e35a78777956", + "sha256:0f29edc409a6392443abf94b9cf89ce99889a1dd5376d94316ae5145dfedd5d6", + "sha256:10892704fc220243f5305762e276552a0395f7beb4dbf9b14ec8fd43b57f126c", + "sha256:16249ee61e95f858e83976573de0f5b2893b3677ba71c9dd36b9cf8be9ac6d65", + "sha256:1d37d57ad971609cf3c53ba6a7e365e40660e3be0e5175fa9f2365a379d6095a", + "sha256:1ebe39cb5fc479422b83de611d14e2c0d3bb2a18bbcb01f229ab3cfbd8fee7a0", + "sha256:214ed4befebe12df36bcc8bc2b64b396ca31be9304b8f59e25c11cf94a4c033b", + "sha256:2283a07e2c21a2aa78d9c4442724ec1eb15f5e42a723b99cb3d822d48f5f7ad1", + "sha256:22ba7cfcad58ef3ecddc7ed1db3409af68d023b7f940da23c6c2a1890976eda6", + "sha256:27c0abcb4a5dac13684a37f76e701e054692a9b2d3064b70f5e4eb54810553d7", + "sha256:28c8d926f98f432f88adc23edf2e6d4921ac26fb084b028c733d01868d19007e", + "sha256:2e71d11abed7344e42a8849600193d15b6def118602c4c176f748e4583246007", + "sha256:34d5fcd24b8445fadc33f9cf348c1047101756fd760b4dacb5c3e99755703310", + "sha256:37503bfbfc9d2c40b344d06b2199cf0e96e97957ab1c1b546fd4f87e53e5d3e4", + "sha256:3c5677e12444c15717b902a5798264fa7909e41153cdf9ef7ad571b704a63dd9", + "sha256:3ff07ec89bae51176c0549bc4c63aa6202991da2d9a6129d7aef7f1407d3f295", + "sha256:41715c910c881bc081f1e8872880d3c650acf13dfa8214bad49ed4cede7c34ea", + "sha256:418cf3f2111bc80e0933b2cd8cd04f286338bb88bdc7bc8e6dd775ebde60b5e0", + "sha256:44edc647873928551a01e7a563d7452ccdebee747728c1080d881d68af7b997e", + "sha256:4a2e8cebe2ff6ab7d1050ecd59c25d4c8bd7e6f400f5f82b96557ac0abafd0ac", + "sha256:4ad1906908f2f5ae4e5a8ddfce73c320c2a1429ec52eafd27138b7f1cbe341c9", + "sha256:501a031947e3a9025ed4405a168e6ef5ae3126c59f90ce0cd6f2bfc477be31b7", + "sha256:5190d403f121660ce8d1d2c1bb2ef1bd05b5f68533fc5c2ea899bd15f4399b35", + "sha256:5498cd1645aa724a7c71c8f378eb29ebe23da2fc0d7a08071d89469bf1d2defb", + "sha256:5cf4e27da7e3fbed4d6c3d8e797387aaad68102272f8f9752883bc32d61cb87b", + "sha256:5e0b74767e5f8c593e8c9b5912019159ed0533c70051e9cce3e8b6aa699fcd69", + "sha256:5ed875a24292240029e4483f9d4a4b8a1ae08843b9c54f43fcc11e404532a8a5", + "sha256:5fcd34e47f6e0b794d17de1b4ff496c00986e1c83f7ab2fb8fcfe9616ff7477b", + "sha256:5fdec68f91a0c6739b380c83b951e2c72ac0197ace422360e6d5a959d8d97b2c", + "sha256:6344df0d5755a2c9a276d4473ae6b90647e216ab4757f8426893b5dd2ac3f369", + "sha256:64386e5e707d03a7e172c0701abfb7e10f0fb753ee1d773128192742712a98fd", + "sha256:652cb6edd41e718550aad172851962662ff2681490a8a711af6a4d288dd96824", + "sha256:66291b10affd76d76f54fad28e22e51719ef9ba22b29e1d7d03d6777a9174198", + "sha256:66e1674c3ef6f541c35191caae2d429b967b99e02040f5ba928632d9a7f0f065", + "sha256:6adc77889b628398debc7b65c073bcb99c4a0237b248cacaf3fe8a557563ef6c", + "sha256:79005a0d97d5ddabfeeea4cf676af11e647e41d81c9a7722a193022accdb6b7c", + "sha256:7c6610def4f163542a622a73fb39f534f8c101d690126992300bf3207eab9764", + "sha256:7f047e29dcae44602496db43be01ad42fc6f1cc0d8cd6c83d342306c32270196", + "sha256:8098f252adfa6c80ab48096053f512f2321f0b998f98150cea9bd23d83e1467b", + "sha256:850774a7879607d3a6f50d36d04f00ee69e7fc816450e5f7e58d7f17f1ae5c00", + "sha256:8d1fab6bb153a416f9aeb4b8763bc0f22a5586065f86f7664fc23339fc1c1fac", + "sha256:8da9669d359f02c0b91ccc01cac4a67f16afec0dac22c2ad09f46bee0697eba8", + "sha256:8dc52c23056b9ddd46818a57b78404882310fb473d63f17b07d5c40421e47f8e", + "sha256:9149cad251584d5fb4981be1ecde53a1ca46c891a79788c0df828d2f166bda28", + "sha256:93dda82c9c22deb0a405ea4dc5f2d0cda384168e466364dec6255b293923b2f3", + "sha256:96b533f0e99f6579b3d4d4995707cf36df9100d67e0c8303a0c55b27b5f99bc5", + "sha256:9c57bb8c96f6d1808c030b1687b9b5fb476abaa47f0db9c0101f5e9f394e97f4", + "sha256:9c7708761fccb9397fe64bbc0395abcae8c4bf7b0eac081e12b809bf47700d0b", + "sha256:9f3bfb4965eb874431221a3ff3fdcddc7e74e3b07799e0e84ca4a0f867d449bf", + "sha256:a33284e20b78bd4a18c8c2282d549d10bc8408a2a7ff57653c0cf0b9be0afce5", + "sha256:a80cb027f6b349846a3bf6d73b5e95e782175e52f22108cfa17876aaeff93702", + "sha256:b30236e45cf30d2b8e7b3e85881719e98507abed1011bf463a8fa23e9c3e98a8", + "sha256:b3bc83488de33889877a0f2543ade9f70c67d66d9ebb4ac959502e12de895788", + "sha256:b865addae83924361678b652338317d1bd7e79b1f4596f96b96c77a5a34b34da", + "sha256:b8bb0864c5a28024fac8a632c443c87c5aa6f215c0b126c449ae1a150412f31d", + "sha256:ba1cc08a7ccde2d2ec775841541641e4548226580ab850948cbfda66a1befcdc", + "sha256:bdb2c67c6c1390b63c6ff89f210c8fd09d9a1217a465701eac7316313c915e4c", + "sha256:c1ff362665ae507275af2853520967820d9124984e0f7466736aea23d8611fba", + "sha256:c2514fceb77bc5e7a2f7adfaa1feb2fb311607c9cb518dbc378688ec73d8292f", + "sha256:c3355370a2c156cffb25e876646f149d5d68f5e0a3ce86a5084dd0b64a994917", + "sha256:c458b6d084f9b935061bc36216e8a69a7e293a2f1e68bf956dcd9e6cbcd143f5", + "sha256:d0eae10f8159e8fdad514efdc92d74fd8d682c933a6dd088030f3834bc8e6b26", + "sha256:d76623373421df22fb4cf8817020cbb7ef15c725b9d5e45f17e189bfc384190f", + "sha256:ebc55a14a21cb14062aa4162f906cd962b28e2e9ea38f9b4391244cd8de4ae0b", + "sha256:eda16858a3cab07b80edaf74336ece1f986ba330fdb8ee0d6c0d68fe82bc96be", + "sha256:ee2922902c45ae8ccada2c5b501ab86c36525b883eff4255313a253a3160861c", + "sha256:efd7b85f94a6f21e4932043973a7ba2613b059c4a000551892ac9f1d11f5baf3", + "sha256:f7057c9a337546edc7973c0d3ba84ddcdf0daa14533c2065749c9075001090e6", + "sha256:fa160448684b4e94d80416c0fa4aac48967a969efe22931448d853ada8baf926", + "sha256:fc09d0aa354569bc501d4e787133afc08552722d3ab34836a80547331bb5d4a0" + ], + "markers": "python_version >= '3.8'", + "version": "==6.0.3" + }, + "requests": { + "hashes": [ + "sha256:18817f8c57c6263968bc123d237e3b8b08ac046f5456bd1e307ee8f4250d3517", + "sha256:4e6d1ef462f3626a1f0a0a9c42dd93c63bad33f9f1c1937509b8c5c8718ab56a" + ], + "index": "pypi", + "markers": "python_version >= '3.10'", + "version": "==2.33.1" + }, + "requests-oauthlib": { + "hashes": [ + "sha256:7dd8a5c40426b779b0868c404bdef9768deccf22749cde15852df527e6269b36", + "sha256:b3dffaebd884d8cd778494369603a9e7b58d29111bf6b41bdc2dcd87203af4e9" + ], + "markers": "python_version >= '3.4'", + "version": "==2.0.0" + }, + "requests-unixsocket": { + "hashes": [ + "sha256:60c4942e9dbecc2f64d611039fb1dfc25da382083c6434ac0316dca3ff908f4d", + "sha256:b2596158c356ecee68d27ba469a52211230ac6fb0cde8b66afb19f0ed47a1995" + ], + "index": "pypi", + "markers": "python_version >= '3.9'", + "version": "==0.4.1" + }, + "resolvelib": { + "hashes": [ + "sha256:04ce76cbd63fded2078ce224785da6ecd42b9564b1390793f64ddecbe997b309", + "sha256:d2da45d1a8dfee81bdd591647783e340ef3bcb104b54c383f70d422ef5cc7dbf" + ], + "version": "==1.0.1" + }, + "rsa": { + "hashes": [ + "sha256:68635866661c6836b8d39430f97a996acbd61bfa49406748ea243539fe239762", + "sha256:e7bdbfdb5497da4c07dfd35530e1a902659db6ff241e39d9953cad06ebd0ae75" + ], + "markers": "python_version >= '3.6' and python_version < '4'", + "version": "==4.9.1" + }, + "six": { + "hashes": [ + "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", + "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81" + ], + "markers": "python_version >= '2.7' and python_version not in '3.0, 3.1, 3.2'", + "version": "==1.17.0" + }, + "urllib3": { + "hashes": [ + "sha256:1b62b6884944a57dbe321509ab94fd4d3b307075e0c2eae991ac71ee15ad38ed", + "sha256:bf272323e553dfb2e87d9bfd225ca7b0f467b919d7bbd355436d3fd37cb0acd4" + ], + "index": "pypi", + "markers": "python_version >= '3.9'", + "version": "==2.6.3" + }, + "websocket-client": { + "hashes": [ + "sha256:9e813624b6eb619999a97dc7958469217c3176312b3a16a4bd1bc7e08a46ec98", + "sha256:af248a825037ef591efbf6ed20cc5faa03d3b47b9e5a2230a529eeee1c1fc3ef" + ], + "markers": "python_version >= '3.9'", + "version": "==1.9.0" + } + }, + "develop": {} +} diff --git a/openshift/hack/generate_requirements.md b/openshift/hack/generate_requirements.md new file mode 100644 index 00000000..0c190e68 --- /dev/null +++ b/openshift/hack/generate_requirements.md @@ -0,0 +1,392 @@ +# `generate_requirements.py` — ART Hermetic Build Requirements Generator + +## Background + +OpenShift's ART (Automated Release Tooling) builds container images in a +**hermetic environment** — no internet access during the actual image build. +All Python packages must be declared upfront in four requirements files so that +cachi2 (the dependency pre-fetcher) can download them before the build starts. + +The old approach used a manually maintained shell pipeline inside +`openshift/Dockerfile.requirements` that hardcoded which packages caused +conflicts and where they belonged. Every time `images/ansible-operator/Pipfile` +changed, a human had to study the new conflict, figure out which packages were +involved, and update the bash script by hand. + +`generate_requirements.py` replaces that pipeline with a fully automated tool +that detects conflicts dynamically and never needs to be told about specific +packages. + +--- + +## Output Files + +The script always produces exactly four files, compatible with the install order +enforced by `openshift/install-ansible.sh`: + +``` +requirements-pre-build.txt → installed first +requirements-build1.txt → installed second +requirements-build.txt → installed third +requirements.txt → installed fourth (runtime packages) +``` + +The three build files contain **build-system tools** (setuptools, wheel, +hatchling, etc.) that pip needs when building packages from source in its PEP +517 build isolation environments. Their sequential installation upgrades +conflicting tool versions in the correct order. The runtime file contains the +actual packages that the ansible-operator image needs at run time. + +--- + +## Configuration — The Only Hardcoded Knowledge + +### `RPM_INSTALLED` + +```python +RPM_INSTALLED = frozenset({"cryptography", "cffi", "pycparser", "maturin"}) +``` + +Packages that the ART OSBS environment provides through system RPMs rather than +pip. They cannot be pip-installed in the hermetic environment. The script still +lets pip-compile **resolve** them (so that packages depending on them, e.g. +`google-auth → cryptography`, produce correct dependency graphs), then **comments +them out** in every generated file. + +--- + +## Pipeline Overview + +``` +Pipfile + Pipfile.lock + │ +Stage 1 │ pipenv install --deploy + │ ↳ CVE auto-fix (pipenv update per vulnerable package) + │ pip freeze --all → all pinned runtime packages + │ +Stage 2 │ Iterative pip-compile → requirements.txt + │ (detects runtime metadata conflicts dynamically) + │ +Stage 3 │ pip_find_builddeps.py per package → build-dep constraint map + │ +Stage 4 │ Conflict detection + phase splitting → 3 build .txt files + │ ↳ Auto-discovers build-isolation exact pins from package metadata + │ (e.g. wheel==0.45.1 from ansible-core's pyproject.toml) + │ and injects them into pre-build; later phases resolve newer versions + │ +Stage 5 │ Safety CVE scan of build files → auto-fix or warn + │ + └─ Export Pipfile.lock (captures CVE-driven runtime updates) +``` + +--- + +## Stage 1 — Resolve Runtime Packages + +```python +pipenv install --deploy # install exactly what Pipfile.lock says +auto_fix_cves(RPM_INSTALLED) # scan with Safety, pipenv update per CVE +pipenv run pip freeze --all # get all pinned versions +``` + +### CVE Auto-Fix for Runtime Packages + +After installing from the lock file, `auto_fix_cves()` runs `pipenv check` +(which uses the Safety vulnerability database) and categorises every finding: + +| Outcome | Action | +|---|---| +| Package in `RPM_INSTALLED` | Skip — the OS layer patches it | +| pip-managed package | `pipenv update ` to upgrade within Pipfile constraints | +| Pipfile constraint prevents fix | Warn with specific guidance; re-run after updating the Pipfile | + +After attempting fixes, `pipenv check` is re-run. Any remaining vulnerabilities +that `pipenv update` could not resolve are reported with the message +"Broaden the Pipfile constraint…" and require a manual Pipfile edit. + +`pip freeze` is run **after** all CVE fixes so that the pinned versions used +for the rest of the pipeline reflect the updated state. + +--- + +## Stage 2 — Generate `requirements.txt` + +Some packages declare mutually incompatible version constraints in their +metadata (e.g. package A declares `setuptools>=77` while package B declares +`setuptools<=70`). pip-compile enforces these constraints during resolution and +fails when they conflict. + +The stage iteratively finds which packages cause pip-compile to fail: + +``` +attempt 1: pip-compile requirements.in → fails + parse error → identify conflicting package X +attempt 2: pip-compile requirements.in (X commented out) → succeeds +``` + +The excluded packages are **appended uncommented** at the end of +`requirements.txt` after pip-compile succeeds. They are valid runtime +dependencies — their metadata conflicts are a pip-compile artifact, not a real +runtime incompatibility. + +After that, RPM-installed packages are **commented out** in post-processing. +They remain in the file (so humans can see what version the image expects) but +pip will not install them. + +--- + +## Stage 3 — Collect Per-Package Build Dependencies + +The cachito/cachi2 build system requires every package needed to **build** each +runtime dependency from source to be pre-declared. `pip_find_builddeps.py` +(a cachito script) inspects a package's `pyproject.toml` / `setup.cfg` +build-system requirements and emits them as pip constraints. + +The script runs `pip_find_builddeps.py` **once per package** (not once for the +whole list) so that each package's build constraints can be associated with +that specific package. This per-package association is essential for the +conflict-detection algorithm in Stage 4. + +All packages from `pip freeze` are processed, including RPM-installed and +conflict-excluded ones. If `pip_find_builddeps.py` fails for a package (e.g. +because downloading its sdist triggers a Rust/maturin compilation), the package +is skipped with a warning and the rest continue. + +Result: `pkg_constraints = { "ansible-runner": ["pbr>=2.1", …], … }` + +--- + +## Stage 4 — Build Dependency Conflict Detection and Phase Splitting + +This is the core of the tool. + +### Why Phases Are Needed + +Some build tools conflict in version requirements. For example: + +- `ansible-runner` uses `pbr` as its build system; `pbr` requires + `setuptools<=70` +- Most other packages use `hatch-vcs` which requires `setuptools>=77` + +Installing both `setuptools<=70` and `setuptools>=77` simultaneously is +impossible. The solution is **sequential installation**: install the older +version first (pre-build), then upgrade to the newer version (build). pip's +`pip install` replaces the old version with the new one, and each package's +build isolation environment finds the version it needs in the cachi2 cache. + +### The `split_phases()` Algorithm + +`split_phases(packages, constraints, tmp)` recursively partitions packages into +an ordered list of groups whose build-dep constraints can each be compiled by +pip-compile without conflicts. + +``` +INPUT: list of all packages with their build-dep constraint sets +OUTPUT: ordered list of groups (earlier group = must install first) +``` + +At each recursion level, five strategies are tried in order: + +**Step 1 — Try the whole group** +Merge all constraints, run pip-compile. If it succeeds, no conflict exists and +all packages go into a single phase. + +**Step 2 — Identify the conflicting dependency** +Parse pip-compile's error output to find the package name that could not be +satisfied. pip-tools 7.5+ emits `"ERROR: Cannot install setuptools …"`; +older versions emit `"Could not find a version that matches …"`. A fallback +scans for constraint-looking lines (operators followed immediately by a digit, +to avoid matching Python code like `result = self._result`). + +**Step 3 — Direct upper-bound heuristic** +Search `pkg_constraints` for any package that **directly** declares an +upper-bound constraint (e.g. `setuptools<=70`) on the conflicting dependency. +Those packages need the older version and go into an earlier phase. + +**Step 4 — Per-package compilation (transitive conflict detection)** +When Step 3 finds nothing, the upper bound may come **transitively** — e.g. +ansible-runner → pbr, and pbr's own build requirements restrict setuptools. +`pip_find_builddeps.py` only captures one level of build deps, so the +`setuptools<=70` constraint lives on pbr, not on ansible-runner. + +The solution: compile each package's build constraints **individually** using +pip-compile and record what version of the conflicting dependency each resolves +to. Packages resolving to different versions reveal the fault line: + +``` +ansible-runner's build deps → setuptools 70.0.0 +urllib3's build deps → setuptools 82.0.1 +split after 70.0.0 (largest gap) +→ earlier group: [ansible-runner] +``` + +The split point is the **largest version gap** in the distribution of resolved +versions (computed as `(hi.major - lo.major) * 1000 + (hi.minor - lo.minor)`). + +**Step 5 — Single-package bisection** +When per-package compilation is also inconclusive (e.g. all packages resolve to +the same version of the conflicting dep because the upper bound is transitive +through a different path), the algorithm removes one package at a time and +retries pip-compile. The first removal that makes the remainder compile +identifies the culprit, which becomes the earlier group. + +**Step 6 — Give up** +If no split can be found, the whole group is kept as one phase and a warning is +printed. This happens when a single package's own build deps are internally +contradictory (see [Internal Conflicts](#internal-conflicts)). + +**Recursion** +Once a split is found, `split_phases` is called recursively on each sub-group. +The final result is a flat ordered list: + +``` +phases[0] — needs oldest tool versions → pre-build +phases[1] — intermediate → build1 +phases[-1] — needs newest tool versions → build +``` + +### Mapping N Phases to 3 Files + +The install order is fixed at 4 files. When `split_phases` returns more than 3 +phases, a **greedy merge** runs: starting from the last (newest) phase and +working backwards through the middle phases, each middle phase is tentatively +merged into the "build" group and pip-compile is run to verify compatibility. +If the merge compiles, the phase is absorbed into build. If it fails, that +phase becomes build1 (or remains separate). This prevents blindly merging +phases that have their own internal conflicts. + +### Build-Isolation Pin Discovery and the Phased Upgrade Pattern + +Packages in non-pre-build phases often declare **exact-version** build-system +requirements. For example, ansible-core's `pyproject.toml` pins `wheel==0.45.1` +so its PEP 517 isolation environment gets precisely that version from the cachi2 +cache. Without special handling, this exact pin would propagate to the build +phase's `.in` file and pin the final image's wheel to an old, potentially +vulnerable version. + +Stage 4 solves this automatically with a three-step process: + +1. **Collect** all `pkg==X.Y.Z` constraints from every non-pre-build package's + build-dep set. +2. **Probe** each candidate by adding it to the pre-build constraint set and + running pip-compile. Pins that conflict with the pre-build's existing + constraints (e.g. `setuptools==82.0.0` would fail against ansible-runner's + `setuptools<=70`) are discarded. Compatible pins are confirmed. +3. **Inject** confirmed pins into the pre-build `.in` file — pip-compile + resolves them naturally there, ensuring cachi2 pre-fetches those exact + versions for build isolation. +4. **Strip** the same exact pins from non-pre-build phases — pip-compile there + resolves the latest compatible version, giving the final image a + newer (possibly CVE-fixed) version. + +This is entirely automatic. No configuration is needed: if ansible-core changes +its wheel requirement in a future release, the new version is discovered and +handled on the next `make generate-requirements` run. + +### Phased Upgrade Pattern: Pre-Build vs Build + +``` +install order: pre-build → build1 → build → runtime +wheel: 0.45.1 — 0.47.0 (auto-discovered isolation pin) +setuptools: 70.0.0 — 82.0.0 (phase-split conflict) +setuptools-scm: 8.1.0 — 9.2.2 (phase-split conflict) +``` + +`pip install -r requirements-build.txt` after pre-build upgrades each of these +tools. Because both versions are in the cachi2 cache: + +- ansible-runner's PEP 517 build isolation finds `setuptools==70.0.0` and + `setuptools-scm==8.1.0` +- ansible-core's PEP 517 build isolation finds `wheel==0.45.1` +- The final image's global site-packages has the newer CVE-fixed versions + +### Internal Conflicts + +Some packages (e.g. `kubernetes==33.1.0`) have build dependencies that are +mutually contradictory — the same package's own build-dep set requires both +`setuptools-scm>=8` and `setuptools_scm<8`. This is an upstream packaging issue. + +When a single-package phase fails to compile, the script: +1. Detects this as an internal conflict (only one package in the phase) +2. Writes an explanatory comment to the `.txt` file (not a warning that blocks + the build) +3. Continues — the package itself remains in `requirements.txt` and pip's own + build isolation can pull whatever it needs from the other build phases + +--- + +## Stage 5 — Build Dependency CVE Scanning + +After generating all build files, Safety is used to scan each one for known +vulnerabilities (`safety check -r `). + +For each vulnerability found: + +| Package type | Action | +|---|---| +| In `RPM_INSTALLED` | Skip — the OS layer patches it | +| Any other pip package | Add `pkg>=min_safe_version` to the `.in` file and re-run pip-compile; if that conflicts (another dep constrains the package below the safe version), report the blocking constraint by name | + +The minimum safe version is extracted from Safety's `"Affected spec: str: + """Canonical package name: lowercase, collapse [-_.] to '-'.""" + return re.sub(r"[-_.]", "-", name).lower() + + +def _run(cmd: list[str], *, capture: bool = False) -> subprocess.CompletedProcess[str]: + """Run a command, print it, and exit on non-zero return code.""" + print(f" $ {' '.join(str(c) for c in cmd)}", flush=True) + r = subprocess.run(cmd, capture_output=capture, text=True) + if r.returncode != 0: + if capture: + sys.stderr.write(r.stderr) + sys.exit(r.returncode) + return r + + +def _pip_compile( + in_file: Path, + out_file: Path, + extra: list[str] | None = None, +) -> tuple[bool, str]: + """ + Attempt pip-compile. Returns (success, stderr). Never raises or exits. + """ + cmd = [ + "pip-compile", + f"--output-file={out_file}", + "--strip-extras", + *(extra or []), + str(in_file), + ] + print(f" $ {' '.join(str(c) for c in cmd)}", flush=True) + r = subprocess.run(cmd, capture_output=True, text=True) + return r.returncode == 0, r.stderr + + +def _read_pinned(text: str) -> dict[str, str]: + """ + Parse pip-freeze / pip-compile text into {norm_name: 'OrigName==ver'}. + Skips commented-out lines, annotation lines, and non-pinned specs. + """ + result: dict[str, str] = {} + for raw in text.splitlines(): + line = raw.strip() + if not line or line.startswith("#"): + continue + m = re.match(r"^([A-Za-z0-9][A-Za-z0-9._-]*)==(\S+)", line) + if m: + result[_norm(m.group(1))] = f"{m.group(1)}=={m.group(2)}" + return result + + +def _comment_out(text: str, norms: set[str]) -> str: + """Return text with `pkg==...` lines whose norm-name is in *norms* commented out.""" + out: list[str] = [] + for raw in text.splitlines(): + stripped = raw.lstrip() + if stripped.startswith("#") or not stripped: + out.append(raw) + continue + m = re.match(r"^([A-Za-z0-9][A-Za-z0-9._-]*)==", stripped) + if m and _norm(m.group(1)) in norms: + out.append(f"#{raw}") + else: + out.append(raw) + return "\n".join(out) + "\n" + + + +def _normalize_quirks(text: str) -> str: + """Fix known PyPI version-string quirks that the ART build does not accept.""" + # python-dateutil ships 2.9.0.post0 on PyPI; ART requires the bare 2.9.0 form. + text = re.sub( + r"(?i)(python-dateutil)==(2\.9\.0)\.post0", + r"\1==\2", + text, + ) + return text + + +def _strip_ansi(text: str) -> str: + """Remove ANSI escape sequences (Safety colours its output by default).""" + return re.sub(r"\x1b\[[0-9;]*[mGKHF]", "", text) + + +def _parse_vuln_report(text: str) -> list[dict[str, str]]: + """ + Parse 'pipenv check' / Safety text output into a list of vulnerability dicts. + Each dict has keys: package, version, vuln_id, affected_spec. + + Uses re.search (not re.match) so that ANSI colour codes or other prefixes + on a line do not prevent the pattern from matching. + """ + text = _strip_ansi(text) + vulns: list[dict[str, str]] = [] + current: dict[str, str] = {} + for raw in text.splitlines(): + line = raw.strip() + m = re.search(r"-> Vulnerability found in (\S+) version (\S+)", line) + if m: + if current: + vulns.append(current) + current = {"package": _norm(m.group(1)), "version": m.group(2)} + continue + if current: + m = re.search(r"Vulnerability ID:\s*(\S+)", line) + if m: + current["vuln_id"] = m.group(1) + continue + m = re.search(r"Affected spec:\s*(.+)", line) + if m: + current["affected_spec"] = m.group(1).strip() + if current: + vulns.append(current) + return vulns + + +def _min_safe_version(affected_spec: str) -> str | None: + """ + Infer the minimum safe package version from a Safety 'Affected spec' string. + + Examples: + '<0.46.3' → '0.46.3' + '<2.33.0' → '2.33.0' + '>=1.0,<1.5' → '1.5' (the upper bound IS the CVE boundary) + + Returns None when the spec contains no upper bound. + """ + # Find the tightest upper bound: the version just after None: + """ + Run 'pipenv check', categorise every vulnerability by remediation type, + and attempt to auto-fix CVEs in pip-managed (non-RPM) packages. + + Three outcomes per CVE: + RPM-installed package — skip; the RPM layer provides the fix. + pip package, fix works — Pipfile.lock is updated in-place via + 'pipenv update '. + pip package, Pipfile — the version constraint in Pipfile is too narrow + constraint prevents to reach the safe version; a warning is printed + the fix with instructions for manual remediation. + """ + print("\n Running pipenv check for CVEs…", flush=True) + check = subprocess.run(["pipenv", "check"], capture_output=True, text=True) + + if check.returncode == 0: + print(" No vulnerabilities found.") + return + + vulns = _parse_vuln_report(check.stdout) + if not vulns: + # Safety returned non-zero but output couldn't be parsed (network error, + # DB format change, etc.) — warn but never block requirements generation. + print( + " WARNING: pipenv check returned non-zero but output could not be parsed:\n" + + check.stdout.rstrip(), + file=sys.stderr, + ) + return + + rpm_norms = {_norm(p) for p in rpm_installed} + rpm_vulns = [v for v in vulns if v["package"] in rpm_norms] + pip_vulns = [v for v in vulns if v["package"] not in rpm_norms] + + if rpm_vulns: + print( + f" {len(rpm_vulns)} CVE(s) in RPM-installed packages" + " (fixed at the RPM layer, not via pip):" + ) + for v in rpm_vulns: + print( + f" • {v['package']} {v['version']}" + f" [{v.get('vuln_id', '?')}]" + f" affected: {v.get('affected_spec', '?')}" + ) + + if not pip_vulns: + print(" No CVEs in pip-managed packages — nothing to auto-fix.") + return + + print(f"\n Attempting to auto-fix {len(pip_vulns)} CVE(s) in pip-managed packages:") + for v in pip_vulns: + print( + f" • {v['package']} {v['version']}" + f" [{v.get('vuln_id', '?')}]" + f" affected: {v.get('affected_spec', '?')}" + ) + + for v in pip_vulns: + pkg = v["package"] + print(f"\n Updating '{pkg}' to fix {v.get('vuln_id', 'vulnerability')}…", flush=True) + r = subprocess.run( + ["pipenv", "update", pkg], + capture_output=True, text=True, + ) + if r.returncode == 0: + print(f" ✓ 'pipenv update {pkg}' succeeded") + else: + print( + f" WARNING: 'pipenv update {pkg}' failed:\n" + f" {r.stderr.strip()[:400]}", + file=sys.stderr, + ) + + # Re-check: what (if anything) is still vulnerable after updates? + print("\n Re-running pipenv check after updates…", flush=True) + recheck = subprocess.run(["pipenv", "check"], capture_output=True, text=True) + if recheck.returncode == 0: + print(" All pip-managed CVEs resolved. ✓") + return + + remaining_pip = [ + v for v in _parse_vuln_report(recheck.stdout) + if _norm(v["package"]) not in rpm_norms + ] + if remaining_pip: + print( + f"\n WARNING: {len(remaining_pip)} CVE(s) remain in pip-managed packages.\n" + " These likely require a Pipfile version constraint change:", + file=sys.stderr, + ) + for v in remaining_pip: + print( + f" • {v['package']} {v['version']}" + f" [{v.get('vuln_id', '?')}]" + f" affected: {v.get('affected_spec', '?')}\n" + f" → Broaden the Pipfile constraint so a version" + f" outside '{v.get('affected_spec', '?')}' can be installed.", + file=sys.stderr, + ) + print( + " After updating the Pipfile, re-run 'make generate-requirements'.\n" + " (The Pipfile.lock exported to the output dir reflects the current" + " best-effort fix.)", + file=sys.stderr, + ) + + +def _parse_conflict_dep(stderr: str) -> str | None: + """ + Extract the name of the dependency that pip-compile could not satisfy. + + Handles multiple pip-tools / pip error formats: + • pip-tools 7.5+ "ERROR: Cannot install pkg, pkg!=X, … because …" + • pip-tools older "Could not find a version that matches pkg>=X,=45" — only when followed immediately by a + digit so we never match Python code like 'result = …' + """ + for pat in ( + # pip-tools 7.5+: first token after "Cannot install" is the dep + r"ERROR: Cannot install ([A-Za-z0-9][A-Za-z0-9._-]*)[,\s!=<>]", + # pip-tools <7.5 + r"Could not find a version that matches ([A-Za-z0-9][A-Za-z0-9._-]*)", + r"No matching distribution found for ([A-Za-z0-9][A-Za-z0-9._-]*)", + # resolvelib traceback: first SpecifierRequirement holds the dep name + r"SpecifierRequirement\('([A-Za-z0-9][A-Za-z0-9._-]*)", + ): + m = re.search(pat, stderr, re.MULTILINE) + if m: + return _norm(m.group(1)) + # Last-resort scan: lines that look like version-constraint specs. + # Require the operator to be followed immediately by a digit (e.g. >=45) + # to avoid matching Python assignment lines (e.g. "result = self._result"). + for line in stderr.splitlines(): + m = re.match( + r"\s+([A-Za-z0-9][A-Za-z0-9._-]*)(?:!=|==|>=|<=|~=|>|<)\d", + line, + ) + if m: + return _norm(m.group(1)) + return None + + +def _pkgs_needing_older(dep_norm: str, constraints: dict[str, list[str]]) -> set[str]: + """ + Return source packages whose build-dep constraints include an UPPER BOUND + (< or <=) on *dep_norm*. These must go into an EARLIER phase because they + need the older version of that dependency. + """ + result: set[str] = set() + for pkg, specs in constraints.items(): + for s in specs: + m = re.match(r"^([A-Za-z0-9][A-Za-z0-9._-]*)(.*)$", s.strip()) + if not m or _norm(m.group(1)) != dep_norm: + continue + # upper-bound markers: < (strict) or <= + if re.search(r"(?:^|,)\s*<[^=]|(?:^|,)\s*<=", m.group(2)): + result.add(pkg) + return result + + +# --------------------------------------------------------------------------- +# Core phase-splitting algorithm +# --------------------------------------------------------------------------- + + +def _find_older_group_by_compilation( + dep_norm: str, + packages: list[str], + constraints: dict[str, list[str]], + tmp: Path, + depth: int, +) -> list[str]: + """ + Handle transitive version conflicts that _pkgs_needing_older cannot detect. + + Compiles each package's build deps **individually** and records the version + of *dep_norm* that each resolves to. Packages that resolve to a + significantly older version of *dep_norm* than the rest are the ones that + must go into an earlier phase. + + The split point is the largest version gap in the distribution of resolved + versions (e.g. {70.0.0, 82.0.1} → gap of 12 → split after 70.0.0). + + Returns the 'older' group as a list, or [] if no clear split is found. + """ + try: + from packaging.version import Version + except ImportError: + return [] + + print( + f" [depth={depth}] Per-package compilation to detect transitive" + f" conflict on '{dep_norm}'…", + flush=True, + ) + + # Compile each package's build deps in isolation and read the resolved + # version of dep_norm from the output. + # + # Build a regex that matches both hyphen and underscore normalizations of + # dep_norm (e.g. 'setuptools-scm' and 'setuptools_scm'). + # re.escape("setuptools-scm") → "setuptools\\-scm"; we then replace the + # escaped hyphen/underscore sequences with a [-_] character class. + name_re = re.escape(dep_norm).replace(r"\-", r"[-_]").replace(r"\_", r"[-_]") + dep_pattern = re.compile(r"^#?\s*" + name_re + r"==(\S+)", re.IGNORECASE) + pkg_resolved: dict[str, Version] = {} + + for pkg in packages: + specs = constraints.get(pkg, []) + if not specs: + continue + solo_in = tmp / f"ppc_{depth}_{pkg}.in" + solo_out = tmp / f"ppc_{depth}_{pkg}.txt" + solo_in.write_text("\n".join(specs) + "\n") + ok, _ = _pip_compile(solo_in, solo_out, ["--allow-unsafe"]) + if not ok or not solo_out.exists(): + continue + for line in solo_out.read_text().splitlines(): + m = dep_pattern.match(line.strip()) + if m: + try: + pkg_resolved[pkg] = Version(m.group(1)) + except Exception: + pass + break + + if not pkg_resolved: + return [] + + unique = sorted(set(pkg_resolved.values())) + print( + f" [depth={depth}] '{dep_norm}' per-package resolved versions: " + + ", ".join(str(v) for v in unique), + flush=True, + ) + + if len(unique) < 2: + return [] # all packages agree — not the source of the conflict + + # Find the largest gap between consecutive resolved versions. + # e.g. [70.0.0, 82.0.1] → gap of 12 minor/major units → split after 70.0.0 + split_after = unique[0] + max_gap = 0 + for lo, hi in zip(unique, unique[1:]): + gap = (hi.major - lo.major) * 1000 + (hi.minor - lo.minor) + if gap > max_gap: + max_gap = gap + split_after = lo + + older = [p for p, v in pkg_resolved.items() if v <= split_after] + print( + f" [depth={depth}] '{dep_norm}' split after {split_after};" + f" older group: {sorted(older)}", + flush=True, + ) + return older if older and set(older) < set(packages) else [] + + +def split_phases( + packages: list[str], + constraints: dict[str, list[str]], + tmp: Path, + _depth: int = 0, +) -> list[list[str]]: + """ + Recursively split *packages* into ordered phases whose merged build-dep + constraints are each solvable by pip-compile without conflicts. + + Returns a list of lists ordered earliest-first (pre-build → build). + + Resolution strategy (in order): + 1. Try to compile the merged constraints for the whole group. + → success: single phase, done. + 2. Parse the conflicting dependency from pip-compile's error output. + 3. Direct upper-bound heuristic: find packages that directly declare an + upper-bound constraint on the conflicting dep (e.g. setuptools<=70). + 4. Per-package compilation (transitive conflict detection): compile each + package's build deps alone, observe what version of the conflicting dep + they resolve to, and split at the largest version gap. This catches + cases where the upper bound comes transitively (e.g. ansible-runner → + pbr → setuptools<=70). + 5. Single-package bisection: remove one package at a time until the + remainder compiles — handles the case where a single package is the + sole culprit and neither of the above found it. + 6. Give up and keep the group as one phase (with a warning). + """ + if not packages: + return [] + if len(packages) == 1: + return [list(packages)] + + # ── Step 1: try the whole group ───────────────────────────────────────── + all_specs = [s for pkg in packages for s in constraints.get(pkg, [])] + merged_in = tmp / f"split_{_depth}_{len(packages)}.in" + merged_out = tmp / f"split_{_depth}_{len(packages)}.txt" + merged_in.write_text("\n".join(all_specs) + "\n") + + ok, stderr = _pip_compile(merged_in, merged_out, ["--allow-unsafe"]) + if ok: + return [list(packages)] + + # ── Step 2: identify the conflicting dependency ────────────────────────── + dep = _parse_conflict_dep(stderr) + earlier: list[str] = [] + + # ── Step 3: direct upper-bound heuristic ───────────────────────────────── + if dep: + earlier = sorted( + _pkgs_needing_older(dep, {p: constraints.get(p, []) for p in packages}) + ) + + # ── Step 4: per-package compilation (transitive conflict detection) ────── + if (not earlier or set(earlier) >= set(packages)) and dep: + print( + f" [depth={_depth}] Direct upper-bound heuristic inconclusive" + f" for '{dep}'; trying per-package compilation…", + flush=True, + ) + earlier = _find_older_group_by_compilation( + dep, packages, constraints, tmp, _depth + ) + + # ── Step 5: single-package bisection ───────────────────────────────────── + if not earlier or set(earlier) >= set(packages): + print( + f" [depth={_depth}] Per-package compilation inconclusive;" + f" bisecting {len(packages)} packages…", + flush=True, + ) + for suspect in packages: + rest = [p for p in packages if p != suspect] + test_in = tmp / f"bisect_{_depth}_{_norm(suspect)}.in" + test_out = tmp / f"bisect_{_depth}_{_norm(suspect)}.txt" + specs = [s for p in rest for s in constraints.get(p, [])] + test_in.write_text("\n".join(specs) + "\n") + ok2, _ = _pip_compile(test_in, test_out, ["--allow-unsafe"]) + if ok2: + earlier = [suspect] + break + + # ── Step 6: give up ────────────────────────────────────────────────────── + if not earlier or set(earlier) >= set(packages): + print( + f" WARNING: Cannot split conflict at depth={_depth};" + " keeping group as a single phase.", + file=sys.stderr, + ) + print(f" pip-compile stderr:\n{stderr}", file=sys.stderr) + return [list(packages)] + + later = [p for p in packages if p not in set(earlier)] + return ( + split_phases(earlier, constraints, tmp, _depth + 1) + + split_phases(later, constraints, tmp, _depth + 1) + ) + + +# --------------------------------------------------------------------------- +# Stages +# --------------------------------------------------------------------------- + + +def stage1_resolve_runtime(tmp: Path) -> str: + """ + Run pipenv install --deploy, attempt to auto-fix any CVEs found by + 'pipenv check', then run pip freeze --all. + + Returns the raw pip-freeze output (all runtime packages pinned). + The Pipfile.lock may be updated in-place if CVE fixes are applied. + """ + print("\n══ Stage 1: Resolve runtime packages via pipenv ══") + _run(["pipenv", "install", "--deploy"]) + + # Check for CVEs and attempt auto-fixes for pip-managed packages. + # RPM-installed packages are skipped (fixed at the OS layer). + # Never blocks requirements generation even if CVEs remain. + auto_fix_cves(RPM_INSTALLED) + + # pip freeze after potential CVE-driven updates so we capture the final + # pinned versions (including any packages updated by auto_fix_cves). + r = _run(["pipenv", "run", "pip", "freeze", "--all"], capture=True) + raw = _normalize_quirks(r.stdout) + n = len(_read_pinned(raw)) + print(f" Resolved {n} pinned packages") + return raw + + +def stage2_runtime_txt( + raw: str, + tmp: Path, + out_dir: Path, +) -> tuple[Path, set[str]]: + """ + Generate requirements.txt. + + Iteratively detects which packages cause pip-compile to fail due to + conflicting declared dependency metadata (e.g. mutually incompatible + setuptools version constraints across packages), excludes them from + compilation, and appends them manually afterwards. + + RPM-installed packages are NOT pre-excluded here — they participate in + pip-compile's resolution so that their dependants resolve correctly. + They are commented out in post-processing. + + Returns (requirements.txt path, set of manually-appended norm names). + """ + print("\n══ Stage 2: Generate requirements.txt ══") + + all_pinned = _read_pinned(raw) + excluded: set[str] = set() # packages causing pip-compile failures + appended: set[str] = set() # same set — appended manually at end + + # Write the compile input to out_dir (not tmp) so pip-compile's + # "# via -r requirements.in" annotation uses a clean relative-looking path + # instead of a /tmp/… tempdir path. + compile_in = out_dir / "requirements.in" + out_txt = out_dir / "requirements.txt" + + for attempt in range(1, 40): + print(f" Attempt {attempt} — excluded from pip-compile: {sorted(excluded) or '(none)'}") + compile_in.write_text(_comment_out(raw, excluded)) + ok, stderr = _pip_compile(compile_in, out_txt) + if ok: + print(f" pip-compile succeeded on attempt {attempt}") + break + + dep = _parse_conflict_dep(stderr) + new_exc: str | None = None + + if dep and dep not in excluded and dep in all_pinned: + new_exc = dep + else: + # Broader scan: any recognisable package name from the error output + for line in stderr.splitlines(): + m = re.search(r"([A-Za-z0-9][A-Za-z0-9._-]+)\s*[>= dict[str, list[str]]: + """ + Run pip_find_builddeps.py for every runtime package individually. + Collecting build deps for ALL packages — including RPM-excluded and + conflict-excluded ones — ensures the phase-split algorithm has complete + information about which packages need which build-dep versions. + + If pip_find_builddeps.py fails for a package (e.g. because downloading its + sdist triggers a Rust/maturin build), that package is skipped with a warning. + + Returns {norm_name: [build-constraint-string, …]}. + """ + print("\n══ Stage 3: Collect per-package build deps via pip_find_builddeps.py ══") + + pkg_constraints: dict[str, list[str]] = {} + + for norm_name in sorted(all_pinned): + pkg_line = all_pinned[norm_name] + single_in = tmp / f"bdi_{norm_name}.txt" + single_out = tmp / f"bdo_{norm_name}.in" + single_in.write_text(pkg_line + "\n") + + print(f" {pkg_line}", end=" … ", flush=True) + r = subprocess.run( + [sys.executable, PIP_FIND_BUILDDEPS, + str(single_in), "-o", str(single_out), "--append"], + capture_output=True, + text=True, + ) + + if r.returncode != 0 or not single_out.exists(): + print("SKIPPED (pip_find_builddeps failed)") + if r.stderr.strip(): + # Print first 200 chars so failures are diagnosable without flooding output + print(f" {r.stderr.strip()[:200]}", file=sys.stderr) + continue + + specs = [ + ln.strip() + for ln in single_out.read_text().splitlines() + if ln.strip() and not ln.strip().startswith("#") + ] + pkg_constraints[norm_name] = specs + print(f"{len(specs)} build constraints") + + total = sum(len(v) for v in pkg_constraints.values()) + print(f" Collected {total} build constraints across {len(pkg_constraints)} packages") + return pkg_constraints + + +def stage4_build_phases( + pkg_constraints: dict[str, list[str]], + out_dir: Path, + tmp: Path, +) -> None: + """ + Detect conflicts in the collected build deps, recursively split into + phases, and produce exactly three build requirements files. + + Phase mapping (always 3 output slots regardless of discovered phase count): + phases[0] → requirements-pre-build.txt (oldest conflicting deps) + phases[1:-1] → requirements-build1.txt (merged middle phases) + phases[-1] → requirements-build.txt (newest deps, main phase) + + Each file has RPM-installed packages commented out. Build-isolation + exact-version pins are discovered dynamically and injected into pre-build. + """ + print("\n══ Stage 4: Detect conflicts and generate build requirements ══") + + packages = list(pkg_constraints.keys()) + print(f" Running phase-split on {len(packages)} packages…") + + phases = split_phases(packages, pkg_constraints, tmp) + + print(f" Discovered {len(phases)} phase(s): {[len(p) for p in phases]} pkgs each") + for i, ph in enumerate(phases): + print(f" Phase {i}: {sorted(ph)}") + + # ---- Map N discovered phases → exactly 3 output slots ---- + # + # Strategy for N > 3 phases: keep phases[0] as pre-build and greedily + # absorb middle phases (from latest to earliest) into the "build" group. + # Any middle phase that makes the build group fail to compile is kept + # separate in "build1". This prevents blindly merging conflicting middle + # phases (e.g. a setuptools-scm<8 phase with a setuptools-scm>=8 phase). + def _try_compile_group(pkgs: list[str]) -> bool: + """Return True if merging pkgs' build-dep constraints compiles cleanly.""" + if not pkgs: + return True + specs = [s for p in pkgs for s in pkg_constraints.get(p, [])] + test_in = tmp / f"maptest_{len(pkgs)}.in" + test_out = tmp / f"maptest_{len(pkgs)}.txt" + test_in.write_text("\n".join(specs) + "\n") + ok, _ = _pip_compile(test_in, test_out, ["--allow-unsafe"]) + return ok + + if not phases: + mapped: list[tuple[str, str, list[str]]] = [ + ("pre_build", "requirements-pre-build", []), + ("build1", "requirements-build1", []), + ("build", "requirements-build", []), + ] + elif len(phases) == 1: + mapped = [ + ("pre_build", "requirements-pre-build", []), + ("build1", "requirements-build1", []), + ("build", "requirements-build", phases[0]), + ] + elif len(phases) == 2: + mapped = [ + ("pre_build", "requirements-pre-build", phases[0]), + ("build1", "requirements-build1", []), + ("build", "requirements-build", phases[1]), + ] + else: + # ≥3 phases: first → pre-build, last → initial build group. + # Try to absorb middle phases into build (from latest to earliest). + # Phases that cannot be merged go to build1. + build_pkgs = list(phases[-1]) + build1_pkgs: list[str] = [] + + # Iterate middle phases from latest (closest to build) to earliest + for phase in reversed(phases[1:-1]): + candidate = phase + build_pkgs + if _try_compile_group(candidate): + build_pkgs = candidate + else: + build1_pkgs = phase + build1_pkgs # prepend to keep order + + mapped = [ + ("pre_build", "requirements-pre-build", phases[0]), + ("build1", "requirements-build1", build1_pkgs), + ("build", "requirements-build", build_pkgs), + ] + + rpm_norms = {_norm(p) for p in RPM_INSTALLED} + + # ── Dynamically discover build-isolation exact-version pins ──────────────── + # + # Packages in non-pre-build phases may declare exact-version build-system + # requirements (e.g. ansible-core's pyproject.toml says wheel==0.45.1 so + # that its PEP 517 isolation environment gets that specific version from the + # cachi2 cache). These pins must appear in requirements-pre-build.txt so + # cachi2 pre-fetches them; later phases are then free to resolve newer + # (possibly CVE-fixed) versions. + # + # Algorithm: + # 1. Collect all pkg==X.Y.Z constraints from non-pre-build packages. + # 2. For each candidate, try adding it to the pre-build phase's constraint + # set and run pip-compile. Keep it only if pre-build still compiles + # (i.e. the pin doesn't conflict with the phase-split constraint, e.g. + # setuptools==82 would rightly fail against pre-build's setuptools<=70). + # 3. Inject the confirmed pins into the pre-build .in file. + # 4. Strip those same pins from non-pre-build phases so pip-compile there + # resolves the latest compatible version. + + pre_build_pkgs = next((pp for pk, _, pp in mapped if pk == "pre_build"), []) + pre_build_base_specs = [ + s for pkg in pre_build_pkgs for s in pkg_constraints.get(pkg, []) + ] + + # Step 1: collect candidate exact pins from non-pre-build packages + candidate_iso_pins: dict[str, str] = {} # norm → version + for pk, _, pp in mapped: + if pk == "pre_build" or not pp: + continue + for pkg in pp: + for s in pkg_constraints.get(pkg, []): + m = re.match(r"^([A-Za-z0-9][A-Za-z0-9._-]*)==(\S+)", s.strip()) + if m: + candidate_iso_pins[_norm(m.group(1))] = m.group(2) + + # Step 2: verify compatibility with pre-build; discard conflicting pins + auto_iso_pins: dict[str, str] = {} + if candidate_iso_pins and pre_build_pkgs: + print( + f"\n Probing {len(candidate_iso_pins)} exact-version pin(s) for" + " pre-build isolation compatibility…" + ) + for norm, ver in sorted(candidate_iso_pins.items()): + test_in = tmp / f"iso_{norm}.in" + test_out = tmp / f"iso_{norm}.txt" + test_in.write_text( + "\n".join(pre_build_base_specs + [f"{norm}=={ver}"]) + "\n" + ) + ok, _ = _pip_compile(test_in, test_out, ["--allow-unsafe"]) + if ok: + auto_iso_pins[norm] = ver + print(f" ✓ build-isolation pin: {norm}=={ver}") + # If not ok: conflicts with pre-build constraints — skip + + pre_build_pin_norms: dict[str, str] = {**auto_iso_pins} + + if auto_iso_pins: + print( + f" Auto-detected build-isolation pins for pre-build: " + + ", ".join(f"{k}=={v}" for k, v in sorted(auto_iso_pins.items())) + ) + + for phase_key, base, phase_pkgs in mapped: + in_path = out_dir / f"{base}.in" + txt_path = out_dir / f"{base}.txt" + label = base.replace("requirements-", "") + + if not phase_pkgs: + print(f"\n {label}: (empty — no packages assigned)") + in_path.write_text("# No packages assigned to this phase\n") + txt_path.write_text( + f"# {base}.txt\n" + "# No packages assigned to this phase\n" + ) + continue + + print(f"\n {label}: source packages = {sorted(phase_pkgs)}") + + raw_specs = [s for pkg in phase_pkgs for s in pkg_constraints.get(pkg, [])] + + if phase_key == "pre_build": + # Step 3: inject isolation pins into pre-build. + # Only skip if the EXACT same pin is already in the constraint set; + # a looser constraint like wheel>=0.36 does not block the injection. + already_exact_pinned = { + _norm(m.group(1)) + for s in raw_specs + if (m := re.match(r"^([A-Za-z0-9][A-Za-z0-9._-]*)==", s.strip())) + } + additions = [ + f"{norm}=={ver}" + for norm, ver in pre_build_pin_norms.items() + if norm not in already_exact_pinned + ] + if additions: + print(f" Injecting isolation pins into pre-build: {additions}") + specs = raw_specs + additions + else: + # Step 4: strip isolation-pinned exact constraints from later phases + # so pip-compile can resolve a newer (CVE-fixed) version there. + filtered_specs: list[str] = [] + stripped: list[str] = [] + for s in raw_specs: + m = re.match(r"^([A-Za-z0-9][A-Za-z0-9._-]*)==(\S+)", s.strip()) + if m: + pkg_n = _norm(m.group(1)) + ver = m.group(2) + if pkg_n in pre_build_pin_norms and ver == pre_build_pin_norms[pkg_n]: + stripped.append(s.strip()) + continue # pre-build covers this exact pin + filtered_specs.append(s) + if stripped: + print( + f" Stripped {len(stripped)} pre-build isolation pin(s)" + f" from {label} (pre-build covers them; {label} resolves newer):" + ) + for s in stripped: + print(f" {s}") + specs = filtered_specs + + in_path.write_text("\n".join(specs) + "\n") + + ok, stderr = _pip_compile(in_path, txt_path, ["--allow-unsafe"]) + if not ok: + print(f" WARNING: pip-compile failed for '{label}':", file=sys.stderr) + # If this is a single-package phase, that package's own build deps + # are internally conflicting (e.g. kubernetes 33.1.0 requires both + # setuptools-scm>=8 and setuptools_scm<8 in its build system). + # Skip the phase rather than leaving a broken file; the package + # itself remains in requirements.txt and pip's build isolation will + # use whatever is available from the other build phases. + if len(phase_pkgs) == 1: + conflict_dep = _parse_conflict_dep(stderr) + print( + f" → '{phase_pkgs[0]}' has an internal build-dep conflict" + f" on '{conflict_dep}'; skipping its build deps.", + file=sys.stderr, + ) + txt_path.write_text( + f"# {base}.txt\n" + f"# '{phase_pkgs[0]}' has an internal build-dep conflict" + f" on '{conflict_dep}'.\n" + "# Its build deps are excluded; the package itself is in" + " requirements.txt.\n" + ) + continue + # Multi-package failure: try to further split via split_phases and + # redistribute — put packages that compile into build, leave the + # remainder in this phase. + print( + f" Attempting recursive split of {len(phase_pkgs)} packages…", + file=sys.stderr, + ) + sub_phases = split_phases(phase_pkgs, pkg_constraints, tmp, _depth=10) + if len(sub_phases) >= 2: + # Move the 'newer' (last) sub-phase into the build group by + # rewriting build specs. The 'older' sub-phase stays here. + print( + f" Recursive split produced {len(sub_phases)} sub-phases;" + " merging latest into build.", + file=sys.stderr, + ) + earlier_pkgs = sub_phases[0] + later_pkgs = sum(sub_phases[1:], []) + # Rewrite this phase's .in with just the earlier sub-phase + specs_earlier = [s for p in earlier_pkgs for s in pkg_constraints.get(p, [])] + in_path.write_text("\n".join(specs_earlier) + "\n") + ok2, stderr2 = _pip_compile(in_path, txt_path, ["--allow-unsafe"]) + # Append later_pkgs' specs to the build .in file for recompile + build_in = out_dir / "requirements-build.in" + build_txt = out_dir / "requirements-build.txt" + if build_in.exists(): + existing = build_in.read_text() + extra = [s for p in later_pkgs for s in pkg_constraints.get(p, [])] + build_in.write_text(existing + "\n".join(extra) + "\n") + _pip_compile(build_in, build_txt, ["--allow-unsafe"]) + if not ok2: + print(f" Sub-split earlier phase also fails; skipping.", file=sys.stderr) + txt_path.write_text( + f"# {base}.txt\n# WARNING: pip-compile failed after sub-split\n" + ) + continue + else: + print(f" {stderr}", file=sys.stderr) + txt_path.write_text( + f"# {base}.txt\n" + "# WARNING: pip-compile failed — check pip-compile output above\n" + ) + continue + + content = txt_path.read_text() + content = _normalize_quirks(content) + content = _comment_out(content, rpm_norms) + txt_path.write_text(content) + print(f" → {txt_path.name}: {len(content.splitlines())} lines") + + +# --------------------------------------------------------------------------- +# Stage 5 — CVE checking for build requirements files +# --------------------------------------------------------------------------- + + +def stage5_check_build_dep_cves(out_dir: Path, tmp: Path) -> None: + """ + Scan each generated build requirements file for CVEs using Safety and + attempt to auto-fix them by adding minimum-version constraints and + re-running pip-compile. + + Two outcomes per vulnerability: + RPM-installed package — skip; fixed at the OS layer. + Other pip package — add 'pkg>=min_safe_version' to the .in file and + re-run pip-compile. If that conflicts (another + dep constrains the package below the safe version), + report the blocking constraint by name. + """ + print("\n══ Stage 5: Check build dep CVEs ══") + + rpm_norms = {_norm(p) for p in RPM_INSTALLED} + + # Collect (phase_key, txt_path, in_path) for every build phase that has + # both files present. + build_phases: list[tuple[str, Path, Path]] = [] + for phase_key, base in [ + ("pre_build", "requirements-pre-build"), + ("build1", "requirements-build1"), + ("build", "requirements-build"), + ]: + txt = out_dir / f"{base}.txt" + inp = out_dir / f"{base}.in" + if txt.exists() and inp.exists(): + build_phases.append((phase_key, txt, inp)) + + any_vuln_found = False + + for phase_key, txt_path, in_path in build_phases: + label = txt_path.stem.replace("requirements-", "") + print(f"\n Scanning {txt_path.name}…", flush=True) + + # safety lives inside the pipenv virtualenv; try several invocation forms + for safety_cmd in ( + ["pipenv", "run", "safety", "check", "-r", str(txt_path)], + ["python3", "-m", "safety", "check", "-r", str(txt_path)], + ["safety", "check", "-r", str(txt_path)], + ): + check = subprocess.run(safety_cmd, capture_output=True, text=True) + if check.returncode != 1 or check.stderr.strip(): + # returncode==1 with no stderr → not a "command not found" + break + if "No such file" in check.stderr or "not found" in check.stderr.lower(): + continue # try next form + break + + if check.returncode == 0: + print(f" No CVEs in {txt_path.name}") + continue + + combined = _strip_ansi(check.stdout + "\n" + check.stderr) + vulns = _parse_vuln_report(combined) + if not vulns: + # safety returned non-zero for a non-CVE reason (network, db, …) + print( + f" WARNING: safety check returned non-zero for {txt_path.name}" + " but output could not be parsed; skipping.", + file=sys.stderr, + ) + continue + + rpm_vulns = [v for v in vulns if _norm(v["package"]) in rpm_norms] + pip_vulns = [v for v in vulns if _norm(v["package"]) not in rpm_norms] + + if rpm_vulns: + print( + f" {len(rpm_vulns)} CVE(s) in RPM-installed packages" + " (fixed at the OS layer):" + ) + for v in rpm_vulns: + print( + f" • {v['package']} [{v.get('vuln_id','?')}]" + f" affected: {v.get('affected_spec','?')}" + ) + + if not pip_vulns: + continue + + any_vuln_found = True + print(f"\n {len(pip_vulns)} CVE(s) in pip-managed packages in {label}:") + for v in pip_vulns: + print( + f" • {v['package']} {v['version']}" + f" [{v.get('vuln_id','?')}]" + f" affected: {v.get('affected_spec','?')}" + ) + + for vuln in pip_vulns: + pkg = _norm(vuln["package"]) + spec = vuln.get("affected_spec", "") + vid = vuln.get("vuln_id", "?") + min_safe = _min_safe_version(spec) + + if not min_safe: + print( + f"\n [{vid}] Cannot infer min safe version from" + f" '{spec}' for '{pkg}'; skipping.", + file=sys.stderr, + ) + continue + + # ── Try to add the min-version constraint and re-compile + print( + f"\n [{vid}] Attempting to upgrade '{pkg}' to >={min_safe}…", + flush=True, + ) + in_content = in_path.read_text() + augmented = in_content.rstrip("\n") + f"\n{pkg}>={min_safe}\n" + + test_in = tmp / f"s5_{phase_key}_{pkg}.in" + test_out = tmp / f"s5_{phase_key}_{pkg}.txt" + test_in.write_text(augmented) + + ok, stderr = _pip_compile(test_in, test_out, ["--allow-unsafe"]) + + if ok: + in_path.write_text(augmented) + content = test_out.read_text() + content = _normalize_quirks(content) + content = _comment_out(content, {_norm(p) for p in RPM_INSTALLED}) + txt_path.write_text(content) + print(f" ✓ '{pkg}' upgraded to >={min_safe} in {txt_path.name}") + else: + conflict = _parse_conflict_dep(stderr) + print( + f"\n WARNING [{vid}]: Cannot upgrade '{pkg}' to >={min_safe}.\n" + f" Blocking constraint on: '{conflict}'\n" + f" A build dep is pinning '{conflict}' to a version that" + f" prevents '{pkg}'>={min_safe}.\n" + f" Check which package in {in_path.name} constrains" + f" '{conflict}' and update it.", + file=sys.stderr, + ) + + if not any_vuln_found: + print(" No CVEs found in any build requirements file.") + + +# --------------------------------------------------------------------------- +# Entry point +# --------------------------------------------------------------------------- + + +def main() -> None: + ap = argparse.ArgumentParser( + description=( + "Generate ART hermetic build requirements files from a Pipfile.\n" + "Dynamically resolves conflicts and splits build deps into phases." + ) + ) + ap.add_argument( + "--output-dir", "-o", + default=".", + help="Directory for output files (default: current dir)", + ) + args = ap.parse_args() + + out_dir = Path(args.output_dir).resolve() + out_dir.mkdir(parents=True, exist_ok=True) + + width = 60 + print("=" * width) + print(" ART Hermetic Build Requirements Generator".center(width)) + print("=" * width) + print(f" Output dir : {out_dir}") + print(f" RPM packages : {sorted(RPM_INSTALLED)}") + + if not Path(PIP_FIND_BUILDDEPS).exists(): + print( + f"\nERROR: {PIP_FIND_BUILDDEPS!r} not found.\n" + "Download it with:\n" + " curl -LO https://raw.githubusercontent.com/containerbuildsystem/" + "cachito/master/bin/pip_find_builddeps.py\n" + " chmod +x pip_find_builddeps.py", + file=sys.stderr, + ) + sys.exit(1) + + with tempfile.TemporaryDirectory(prefix="gen_reqs_") as td: + tmp = Path(td) + + # Stage 1 — resolve all runtime packages (includes CVE auto-fix) + raw = stage1_resolve_runtime(tmp) + all_pinned = _read_pinned(raw) + + # Stage 2 — generate requirements.txt with dynamic conflict exclusion + _req_txt, _appended = stage2_runtime_txt(raw, tmp, out_dir) + + # Stage 3 — collect build deps for every runtime package + pkg_constraints = stage3_collect_build_deps(all_pinned, tmp) + + # Stage 4 — detect conflicts, split phases, compile build files + stage4_build_phases(pkg_constraints, out_dir, tmp) + + # Stage 5 — check generated build requirements for CVEs and auto-fix + stage5_check_build_dep_cves(out_dir, tmp) + + # Export the (potentially CVE-updated) Pipfile.lock so the caller can + # commit it alongside the requirements files. + lock_src = Path("Pipfile.lock") + lock_dst = out_dir / "Pipfile.lock" + if lock_src.exists() and lock_src.resolve() != lock_dst.resolve(): + shutil.copy2(lock_src, lock_dst) + print(f"\n Pipfile.lock exported → {lock_dst}") + print(" Commit it if CVE fixes updated it (check with git diff).") + elif lock_src.resolve() == lock_dst.resolve(): + print(f"\n Pipfile.lock already in output dir (same path), skipping copy.") + + # Summary + print("\n" + "=" * width) + print(" Summary".center(width)) + print("=" * width) + for fname in ( + "requirements-pre-build.txt", + "requirements-build1.txt", + "requirements-build.txt", + "requirements.txt", + "Pipfile.lock", + ): + p = out_dir / fname + if p.exists(): + lines = p.read_text().splitlines() + active = sum(1 for ln in lines if ln.strip() and not ln.strip().startswith("#")) + print(f" {fname:<30} {len(lines):>4} lines ({active} active)") + else: + print(f" {fname:<30} MISSING") + + +if __name__ == "__main__": + main() diff --git a/openshift/requirements-build.txt b/openshift/requirements-build.txt index 2d18f556..733a5863 100644 --- a/openshift/requirements-build.txt +++ b/openshift/requirements-build.txt @@ -2,60 +2,78 @@ # This file is autogenerated by pip-compile with Python 3.12 # by the following command: # -# pip-compile --allow-unsafe --output-file=./requirements-build.txt --strip-extras ./requirements-build.in +# pip-compile --allow-unsafe --output-file=/requirements-build.txt --strip-extras /requirements-build.in # -calver==2025.10.20 - # via -r requirements-build.in +#cffi==2.0.0 + # via -r /requirements-build.in changelog-chug==0.0.3 - # via -r requirements-build.in + # via -r /requirements-build.in cython==3.2.4 - # via -r requirements-build.in + # via -r /requirements-build.in docutils==0.22.4 # via - # -r requirements-build.in + # -r /requirements-build.in # changelog-chug flit-core==3.12.0 - # via -r requirements-build.in + # via -r /requirements-build.in hatch-vcs==0.5.0 - # via -r requirements-build.in + # via -r /requirements-build.in hatchling==1.29.0 # via - # -r requirements-build.in + # -r /requirements-build.in # hatch-vcs -packaging==26.0 +#maturin==1.13.1 + # via -r /requirements-build.in +packaging==26.2 # via - # -r requirements-build.in + # -r /requirements-build.in # hatchling # setuptools-scm + # vcs-versioning # wheel -pathspec==1.0.4 +pathspec==1.1.1 # via - # -r requirements-build.in + # -r /requirements-build.in # hatchling pbr==7.0.3 - # via -r requirements-build.in + # via -r /requirements-build.in pluggy==1.6.0 # via - # -r requirements-build.in + # -r /requirements-build.in # hatchling -poetry-core==2.3.1 - # via -r requirements-build.in +poetry-core==2.4.0 + # via -r /requirements-build.in +#pycparser==3.0 + # via + # -r /requirements-build.in + # cffi +semantic-version==2.10.0 + # via + # -r /requirements-build.in + # setuptools-rust semver==3.0.4 # via - # -r requirements-build.in + # -r /requirements-build.in # changelog-chug +setuptools-rust==1.12.1 + # via -r /requirements-build.in setuptools-scm==9.2.2 - # via hatch-vcs -trove-classifiers==2026.1.14.14 # via - # -r requirements-build.in + # -r /requirements-build.in + # hatch-vcs +trove-classifiers==2026.4.28.13 + # via + # -r /requirements-build.in # hatchling -wheel==0.46.3 - # via -r requirements-build.in +vcs-versioning==1.1.1 + # via -r /requirements-build.in +wheel==0.47.0 + # via -r /requirements-build.in # The following packages are considered to be unsafe in a requirements file: -setuptools==81.0.0 +setuptools==82.0.0 # via - # -r requirements-build.in + # -r /requirements-build.in # pbr + # setuptools-rust # setuptools-scm diff --git a/openshift/requirements-build1.txt b/openshift/requirements-build1.txt index 918c89df..ad489d4d 100644 --- a/openshift/requirements-build1.txt +++ b/openshift/requirements-build1.txt @@ -1,39 +1,3 @@ -# -# This file is autogenerated by pip-compile with Python 3.12 -# by the following command: -# -# pip-compile --allow-unsafe --output-file=./requirements-build1.txt --strip-extras ./requirements-build1.in -# -hatch-vcs==0.5.0 - # via -r requirements-build1.in -hatchling==1.29.0 - # via - # -r requirements-build1.in - # hatch-vcs -packaging==26.0 - # via - # -r requirements-build1.in - # hatchling - # setuptools-scm -pathspec==1.0.4 - # via - # -r requirements-build1.in - # hatchling -pluggy==1.6.0 - # via - # -r requirements-build1.in - # hatchling -setuptools-scm==9.2.2 - # via - # -r requirements-build1.in - # hatch-vcs -trove-classifiers==2026.1.14.14 - # via - # -r requirements-build1.in - # hatchling - -# The following packages are considered to be unsafe in a requirements file: -setuptools==82.0.1 - # via - # -r requirements-build1.in - # setuptools-scm +# requirements-build1.txt +# 'kubernetes' has an internal build-dep conflict on 'setuptools-scm'. +# Its build deps are excluded; the package itself is in requirements.txt. diff --git a/openshift/requirements-pre-build.txt b/openshift/requirements-pre-build.txt index 64e41f82..3c954705 100644 --- a/openshift/requirements-pre-build.txt +++ b/openshift/requirements-pre-build.txt @@ -2,37 +2,36 @@ # This file is autogenerated by pip-compile with Python 3.12 # by the following command: # -# pip-compile --allow-unsafe --output-file=./requirements-pre-build.txt --strip-extras ./requirements-pre-build.in +# pip-compile --allow-unsafe --output-file=/requirements-pre-build.txt --strip-extras /requirements-pre-build.in # changelog-chug==0.0.3 - # via -r requirements-pre-build.in + # via -r /requirements-pre-build.in cython==3.2.4 - # via -r requirements-pre-build.in + # via -r /requirements-pre-build.in docutils==0.22.4 # via - # -r requirements-pre-build.in + # -r /requirements-pre-build.in # changelog-chug flit-core==3.12.0 - # via -r requirements-pre-build.in -packaging==26.0 + # via -r /requirements-pre-build.in +packaging==26.2 # via - # -r requirements-pre-build.in + # -r /requirements-pre-build.in # setuptools-scm - # wheel pbr==7.0.3 - # via -r requirements-pre-build.in + # via -r /requirements-pre-build.in semver==3.0.4 # via - # -r requirements-pre-build.in + # -r /requirements-pre-build.in # changelog-chug setuptools-scm==8.1.0 - # via -r requirements-pre-build.in + # via -r /requirements-pre-build.in wheel==0.45.1 - # via -r requirements-pre-build.in + # via -r /requirements-pre-build.in # The following packages are considered to be unsafe in a requirements file: setuptools==70.0.0 # via - # -r requirements-pre-build.in + # -r /requirements-pre-build.in # pbr # setuptools-scm diff --git a/openshift/requirements.txt b/openshift/requirements.txt index b4d05e68..25ce7e4f 100644 --- a/openshift/requirements.txt +++ b/openshift/requirements.txt @@ -2,117 +2,135 @@ # This file is autogenerated by pip-compile with Python 3.12 # by the following command: # -# pip-compile --output-file=./requirements.txt --strip-extras ./requirements.in +# pip-compile --output-file=/requirements.txt --strip-extras /requirements.in # -certifi==2026.2.25 +ansible-core==2.18.14 + # via -r /requirements.in +ansible-runner==2.4.2 + # via -r /requirements.in +certifi==2026.4.22 # via - # -r requirements.in + # -r /requirements.in # kubernetes # requests #cffi==2.0.0 # via - # -r requirements.in + # -r /requirements.in # cryptography -charset-normalizer==3.4.4 +charset-normalizer==3.4.7 # via - # -r requirements.in + # -r /requirements.in # requests #cryptography==46.0.5 # via - # -r requirements.in + # -r /requirements.in + # ansible-core # google-auth durationpy==0.10 # via - # -r requirements.in + # -r /requirements.in # kubernetes google-auth==2.48.0 # via - # -r requirements.in + # -r /requirements.in # kubernetes -idna==3.11 +idna==3.13 # via - # -r requirements.in + # -r /requirements.in # requests jinja2==3.1.6 - # via -r requirements.in + # via + # -r /requirements.in + # ansible-core kubernetes==33.1.0 - # via -r requirements.in + # via -r /requirements.in lockfile==0.12.2 # via - # -r requirements.in + # -r /requirements.in # python-daemon markupsafe==3.0.3 # via - # -r requirements.in + # -r /requirements.in # jinja2 oauthlib==3.3.1 # via - # -r requirements.in + # -r /requirements.in # kubernetes # requests-oauthlib packaging==26.0 - # via -r requirements.in + # via + # -r /requirements.in + # ansible-core + # ansible-runner pexpect==4.9.0 - # via -r requirements.in + # via + # -r /requirements.in + # ansible-runner ptyprocess==0.7.0 # via - # -r requirements.in + # -r /requirements.in # pexpect -pyasn1==0.6.2 +pyasn1==0.6.3 # via - # -r requirements.in + # -r /requirements.in # pyasn1-modules # rsa pyasn1-modules==0.4.2 # via - # -r requirements.in + # -r /requirements.in # google-auth #pycparser==3.0 # via - # -r requirements.in + # -r /requirements.in # cffi python-daemon==3.1.2 - # via -r requirements.in + # via + # -r /requirements.in + # ansible-runner python-dateutil==2.9.0 # via - # -r requirements.in + # -r /requirements.in # kubernetes pyyaml==6.0.3 # via - # -r requirements.in + # -r /requirements.in + # ansible-core + # ansible-runner # kubernetes -requests==2.32.5 +requests==2.33.1 # via - # -r requirements.in + # -r /requirements.in # kubernetes # requests-oauthlib + # requests-unixsocket requests-oauthlib==2.0.0 # via - # -r requirements.in + # -r /requirements.in # kubernetes +requests-unixsocket==0.4.1 + # via -r /requirements.in resolvelib==1.0.1 - # via -r requirements.in + # via + # -r /requirements.in + # ansible-core rsa==4.9.1 # via - # -r requirements.in + # -r /requirements.in # google-auth six==1.17.0 # via - # -r requirements.in + # -r /requirements.in # kubernetes # python-dateutil urllib3==2.6.3 # via - # -r requirements.in + # -r /requirements.in # kubernetes # requests websocket-client==1.9.0 # via - # -r requirements.in + # -r /requirements.in # kubernetes # The following packages are considered to be unsafe in a requirements file: # pip -ansible-core==2.18.14 -ansible-runner==2.4.2 -requests-unixsocket==0.4.1