feat: --cwd flag + linear-memory hash cache#19
Merged
Conversation
Run hashup as if invoked from the given directory. Changes where hashup.json is discovered, where relative entry/extras paths resolve, and where --out writes. Defaults to process.cwd(). Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Drop the per-file flattened transitive hash list. Store only each file's own content hash plus its direct deps; reconstruct the transitive contribution at combine time by walking cache.deps. Final digest is now sha256(each reachable file's hash, sorted by path), so every unique file contributes exactly once regardless of how many import paths reach it. Measured on a real monorepo config (81-entry glob against shared UI code): peak RSS 9.3 GB → 125 MB, wall time ~3 min → 1.1 s. Hash output changes. Any stored 0.6.x hashes must be re-baselined. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Contributor
|
📖 Docs preview: https://maastrich.github.io/hashup/branches/pr-19/ |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Two commits, shipping together as 0.7.0.
feat(cli): add --cwd flagRun hashup as if from the given directory. Changes where
hashup.jsonis discovered, where relative entry/extras paths resolve, and where--outwrites. Defaults toprocess.cwd().perf: linear-memory hash cacheHashupCache.hashesnow stores each file's own content hash (one 64-char sha256 string) instead of the flattened transitive list. The transitive contribution is reconstructed at combine time by walkingcache.deps. Memory drops from O(files × avg closure) to O(unique files).Measured on a real-world 81-entry glob against a large shared UI graph (bfront webapps/lcm):
--max-old-space-size=12288~74× less memory, ~160× faster.
Hash output changes. Final digest is now
sha256(each reachable file's hash, concatenated in sorted-path order). Every unique file contributes exactly once regardless of how many import paths reach it. Any stored 0.6.x hashes must be re-baselined. Side benefit: cycles now hash the same regardless of which member was the entry point (test strengthened to assert this).Targeted break for direct
hashFilecallers. Return type is nowPromise<string | null>instead ofPromise<string[]>.hashup()itself is unchanged — the option surface is additive.Test plan
vp test— 117/117 passingvp checkcleantests/examples.test.tsupdated to the new algorithm's outputtests/circular.test.tsstrengthened: same hash from any cycle membershared-cache.test.tsstill passes (shared-cache determinism holds under the new algorithm)Docs
Updated
docs/api/hashup.md,docs/api/utilities.md,docs/guide/how-it-works.md,docs/guide/cli.md(new--cwdoption, new algorithm description, caveats refreshed).🤖 Generated with Claude Code