feat(cms): database export/import for non-production environments#517
Open
feat(cms): database export/import for non-production environments#517
Conversation
…ironments Replace the gateway GraphQL sync for dev/staging with a pg_dump/restore pipeline. A nightly GitHub Actions cron exports the production CMS database (excluding admin, auth, upload, and migration tables) to Railway S3. A new `pnpm data-import` script downloads, decompresses, preprocesses, and restores the backup into the target database inside a single atomic transaction. Includes production safeguard (refuses NODE_ENV=production), 31 unit tests for pure utility functions, and compound engineering solution documentation. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
🚅 Deployed to the forge-pr-517 environment in forge
2 services not affected by this PR
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
pg_dumps the production CMS database (excluding admin, auth, upload, and migration tables), gzips the output, and uploads to Railway S3pnpm data-importscript inapps/cmsthat downloads the backup, decompresses, preprocesses SQL, and restores into a target PostgreSQL database viapsqlKey design decisions
--single-transaction— if restore fails, the DROP rolls backNODE_ENV=productionRAILWAY_S3_*env var pattern (same asapps/manager)workflow_dispatchtriggerdata-import-utils.tsfor testability, 31 unit testsRequired GitHub Secrets
Before this workflow can run, add these secrets in the repo settings:
CMS_DATABASE_URL— production PostgreSQL connection stringRAILWAY_S3_ENDPOINT,RAILWAY_S3_REGION,RAILWAY_S3_BUCKETRAILWAY_S3_ACCESS_KEY_ID,RAILWAY_S3_SECRET_ACCESS_KEYTest plan
pnpm testpasses inapps/cms(31/31 tests)tsc --noEmitpasses (0 errors)gh workflow run cms-db-export.ymlbackups/cms-backup.sql.gzpnpm data-importagainst a dev/staging database🤖 Generated with Claude Code