feat(cms): add data snapshot for dev database seeding#519
Merged
Conversation
Replaces the GitHub Actions-based export approach with a Strapi-native solution. After nightly gateway-sync completes, Strapi runs pg_dump on video/language/country content tables, compresses and uploads to Railway S3. Developers pull the snapshot via a secret-protected API endpoint and restore locally with `pnpm data-import`, reducing setup from 4+ hours to minutes. - Export service with pg_dump table allowlist (17 content tables) - Secret-auth middleware using timing-safe comparison - Trigger + download + status API endpoints - Cron chaining after gateway-sync completion - Import script adapted from feat/cms-database-export-import branch - 2-snapshot retention policy (always one available during export) - postgresql-client added to Dockerfile - 31 unit tests for import utilities Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
🚅 Deployed to the forge-pr-519 environment in forge
2 services not affected by this PR
|
- Change DATABASE_CLIENT default from sqlite to postgres - Move pg to dependencies, better-sqlite3 to optionalDependencies - Update .env.example with PostgreSQL defaults and SQLite fallback docs - Add Strapi component tables to snapshot (cloudflare-image, variant-download, audio-preview) - Add Strapi join table globs (*_lnk, *_cmps) for relation and component links - Add --sequence-data to pg_dump for correct auto-increment counter restore - Add imports/ to .gitignore Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Remove better-sqlite3 dependency, SQLite connection config, and DATABASE_CLIENT/DATABASE_FILENAME env vars. The CMS now requires PostgreSQL in all environments, matching production and enabling the data-import workflow. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
More general-purpose name for the production CMS base URL env var. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Separate naming: DATA_SNAPSHOT_SECRET protects the endpoint in production, PROD_DATA_SNAPSHOT_SECRET is what developers set locally (same value) to authenticate via pnpm data-import. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Switch from standalone Dockerfile build to Docker Compose so a PostgreSQL 16 sidecar runs alongside the dev container. The app service waits for the DB healthcheck before starting, and DATABASE_URL is set automatically. Also marks the CMS data snapshot plan as completed. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
pg_dump, compresses and uploads to Railway S3 after each nightly gateway-sync/api/data-snapshot/trigger,/download,/status) for manual triggers and pre-signed download URLspnpm data-importCLI script that downloads the latest snapshot and restores it into a local PostgreSQL database in minutes (replaces the 4+ hour gateway-sync for local dev)What's included
api::data-snapshot— pg_dump allowlisted tables, gzip, S3 upload, 2-snapshot retentionx-snapshot-secretheader validated withtimingSafeEqualpnpm data-import— downloads snapshot, preprocesses SQL, atomicpsql --single-transactionrestoreDATABASE_URLwired uppostgresql-clientadded forpg_dumpavailabilityTest plan
pnpm test— 31 unit tests pass for import utilities (connection string parsing, SQL filtering, byte formatting)pnpm data-importfrom empty local DB to working CMS dataset (requires prod snapshot in S3)POST /api/data-snapshot/triggercreates snapshot on deployed instanceGET /api/data-snapshot/downloadreturns pre-signed URLx-snapshot-secretPost-merge
DATA_SNAPSHOT_SECRETto Doppler (forge-cms/devandforge-cms/prd)feat/cms-database-export-import🤖 Generated with Claude Code