A pragmatic "read later" service built in Rust to explore production-grade web service patterns:
- Authentication
- Background jobs
- Full‑text search
- Observability
- Robust operations
- REST API with generated OpenAPI docs (axum + utoipa)
- User authentication (argon2 password hashes + JWT)
- Background fetch & extract pipeline (reqwest + scraper)
- Full‑text search (tantivy)
- Structured logging & tracing (tracing) + metrics (future Prometheus endpoint)
- Database persistence (PostgreSQL via sqlx; async, compile‑time checked queries when
make prepareis run) - Schema documentation / ERD via SchemaSpy (
make erd->./erd/index.html)
High-level flow:
- Client creates an item (URL + metadata)
- A background task fetches & normalizes HTML, stores text content
- Indexer updates tantivy with (title + site + tags + text)
- Search endpoint returns ranked results with snippets
src/
bin/
api.rs # HTTP server entrypoint
migrate.rs # One-shot migration runner (used in Docker / local)
lib.rs # (future) shared library code
migrations/ # sqlx migrations (*.up.sql / *.down.sql)
Makefile # Developer workflow commands
Dockerfile # Multi-stage container build (api + migrate)
docker-compose.yml# Postgres + migrate + api + schemaspy services
scripts/db-health.sh # Wait/health checks for Postgres
erd/ # Generated SchemaSpy output (HTML + diagrams)
docs/PROJECT.md # Vision, roadmap, non-functional goals
Pre-requisites:
- Rust (see
rust-toolchain.toml) - Docker (for Postgres + ERD generation)
- Run
make install-tools
Steps:
# 1. Start Postgres
make db-up
# 2. Run migrations
make db-migrate
# 3. Launch the API (defaults to 0.0.0.0:8080 via Config)
make dev
# 4. Hit the root endpoint
curl -s localhost:8080/Expected response: Hello from capsule!
Tear down:
make db-downFull reset (drops volume):
make db-resetConfig::from_env() (see config/mod.rs) loads environment variables. Key variable:
DATABASE_URL(required for API & migrations) e.g.postgres://capsule:capsule_password@localhost:5432/capsule_dev
Additional configuration knobs (future): bind address, logging level, JWT secrets, rate limits.
Migrations live in migrations/ and are executed by either:
make db-migrate(sqlx-cli) OR- The
capsule-migratebinary (used indocker-compose.ymlas themigrateservice)
Generate / update sqlx offline metadata (speeds up compile-time query checking):
make prepareCheck database health:
make db-health # exit 0 if healthy
make db-wait # block until healthy (used in CI / scripts)Open a psql-like shell (requires pgcli installed):
make pgcliGenerate ERD & HTML docs (writes into ./erd):
make erd
open erd/index.html # macOSBuild & run everything (Postgres + migrations + API):
docker compose up --build apiServices:
postgres(port 5432)migrate(runs once; executes migrations then exits)api(exposes port 8080)schemaspy(on-demand ERD generation:make erd)
Environment is baked with DATABASE_URL pointing at the compose network host.
| Target | Purpose |
|---|---|
dev |
Run API locally (debug) |
fmt |
Format sources |
lint |
Clippy (deny warnings) |
test |
Run tests |
audit |
Security audit (cargo-audit) |
deny |
Dependency policy (cargo-deny) |
check |
fmt + lint + test + audit + deny |
db-up |
Start Postgres via Docker |
db-down |
Stop Postgres |
db-migrate |
Apply migrations |
db-reset |
Drop volume & reinit DB |
db-health |
Health probe (fast) |
db-wait |
Wait until healthy |
db-logs |
Tail Postgres logs |
prepare |
sqlx offline metadata |
erd |
Generate ERD docs |
Install tooling once:
make install-tools- Unit tests for parsing, auth, extraction
- Integration tests exercising HTTP routes & DB side-effects
- Property tests (URL normalization; idempotent job enqueue)
- Fuzzing extractor inputs
Run tests:
make test