diff --git a/SECURITY.md b/SECURITY.md new file mode 100644 index 0000000..88cf13a --- /dev/null +++ b/SECURITY.md @@ -0,0 +1,21 @@ +# Security Policy + +## Reporting A Vulnerability + +Report suspected vulnerabilities or secret exposure to `security@trustsignal.dev`. + +- Include the affected repository, environment, and any known receipt IDs, workflow IDs, or request IDs. +- Do not post sensitive findings in public issues. +- Use private evidence storage for screenshots, logs, or provider console exports. + +## Response Expectations + +- Acknowledge receipt within 3 business days. +- Triage severity and containment path before broad disclosure. +- Coordinate remediation and external communication through the incident response plan. + +## Related Documentation + +- [Repository security guidance](docs/SECURITY.md) +- [Incident response plan](docs/INCIDENT_RESPONSE_PLAN.md) +- [Security workflows](docs/security-workflows.md) diff --git a/SECURITY_CHECKLIST.md b/SECURITY_CHECKLIST.md index 07e88e5..9633521 100644 --- a/SECURITY_CHECKLIST.md +++ b/SECURITY_CHECKLIST.md @@ -22,7 +22,7 @@ | --- | -------------------------------------------- | ------ | ------------------------------------------------------------------------------------------------------------------------------------------------------- | | 2.1 | Schema uses `postgresql` provider | ✅ | `apps/api/prisma/schema.prisma` line 6. | | 2.2 | TLS enforced on DB connections in production | 🔒 | `server.ts` startup guard rejects `DATABASE_URL` without `sslmode=require\|verify-full\|verify-ca` when `NODE_ENV=production`. | -| 2.3 | Encryption at rest on DB volume | 📋 | Must be verified on the hosting provider (Render, AWS RDS, Supabase, etc.). All major providers support this — enable it. | +| 2.3 | Encryption at rest on DB volume | 📋 | Must be verified on the hosting provider (Render, AWS RDS, Supabase, etc.). Capture evidence using `docs/ops/db-security-evidence.md` and store the exported proof in private compliance storage. | | 2.4 | Separate DB credentials per environment | 📋 | Production, staging, and development must use distinct credentials with least-privilege grants. | | 2.5 | DB user has minimal required permissions | 📋 | Production DB user should have `SELECT, INSERT, UPDATE` only — no `DROP`, `CREATE`, or superuser. Prisma Migrate should use a separate privileged user. | | 2.6 | Connection pooling configured | 📋 | Use PgBouncer or Prisma Accelerate for connection management in production. | @@ -94,6 +94,29 @@ These cannot be verified in code and require manual confirmation: | 7.7 | **Separate staging/prod credentials** | Ops | Create distinct DB users and API keys per environment | | 7.8 | **Pre-commit secret scanning** | Dev | Install `git-secrets` or `trufflehog` as pre-commit hook (since GitHub secret scanning requires Enterprise) | +### 7.A Rotation Evidence And Cadence + +Rotation policy: + +- rotate exposed or suspected-exposed secrets immediately +- rotate standing secrets at least every 90 days unless a stricter provider or customer obligation applies +- record the operator, timestamp, and validation outcome for every rotation event + +Store rotation evidence in: + +- Vanta +- private compliance storage +- a private audit repository + +Recommended evidence bundle for each rotated secret: + +| Secret | Cadence | Evidence Required | Evidence Location | +| --- | --- | --- | --- | +| `ATTOM_API_KEY` | Immediate if exposed, otherwise every 90 days | provider rotation log, redacted screenshot, post-rotation smoke test result | Vanta or private audit repository | +| `OPENAI_API_KEY` | Immediate if exposed, otherwise every 90 days | provider rotation log, redacted screenshot, post-rotation smoke test result | Vanta or private audit repository | +| `PRIVATE_KEY` | Immediate if exposed, otherwise on key-management schedule | key replacement record, redeploy confirmation, receipt verification sample | private audit repository | +| `DATABASE_URL` / DB password | Immediate if exposed, otherwise every 90 days | password rotation record, redeploy confirmation, database connectivity proof | Vanta or private audit repository | + --- -_Last updated: 2026-02-18T17:25 CST by security remediation session._ +_Last updated: 2026-03-20T00:00 CST by SOC 2 remediation session._ diff --git a/apps/api/prisma/migrations/20260320050000_add_workflow_event_table/migration.sql b/apps/api/prisma/migrations/20260320050000_add_workflow_event_table/migration.sql new file mode 100644 index 0000000..7cc86c1 --- /dev/null +++ b/apps/api/prisma/migrations/20260320050000_add_workflow_event_table/migration.sql @@ -0,0 +1,23 @@ +CREATE TABLE "WorkflowEvent" ( + "id" TEXT NOT NULL, + "workflowId" TEXT NOT NULL, + "timestamp" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP, + "operator" TEXT NOT NULL, + "action" TEXT NOT NULL, + "bundleId" TEXT, + "decision" TEXT, + "receiptId" TEXT, + "eventType" TEXT NOT NULL, + "runId" TEXT, + "artifactId" TEXT, + "packageId" TEXT, + "classification" TEXT, + "reason" TEXT, + "payload" JSONB NOT NULL, + "createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP, + + CONSTRAINT "WorkflowEvent_pkey" PRIMARY KEY ("id") +); + +CREATE INDEX "WorkflowEvent_workflowId_timestamp_idx" +ON "WorkflowEvent"("workflowId", "timestamp"); diff --git a/apps/api/prisma/schema.prisma b/apps/api/prisma/schema.prisma index d2372f2..75e5f1e 100644 --- a/apps/api/prisma/schema.prisma +++ b/apps/api/prisma/schema.prisma @@ -41,6 +41,27 @@ model VerificationRecord { @@index([apiKeyId, createdAt]) } +model WorkflowEvent { + id String @id @default(cuid()) + workflowId String + timestamp DateTime @default(now()) + operator String + action String + bundleId String? + decision String? + receiptId String? + eventType String + runId String? + artifactId String? + packageId String? + classification String? + reason String? + payload Json + createdAt DateTime @default(now()) + + @@index([workflowId, timestamp]) +} + model Receipt { id String @id @default(uuid()) receiptHash String diff --git a/apps/api/src/db.ts b/apps/api/src/db.ts index 8d25b93..40cb51a 100644 --- a/apps/api/src/db.ts +++ b/apps/api/src/db.ts @@ -93,6 +93,24 @@ export async function ensureDatabase(prisma: PrismaClient) { "createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP, "completedAt" TIMESTAMP(3) )`, + `CREATE TABLE IF NOT EXISTS "WorkflowEvent" ( + "id" TEXT PRIMARY KEY, + "workflowId" TEXT NOT NULL, + "timestamp" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP, + "operator" TEXT NOT NULL, + "action" TEXT NOT NULL, + "bundleId" TEXT, + "decision" TEXT, + "receiptId" TEXT, + "eventType" TEXT NOT NULL, + "runId" TEXT, + "artifactId" TEXT, + "packageId" TEXT, + "classification" TEXT, + "reason" TEXT, + "payload" JSONB NOT NULL, + "createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP + )`, `CREATE UNIQUE INDEX IF NOT EXISTS "RegistryCache_sourceId_subjectHash_key" ON "RegistryCache" ("sourceId", "subjectHash")`, `CREATE INDEX IF NOT EXISTS "RegistryCache_expiresAt_idx" @@ -103,6 +121,8 @@ export async function ensureDatabase(prisma: PrismaClient) { ON "RegistryOracleJob" ("sourceId", "createdAt")`, `CREATE INDEX IF NOT EXISTS "RegistryOracleJob_status_idx" ON "RegistryOracleJob" ("status")`, + `CREATE INDEX IF NOT EXISTS "WorkflowEvent_workflowId_timestamp_idx" + ON "WorkflowEvent" ("workflowId", "timestamp")`, `CREATE INDEX IF NOT EXISTS "RegistrySource_active_idx" ON "RegistrySource" ("active")` ]; diff --git a/apps/api/src/env.ts b/apps/api/src/env.ts new file mode 100644 index 0000000..bbef345 --- /dev/null +++ b/apps/api/src/env.ts @@ -0,0 +1,74 @@ +import { existsSync, readFileSync } from 'node:fs'; +import path from 'node:path'; + +let runtimeEnvLoaded = false; + +function parseEnvFile(contents: string): Record { + const values: Record = {}; + + for (const rawLine of contents.split(/\r?\n/u)) { + const line = rawLine.trim(); + if (!line || line.startsWith('#')) { + continue; + } + + const separatorIndex = line.indexOf('='); + if (separatorIndex <= 0) { + continue; + } + + const key = line.slice(0, separatorIndex).trim(); + let value = line.slice(separatorIndex + 1).trim(); + + if ( + (value.startsWith('"') && value.endsWith('"')) || + (value.startsWith("'") && value.endsWith("'")) + ) { + value = value.slice(1, -1); + } + + values[key] = value; + } + + return values; +} + +export function loadRuntimeEnv(): void { + if (runtimeEnvLoaded) { + return; + } + + const envFiles = [ + path.resolve(process.cwd(), '.env'), + path.resolve(process.cwd(), '../../.env') + ]; + + for (const envFile of envFiles) { + if (!existsSync(envFile)) { + continue; + } + + const parsed = parseEnvFile(readFileSync(envFile, 'utf8')); + for (const [key, value] of Object.entries(parsed)) { + if (process.env[key] === undefined) { + process.env[key] = value; + } + } + } + + runtimeEnvLoaded = true; +} + +export function resolveDatabaseUrl(env: NodeJS.ProcessEnv = process.env): string | undefined { + const databaseUrl = + env.DATABASE_URL || + env.SUPABASE_DB_URL || + env.SUPABASE_POOLER_URL || + env.SUPABASE_DIRECT_URL; + + if (databaseUrl && !env.DATABASE_URL) { + env.DATABASE_URL = databaseUrl; + } + + return env.DATABASE_URL; +} diff --git a/apps/api/src/server.ts b/apps/api/src/server.ts index 114648c..0834bbc 100644 --- a/apps/api/src/server.ts +++ b/apps/api/src/server.ts @@ -60,6 +60,11 @@ import { verifyRevocationHeaders } from './security.js'; import { isWorkflowError } from './workflow/errors.js'; +import { + NoopWorkflowEventSink, + PrismaWorkflowEventSink, + type WorkflowEventSink +} from './workflow/events.js'; import { WorkflowService } from './workflow/service.js'; import { readinessWorkflowRequestSchema, @@ -77,6 +82,7 @@ const REQUEST_START = Symbol('requestStartMs'); type RequestTimerState = { [REQUEST_START]?: number; }; +type PrismaWorkflowEventDelegate = ConstructorParameters[0]; const NOTARY_STATUSES = ['ACTIVE', 'SUSPENDED', 'REVOKED', 'UNKNOWN'] as const; const registrySourceIdEnum = z.enum(REGISTRY_SOURCE_IDS); @@ -803,6 +809,7 @@ class BlockchainVerifier { type BuildServerOptions = { fetchImpl?: typeof fetch; logger?: boolean | Record; + workflowEventSink?: WorkflowEventSink; }; type VerifyRouteInput = BundleInput & { @@ -817,7 +824,6 @@ export async function buildServer(options: BuildServerOptions = {}) { requireProductionVerifierConfig(); const app = Fastify({ logger: options.logger ?? true }); const securityConfig = buildSecurityConfig(); - const workflowService = new WorkflowService(); const propertyApiKey = resolvePropertyApiKey(); const registryAdapterService = createRegistryAdapterService(prisma, { fetchImpl: options.fetchImpl @@ -921,6 +927,18 @@ export async function buildServer(options: BuildServerOptions = {}) { ); } + const workflowEventSink = + options.workflowEventSink ?? + (databaseReady + ? new PrismaWorkflowEventSink( + (prisma as PrismaClient & { workflowEvent: PrismaWorkflowEventDelegate }).workflowEvent, + app.log + ) + : new NoopWorkflowEventSink()); + const workflowService = new WorkflowService(undefined, { + eventSink: workflowEventSink + }); + const dbOptionalRoutes = new Set([ '/api/v1/health', '/api/v1/status', @@ -930,6 +948,7 @@ export async function buildServer(options: BuildServerOptions = {}) { '/api/v1/workflows/readiness-audit', '/api/v1/workflows', '/api/v1/workflows/:workflowId', + '/api/v1/workflows/:workflowId/events', '/api/v1/workflows/:workflowId/evidence-package', '/api/v1/workflows/:workflowId/artifacts', '/api/v1/workflows/:workflowId/artifacts/:artifactId/verify', @@ -1048,6 +1067,27 @@ export async function buildServer(options: BuildServerOptions = {}) { return reply.send(state); }); + app.get('/api/v1/workflows/:workflowId/events', { + preHandler: [requireApiKeyScope(securityConfig, 'read')], + config: { rateLimit: perApiKeyRateLimit } + }, async (request, reply) => { + const parsed = workflowParamsSchema.safeParse(request.params); + if (!parsed.success) { + return reply.code(400).send({ error: 'invalid_workflow_id' }); + } + + const state = workflowService.getWorkflowState(parsed.data.workflowId); + if (!state) { + return reply.code(404).send({ error: 'workflow_not_found' }); + } + + const events = await workflowEventSink.listByWorkflow(parsed.data.workflowId); + return reply.send({ + workflowId: parsed.data.workflowId, + events + }); + }); + app.get('/api/v1/workflows/:workflowId/evidence-package', { preHandler: [requireApiKeyScope(securityConfig, 'read')], config: { rateLimit: perApiKeyRateLimit } diff --git a/apps/api/src/workflow.events.test.ts b/apps/api/src/workflow.events.test.ts new file mode 100644 index 0000000..e43a907 --- /dev/null +++ b/apps/api/src/workflow.events.test.ts @@ -0,0 +1,58 @@ +import { describe, expect, it, vi } from 'vitest'; + +import { PrismaWorkflowEventSink, type StoredWorkflowEvent } from './workflow/events.js'; + +describe('PrismaWorkflowEventSink', () => { + it('persists normalized workflow audit events and returns them in timestamp order', async () => { + const rows: Array> = []; + const create = vi.fn(async ({ data }: { data: Record }) => { + const row = { + id: `event-${rows.length + 1}`, + ...data + }; + rows.push(row); + return row; + }); + const findMany = vi.fn(async () => rows); + + const sink = new PrismaWorkflowEventSink({ + create, + findMany + }); + + sink.record({ + type: 'workflow.created', + workflowId: 'workflow-1', + actor: 'operator@trustsignal.test', + timestamp: '2026-03-20T05:00:00.000Z' + }); + sink.record({ + type: 'workflow.release.evaluated', + workflowId: 'workflow-1', + artifactId: 'artifact-1', + actor: 'operator@trustsignal.test', + target: 'customer_shareable', + timestamp: '2026-03-20T05:00:01.000Z', + allowed: false + }); + + const events = await sink.listByWorkflow('workflow-1'); + + expect(create).toHaveBeenCalledTimes(2); + expect(findMany).toHaveBeenCalledWith({ + where: { workflowId: 'workflow-1' }, + orderBy: { timestamp: 'asc' } + }); + + const [createdEvent, decisionEvent] = events as StoredWorkflowEvent[]; + expect(createdEvent.action).toBe('workflow.created'); + expect(createdEvent.operator).toBe('operator@trustsignal.test'); + expect(createdEvent.bundleId).toBeNull(); + expect(decisionEvent.bundleId).toBe('artifact-1'); + expect(decisionEvent.decision).toBe('block'); + expect(decisionEvent.payload).toMatchObject({ + type: 'workflow.release.evaluated', + artifactId: 'artifact-1' + }); + }); +}); diff --git a/apps/api/src/workflow.test.ts b/apps/api/src/workflow.test.ts index f99b6a6..0d3256c 100644 --- a/apps/api/src/workflow.test.ts +++ b/apps/api/src/workflow.test.ts @@ -2,6 +2,7 @@ import { afterAll, beforeAll, describe, expect, it } from 'vitest'; import { FastifyInstance } from 'fastify'; import { buildServer } from './server.js'; +import { InMemoryWorkflowEventSink } from './workflow/events.js'; describe('Trust Agents workflow orchestration', () => { let app: FastifyInstance; @@ -256,4 +257,61 @@ describe('Trust Agents workflow orchestration', () => { await isolatedApp.close(); } }); + + it('returns queryable workflow events after a verification flow', async () => { + await app.close(); + + const workflowEventSink = new InMemoryWorkflowEventSink(); + app = await buildServer({ + logger: false, + workflowEventSink + }); + + const workflowRes = await app.inject({ + method: 'POST', + url: '/api/v1/workflows', + headers: { 'x-api-key': apiKey }, + payload: { createdBy: 'operator@trustsignal.test' } + }); + const workflow = workflowRes.json(); + + const artifactRes = await app.inject({ + method: 'POST', + url: `/api/v1/workflows/${workflow.id}/artifacts`, + headers: { 'x-api-key': apiKey }, + payload: { + createdBy: 'operator@trustsignal.test', + classification: 'internal', + parentIds: [], + content: { + schemaVersion: 'trustsignal.workflow.input.v1', + source: 'audit-log-test' + } + } + }); + const artifact = artifactRes.json(); + + await app.inject({ + method: 'POST', + url: `/api/v1/workflows/${workflow.id}/artifacts/${artifact.id}/verify`, + headers: { 'x-api-key': apiKey } + }); + + const eventsRes = await app.inject({ + method: 'GET', + url: `/api/v1/workflows/${workflow.id}/events`, + headers: { 'x-api-key': apiKey } + }); + + expect(eventsRes.statusCode).toBe(200); + const body = eventsRes.json(); + expect(body.events.length).toBeGreaterThanOrEqual(3); + expect(body.events.map((event: { eventType: string }) => event.eventType)).toEqual([ + 'workflow.created', + 'workflow.artifact.created', + 'workflow.artifact.verified' + ]); + expect(body.events[0].operator).toBe('operator@trustsignal.test'); + expect(body.events[2].decision).toBe('verified'); + }); }); diff --git a/apps/api/src/workflow/events.ts b/apps/api/src/workflow/events.ts index 25507cc..2d080e5 100644 --- a/apps/api/src/workflow/events.ts +++ b/apps/api/src/workflow/events.ts @@ -1,19 +1,185 @@ export type WorkflowEvent = | { type: 'workflow.created'; workflowId: string; actor: string; timestamp: string } - | { type: 'workflow.run.started'; workflowId: string; runId: string; timestamp: string } - | { type: 'workflow.run.completed'; workflowId: string; runId: string; timestamp: string } - | { type: 'workflow.run.failed'; workflowId: string; runId: string; timestamp: string; reason: string } - | { type: 'workflow.artifact.created'; workflowId: string; artifactId: string; classification: string; timestamp: string } - | { type: 'workflow.artifact.verified'; workflowId: string; artifactId: string; timestamp: string; verified: boolean } - | { type: 'workflow.release.evaluated'; workflowId: string; artifactId: string; target: string; timestamp: string; allowed: boolean } - | { type: 'workflow.evidence_package.created'; workflowId: string; packageId: string; classification: string; timestamp: string }; + | { type: 'workflow.run.started'; workflowId: string; runId: string; actor: string; timestamp: string } + | { type: 'workflow.run.completed'; workflowId: string; runId: string; actor: string; timestamp: string } + | { type: 'workflow.run.failed'; workflowId: string; runId: string; actor: string; timestamp: string; reason: string } + | { type: 'workflow.artifact.created'; workflowId: string; artifactId: string; actor: string; classification: string; timestamp: string } + | { type: 'workflow.artifact.verified'; workflowId: string; artifactId: string; actor: string; timestamp: string; verified: boolean } + | { type: 'workflow.release.evaluated'; workflowId: string; artifactId: string; actor: string; target: string; timestamp: string; allowed: boolean } + | { type: 'workflow.evidence_package.created'; workflowId: string; packageId: string; actor: string; classification: string; timestamp: string }; + +export type StoredWorkflowEvent = { + id: string; + workflowId: string; + timestamp: string; + operator: string; + action: string; + bundleId: string | null; + decision: string | null; + receiptId: string | null; + eventType: WorkflowEvent['type']; + runId: string | null; + artifactId: string | null; + packageId: string | null; + classification: string | null; + reason: string | null; + payload: WorkflowEvent; +}; + +type WorkflowEventCreateInput = { + workflowId: string; + timestamp: Date; + operator: string; + action: string; + bundleId: string | null; + decision: string | null; + receiptId: string | null; + eventType: WorkflowEvent['type']; + runId: string | null; + artifactId: string | null; + packageId: string | null; + classification: string | null; + reason: string | null; + payload: WorkflowEvent; +}; + +type WorkflowEventRow = WorkflowEventCreateInput & { + id: string; +}; + +type WorkflowEventPrismaDelegate = { + create(args: { data: WorkflowEventCreateInput }): Promise; + findMany(args: { + where: { workflowId: string }; + orderBy: { timestamp: 'asc' } | { timestamp: 'desc' }; + }): Promise; +}; + +function toStoredWorkflowEvent(row: WorkflowEventRow): StoredWorkflowEvent { + return { + id: row.id, + workflowId: row.workflowId, + timestamp: row.timestamp.toISOString(), + operator: row.operator, + action: row.action, + bundleId: row.bundleId, + decision: row.decision, + receiptId: row.receiptId, + eventType: row.eventType, + runId: row.runId, + artifactId: row.artifactId, + packageId: row.packageId, + classification: row.classification, + reason: row.reason, + payload: row.payload + }; +} + +function toCreateInput(event: WorkflowEvent): WorkflowEventCreateInput { + const operator = event.actor || 'system'; + const artifactId = 'artifactId' in event ? event.artifactId : null; + const packageId = 'packageId' in event ? event.packageId : null; + const classification = 'classification' in event ? event.classification : null; + const runId = 'runId' in event ? event.runId : null; + const reason = 'reason' in event ? event.reason : null; + const decision = + 'allowed' in event + ? event.allowed + ? 'allow' + : 'block' + : 'verified' in event + ? event.verified + ? 'verified' + : 'not_verified' + : event.type === 'workflow.run.completed' + ? 'completed' + : event.type === 'workflow.run.failed' + ? 'failed' + : null; + + return { + workflowId: event.workflowId, + timestamp: new Date(event.timestamp), + operator, + action: event.type, + bundleId: artifactId ?? packageId, + decision, + receiptId: null, + eventType: event.type, + runId, + artifactId, + packageId, + classification, + reason, + payload: event + }; +} export interface WorkflowEventSink { record(event: WorkflowEvent): void; + listByWorkflow(workflowId: string): Promise | StoredWorkflowEvent[]; } export class NoopWorkflowEventSink implements WorkflowEventSink { record(_event: WorkflowEvent): void { - // Intentionally empty. This is the local-only default seam for future audit/event logging. + // Intentionally empty when persistence is unavailable. + } + + listByWorkflow(_workflowId: string): StoredWorkflowEvent[] { + return []; + } +} + +export class InMemoryWorkflowEventSink implements WorkflowEventSink { + private readonly events: StoredWorkflowEvent[] = []; + + record(event: WorkflowEvent): void { + const input = toCreateInput(event); + this.events.push( + toStoredWorkflowEvent({ + id: `${input.workflowId}:${this.events.length + 1}`, + ...input + }) + ); + } + + listByWorkflow(workflowId: string): StoredWorkflowEvent[] { + return this.events.filter((event) => event.workflowId === workflowId); + } +} + +export class PrismaWorkflowEventSink implements WorkflowEventSink { + private pendingWrite: Promise = Promise.resolve(); + + constructor( + private readonly workflowEventDelegate: WorkflowEventPrismaDelegate, + private readonly logger: { error: (payload: unknown, message?: string) => void } = console + ) {} + + record(event: WorkflowEvent): void { + const data = toCreateInput(event); + this.pendingWrite = this.pendingWrite + .then(async () => { + await this.workflowEventDelegate.create({ data }); + }) + .catch((error) => { + this.logger.error( + { + error_name: error instanceof Error ? error.name : 'UnknownError', + workflow_id: event.workflowId, + event_type: event.type + }, + 'failed to persist workflow event' + ); + }); + } + + async listByWorkflow(workflowId: string): Promise { + await this.pendingWrite; + const rows = await this.workflowEventDelegate.findMany({ + where: { workflowId }, + orderBy: { timestamp: 'asc' } + }); + return rows.map(toStoredWorkflowEvent); } } diff --git a/apps/api/src/workflow/service.ts b/apps/api/src/workflow/service.ts index a1736c8..220565d 100644 --- a/apps/api/src/workflow/service.ts +++ b/apps/api/src/workflow/service.ts @@ -305,6 +305,7 @@ export class WorkflowService { type: 'workflow.artifact.created', workflowId: input.workflowId, artifactId: artifact.id, + actor: input.createdBy, classification: artifact.classification, timestamp: artifact.createdAt }); @@ -320,7 +321,7 @@ export class WorkflowService { }; } - verifyArtifact(workflowId: string, artifactId: string): VerificationRecord { + verifyArtifact(workflowId: string, artifactId: string, actor = 'system'): VerificationRecord { const artifact = this.getArtifactForWorkflow(workflowId, artifactId); const recomputedHash = keccak256Utf8(canonicalizeJson(artifact.content)); const verification: VerificationRecord = { @@ -335,6 +336,7 @@ export class WorkflowService { type: 'workflow.artifact.verified', workflowId, artifactId, + actor, timestamp: verification.timestamp, verified: verification.verified }); @@ -345,7 +347,8 @@ export class WorkflowService { evaluateReleaseDecision( workflowId: string, artifactId: string, - target: ReleaseTarget + target: ReleaseTarget, + actor = 'system' ): ReleaseDecision { const artifact = this.getArtifactForWorkflow(workflowId, artifactId); const decision = evaluateReleaseDecisionForArtifact({ @@ -359,6 +362,7 @@ export class WorkflowService { type: 'workflow.release.evaluated', workflowId, artifactId, + actor, target, timestamp: decision.timestamp, allowed: decision.allowed @@ -461,8 +465,8 @@ export class WorkflowService { const summaryArtifact = this.toArtifact(this.getArtifactForWorkflow(workflow.id, summaryArtifactId)); const verificationRecords = [ - this.verifyArtifact(workflow.id, findingArtifact.id), - this.verifyArtifact(workflow.id, summaryArtifact.id) + this.verifyArtifact(workflow.id, findingArtifact.id, request.createdBy), + this.verifyArtifact(workflow.id, summaryArtifact.id, request.createdBy) ]; const releaseTargets = request.releaseTargets ?? { @@ -470,8 +474,8 @@ export class WorkflowService { summary: 'customer_shareable' as const }; const releaseDecisions = [ - this.evaluateReleaseDecision(workflow.id, findingArtifact.id, releaseTargets.findings), - this.evaluateReleaseDecision(workflow.id, summaryArtifact.id, releaseTargets.summary) + this.evaluateReleaseDecision(workflow.id, findingArtifact.id, releaseTargets.findings, request.createdBy), + this.evaluateReleaseDecision(workflow.id, summaryArtifact.id, releaseTargets.summary, request.createdBy) ]; const evidenceReferences: EvidenceReference[] = [ @@ -511,6 +515,7 @@ export class WorkflowService { type: 'workflow.evidence_package.created', workflowId: workflow.id, packageId: evidencePackage.id, + actor: request.createdBy, classification: evidencePackage.classification, timestamp: evidencePackage.createdAt }); @@ -559,6 +564,7 @@ export class WorkflowService { type: 'workflow.run.started', workflowId, runId: run.id, + actor: request.createdBy, timestamp: nowIso() }); @@ -582,7 +588,7 @@ export class WorkflowService { classification: stepRequest.classification, parameters: stepRequest.parameters ?? {}, createArtifact: (artifactInput) => this.createArtifact(artifactInput), - verifyArtifact: (artifactId) => this.verifyArtifact(workflowId, artifactId) + verifyArtifact: (artifactId) => this.verifyArtifact(workflowId, artifactId, request.createdBy) }); step.outputArtifactIds = [...result.outputArtifactIds]; @@ -594,6 +600,7 @@ export class WorkflowService { type: 'workflow.run.completed', workflowId, runId: run.id, + actor: request.createdBy, timestamp: nowIso() }); return { @@ -610,6 +617,7 @@ export class WorkflowService { type: 'workflow.run.failed', workflowId, runId: run.id, + actor: request.createdBy, timestamp: nowIso(), reason: error instanceof Error ? error.message : 'workflow_run_failed' }); diff --git a/docs/INCIDENT_RESPONSE_PLAN.md b/docs/INCIDENT_RESPONSE_PLAN.md new file mode 100644 index 0000000..bb9968e --- /dev/null +++ b/docs/INCIDENT_RESPONSE_PLAN.md @@ -0,0 +1,78 @@ +# TrustSignal Incident Response Plan + +This plan is the audit-facing incident response runbook for TrustSignal. It is specific to TrustSignal's verification receipts, workflow orchestration, API surface, and operational dependencies. Detailed incident records and responder notes must remain in private systems. + +## Severity Levels + +| Severity | Description | Response Expectation | +| --- | --- | --- | +| `P1` | Confirmed compromise of signing keys, production database, or multi-tenant trust boundary | Immediate coordination, containment first | +| `P2` | Confirmed misuse of API credentials, repository compromise, or workflow evidence tampering with customer impact | Begin response within 4 hours | +| `P3` | Suspected replay, integrity mismatch, or monitoring alert with limited blast radius | Same business day | +| `P4` | Low-risk control drift, documentation gaps, or non-exploitable hygiene issue | Next planned remediation cycle | + +## Detection Sources + +TrustSignal incidents can be detected through: + +- GitHub Actions failures or secret-leak alerts +- security workflow findings from Trivy, dependency review, or zizmor +- API monitoring and verification lifecycle metrics +- workflow audit event review from `WorkflowEvent` persistence +- partner or user reports +- provider notifications from Vercel, Supabase, GitHub, or other infrastructure vendors + +## Roles And Responsibilities + +- Incident commander: coordinates triage, containment, owner assignment, and final timeline +- Engineering responder: scopes impact, deploys fixes, and preserves evidence +- Communications lead: prepares customer, partner, or regulator communication when needed +- Compliance owner: stores evidence, links remediation records, and tracks follow-up actions + +## Evidence Gathering + +For every incident: + +1. Preserve relevant logs before destructive changes. +2. Capture the affected receipt IDs, workflow IDs, artifact IDs, and request IDs where applicable. +3. Export related `WorkflowEvent` records, verification logs, and CI run URLs. +4. Save deployment, provider, and branch-protection evidence in private compliance storage. +5. Record who performed containment and when. + +## Communication Plan + +- Internal: use a private incident channel and tracked incident record +- External: notify affected partners or customers after impact is confirmed and containment steps are underway +- Regulatory or contractual notifications: route through leadership and compliance review before sending + +## Containment And Recovery + +Containment priorities for TrustSignal: + +- revoke or rotate exposed API keys, signing keys, webhook secrets, or database credentials +- stop issuance of new trust artifacts if receipt integrity is uncertain +- disable affected workflows or routes if the trust boundary is compromised +- redeploy only after verification checks and smoke tests pass + +Recovery must include: + +- validation of signed receipt verification paths +- review of workflow audit events for the incident window +- confirmation that branch protection and CI controls remain intact + +## Post-Incident Review + +Every `P1` to `P3` incident requires: + +1. a written summary +2. a root-cause statement +3. corrective actions with owners +4. control updates or follow-up issues +5. evidence links stored outside the public repository + +## Related Documents + +- [docs/security/INCIDENT_RESPONSE.md](security/INCIDENT_RESPONSE.md) +- [docs/compliance/policies/incident-response-policy.md](compliance/policies/incident-response-policy.md) +- [docs/security-workflows.md](security-workflows.md) +- [docs/SECURITY.md](SECURITY.md) diff --git a/docs/SECURITY.md b/docs/SECURITY.md index 165408e..07bbe4b 100644 --- a/docs/SECURITY.md +++ b/docs/SECURITY.md @@ -18,3 +18,8 @@ The pre-commit hook rejects: ## Reporting If you discover a secret leak, notify the repo owner and rotate the credential. + +## Related Response Documentation + +- [Incident Response Plan](INCIDENT_RESPONSE_PLAN.md) +- [Security workflows](security-workflows.md) diff --git a/docs/compliance/evidence/logging-monitoring.md b/docs/compliance/evidence/logging-monitoring.md index 7e22b82..6e69fb6 100644 --- a/docs/compliance/evidence/logging-monitoring.md +++ b/docs/compliance/evidence/logging-monitoring.md @@ -1,10 +1,43 @@ -# TrustSignal Logging and Monitoring Evidence Placeholder +# TrustSignal Logging and Monitoring Evidence + +## Control Objective -Control Objective Document that TrustSignal monitors security-relevant activity and retains reviewable monitoring evidence through approved operational systems. -Evidence Expected by Auditor -Monitoring procedures, alert review records, dashboard evidence, and logging review records. +## Repository-Backed Evidence + +The repository now supports workflow audit trail persistence for Trust Agents orchestration: + +- workflow audit events are emitted through `apps/api/src/workflow/events.ts` +- runtime persistence is backed by the `WorkflowEvent` table in Prisma +- events are queryable by workflow ID through `GET /api/v1/workflows/:workflowId/events` + +Expected event fields for audit review: + +- `timestamp` +- `operator` +- `action` +- `workflowId` +- `bundleId` when an artifact or package identifier exists +- `decision` for release or verification outcomes +- `receiptId` when workflow automation is later linked to receipt issuance +- raw event payload for reconstruction and forensic review + +## Auditor Evidence To Capture + +For audit evidence collection, capture: + +- one successful workflow run showing events persisted in the `WorkflowEvent` table +- one API response from `GET /api/v1/workflows/:workflowId/events` +- one screenshot or export from the monitoring system showing alert review or dashboard evidence +- operator review notes for at least one verification or release-decision workflow + +## Where Evidence Is Stored + +Store production logs, screenshots, dashboard exports, and workflow-event query evidence in: + +- Vanta +- a private audit repository +- approved internal compliance storage -Where Evidence Is Stored -Vanta, internal compliance storage, or private audit repository. Do not store production logs, dashboard exports, alert payloads, or private system architecture in this public repository. +Do not store production logs, alert payloads, or internal monitoring screenshots in this public repository. diff --git a/docs/github-settings-checklist.md b/docs/github-settings-checklist.md index 71b0e90..2c2f0e2 100644 --- a/docs/github-settings-checklist.md +++ b/docs/github-settings-checklist.md @@ -63,19 +63,38 @@ Reason: ### 5. Branch Protection Or Rulesets -Configure branch protection or a repository ruleset for `main`: +Configure branch protection or a repository ruleset for `master`: - require pull requests before merge - require at least one human PR review - dismiss stale approvals when new commits are pushed if that matches team policy -- disable force pushes to `main` -- restrict direct pushes to `main` +- disable force pushes to `master` +- disable branch deletion on `master` +- restrict direct pushes to `master` - optionally require branches to be up to date before merge - add a real `CODEOWNERS` file later if the repository has stable maintainer usernames or org team slugs +Recommended baseline for this repository: + +- `required_approving_review_count = 1` +- `strict = true` +- required status checks: + - `lint` + - `typecheck` + - `test` + - `secret-scan` + - `dependency-audit` + - `signed-receipt-smoke` + +Evidence to capture after configuration: + +- one `gh api` or GitHub UI export showing branch protection enabled on `master` +- one screenshot showing the required status checks list +- one screenshot or JSON export showing force pushes and deletions disabled + ### 6. Required Status Checks -After the workflows have run successfully on `main`, consider requiring these checks before merge: +After the workflows have run successfully on `master`, consider requiring these checks before merge: - `typecheck` - `web-build` @@ -100,6 +119,13 @@ Advisory only by default: 3. Confirm Dependabot is creating update PRs on the expected schedule. 4. Confirm the Security tab shows dependency graph, Dependabot alerts, and code scanning as enabled where supported. 5. Add the required status checks only after at least one successful run for each target check. +6. Save one redacted screenshot or `gh api` response showing the final `master` branch protection settings in private compliance evidence storage. + +## Example Verification Command + +```bash +gh api /repos/TrustSignal-dev/TrustSignal/branches/master/protection +``` ## Related Documentation diff --git a/docs/ops/db-security-evidence.md b/docs/ops/db-security-evidence.md index d1d704c..c507e05 100644 --- a/docs/ops/db-security-evidence.md +++ b/docs/ops/db-security-evidence.md @@ -46,3 +46,21 @@ When the bundle is generated from staging/prod credentials: 1. Link the evidence file in `docs/PRODUCTION_GOVERNANCE_TRACKER.md` Workstream `#3`. 2. Record command date/time and operator. 3. Mark status as `VERIFIED IN STAGING` only after staging checks pass. + +## Provider Evidence Required To Close Audit Findings + +In addition to the generated bundle, collect one provider-side proof showing encryption at rest is enabled for the production database volume. + +Accepted examples: + +- AWS RDS or Aurora screenshot showing `StorageEncrypted = true` +- Supabase project/database screenshot or support confirmation showing encryption at rest is enabled +- Render managed Postgres evidence or provider support statement confirming encryption at rest + +Store the provider evidence outside this public repository: + +- Vanta +- private compliance storage +- private audit repository + +Do not paste provider account identifiers, screenshots, or raw support transcripts into this public repository. diff --git a/docs/security-workflows.md b/docs/security-workflows.md index 0d86739..3798339 100644 --- a/docs/security-workflows.md +++ b/docs/security-workflows.md @@ -113,5 +113,6 @@ Those controls still require manual verification in GitHub after merge. ## Related Documentation - [GitHub settings checklist](github-settings-checklist.md) +- [Incident response plan](INCIDENT_RESPONSE_PLAN.md) - [Security summary](security-summary.md) - [Documentation index](README.md)