Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .code-factory/config.toml
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
#:schema https://task-action.xmtp.team/schema.json

[sandbox]
size = "small"
docker = false
1 change: 1 addition & 0 deletions AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,3 +111,4 @@ Cross-cutting:

- [GitHub App setup](docs/github-app-setup.md) — registration, installation, secrets
- [Coder API endpoints](docs/coder-api.md) — experimental tasks + stable endpoints we consume
- [Per-repo config format](docs/repo-config.md) — `.code-factory/config.toml` field reference for sandbox, harness, scheduled jobs, and event hooks
4 changes: 4 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,10 @@ All non-secret config lives in [`wrangler.toml`](wrangler.toml) under `[vars]`.
| `CODER_ORGANIZATION` | var | Coder organization (default: `default`) |
| `LOG_FORMAT` | var | `json` (production) or `pretty` (local dev) |

### Per-repo configuration

Consuming repositories may define a `.code-factory/config.toml` file to customize sandbox sizing, harness selection, scheduled jobs, and event hooks. See [docs/repo-config.md](docs/repo-config.md) for the full reference. A machine-readable JSON Schema is served at `/schema.json` for editor integration (Taplo, VS Code).

## Running

```bash
Expand Down
2 changes: 1 addition & 1 deletion biome.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"$schema": "https://biomejs.dev/schemas/2.4.7/schema.json",
"$schema": "https://biomejs.dev/schemas/2.4.12/schema.json",
"files": {
"includes": ["src/**/*.ts", "*.json", "*.md"],
"maxSize": 2097152
Expand Down
108 changes: 108 additions & 0 deletions docs/repo-config.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,108 @@
# Repo Config (`.code-factory/config.toml`)

## Overview

Each repository that uses coder-action may include an optional `.code-factory/config.toml` file at the repository root. The file uses standard TOML syntax. Unknown keys are silently stripped (Zod `.strip()` behavior), so repositories can land forward-compatible config before the server knows about a new field. Validation runs on the write path via `parseRepoConfigToml` whenever a config push event is processed.

## Editor integration

Point [Taplo](https://taplo.tamasfe.dev/) or the VS Code TOML extension at the served `/schema.json` endpoint to get hover docs and inline validation.

Top-of-file directive (no extra tooling required):

```toml
#:schema https://task-action.xmtp.team/schema.json
```

Or add a `.taplo.toml` file at the repository root:

```toml
[schema]
path = "https://task-action.xmtp.team/schema.json"
include = [".code-factory/config.toml"]
```

The schema is generated from the fully resolved Zod shape in input mode, so `default` values are visible in editor hover tooltips.

## `[sandbox]`

| Field | Type | Default | Required | Description |
|-------|------|---------|----------|-------------|
| `size` | `"small"` \| `"medium"` \| `"large"` | `"medium"` | No | Controls sandbox instance sizing. |
| `docker` | boolean | `false` | No | Enables docker-in-docker inside the sandbox. |
| `volumes` | array of `[[sandbox.volumes]]` | `[]` | No | Persistent volumes attached to the sandbox. |

## `[[sandbox.volumes]]`

| Field | Type | Default | Required | Description |
|-------|------|---------|----------|-------------|
| `path` | string | — | Yes | Mount point inside the sandbox. |
| `size` | volume-size string | `"10Gi"` | No | Accepts common variants (`10gb`, `10GB`, `10G`, `10gi`, `10Gi`) and is always normalized to the canonical binary-SI form (`10Gi`, `500Mi`, `2Ti`, `64Ki`). Supports K/M/G/T prefixes. |

## `[harness]`

| Field | Type | Default | Required | Description |
|-------|------|---------|----------|-------------|
| `provider` | `"claude_code"` \| `"codex"` | `"claude_code"` | No | Selects the code-agent harness. |

## `[[scheduled_jobs]]`

| Field | Type | Default | Required | Description |
|-------|------|---------|----------|-------------|
| `name` | string | — | Yes | Display name for the scheduled job. |
| `branch` | string | — | Yes | Branch the job runs against. |
| `schedule` | string | — | Yes | Cron expression (e.g. `"0 9 * * 1"`). |
| `prompt` | string | — | Yes | Prompt forwarded to the agent when the job fires. |

## `[[on_event.failed_run]]`

| Field | Type | Default | Required | Description |
|-------|------|---------|----------|-------------|
| `workflows` | array of strings | — | Yes | Workflow names (as they appear in GitHub Actions) to watch. Must be non-empty. |
| `branches` | array of strings | — | Yes | Branches on which a failed `workflow_run` triggers this event. Must be non-empty. |
| `prompt_additions` | string | — | No | Extra prompt context forwarded to the task. |

**Schema-only in this release — no consumer yet.** The block validates today; event dispatch is a future change.

## Examples

Minimal config (sandbox defaults apply):

```toml
[sandbox]
size = "large"
```

Full config (every section populated):

```toml
[sandbox]
size = "large"
docker = true

[[sandbox.volumes]]
path = "/home/user/data"
size = "20Gi"

[[sandbox.volumes]]
path = "/tmp/cache"
size = "5Gi"

[harness]
provider = "claude_code"

[[scheduled_jobs]]
name = "weekly-audit"
branch = "main"
schedule = "0 9 * * 1"
prompt = "Run the dependency audit and open a PR with any updates."

[[on_event.failed_run]]
workflows = ["ci.yml", "deploy.yml"]
branches = ["main", "release"]
prompt_additions = "Focus on the failing step and propose a fix."
```

## JSON Schema

The Worker serves the JSON Schema at `GET /schema.json`. The schema reflects the latest deploy of the Worker and is generated directly from the Zod validation shape, so it always matches what `parseRepoConfigToml` accepts. Use the endpoint URL in Taplo or VS Code as shown in the Editor integration section above.
220 changes: 220 additions & 0 deletions src/config/repo-config-schema.test.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import { describe, expect, test } from "vitest";
import {
JSON_SCHEMA,
parseRepoConfigToml,
resolveRepoConfigSettings,
} from "./repo-config-schema";
Expand Down Expand Up @@ -102,6 +103,7 @@ describe("resolveRepoConfigSettings — defaults applied on read", () => {
sandbox: { size: "medium", docker: false, volumes: [] },
harness: { provider: "claude_code" },
scheduled_jobs: [],
on_event: { failed_run: [] },
});
});
test("volume with path-only → size defaulted to '10Gi'", () => {
Expand All @@ -119,6 +121,145 @@ describe("resolveRepoConfigSettings — defaults applied on read", () => {
});
});

describe("resolveRepoConfigSettings — on_event defaults", () => {
test("undefined → on_event.failed_run defaults to []", () => {
const r = resolveRepoConfigSettings(undefined);
expect(r.on_event.failed_run).toEqual([]);
});

test("empty object → on_event.failed_run defaults to []", () => {
const r = resolveRepoConfigSettings({});
expect(r.on_event.failed_run).toEqual([]);
});

test("sparse on_event with no failed_run → failed_run defaults to []", () => {
const r = resolveRepoConfigSettings({ on_event: {} });
expect(r.on_event.failed_run).toEqual([]);
});

test("entries passthrough", () => {
const r = resolveRepoConfigSettings({
on_event: {
failed_run: [
{
workflows: ["CI"],
branches: ["main"],
prompt_additions: "fix it",
},
],
},
});
expect(r.on_event.failed_run).toEqual([
{ workflows: ["CI"], branches: ["main"], prompt_additions: "fix it" },
]);
});
});

describe("parseRepoConfigToml — on_event.failed_run", () => {
test("full entry with all fields → parses", () => {
const toml = `
[[on_event.failed_run]]
workflows = ["CI"]
branches = ["main"]
prompt_additions = "There was a failed run. Fix it"
`;
const parsed = parseRepoConfigToml(toml);
expect(parsed.on_event?.failed_run?.[0]).toEqual({
workflows: ["CI"],
branches: ["main"],
prompt_additions: "There was a failed run. Fix it",
});
});

test("entry without prompt_additions → parses", () => {
const toml = `
[[on_event.failed_run]]
workflows = ["CI"]
branches = ["main"]
`;
const parsed = parseRepoConfigToml(toml);
expect(parsed.on_event?.failed_run?.[0]).toEqual({
workflows: ["CI"],
branches: ["main"],
});
});

test("multiple entries → preserved in order with full shape", () => {
const toml = `
[[on_event.failed_run]]
workflows = ["CI"]
branches = ["main"]

[[on_event.failed_run]]
workflows = ["Deploy"]
branches = ["release"]
`;
const parsed = parseRepoConfigToml(toml);
expect(parsed.on_event?.failed_run).toEqual([
{ workflows: ["CI"], branches: ["main"] },
{ workflows: ["Deploy"], branches: ["release"] },
]);
});

test("unknown keys inside entry are dropped", () => {
const toml = `
[[on_event.failed_run]]
workflows = ["CI"]
branches = ["main"]
future_field = "ignored"
`;
const parsed = parseRepoConfigToml(toml);
expect(parsed.on_event?.failed_run?.[0]).toEqual({
workflows: ["CI"],
branches: ["main"],
});
});

test("missing workflows → NonRetryableError", () => {
expect(() =>
parseRepoConfigToml(`[[on_event.failed_run]]\nbranches = ["main"]`),
).toThrow(/Invalid RepoConfig/);
});

test("empty workflows array → NonRetryableError", () => {
expect(() =>
parseRepoConfigToml(
`[[on_event.failed_run]]\nworkflows = []\nbranches = ["main"]`,
),
).toThrow(/Invalid RepoConfig/);
});

test("missing branches → NonRetryableError", () => {
expect(() =>
parseRepoConfigToml(`[[on_event.failed_run]]\nworkflows = ["CI"]`),
).toThrow(/Invalid RepoConfig/);
});

test("empty branches array → NonRetryableError", () => {
expect(() =>
parseRepoConfigToml(
`[[on_event.failed_run]]\nworkflows = ["CI"]\nbranches = []`,
),
).toThrow(/Invalid RepoConfig/);
});

test("error message does not leak raw branch values on type mismatch", () => {
// Use branches = "not-an-array-SECRET" (type mismatch) so that the raw
// string itself becomes issue.input for the failing Zod issue. If the
// error builder ever started interpolating issue.input, the secret would
// surface in the message.
expect.assertions(2);
try {
parseRepoConfigToml(
`[[on_event.failed_run]]\nworkflows = ["CI"]\nbranches = "SECRET_BRANCH_VALUE"`,
);
} catch (err) {
expect((err as Error).message).not.toContain("SECRET_BRANCH_VALUE");
expect((err as Error).message).toMatch(/Invalid RepoConfig/);
}
});
});

describe("volume size normalization → canonical Kubernetes binary-SI form", () => {
test.each([
["10gb", "10Gi"],
Expand Down Expand Up @@ -165,3 +306,82 @@ describe("volume size normalization → canonical Kubernetes binary-SI form", ()
).toThrow(/Invalid RepoConfig/);
});
});

describe("JSON_SCHEMA export", () => {
// biome-ignore lint/suspicious/noExplicitAny: JSON Schema traversal — recursive unknown shape
const schemaAny = JSON_SCHEMA as any;

test("has required top-level metadata", () => {
expect(JSON_SCHEMA.$schema).toBe(
"https://json-schema.org/draft/2020-12/schema",
);
expect(JSON_SCHEMA.$id).toBeTypeOf("string");
expect(JSON_SCHEMA.title).toBe("code-factory repo config");
expect(typeof JSON_SCHEMA.description).toBe("string");
});

test("sandbox.size has default 'medium' and is optional", () => {
const s = schemaAny;
const sandbox = s.properties.sandbox;
expect(sandbox.properties.size.default).toBe("medium");
expect(sandbox.required ?? []).not.toContain("size");
});

test("sandbox.volumes[].path is required; size has default '10Gi'", () => {
const s = schemaAny;
const volItem = s.properties.sandbox.properties.volumes.items;
expect(volItem.required).toContain("path");
expect(volItem.properties.size.default).toBe("10Gi");
});

test("harness.provider has default 'claude_code' and is optional", () => {
const s = schemaAny;
const harness = s.properties.harness;
expect(harness.properties.provider.default).toBe("claude_code");
expect(harness.required ?? []).not.toContain("provider");
});

test("scheduled_jobs default is []", () => {
const s = schemaAny;
expect(s.properties.scheduled_jobs.default).toEqual([]);
});

test("scheduled_jobs[] requires name, branch, schedule, prompt", () => {
const s = schemaAny;
const item = s.properties.scheduled_jobs.items;
expect(item.required).toEqual(
expect.arrayContaining(["name", "branch", "schedule", "prompt"]),
);
});

test("on_event.failed_run default is []", () => {
const s = schemaAny;
const failedRun = s.properties.on_event.properties.failed_run;
expect(failedRun.default).toEqual([]);
});

test("on_event.failed_run[].workflows has minItems: 1", () => {
const s = schemaAny;
const item = s.properties.on_event.properties.failed_run.items;
expect(item.properties.workflows.minItems).toBe(1);
});

test("on_event.failed_run[].branches has minItems: 1", () => {
const s = schemaAny;
const item = s.properties.on_event.properties.failed_run.items;
expect(item.properties.branches.minItems).toBe(1);
});

test("on_event.failed_run[] requires workflows and branches but not prompt_additions", () => {
const s = schemaAny;
const item = s.properties.on_event.properties.failed_run.items;
expect(item.required).toEqual(
expect.arrayContaining(["workflows", "branches"]),
);
expect(item.required ?? []).not.toContain("prompt_additions");
});

test("is JSON-serializable", () => {
expect(() => JSON.stringify(JSON_SCHEMA)).not.toThrow();
});
});
Loading
Loading