-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
User Request
Add OpenAPI schema for the LLM service to expose it through the gateway.
Specification
Create openapi/llm/v1/ directory with REST API schemas for all 12 LLM service operations:
Endpoints
- Providers:
POST/GET /providers,GET/PATCH/DELETE /providers/{id} - Models:
POST/GET /models,GET/PATCH/DELETE /models/{id} - Proxy:
POST /responses(opaque body passthrough, supports SSE streaming)
File Structure
openapi/llm/v1/
├── openapi.yaml
├── paths/
│ ├── providers.yaml
│ ├── provider-by-id.yaml
│ ├── models.yaml
│ ├── model-by-id.yaml
│ └── responses.yaml
└── components/
├── parameters/
│ └── IdPath.yaml
├── responses/
│ └── ProblemResponse.yaml
└── schemas/
├── Problem.yaml
├── EntityMeta.yaml
├── AuthMethod.yaml
├── LLMProvider.yaml
├── LLMProviderCreateRequest.yaml
├── LLMProviderUpdateRequest.yaml
├── PaginatedLLMProviders.yaml
├── Model.yaml
├── ModelCreateRequest.yaml
├── ModelUpdateRequest.yaml
└── PaginatedModels.yaml
Patterns
- Follow existing
openapi/files/v1/andopenapi/team/v1/patterns - OpenAPI 3.0.3 with
$reffor components - RFC 7807 Problem responses for errors
- Pagination: offset-based (
page/perPage/total) matching team API convention - Proxy endpoint: opaque JSON body with SSE streaming support via
text/event-stream
CI
Update .github/workflows/openapi-publish.yml to bundle, lint, and publish the LLM spec.
Schema Details
See the proto at proto/agynio/api/llm/v1/llm.proto for field definitions. Key schemas:
- LLMProvider: EntityMeta + endpoint (uri) + authMethod (enum: bearer)
- Model: EntityMeta + name + llmProviderId (uuid) + remoteName
- EntityMeta: id (uuid) + createdAt (date-time) + updatedAt (date-time, optional)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels