Skip to content

Conversation

@jsflax
Copy link

@jsflax jsflax commented Dec 26, 2025

What

Add full support for OpenAI's Batch API, enabling asynchronous processing of large request volumes with 50% cost savings.

Core API methods:

  • createBatch - Create a batch from an uploaded JSONL file
  • retrieveBatch - Get batch status and details
  • listBatches - List batches with pagination
  • cancelBatch - Cancel an in-progress batch

File API additions:

  • retrieveFileContent - Download file content (needed for batch output)
  • deleteFile - Delete uploaded files

Convenience methods:

  • submitBatch - Handles JSONL encoding, file upload, and batch creation in one call
  • waitForBatch - Polls for completion and returns parsed responses

New types:

  • BatchQuery, BatchResult, BatchListResult, BatchResponseLine
  • BatchEndpoint, BatchCompletionWindow, BatchStatus enums
  • BatchError for convenience method error handling
  • FileDeleteResult for file deletion responses

Why

The Batch API offers significant benefits for processing large volumes of requests:

  • 50% cost reduction compared to synchronous API calls
  • Higher rate limits with a separate pool
  • 24-hour turnaround guarantee

This is particularly useful for bulk data processing, evaluations, benchmarks, and batch content generation.

Affected Areas

  • Sources/OpenAI/Public/Models/ - New Batch API models
  • Sources/OpenAI/Public/Protocols/ - Protocol extensions
  • Sources/OpenAI/OpenAI.swift - Closure-based implementations + API paths
  • Sources/OpenAI/OpenAI+OpenAIAsync.swift - Async implementations + convenience methods
  • Sources/OpenAI/Private/ - Request builders and raw data handling
  • Tests/OpenAITests/ - Unit and integration tests
  • README.md - Documentation

More Info

  • OpenAI Batch API Documentation
  • Includes both unit tests (BatchAPITests.swift) and integration tests (BatchAPIIntegrationTests.swift)
  • Integration tests verify structured output support with JSON schemas
  • All 5 integration tests passing against live API

Add full support for OpenAI's Batch API, which allows sending asynchronous
groups of requests with 50% lower costs and a 24-hour turnaround time.

Core API methods:
- createBatch: Create a batch from an uploaded JSONL file
- retrieveBatch: Get batch status and details
- listBatches: List batches with pagination
- cancelBatch: Cancel an in-progress batch

File API additions:
- retrieveFileContent: Download file content (for batch output)
- deleteFile: Delete uploaded files

Convenience methods:
- submitBatch: Handles JSONL encoding, file upload, and batch creation
- waitForBatch: Polls for completion and returns parsed responses

New types:
- BatchQuery, BatchResult, BatchListResult, BatchResponseLine
- BatchEndpoint, BatchCompletionWindow, BatchStatus enums
- BatchError for convenience method error handling
- FileDeleteResult for file deletion

Includes both unit tests (BatchAPITests) and integration tests
(BatchAPIIntegrationTests) with structured output support.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant