ajevans99/swift-openai
WIP Swift OpenAI API wrapper using the OpenAPI docs
Library Structure
| Layer | Package | Description | Use Case | |-------|---------|-------------|----------| | 🚀 1 — High-level API | OpenAIKit | Opinionated wrapper for conversational agents using tools, including the Tool protocol, ResponseSession actor, error policies, and streaming support. | Build assistants, chatbots, and AI-powered features with minimal setup. | | 🧠 2 — Swifty OpenAI Client | OpenAICore | Lightweight async wrapper around the OpenAI API with ergonomic Swift types and methods like createResponse(...). Built on top of OpenAICore. | Interact directly with OpenAI endpoints in a clean, modern Swift style. | | 🛠️ 3 — Low-level Generated Layer | OpenAIFoundation | Fully autogenerated models and API client from the OpenAPI spec. No opinions, no wrappers—just raw access. | Advanced integrations, custom wrappers, or OpenAPI-driven workflows. |
💡 All layers rely on a pluggable transport layer conforming to
ClientTransportprotocol. You can useOpenAPIAsyncHTTPClient,OpenAPIURLSession, or bring your own.
**`OpenAIKit`**
- Define your tools, conforming to the
Toolprotocol.
- Register your tools
- Setup a
ResponseSession
- Start streaming (or send async request)
let orchestrator = ToolOrchestratorPlugin(
tools: [WeatherTool(apiKey: "...")]
)
let handle = try await session.stream(
"What's the weather in SF?",
plugins: TextPlugin(), orchestrator
)
let (textChannel, toolChannel) = handle.pluginEvents
try await withThrowingTaskGroup(of: Void.self) { group in
group.addTask {
for try await event in textChannel.events {
switch event {
case .delta(let chunk): print(chunk, terminator: "")
case .completed: print()
}
}
}
group.addTask {
for try await event in toolChannel.events {
print(event)
}
}
try await group.waitForAll()
}<details> <summary><strong>Plugin System Deep Dive</strong></summary>
Why this design?
ResponseSession.stream gives you two layers at once:
handle.raw: fullStreamingResponseprotocol events.handle.pluginEvents: strongly typed plugin channels tailored to your app.
This lets you keep low-level access when needed, while still writing most app logic against clean domain events.
Compose any number of plugins
Plugins are variadic, so you are not limited to 1-3:
let handle = try await session.stream(
"Generate an image and explain it",
plugins: TextPlugin(), ToolOrchestratorPlugin(), ImagePlugin(), MyPlugin()
)
let (text, tools, images, custom) = handle.pluginEventsAuthor your own plugin
struct RefusalPlugin: ResponseStreamPlugin {
enum Event: Sendable {
case refusal(String)
}
func consume(
_ event: StreamingResponse,
context: inout StreamPluginContext
) async throws -> Event? {
guard case .outputItem(.done(let item, _)) = event else { return nil }
guard case .message(let message) = item else { return nil }
for content in message.content {
if case .refusal(let refusal) = content {
return .refusal(refusal.refusal)
}
}
return nil
}
}Tool orchestration
ToolOrchestratorPlugin supports plugin-local tools:
let orchestrator = ToolOrchestratorPlugin(
tools: [WeatherTool(apiKey: "...")],
errorPolicy: .returnAsMessage
)If a tool is not found locally, it can fall back to session-level registration (session.register(tool:)) for compatibility. When set, the orchestrator's errorPolicy override is also applied on that fallback path.
Backpressure visibility
Each plugin channel uses bounded buffering (bufferingNewest). If a consumer is too slow:
- older buffered events can be dropped,
- and you can inspect loss with
channel.droppedCount().
Raw-only mode
If you want protocol events only:
let raw = try await session.streamRaw("Debug this turn")
for try await event in raw {
print(event.value)
}</details>
**`OpenAIFoundation`**
| Endpoint | Supported? | | --- | --- | | /responses | [x] | | /images | [x] |
Examples
Checkout the Example CLI Project for some more sample usages.
Code Generation
[!NOTE] This section is only relevant for library maintainers. If you're just using the package, you can skip this.
swift-openai uses swift-openapi-generator to generate models and endpoint definitions directly from OpenAI’s documented OpenAPI spec at openapi.documented.yml. This ensures maximum compatibility and future-proofing as the spec evolves.
To ensure generated code remains buildable against fast-moving upstream changes, generation runs two deterministic patch phases:
- OpenAPI spec transforms in
Scripts/apply-patches.sh(pre-generation). - Generated source transforms in
Scripts/apply-generated-patches.sh(post-generation).
| Task | Command | |-----------------------------------------|------------------| | To check for spec changes | make check | | To fetch the latest openapi.yaml | make fetch | | To apply the necessary transforms | make patches | | To generate the Swift types | make generate | | To fetch, patch, and generate | make all |
Snapshot Tests
The test suite includes fixture-based response decode snapshots and an opt-in live recorder.
| Task | Command | |----------------------------------------------------------|-------------------------| | Run all tests (includes snapshot replay) | make test | | Run a live decode smoke test (requires OPENAI_API_KEY) | make test-live-snapshots | | Record/update snapshot fixtures from live API | make record-snapshots |
Package Metadata
Repository: ajevans99/swift-openai
Stars: 1
Forks: 0
Open issues: 3
Default branch: main
Primary language: swift
License: MIT
README: README.md