ohitslaurence/swift-ai-sdk
A provider-agnostic Swift SDK for building AI-powered applications. Write your integration once against a unified protocol, then swap between OpenAI, Anthropic, and future providers with a single line change.
Installation
Add the package to your Package.swift:
dependencies: [
.package(url: "https://github.com/ohitslaurence/swift-ai-sdk", from: "0.1.0")
]Then add the product to your target:
.target(
name: "MyApp",
dependencies: [
.product(name: "AI", package: "swift-ai-sdk") // everything
]
)You can also depend on individual modules if you don't need the full surface:
.product(name: "AICore", package: "swift-ai-sdk") // core protocols only
.product(name: "AIProviderOpenAI", package: "swift-ai-sdk") // OpenAI provider
.product(name: "AIProviderAnthropic", package: "swift-ai-sdk") // Anthropic providerQuick Start
Completion
import AI
let provider = OpenAIProvider(apiKey: "sk-...")
let response = try await provider.complete(
AIRequest(
model: .gpt(.gpt5Mini),
messages: [.user("Hello")]
)
)
print(response.text)
print("Tokens: \(response.usage.totalTokens)")Streaming
let stream = provider.stream(
AIRequest(model: .gpt(.gpt5Mini), messages: [.user("Tell me a story")])
)
for try await event in stream {
if case .delta(.text(let chunk)) = event {
print(chunk, terminator: "")
}
}Swap Providers
// Change the provider — everything else stays the same
let anthropic = AnthropicProvider(apiKey: "sk-ant-...")
let response = try await anthropic.complete(
AIRequest(
model: .claude(.sonnet4_6),
messages: [.user("Hello")]
)
)Structured Output
struct MovieReview: AIStructured {
let title: String
let rating: Int
let summary: String
}
let result = try await provider.generate(
"Review the movie Inception",
schema: MovieReview.self,
model: .gpt(.gpt5Mini)
)
print(result.value.title) // "Inception"
print(result.value.rating) // 9Tool Use
let weatherTool = AITool(
name: "get_weather",
description: "Get the current weather for a city",
inputSchema: .object(
properties: ["city": .string(description: "The city name")],
required: ["city"]
),
handler: { input in "22°C, sunny" }
)
let result = try await provider.completeWithTools(
AIRequest(
model: .gpt(.gpt5Mini),
messages: [.user("What's the weather in London?")],
tools: [weatherTool]
)
)
print(result.response.text)Agents
let agent = Agent(
provider: provider,
model: .gpt(.gpt5Mini),
systemPrompt: "You are a helpful assistant.",
tools: [weatherTool]
)
let result = try await agent.run("What's the weather in London?")
print(result.response.text)Embeddings
let embedding = try await provider.embed(
"swift concurrency",
model: .openAIEmbedding(.textEmbedding3Small)
)
print("Dimensions: \(embedding.dimensions)")SwiftUI
import AI
struct ChatView: View {
@State var conversation = AIConversation(
provider: OpenAIProvider(apiKey: "sk-..."),
model: .gpt(.gpt5Mini)
)
var body: some View {
VStack {
AIMessageList(conversation.messages,
streamingText: conversation.currentStreamText)
Button("Send") {
conversation.send("Hello!")
}
.disabled(conversation.isStreaming)
}
}
}Middleware
let provider = withMiddleware(
OpenAIProvider(apiKey: "sk-..."),
middleware: [
DefaultSettingsMiddleware(maxTokens: 1000, temperature: 0.7),
LoggingMiddleware { print($0) }
]
)Telemetry
struct MetricsSink: AITelemetrySink {
func record(_ event: AITelemetryEvent) async {
print("[\(event.operationKind)] \(event.kind)")
}
}
let provider = withTelemetry(openai, configuration: AITelemetryConfiguration(
sinks: [MetricsSink()],
includeMetrics: true,
includeUsage: true
))Models
Each provider ships predefined model constants for discoverability and type safety:
.gpt(.gpt5Mini) // OpenAI GPT-5 Mini
.claude(.sonnet4_6) // Anthropic Claude Sonnet 4.6You're never locked into the predefined list. Use .custom() for any model string — new releases, fine-tunes, or aliases:
.gpt(.custom("gpt-6-turbo"))
.claude(.custom("claude-haiku-4-5-20251001"))Or skip the helpers entirely and pass a raw string:
AIModel("gpt-5-mini")Modules
| Module | Description | |--------|-------------| | AI | Umbrella — re-exports all modules below | | AICore | Provider-agnostic protocols, types, streaming, retry, timeout, middleware, telemetry | | AIProviderOpenAI | OpenAI Chat Completions and Embeddings provider | | AIProviderAnthropic | Anthropic Messages API provider | | AISwiftUI | Observable state types for SwiftUI chat interfaces |
Documentation
The package includes DocC documentation for every module:
Example App
The repo includes a runnable macOS chat app at Examples/ChatApp/. Open it in Xcode to try the SDK interactively:
open Examples/ChatApp/Package.swiftPaste your API key directly in the UI, pick a provider (OpenAI or Anthropic), select a model, and start chatting. Responses stream in real-time using AIConversation.
Local Development
swift build # build all targets
swift test # run unit tests
make format # auto-format code
make format-check # check formatting
make docs # generate DocC documentationLicense
Apache License 2.0 — see LICENSE for details.
Package Metadata
Repository: ohitslaurence/swift-ai-sdk
Default branch: main
README: README.md