ml-explore/mlx-swift-lm
MLX Swift LM is a Swift package to build tools and applications with large language models (LLMs) and vision language models (VLMs) in [MLX Swift](https://github.com/ml-explore/mlx-swift).
Documentation
Developers can use these examples in their own programs -- just import the swift package!
- Porting and implementing models
- Techniques for developing in mlx-swift-lm
- MLXLLMCommon: Common API for LLM and VLM
- MLXLLM: Large language model example implementations
- MLXVLM: Vision language model example implementations
- MLXEmbedders: Popular encoders and embedding models example implementations
Usage
This package integrates with a variety of tokenizer and downloader packages through protocol conformance. Users can pick from three ways to integrate with these packages, which offer different tradeoffs between freedom and convenience.
See documentation on how to integrate mlx-swift-lm and downloaders/tokenizers.
[!NOTE] If the documentation link shows a 404, view the source.
Installation
Add the core package to your Package.swift:
.package(url: "https://github.com/ml-explore/mlx-swift-lm", .upToNextMajor(from: "3.31.3")),Then chose an integration package for downloaders and tokenizers.
[!NOTE] If the documentation link shows a 404, view the source.
Quick Start
After installing the package you can use LLMs to generate content with only a few lines of code. (Note: the exact line to load the model depends on the integration package).
[!NOTE] If the documentation link shows a 404, view the source.
import MLXLLM
import MLXLMCommon
let modelConfiguration = LLMRegistry.gemma3_1B_qat_4bit
// customize this line per the integration package
let model = try await loadModelContainer(
configuration: modelConfiguration
)
let session = ChatSession(model)
print(try await session.respond(to: "What are two things to see in San Francisco?"))
print(try await session.respond(to: "How about a great place to eat?"))Package Metadata
Repository: ml-explore/mlx-swift-lm
Default branch: main
README: README.md