buhe/llmfarm_core.swift
Swift library to work with llama and other large language models.
Swift Package Manager
Add llmfarm_core to your project using Xcode (File > Add Packages...) or by adding it to your project's Package.swift file:
dependencies: [
.package(url: "https://github.com/guinmoon/llmfarm_core.swift")
]Build and Debug
To Debug llmfarm_core package, do not forget to comment .unsafeFlags(["-Ofast"]) in Package.swift. Don't forget that the debug version is slower than the release version.
To build with QKK_64 support uncomment .unsafeFlags(["-DGGML_QKK_64"]) in Package.swift.
Usage
[More examples in the examples directory](/Examples)
Example generate output from a prompt
import Foundation
import llmfarm_core
let maxOutputLength = 256
var total_output = 0
func mainCallback(_ str: String, _ time: Double) -> Bool {
print("\(str)",terminator: "")
total_output += str.count
if(total_output>maxOutputLength){
return true
}
return false
}
var input_text = "State the meaning of life."
let ai = AI(_modelPath: "llama-2-7b.q4_K_M.gguf",_chatName: "chat")
var params:ModelContextParams = .default
params.use_metal = true
try? ai.loadModel(ModelInference.LLama_gguf,contextParams: params)
ai.model.promptFormat = .LLaMa
let output = try? ai.model.predict(input_text, mainCallback)
Projects based on this library
* ## LLM Farm App to run LLaMA and other large language models locally on iOS and MacOS.
Package Metadata
Repository: buhe/llmfarm_core.swift
Stars: 1
Forks: 0
Open issues: 0
Default branch: main
Primary language: objective-c
License: MIT
README: README.md
Fork: yes