Contents

Prompt

A prompt from a person to the model.

Declaration

struct Prompt

Mentioned in

Overview

Prompts can contain content written by you, an outside source, or input directly from people using your app. You can initialize a Prompt from a string literal:

let prompt = Prompt("What are miniature schnauzers known for?")

Use PromptBuilder to dynamically control the prompt’s content based on your app’s state. The code below shows that if the Boolean is true, the prompt includes a second line of text:

let responseShouldRhyme = true
let prompt = Prompt {
    "Answer the following question from the user: \(userInput)"
    if responseShouldRhyme {
        "Your response MUST rhyme!"
    }
}

If your prompt includes input from people, consider wrapping the input in a string template with your own prompt to better steer the model’s response. For more information on handling inputs in your prompts, see Improving the safety of generative model output.

All input to the model contributes tokens to the context window of the LanguageModelSession — including the Instructions, Prompt, Tool, and Generable types, and the model’s responses. If your session exceeds the available context size, it throws LanguageModelSession.GenerationError.exceededContextWindowSize(_:).

Prompts can consume a lot of tokens, especially when you send multiple prompts to the same session. To reduce your prompt size when you exceed the context window size:

  • Write shorter prompts to save tokens.

  • Provide only the information necessary to perform the task.

  • Use concise and imperative language instead of indirect or jargon that the model might misinterpret.

  • Use a clear verb that tells the model what to do, like “Generate”, “List”, or “Summarize”.

  • Include the target response length you want, like “In three sentences” or “List five reasons”.

Prompting the same session eventually leads to exceeding the context window size. When that happens, create a new context window by initializing a new instance of LanguageModelSession. For more information on managing the context window size, see TN3193: Managing the on-device foundation model’s context window.

Topics

Creating a prompt

See Also

Prompting