Making onscreen content available to Siri and Apple Intelligence
Enable Siri and Apple Intelligence to respond to a person’s questions and action requests for your app’s onscreen content.
Overview
When a person asks a question about onscreen content or wants to perform an action on it, Siri and Apple Intelligence will be able to retrieve the content to respond to the question and perform the action. If the user explicitly requests it, Siri and Apple Intelligence will be able to send content to supported third-party services. For example, someone could view a website and use Siri to provide a summary by saying or typing a phrase like “Hey Siri, what’s this document about?”
Create an app entity and associate it with the user activity object
To integrate your app’s onscreen content with current and upcoming personal intelligence features of Siri and Apple Intelligence, explicitly provide the onscreen content using the App Intents framework. Describe the content with an AppEntity — you might be able to reuse existing app entity code. Then, tell the system about the content when it becomes visible:
Create an app entity identifier using EntityIdentifier .
Associate the identifier with the current NSUserActivity by setting the activity’s appEntityIdentifier property.
To remove the association between the user activity and your app entity, set the user activity’s appEntityIdentifier property to nil.
The following code snippet from the Making your app’s functionality available to Siri sample code project shows how a photo-viewing app might provide a photo to Siri and Apple Intelligence by creating an app entity identifier for the asset app entity that represents a photo, and associating it with the user activity:
MediaView(
image: image,
duration: asset.duration,
isFavorite: asset.isFavorite,
proxy: proxy
)
.userActivity(
"com.example.apple-samplecode.AssistantSchemasExample.ViewingPhoto",
element: asset.entity
) { asset, activity in
activity.title = "Viewing a photo"
activity.appEntityIdentifier = EntityIdentifier(for: asset)
}Make the app entity transferable
Associating an AppEntity with the NSUserActivity provides Siri and Apple Intelligence with your app’s onscreen content to offer personalized intelligence assistance. To go one step further and enable Siri and Apple Intelligence to further process the provided onscreen content and respond to a person’s explicit request to send the content as an attachment to other services, including third parties:
Update your app entity to conform to the Transferable protocol.
In your
Transferableimplementation, provide image, PDF, rich text, or plain text representations. To increase compatibility with third-party services, provide several representations that best fit your content. For example, an email client might represent an email as rich text, plain text, and a PDF. For more on adoptingTransferable, refer to CoreTransferable.
Provide additional context to the system with an assistant schema
To enable Siri and Apple Intelligence to further process the provided onscreen content and provide a better response in iOS 18, make sure that the app entity that you associate with an NSUserActivity conforms to one of the assistant schemas in the list below.
Domain | Schema | Swift macro | Example request |
|---|---|---|---|
Browser |
| A person might ask Siri questions about the web page. | |
Document reader |
| A person might ask Siri to explain the conclusion of a document. | |
File management |
| A person might ask Siri to summarize file content. | |
| A person might ask Siri to provide a summary. | ||
Photos |
| A person might ask Siri about things to do with an object in a photo. | |
Presentations |
| A person might ask Siri to suggest a creative title for a presentation. | |
Spreadsheets |
| A person might ask Siri to give an overview of the spreadsheet’s data. | |
Word processor |
| A person might ask Siri to suggest additional content for a text document. |