Transcript
[ Music ]
[ Applause ]
>> Hi, everyone.
My name's Paul Salzman, and my
friend Josh Ford and I are very
excited to talk to you today
about servicing your Siri
shortcuts on the Siri watch
face.
Last year, we released the Siri
watch face, which has glanceable
information and tappable
actions, sorted by their
relevance to the user, at every
wrist raise.
That means our users have access
to dynamically updated content,
relevant to them throughout the
day without any configuration
required.
And now, in watchOS 5, we are
very excited to add your
applications as data sources to
the Siri watch face.
The shortcuts you provide will
show up on the watch face, on
these items that we call
platters.
When the user taps on one of
these platters, the underlying
shortcut will be executed.
That can do something like
launching into your application
into a specific context, or if
your shortcut supports
background execution, we can run
that inline on the watch face.
That means users can take
advantage of your application's
functionality without leaving
their watch face.
So, let's take a look at what
we're going to talk about today.
We're going to start off by
going over how content appears
on the Siri watch face.
After that, we'll talk about the
Relevant Shortcut API, which is
the API you use to provide
content to the Siri watch face.
We'll also go over how you can
use this API within your iOS
application to provide content
to the Siri watch face.
And then, I'm going to hand
things over to Josh, who'll talk
about our prediction engine, as
well as give you insight as to
how best to use these API's for
your application.
So, let's talk about how content
appears on the Siri watch face.
Everything on the Siri watch
face is sorted by its relevance
to the user.
The more relevant a piece of
content is, the higher up on the
watch face it's going to appear.
And, we calculate relevance by
incorporating a number of inputs
across the system, like the
current time of day, the user's
location, their routine, and
their engagement with a given
data source.
You'll provide this content to
us with a relevantShortcut,
which associates a shortcut with
UI customization, and the
ability to give us hints as to
when to deploy your content.
Now, we will derive implicit
relevance for the shortcuts you
provide based off of your user's
past interaction with the
shortcuts.
But, often you have much more
insightful suggestions,
especially when showing
glanceable information, or if
you want to suggest a shortcut
that hasn't yet been executed by
a user.
So, you can provide us things
called relevance providers.
And, just like a first-party
data source, users can disable
or re-enable your data source in
the Siri face customization page
in the iOS Watch app.
Now, before we get too far into
adopting the Relevant Shortcut
API, we want to make sure we're
not contending with these
relevance calculations when we
see how our relevant shortcuts
behave and look on the watch
face.
So, while we're developing,
we're going to want to go into
the iOS Settings app, into the
Developer's page, and find the
Shortcuts Testing section.
In there, we can ensure that our
most recently provided relevant
shortcuts show up at the top of
the watch face, by enabling the
Show Recent Shortcuts option.
Additionally, as we get further
into adopting this API in our
iOS application, we can cause
the periodic syncing of relevant
shortcuts from the iOS device to
the watch to occur immediately
by tapping the Force Sync
Shortcuts to Watch button.
So, now let's talk about
relevant shortcuts, and at the
core of a relevant shortcut, is
a shortcut.
Shortcuts encompass key
functionality, within your
application, that you want to
make more accessible to your
users.
And, they can access these
shortcuts by saying key phrases
into Siri, or tapping on various
system UI.
And, in the case of watchOS,
that's a platter on the Siri
watch face.
There's a lot of in-depth
discussion about how to make
great shortcuts this year.
And all will give a high-level
overview of how they work with a
watchOS persecutive.
I highly recommend seeing the
"Introduction to Siri Shortcuts"
talk, and the "Building for
Voice with Siri Shortcuts" that
happened earlier in the
conference.
So, shortcuts can be made out of
one of two things.
An NSUserActivity, which
represents a state within your
application you want to
accelerate users back into, or
an intent, which can execute a
task on your user's behalf.
Now, intents are really
powerful, because they can
support background execution,
which means that users can take
advantage of your functionality
without having to launch your
app into the foreground.
In fact, users can request
background-capable intents that
are available on their iPhone
from their Apple watch or
HomePod.
And, our frameworks provide a
lot of built-in intents you can
take advantage of right now,
like sending a message, starting
a workout, or requesting a ride.
But, new in watchOS 5, and iOS
12, you can make your own custom
intents that have the
functionality that your app does
best.
There's this awesome in-depth
intents definition file and
editor built into Xcode.
And, I'll give you a couple of
pointers related to relevant
shortcuts.
But, I highly recommend seeing
those other talks for full
details.
Now, let's go over a couple of
examples of how shortcuts will
execute when requested from a
watch app.
User generates their shortcut
request by tapping on the Siri
watch face, or by saying a key
voice phrase to Siri, and the
watch receives that request.
It'll examine it and determine,
is there an application that's
installed that can handle the
shortcut?
And, in this example, yes, there
is one that's installed.
So, we're going to dispatch that
shortcut to the appropriate
application.
If your shortcut is implemented
by an intent that can handle
background execution, the
application's intent execution
will run that shortcut.
But, if instead your shortcut is
built off of a NSUserActivity,
or an intent that can't run in
the background, the application
itself will be launched to
handle the shortcut.
When the shortcut execution is
complete, a result will be
generated, and then handed back
to the user.
Now, let's take a look at
another example.
In this case our user taps on
the Siri watch face, or says a
key phrase to Siri to generate
the shortcut request.
And, the watch will examine it.
And, in this case, it'll
determine that there isn't an
application that's installed
that can handle the shortcut.
So, we'll check with the phone
and see if it has an app that
can handle the shortcut.
And, in this case, yes there is.
So, we're going to forward that
request over to the phone.
And, the proper application, or
intents extension will handle
execution.
When execution is complete, a
result will be generated on the
phone, and forward back to the
watch, and conveyed to the user.
So, now that we understand the
key concepts that make up a
shortcut, and how it'll execute
on the watch, let's talk about
relevant shortcuts, which take
your shortcuts and show them on
the watch face when they're most
relevant.
We can automatically populate
the fields on the platters on
the Siri watch face, based on
your shortcut content.
But, you can also customize the
platter's display, which is
really useful for displaying
glanceable information.
And, of course when your platter
is tapped, the underlying
shortcut will be executed.
So, let's take a look at how
this will run on this watch
face.
If your shortcut is backed by a
user activity, when the user
taps on it, your app will be
launched into the appropriate
context.
If instead, your app is-- or,
your shortcut is based off of
intents, when the user taps on
the platter, we'll see this
intent confirmation view.
And, if they tap to confirm it,
if your intent runs at the
background, we'll execute it
inline.
If instead your intent can't run
in the background, we'll launch
your application, and hand you
the intent to continue
execution.
So, let's look at the API for
Relevant Shortcuts.
At the core of a relevant
shortcut, of course, is a
shortcut.
If you want to give us hints as
to when this content is
relevant, you can provide us
relevanceProviders, which we'll
go over soon.
And, if you want to customize
the UI beyond what your shortcut
will provide, you can give us a
defaultCardTemplate on the
watchTemplate property.
Now, once you're done creating
all of your wonderful relevant
shortcuts, you need to let us
know about them.
So, you want to inform the
default relevantShortcutStore.
And, the way you do that is
providing an array.
And, every time you give us an
array, it'll erase the previous
contents we had in our shortcut
store, which is really useful
for invalidating stale relevant
shortcuts.
But, you just need to keep in
mind to provide us all of the
relevant shortcuts we should be
considering.
So, let's look at how your
content will display on the
platter.
You can see on the top left,
your application's icon will be
displayed, followed by your
app's name.
Below that is a required title
string, and below that is an
optional subtitle string, that
you can use for more context.
We'll display that in italics.
To the left of both of these
strings is an optional custom
image.
This image supports
transparency, and automatically
applies rounded corners.
You can see more about the
dimensions of these assets by
looking at the human interface
guidelines resources for
watchOS.
Now, as I mentioned, we can
automatically populate all these
fields based off of the
shortcuts you supply.
In the case of a custom intent,
every parameter combination you
supply has an associated title
and subtitle we can use.
And, as you create your intents
in code to generate a shortcut,
you can set an image for any of
the parameters.
We will choose an image for the
custom image, based off of what
is available on the most
specific parameter.
And, the specificity of a
parameter is defined by the
order of parameters you have
listed in your intents
definition file.
In the case of an
NSUserActivity, when you're
creating your userActivity to
make a shortcut, you'll supply
it-- userActivity type that
you've listed in your app's
info.plist.
And, for us to be able to
display it without a default
card template, you'll need to
give us a title on the title
property.
In iOS you can also supply a
subtitle and a custom image by
creating a
CSSearchableItemAttributeSet.
And, on that attribute set,
we'll extract a subtitle from
the content description
property, and the custom image
from the thumbnailData property.
When you're done configuring
this attribute set, set on your
NSUserActivity's
contentAttributeSet property.
If you don't want to give us the
content that's baked into your
shortcut, you can supply a
default card template, which has
properties for each of the
fields on this platter.
And, depending on what we have
available to us, we will display
a different layout.
You can see in the right two
configurations, that if no image
is provided, we'll lay out the
text further to the left, giving
you more space for your words.
In the bottom two cases, if
there's no subtitle, we'll allow
the title string to wrap from
the top line to the bottom line.
If your shortcut is based on an
intent, when a user taps on it,
they'll see this intent
confirmation view.
In the top left is your
application's icon, which is a
bit bigger this time, followed
by your app's name.
Below that we'll display the
title and subtitle directly from
your intents definition file.
And, at this point, the user has
three options.
If they want to run your intent,
they can click Confirm in that
middle pink button.
If they don't want to run it,
they can tap dismiss or the
digital crown.
And then, the third option is a
bit more subtle.
Sometimes users see an intent,
realize they want to tweak some
of the parameters.
So, they can tap that top
module, and we'll launch into
your application, and hand you
the intent you've customized.
So, you can present some UI to
allow your users to tweak some
of the parameters before
continuing execution.
Now, let's talk about that
middle pink Confirm button a
little bit more.
The string we show there, where
it says Action Verb, is derived
based off of the category of
intent you've defined in your
intents definition file.
Additionally, that color is
chosen from your application's
global tint color in your watch
app storyboard.
Now, it's very important to keep
in mind that the intents that
you support in your watch app
must be a subset of the intents
you support in your iOS
application.
That means you're going to be
sharing the same title strings,
and subtitle strings on iOS and
watchOS.
And, watchOS is a very
constrained canvas.
Every word counts.
So, we highly recommend using
string dictionaries, with the
NSStringVariableWidthRuleType
key.
That allows you to give us a
list of varying sizes of strings
that we can choose from,
depending on the context we're
displaying them in.
When providing content to
watchOS, we recommend supplying
a string with a width rule of 20
for a 38 millimeter watch.
And, a width rule of 24 for a 42
millimeter watch.
To get more information on
adopting this API, please see
the "Localizing with Xcode 9"
talk from last year's
conference.
So now, let's talk about
relevance providers.
Relevance providers are your way
to give us a hint as to when we
should show your content.
They really let us know how we
should incorporate inputs like
the time of day, the user's
location, or their routine, when
calculating relevance.
And, in fact, you can give more
than one relevance provider on a
given relevance shortcut.
And, we're going to take the
inner section of their output.
So, if you give us a relevance
provider that says a certain
time of day's very important.
And, another one that says a
specific location's important,
you'll get high relevance output
when both it is that time of day
and the user is at that
location.
If instead you want the union of
these relevance providers,
you're going to want to provide
two separate relevance
shortcuts, each one with a
different relevance provider.
So, let's take a look at the
options we have available.
The first one is
INDateRelevanceProvider, which
has a required startDate
parameter, and an optional
endDate parameter.
When you create an instance of
this relevance provider, the
closer time of day gets to the
start that you've provided, the
higher the relevance output of
the provider.
After that time of day passes,
the relevance will fall off on a
curve, allowing more content to
appear on the top of the watch
face.
And, if you give us this
optional endDate, we'll adjust
that curve to accommodate.
If your content instead is more
relevant to a given location,
you can use the
INLocationRelevanceProvider.
It takes a CLRegion as its main
parameter when you create an
instance of this.
As the user gets closer and
closer to that region, the
relevance output of this
relevanceProvider gets higher
and higher.
Now, you don't always have a
specific time of day, or
location in mind, for where your
content should be relevant.
User's schedules, their favorite
locations vary all over the
place, and you don't want to
have to get in the business of
tracking all of this.
So, you can take advantage of
the Siri face's smarts, and use
an
INDailyRoutineRelevanceProvider,
where we have situations for
both time of days that might be
favorite for your users, and
location.
For instance, if you have some
content you want to show as a
daily forecast when your user
wakes up, you don't want to have
to know if they wake up at 5
A.M. or 10 A.M.
You should just give us the
morning situation.
And, if you've got a workout you
want to suggest that requires
some gym equipment, you can pass
us the gym situation.
So, now that we have all of this
API in our toolbox, let's build
a couple of relevant shortcuts.
In my examples we've got a
hypothetical meal logging
application that allows users to
log their meals, and they can
opt to do challenges.
This week's challenge, our users
opted into a veggie challenge.
So, we're going to create a user
activity for logging meals.
It's an activity type is
com.myapp.LogMeal.
And, because we're going to be
displaying this for the dinner
meal, we're going to set the
value for the meal key in our
user info dictionary to Dinner.
Once we have our userActivity
set up, we can create a
shortcut, and from that
shortcut, we can create a
relevantShortcut.
Now, we want to really let our
users know that this is showing
up on the watch face, because
they've opted into the veggie
challenge.
So, we're going to create a
default card template to
customize their UI with Log
Dinner as our title.
And, we're going to convey the
veggie challenge in our subtitle
and our custom image.
Now, our users haven't always
been necessarily logging their
dinners, but they've opted into
this challenge.
So, we want to give the system a
hint that this should be
displayed in the evening by
passing a
dailyRoutineRelevanceProvider
with the situation .evening.
And now that our
relevantShortcut is configured,
we can pass it along to the
default relevantShortcutStore.
For our next example, in our
application, users also have a
bunch of favorites they can
configure for snacks they eat
often, or perhaps a breakfast
they had every day.
So, we've created a
logFavoriteMealIntent.
We want a couple of these to
surface on the watch face to
make it even easier for our
users to log their favorite
snacks and meals.
So, we'll create an instance of
our intent, and take one of our
favorites, and set it on the
favorites parameter.
In this case, our user likes to
eat cookies.
We'll also set an image for that
parameter, so that when it shows
up on the watch face, they get a
little bit more context about
what they're about to log.
From here, we can take our
intent and create a shortcut,
and from our shortcut, create a
relevantShortcut.
At this point, we're ready to
pass along our relevantShortcut
to the relevant shortcut store.
We don't need a default card
template, because the title
strings and image, the image
that we pass along to our
intent, should be sufficient.
We also don't really want to
provide a relevanceProvider
here.
Because this is something
habitual for our users.
They log these often throughout
the day, and we can take
advantage of the Siri face's
prediction engine to show it
when it most matters to our
users.
And, once we're done creating
our relevantShortcuts, we're not
done here actually.
We need to be able to handle
them as well.
And, new in watchOS 5 is a
method on your
WKExtensionDelegate called
handle user shortcut.
And, our first example, if the
user taps on that Siri platter,
our application will be
launched, and we'll be handed a
user activity to this method,
whose activity type matches
com.myapp.LogMeal.
At this point, we want to make
sure we go into the right part
of our application.
So, we'll jump up to the
rootInterfaceController, and
push on our
logMealInterfaceController.
We've got to be sure to pass
along that context we put into
our user info dictionary, so we
know which meal we're about to
be logging.
For our second example, the
common case will be that that
background-capable intent will
execute successfully in our
intents extension.
But, there are a couple
instances where our application
will get a callback directly.
The first one being if there's
an error during execution, and
we say we need to handle this
within our app.
The other one, though, might
happen if the user wants to
tweak a couple of the
parameters.
For instance, they see that
they're about to log a cookie,
when they really know they just
ate 5.
So, we'll tap on the top module,
we'll launch into your
application, and you'll get a
callback here.
The userActivity you'll receive
will have an activity type equal
to the name of the class of
intent you gave us.
At this point, you can extract
an interaction off of
userActivity, and an intent off
of that interaction, which will
have all the parameters you set
when you created your
relevantShortcut.
So, now we're ready to create
relevant shortcuts whenever our
application has runtime.
But, it's very important to
note, that just because your
content is showing up on the
Siri watch face, doesn't mean
your application is actively
running.
So, to help you get more runtime
to provide relative shortcuts,
we've created the
WKRelevantShortcutRefresh
BackgroundTask.
We'll be handing this out to
applications that are providing
awesome relevant shortcuts that
our users are spending time
looking at, tapping through, but
not scrolling past.
When you get one of these
refresh background tasks, it's
your opportunity to update the
data that supplies your relevant
shortcuts, and give us a fresh
set of relevant shortcuts.
And, on a related note, if your
background-capable intent
executes, it's inside of your
intents extension.
That means that if you update
your data store, it's possible
that your UI and your
application has become stale.
So, in watchOS 5, we've also
created the
WKIntentDidRunRefresh
BackgroundTask.
And, we'll hand that to you with
some runtime when your intents
extension completes a execution
of a shortcut.
That is your opportunity to make
sure that your UI's up to date,
perhaps request an updated
snapshot, or reload, or extend
your complication timeline.
So, let's talk about how we can
take advantage of these API's on
iOS.
And, the great news is, with the
exception of those
WKRefreshBackgroundTasks, we can
use all of the same exact API's
in our iOS application.
The relevant shortcuts we
provide in our iOS app will be
periodically synced from the
phone to the watch.
And, merged in with local,
relevant shortcuts for
consideration to be displayed on
the watch face.
You have all of the same UI
customization options available.
The only difference is if your
relevant shortcut will execute
back on the phone, we'll show
your iOS application's icon
instead of your watchOS
applications icon.
And now that we can be showing
relevant shortcuts that will
execute on the phone or the
watch, let's talk about where
they will execute when the user
taps on them.
If a relevant shortcut can be
handled by an application that's
installed locally on your watch,
regardless of where it came
from, we're going to execute it
in the watchOS application.
That gives the user the best
experience and the lowest
latency.
You can ensure that your watchOS
application supports execution
of a shortcut by making sure the
user activity's activityType is
in your info.plist,
NSUserActivityTyped array.
Or, if it's an intent, that your
intent is listed in your intents
extension.
If instead there is no
application installed that can
handle your shortcut, except on
iOS, we'll execute it back on
the phone, even if your phone
isn't near the user.
Now, because we can be executing
content back on the phone, we
want to make sure that users
don't tap on a platter, and then
are immediately told to go find
their phone.
So, we have a couple of rules
about the type of shortcuts that
will execute back on the phone
that we're willing to surface.
Requirements are that they're
intent-based.
They handle background
execution.
And, they don't require access
to protected data.
And, that's the data that you
keep encrypted when the phone is
locked.
The way you can ensure that your
content meets these requirements
is by taking a look at the
parameter combination in your
custom intent, and ensuring that
background execution is
supported.
And, looking at your overall
custom intent and seeing if you
have any authentication
restrictions you've applied.
There are three options here,
and we support the first two.
The first one is that there are
no restrictions.
The second one says that you are
restricted while the device is
locked, which means that at
least the watch has to be
unlocked for us to execute.
And the final one is Restricted
While Locked, or Protected Data
is Unavailable.
Now that you have all this
information you're ready to
build some awesome shortcuts
from both the watch and the
phone.
So, I'd like to hand things over
to Josh to talk a little bit
more about our prediction
engine.
Thank you.
[ Applause ]
>> Thank you, Paul.
Good morning.
My name is Josh, and I am an
engineer on the watchOS team.
And, we're really excited to see
the kinds of relevant shortcuts
that you can surface on the Siri
watch face.
So, Paul just walked you through
some of the API's and different
ways that you can provide your
relevant shortcuts to the
system.
Now, I want to talk about how
the system predicts your
relevant shortcuts, and the
things that you can do to ensure
the system surfaces your content
when it's most appropriate to
the user.
It's like Paul talked about
earlier, content on the Siri
watch face is ordered by
relevance to the user.
At a wrist raise, we want to be
surfacing the content the user
cares about most.
Whether that be based on the
current time, location, or other
factors from across the system.
And, to figure out what content
the user cares about, we're
looking at how they interact
with the different platters on
the Siri watch face.
What things are they tapping on?
What things are they spending a
lot of time looking at?
And, what things are they
scrolling past to find other
content?
And, we use all this information
to understand what are the
different platters that the user
wants to see right on a wrist
raise.
And so, because we're trying to
surface, again, the content the
user cares about most, you want
to make sure that you're
providing relevant, engaging
content to be surfaced on the
Siri watch face.
So, let's talk a little bit more
about how we actually put your
relevant shortcuts on the Siri
watch face.
So, like Paul mentioned earlier,
the first step is you need to
provide your relevant shortcuts
to the system, into the
defaultRelevantShortcutsStore.
Once you've provided your
relevant shortcuts, we can run
them through our machine
learning model to figure out
what is the best way to be
surfacing your relevant
shortcuts to the user.
So, let's look at what we take
into account inside of this
model.
So, the first thing that we look
at is your relevance providers.
This is your way to provide
additional context to the system
that we wouldn't otherwise have.
This might be a concrete time,
location, or other context.
We also look at past behavior.
So, how has the user interacted
with this relevant shortcut
before?
Is it something they're spending
a lot of time tapping on and
looking at on the watch face?
Or, something that they're
scrolling past to find something
more interesting?
We also look at a number of
different factors from across
the system, such as, you know,
the current time, current day of
the week, current location,
user's routine, among a number
of other factors.
And, take all this into account
to try to better understand what
is the context under which the
user wants to interact with the
particular shortcut.
And, it's also worth noting that
this model is secure and
personalized to each user.
All of our learning happens
on-device, and we're building a
model for each and every single
user of the Siri watch face.
The way that you interact with
the watch face may be different
from the way that I interact
with the watch.
And, once we have this model
trained we can then take your
relevant shortcuts, and again,
surface them on the Siri watch
face based on their relevance to
the user.
If you attended some of the
previous shortcut talks, you may
have heard about donations.
This is your way to indicate to
the system what the user's doing
within your app.
And, although they don't
actually surface within the UI
of the Siri watch face, we take
them into account when trying to
understand the user's past
behavior.
So, again, donations are your
way to indicate important tasks,
or other information about what
users are doing inside of your
applications, and providing that
to the system.
And, by giving this information
to the system, we can understand
and learn patterns in the user's
behavior.
So, if the user, for example, is
looking or performing the same
action at the same time every
single day, or around similar
locations, the system can pick
up on these patterns, and again,
surface the relevant shortcuts
when they're appropriate to your
users.
So, if you're using
NSUserActivities to represent
your donations, there's a few
things you need to do to be able
to provide those donations to
the system.
The first is you need to set
both these properties to "true"
in your NSUserActivity, the
eligibleForPrediction and
eligibleForSearch.
Next, you need to make sure that
your user activity is supported
by an application, by indicating
that within your info.plist.
And, finally you want to make
sure that you're donating these
user activities whenever this
piece of content is visible to
the user, so the system can
start to pick up on the
patterns.
And so, to provide that
donation, there's a method on
NSUserActivity, becomeCurrent,
that allows you to donate to the
system.
New in watchOS 5, though, there
is this method
updateUserActivity on
WKInterfaceController, where you
can attach your user activity to
an interfaceController, and
whenever this
interfaceController's visible to
the user, the system will
automatically be donating the
user activities on your behalf.
This is similar to the API's
that we have over on iOS, on
UIViewController, and
UIResponder, where you can, kind
of, attach an NSUserActivity to
a piece of your UI.
And so now, this is the
recommended way to be providing
NSUserActivity donations on
watchOS.
If your shortcuts are backed by
an intent, you can provide those
intent donations by using the
INInteraction API.
So, if you attended any of the
previous shortcut talks, this
probably looks familiar to you.
But, to provide this donation,
first you create your intent,
and adjust any parameters as
they're necessary for what the
user just did within your app.
Next, you create your
INInteraction with intent, and
call the donate method when
users perform this particular
interaction.
The other thing to look out for,
is this primary shortcut type
for when we're making watchOS
predictions.
And so, let's take a look at
this.
Inside of the Xcode intents
editor, you'll notice that
there's a field to select the
primary shortcut type.
So, this allows you to indicate
to the system, like, kind of,
the most critical use cases for
your app.
And so, let's talk more about
what that means.
So, by indicating the primary
shortcut type, you're, kind of,
telling the system what are the
use cases that you think your
users care about most.
And, this helps us expedite our
learning process.
You can indicate one of these
per intent that you've defined.
And, for the best performance,
the parameter combination, the
selection, have a subset of the
parameters that you're providing
in your relevant shortcuts.
And, we'll walk through a couple
of examples to try to better
understand what that means.
But, by giving this information
to the system, we can much more
quickly pick up on patterns in
the user behavior, and much more
quickly understand what are the
relevant shortcuts that the user
cares about most.
So, first up, we have this
Messaging app.
And so, this is something that
Paul and I have been using for a
while now, so every morning
we're sending messages back and
forth.
Sometimes it's in preparation
for this talk, and other times
it's about what cookies we want
to get at lunch.
And so, this app has gone ahead
and adopted Siri shortcuts.
And, they defined a couple
different parameters that their
app supports.
So, the first one is the
recipient.
So, who am I sending a message
to?
And, the second one is the
message.
So, what is the content of the
message I'm trying to send?
So, like I mentioned before,
right, Paul and I are using this
app every single morning, but
the actual content of the
messages that we're sending
varies from day to day.
So, because of this, this may
not make a good candidate for
the primary shortcut type.
It's going to take a lot longer
before the system can, kind of,
understand what are the
shortcuts I'm trying to perform.
Whereas, Paul and I are having
very consistent conversations,
so the system can really quickly
pick up on this pattern, that
every morning, I'm sending
messages to Paul.
Another little more interesting
example, is this app that I use
to order my morning coffee.
And again, this app has gone and
adopted Siri shortcuts.
So, they support a couple
different parameters.
The first one is the type of
coffee I want to buy.
The next one is the condiments,
so do I want cream or sugar?
And, finally, the location that
I want to pick my coffee up
from.
So, every morning I use this app
and go place an order for my
morning coffee.
But, depending on where I have a
meeting, the location that I
want to pick my coffee up from
will change from day to day.
Maybe I'm going to Infinite
Loop, or Apple Park, or up to
San Francisco, and I want to
pick up my coffee at, you know,
the closest store.
So, again, because there's a lot
of variability in the location,
this may not make a good
candidate for the primary
shortcut type.
It's going to take a lot longer
for the system to, kind of,
understand the patterns in my
behavior.
Whereas, the actual order that I
place is the same thing every
single time.
I always order a latte with the
same amount of cream and same
amount of sugar.
So, this does make a really
great candidate for the primary
shortcut type.
And, coffee-- it's on its own,
might be too generic, right?
The coffee plus the condiment
might make a better primary
shortcut type.
And so, that is how you can
provide information from within
your apps about what users are
doing.
So, now let's talk about how we
predict your relevant shortcuts.
And, to talk about this, we have
a couple different apps that
we're going to be looking at.
The first one is this recipes
app.
And so, this is an app that I
use every single day that
provides me suggestions for
recipes that I might want to
try.
Next, we have this fitness
trainer app that provides me a
nice reminder to go out and do
my evening run.
And finally, there's this travel
guide app that tries to surface
interesting locations of
interest while I'm out and about
so I can make sure to go check
them out.
And so, each of these three
different apps falls into one of
these, kind of, three different
categories that we have.
The first one is what we call
"Downtime."
And so, this is something that
doesn't have a concrete time or
location associated with it.
When I want to interact with
this recipes app may vary from
when you want to interact with
this particular app.
Whereas, this fitness trainer
app has a very concrete time
associated with it, right?
I've gone and configured that 8
P.M. is the perfect time to be
surfacing this reminder to me.
And, by giving this information
to the system, we can much more
accurately rank this particular
relevant shortcut against
everything else that we have to
consider.
Similarly, this travel guide app
knows where they want to be
surfacing their particular
relevant shortcuts, right?
They know all the locations of
interest, in this case,
Golden Gate Park.
And again, by passing that
information to the system, we
can much more accurately predict
the relevance of this particular
shortcut.
And, to provide this additional
context to the system, whether
that be a concrete time,
location, or other information,
you can provide that through the
relevanceProvider API.
And so, this allows you to
indicate hints to the system at
when your content is most
important to your users.
It also allows you to surface
new content that users may not
have seen before.
Since you're providing this
additional hints to the system,
we don't need to have seen as
much of the user's past behavior
and consistency before the
system becomes confident that
this is something the user cares
about.
But, it's also important to keep
in mind that user engagement is
taken into account at every step
of the process.
If you provide relevance
providers or not, we want to
make sure that the content that
we're surfacing is stuff that
the user cares about.
So, let's take another look at
this recipes app.
So, like what we talked about
earlier, there's not really a
concrete time or location
associated with it, right?
The time that I want to interact
with this app may be for cooking
dinner, whereas for you it might
be for cooking lunch.
And so, in this case, we may
actually indicate an empty set
of relevanceProviders.
And, in this case, the system
will determine the relevance of
this particular shortcut based
on how the user's interacted
with it before.
So, for me, I always use this
around 7 P.M. because that's
when I go cook dinner, so that's
when the system will start to
surface this particular relevant
shortcut.
If we take a look, however, at
this fitness trainer app, which
does have a concrete time
associated with it, you can
provide that by creating a
dateRelevanceProvider, and in
this case providing the 8 P.M.
start time.
And, by giving this information
to the system, we will surface
this content around a specific
time of day.
And, as you move closer and
closer and closer to 8 P.M.,
this content becomes more and
more relevant to the user.
So, let's look at what-- that.
So, here's a couple cards that I
might have on my Siri watch
face.
The first is a reminder from
Calendar for a prep meeting for
this talk.
Next, is an app that I use that
gives me interesting tidbits of
space news.
And finally, we have this
fitness trainer app, so let's
focus in on that.
So, I just woke up, I'm getting
ready for my day.
You know, it's, kind of, 8 A.M.
And so, you can see that this
fitness trainer app is actually
already starting to surface on
the Siri watch face.
We want to make sure that users
are, kind of, aware of all the
things they have going on during
the course of the day, but it's
kind of placed much lower on the
face.
It's not the most pressing thing
to the user right now.
But, as we start moving
throughout the day, right, that
calendar event that I had is no
longer relevant to me.
And so, this fitness trainer
suggestion will start to bubble
up the face, right?
We're getting closer and closer
to 8 P.M. This content is
becoming more important to me.
And, once we finally reach 8
P.M., right, this is the most
important thing to me right now.
Want to make sure I don't forget
to go for my run.
So, it'd be surfaced up at the
top of the Siri watch face.
Now, let's take a look at the
travel guide app, which again is
trying to surface content around
interesting locations of
interest the user.
And, as you may have guessed,
you can provide this information
to the system using a
locationRelevanceProvider.
In this case, near Golden Gate
Park.
And, by providing this
information to the system, we
will automatically be surfacing
this content when the user
starts getting close to Golden
Gate Park.
And, again, as the user gets
closer and closer and closer to
Golden Gate Park, this content
becomes more relevant to them.
So, if I just arrive up in San
Francisco, you can see that
we're already starting to
surface this suggestion to go
check out Golden Gate Park.
It's displayed lower on the
face, because it's not the most
pressing thing to me right now.
But, we want to make sure that
the user is aware about this.
And, as I move closer and closer
to Golden Gate Park, this
content becomes more relevant to
me, until I've finally arrived,
and I can really easily tap on
this to dive into more details,
and check out what are the
things I can do while I'm at
Golden Gate Park.
When you're creating your
locationRelevanceProviders, you
provide to the system a
CLRegion.
And so, this allows you to
indicate two important
properties.
The first is the actual
coordinates, the lat, long that
you're interested in surfacing
your content, along with the
radius.
So, how close does the user need
to be before this content is
relevant to them?
There's also a couple properties
in CLRegion that you can use to,
kind of, adjust how the system
interprets your region.
The first one is notifyOnEntry.
So, as the name might suggest,
the system waits until the user
has entered into this region
before this piece of content
becomes relevant to the user.
Similarly, there's a
notifyOnExit property that you
can set, where the system will
wait until the user has left
this particular region before
this content becomes relevant.
We actually use both these two
properties with the existing
location based reminders on the
Siri watch face, to get, kind
of, a geofencing-like behavior.
By default on CLRegion, both
these properties are set to
"true."
And, in that case, we'll
smoothly interpolate the
relevance based on how close the
user is to a particular
location.
And so, creating your CLRegion
is really easy.
In this example, we're creating
one around Apple Park, so I've
brought in the coordinates for
Apple Park, along with the
radius.
So, how close do I need to be?
In this case, 2 kilometers.
Next, I will go adjust the
notifyOnEntry, notifyOnExit
properties as they make sense
for my use case.
And, once I have my region fully
configured, I can really easily
create a
locationRelevanceProvider.
A couple of quick notes about
the locationRelevanceProvider.
In order to use it, your app
needs location authorization,
otherwise we're going to ignore
this particular
locationRelevanceProvider.
And, to preserve user battery
life, the number of location
updates we get during the course
of an hour is limited, so keep
that in mind.
So, that's some of the ways that
you can provide concrete times
and locations to the system.
Now, let's talk about what you
can do to try to provide
personalization.
So, we have this
dailyRoutineRelevanceProvider.
And so, like Paul mentioned
earlier, this allows you to
surface content at meaningful
times or locations to the user.
And, these times and locations
are personalized to each and
every single user of the Siri
watch face.
So, the first situation that
we'll talk about is the morning
situation.
This allows you to surface
content right when the user
wakes up in the morning.
We actually already used this
for the existing first-party
weather data source on the Siri
watch face, so we can provide
the user a nice summary of the
forecast at the beginning of the
day, and it quickly dismisses,
making room for other content
they might be interested in.
We also have an evening
situation.
And so, this allows us to
surface content before the user
goes to bed.
So, we use this one for the new
heart rate card in watchOS 5, so
that we can provide the user a
summary of their heart rate
activity throughout the day.
And, creating a
dailyRoutineRelevanceProvider is
really easy.
You've just got to specify which
situation you're interested in.
So, in this case, we can really
easily create a
dailyRoutineRelevanceProvider to
surface content right when the
user wakes up.
And, the behavior for both these
two situations is very similar
to a dateLocation-- or a
dateRelevanceProvider.
Except that the system is
automatically figuring out what
are the dates that we should be
surfacing this content.
We also have a few situations
available to you that allow you
to surface content in meaningful
locations to the user, whether
that be home, work, school, or
when the user arrives to the
gym.
And again, creating one of these
dailyRoutineRelevanceProviders
is really easy.
You just need to specify which
situation you're interested in.
And, for all of these
situations, the behavior of this
relevanceProvider is similar
again to a
locationRelevanceProvider,
except that the system is
automatically figuring out the
locations on your behalf.
And so, as the user gets closer
to these locations, your content
will become more relevant to
your users.
So, that is some of the insight
into how we predict your
relevant shortcuts on the Siri
watch face.
And, some of the things that you
can do to provide additional
context to the system so that we
can surface your content when
it's most appropriate.
Now, let's talk about how you
can build a great experience on
the Siri watch face.
So, we've been working on the
Siri watch face for a while now.
And, we've learned quite a few
things along the way.
First one of those things is
there's kind of, two high-level
categories of content that we
think works great on the Siri
watch face.
The first one of those is
glanceable information.
So, being able to really easily,
on a wrist raise, get a snippet
of information that I care
about, and being able to tap on
it when it's appropriate to dive
into more details.
The second is tappable actions.
So, being able to really easily,
from my watch face, tap on a
platter, execute a complex
action, and get right back to my
day.
So, let's take a look at the
glanceable information.
So, you can see we have this
recipes app again.
And, throughout the day, when
I'm glancing at my wrist, I can
get a nice snippet of
information, right?
I can see the recipe that I
might be cooking later today.
And, I can really easily decide
is this something I want to
make, or do I want to go out
tonight?
And, when I'm getting ready to
go home in the evening, I can
really easily tap on it to dive
into this app and see, do I need
to stop at the grocery store on
my way home?
So, if you're trying to provide
glanceable information to the
user, there's a few things that
you want to keep in mind.
Make sure that you're surfacing
the most critical information in
your relevant shortcuts.
This is the content that users
are going to see at a wrist
raise throughout the day, and
this is the content that they
really want to see right on
their watch face.
When the user taps on your
relevant shortcut, it should be
opening up into a location
within your app, and providing
them additional details.
Accidental taps on the Siri
watch face do happen, and so,
you want to make sure you're not
kicking off any long-running
background task that the user
isn't aware of.
And, like Paul mentioned
earlier, the system will
automatically be giving you an
additional background refresh
task to go provide new content.
Take advantage of this, so you
make sure you have relevant and
interesting content available on
the Siri watch face.
And, whenever you get new data,
whether that be through this
background refresh task, because
you've gone to the network and
downloaded new data, or because
the user has started using your
app, make sure that you provide
new relevant shortcuts the
system can always be surfacing
the most interesting content to
the user right on a wrist raise.
And finally, if the information
that you're trying to surface is
timely, indicate that to us
through the relevance providers,
so the system can surface your
content, again, when it's most
appropriate.
And so, that is glanceable
information.
Now, let's talk a little bit
about tappable actions on the
Siri watch face.
So, you can see we have this
fitness trainer app again.
And, I can really easily, right
on my wrist, get this reminder
that I need to go out for my
run.
And tapping on it, I get this
nice confirmation to make sure
that this is the actual action
that I want to execute.
And, in two taps, I can start
this workout.
I don't need to spend time
digging through apps to find
this particular fitness trainer
app, and start-- kick off my
workout.
I can really easily in two taps,
right from my watch face, kick
off this workout.
So, if you're providing tappable
actions, a few things to keep in
mind.
Your intents are running within
your Siri extension.
And, this allows the system to
automatically run your intents
in the background, so the user
never has to leave the Siri
watch face to go run your
actions.
We always display the standard
system confirmation UI, so that
users know the action they're
going to run, and we can make
sure this is what they want to
perform.
You also want to make sure that
the relevant shortcuts that
you're creating are fully
specified, so that your intents
extension can handle them
without any additional user
confirmation.
So that when the user taps on
one of these relevant shortcuts
on the Siri watch face, we don't
need to open up your app to
continue and get, you know,
additional information from the
user.
We can kick off your SiriKit
extension, run it in the
background, and the user can get
on with their day.
And, you want to be providing
commonly used tasks within your
app as relevant shortcuts.
These are the kind of tasks that
users are using your apps a lot
for, and that they want to have
really quick and easy access to
from the watch face.
And, you also want to make sure
that you're providing relevant
shortcuts often.
You may not know the last time
that you provided relevant
shortcuts to the system is, and
you want to make sure that your
users can always access your
actions right from the Siri
watch face.
If you're going to be providing
these when your app starts up,
do it on a background thread so
you're not slowing down your app
launch.
So, we've talked a lot about
relevant shortcuts, and the ways
that you can surface your
information and actions right on
the Siri watch face.
You want to make sure that
you're also providing relevant
and engaging content.
This is the kind of content that
users want to see right on a
wrist raise, and the kind of
content that we are surfacing on
the Siri watch face, and trying
to promote to the top.
The richest experience happens
when you have a watchOS app.
This is when the system can give
you additional background
refresh tasks, and there's no
latency between the user tapping
on one of the relevant
shortcuts, and us beginning
execution.
If you have any questions, we
have a lab later today.
We would love to talk to you
guys about relevant shortcuts
and the Siri watch face.
And, we're really excited to see
the kinds of experiences that
you guys can create right on the
watch face.
Thank you.
[ Applause ]