Transcript
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> ANIL KANDANGATH: Good
afternoon and welcome.
My name is Anil Kandangath
and today I'm going to talk
about what is new
in Core Motion.
We have quite a few
things to talk about.
But today we will stick
to a few key points.
We will begin by talking
about the Apple Watch.
We have a new platform
and we will talk about how
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
We have a new platform
and we will talk about how
to bring your Core Motion
apps to the Apple Watch.
We also have new updates
to the pedometer
including some new APIs.
We will also talk
about the altimeter
which is a sensor we
introduced in the iPhone 6.
Gabrielle will then walk
us through an application
that uses Core Motion in a
really, really interesting way.
I've seen it and
it's pretty awesome.
So let's get started.
Now this is not an
introductory session
but we will cover enough
Core Motion concepts for you
to be able to follow along even
if you are not intimately
familiar with Motion itself.
But for those of you here
at the conference we do have
a lab immediately following
the session.
So please come by with
all your questions.
We have the engineers
and scientists on hand.
If you have no questions,
congratulations.
[Laughter.]
>> ANIL KANDANGATH:
Still come by
and tell us how you are using
Core Motion in your apps.
We would love to hear from you.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
We would love to hear from you.
We do have some great past
sessions on Apple developer
where we go into great
detail about motion sensing
and motion activity
and the pedometer.
So do check them out.
Let's get started.
Now, motion sensing
has come a long way
in iOS since the early days.
We will begin with
just a quick overview
of how things work today.
This is a traditional
motion sensing architecture.
You take some sensors hook
them up to the main processor
and the drawback
should be obvious.
Any time you want to
access sensor data,
you do have to keep the
main processor active.
This severely limits the kinds
of applications that
you can write.
So we thought we
could do better.
Starting with the iPhone 5S we
introduced a motion coprocessor
that we called the M7.
This is a tiny processor
whose sole job is
to process Motion data.
It can chug along all
day long and only turn
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
It can chug along all
day long and only turn
on the main processor
when it really needs to.
This is great for the
battery life of the users.
In iPhone 6 we introduced
the M8 processor
and new sensor, the altimeter.
This is what motion sensing
looks like today on iOS.
So what does it get you?
Well, if you take the sensors
and access data live you do get
not just the raw sensor data
but also features built
on top of the sensors
like device motion which gives
you the altitude of the device
or the pedometer
or motion activity.
On platforms that have the
altimeter you also get access
to the raw pressure, altitude
changes and flights of stairs.
So that is live data
from the sensors.
But the promise of the
motion coprocessor is
that you can do more
than this, right.
Indeed, you can.
You also get 24/7
access to motion activity
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and the pedometer
and on platforms
that have the altimeter you
also get flights of stairs 24/7.
So this is what you can do with
motion sensing on iOS today.
So what does motion sensing
look like on the Apple Watch?
Well, the Apple Watch
also has a coprocessor
and it has an accelerometer
which means you get the same
24/7 access to motion activity
and the pedometer and in terms
of live data you get sensor data
in addition to these two.
So if you are thinking gee,
this all looks so familiar.
Maybe Motion is very similar
on the Apple Watch too,
you would be right.
Most Core Motion APIs on iOS
are available on watchOS,
and not only that most Core
Motion APIs behave the same
on iOS and watchOS.
So this is a quick run down
of the features we have
on both OSs.
But the Watch is fundamentally
a different platform.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
But the Watch is fundamentally
a different platform.
So there are some
things we need to know
as we build our applications
for the Watch
and I'll go through them.
We'll start with
motion activity.
Motion activity is what gives
us contextual information
about what the user is doing.
You can tell if they
are walking, running,
driving, cycling, et cetera.
And the stage that you can
obtain are very much dependent
on the platform and
how it is used.
So this is a quick summary of
the states that you can get
on watchOS and you'll notice
that you can get access
to walking, running, cycling,
and stationary states.
And that's motion activity
on the Apple Watch.
Developers have wanted
access to the sensors itself
and we do provide access
to the accelerometer
through the familiar
CMAccelerometer API.
And that should be all there
is to the accelerometer
but it is a different platform
and there are some
considerations
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and there are some
considerations
that we have to keep in mind.
The first is that you are
app may get a limited amount
of processing time.
There are no real background
processing modes in the Watch,
so your app pretty much runs
when you are showing
up on the screen.
And the screen may turn
off for many reasons.
The screen may time out.
But the Apple Watch screen is
also designed to turn on only
when you're looking at it.
So if you want to turn
the screen away from you,
there is a good likelihood
the screen may turn off
and your app may not get
processing time after that.
So given all these
considerations
in mind there are a few
best practices I would
like to talk about.
The first is to design your
app to only expect data
when the app is on screen.
Now, I know this is
easier said then done
but something to keep in mind.
The other is as you are
accessing a streaming,
a stream of sensor
data you have to ensure
that you can handle your
task being suspended
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
that you can handle your
task being suspended
in a graceful way.
And fortunately we do
have a way to do this
through NSProcessInfo.
If you use this API
performExpiring
ActivityWithReason, you can
perform your sensor data
processing as a block
in this API
and this will inform your
block that your task is
about to be suspended so
you can do the right thing.
That is the CMAccelerometer API.
But some of you want more,
a lot more than that.
In fact you wanted to
get access to sensor data
for a long period of time.
So today I'm pleased to say
that for the first time
ever we are providing access
to historical sensor data as
the historical accelerometer.
Now what does this do for you?
It allows you to access
data for long durations.
For really long durations.
And you can access
this data even
when your app is not running.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So this enables you to perform
your own custom algorithms
on top of this long
streams of sensor data.
This is really cool
and we expect you
to make really cool
apps with this.
So how do you do this?
We have a new API called
the CMSensorRecorder,
and this enables you to
initiate historical sensor
data recording.
Now keep in mind, this
is a key difference
between this historical API
and the other historical APIs
we have such as the pedometer
and motion activity where
you do not really need
to initiate any recording.
But for the SensorRecorder
you do need to initiate it,
and the data is recorded
at 50 Hertz,
and you can query it
for up to three days.
So that's how you use
the SensorRecorder.
In terms of the implementation,
here is how we do it.
The first thing you want to
do is initiate recording data
and do it by saying
recordAccelerometerDataFor
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and do it by saying
recordAccelerometerDataFor
and provide a duration.
Once you do this, the
device may go to sleep.
Your app may get suspended.
It's all okay.
Later the user may come
and launch your app.
When they launch your app
you may decide to query
for sensor data and you query it
by saying accelerometerDataFrom
and provide a time range.
The accelerometer data
will then be returned
for the time ranges specified.
And it should be obvious that
the time range can be a subset
of the time that you have
been recording data for.
Now, the data is
returned as a sequence
of CMAccelerometerData objects.
Cast your mind back
to what we said
about handling task
suspenses gracefully.
You will encounter the
same problem here, too.
So as you are processing
this big stream
of historical data you do have
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
of historical data you do have
to visit our old
friend NSProcessInfo.
Now, this will inform
you when your task is
about to be suspended.
So if you look at the
accelerometer data object,
it contains not just the
acceleration part you're
familiar with, but
also a startDate.
And you can use this
startDate as an anchor
to make sure the next time the
app is launched you can query
from this point onwards.
So that, is how SensorRecorder
works.
Now, this is great, but as you
show there is one consideration.
Well, two, power
and performance.
Now, the SensorRecorder
is a powerful API
but with great power
comes great danger.
And the danger here is that
you may not have enough time
to process large
strings of sensor data.
So a few best practices again.
The first thing is to, and this
may be obvious, is to record
and query only data for the
minimum duration required
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and query only data for the
minimum duration required
by your app.
So if you are a work out app
you probably don't need 12 hours
of data.
You can probably get by with
a couple of hours of data.
The less amount of
data you query for,
the easier it will
be for your app.
You should also know the
sensor data rate requirements.
Now, the data is
available at 50 Hertz,
but unless you are planning to
extract features from that data
that require a high sensor rate
you are better off just dropping
data and process less data.
So a few best practices
and you should be good.
And that's the Core Motion
update for Apple Watch.
As you've seen we had a
lot of the familiar APIs
and we also have access
to historical sensor data.
This should be really great,
and you should be able
to make really cool
application with this.
In terms of writing
Watch applications,
we did have a session
earlier this morning
that you can probably
go watch now.
For the rest of the talk I'll
focus on iOS and I'll begin
with updates to the pedometer.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
This is the pedometer that
you are familiar with.
Steps and distance.
A quick recap.
The pedometer is designed
to give you consistent
performance
across body location.
So whether it's in your hand
or if it's in your pocket,
the performance is
supposed to be consistent,
and it is also consistent
for varying pace.
Now a great feature
of this pedometer is
that it adapts to the user.
So the more the user uses the
pedometer, the better it gets
by calibrating itself
to the user.
In iOS 9 we are making one big
improvement to the pedometer
which is it incorporates
GPS when possible.
Why do we do this?
Well, you may have an app
that is already subscribing
to location.
Maybe you are trying to
plot the user's track
as they go on a run.
When you do that,
the pedometer senses
that GPS data is available
and it gets more
accurate by using it.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and it gets more
accurate by using it.
Well, why is this so critical?
So take a look at this
run that a colleague
of mine did in San Francisco.
San Francisco is urban canyon.
It has tall buildings and it
can be a challenging environment
for any GPS.
If you look at this
segment here,
GPS looks pretty solid here.
If you were to rely on
GPS we'd get really good
distance estimates.
That's good.
If you look at a
different segment here,
you can see that
there's quite a bit
of wander in the user's track.
Now granted there are quite
a few pubs along the way --
[Laughter.] but that is probably
not really how this person
really ran.
The pedometer is smart
enough to understand
that this is a portion where
it should not rely on the GPS,
and it resorts to
its own stride-based
distance estimation.
The end result if you use the
pedometer you get a consistent
performance and that's
why you should use steps
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
performance and that's
why you should use steps
and distance from the pedometer.
We have one big API improvement
in the pedometer this year.
That is -- well,
before we go to that,
let's talk about one
you already know about
and that's floor counting.
So floor counting is available
as floorsAscended
and floorsDescended.
Because this is part
of the pedometer,
there is a pedestrian aspect
to it, which is that you have
to actually take steps.
What it means is that you
really have to earn your floors
to be awarded floors here.
Let's take a deeper look
at how this actually works.
It has a few requirements.
The first is that there is a
minimum ascend rate requirement.
There is also a steps
requirement.
What this all means for us
is that if you were to say go
on a long hike and you walk slow
and eventually you accumulate
quite a bit of altitude.
The likelihood is that you won't
hit the ascend rate requirements
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
The likelihood is that you won't
hit the ascend rate requirements
and we won't award you floors.
But if you are walking
in San Francisco
where you have some really,
really steep hills
it is possible
that you will get a few floors.
Also because of the step
rate requirements if you were
to use the elevator
or the escalator you are not
likely to be awarded floors.
So that is how floor
counting works in iOS.
Now it's time for the new API.
That is pace.
Pace is something
that developers have
long asked us to provide.
We are happy to provide it now.
When we talk about pace, we
are referring to currentPace.
So this is really the
instantaneous pace
and not the pace as estimated
from the beginning of your run.
And it is provided in units
of time over distance.
If you are wondering why it is
provided in this way and not
as speed, you know, that pace
is normally analogous to,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
is because for runners,
pace is the time taken
to cover a known distance.
If you have been following
the Apple Watch updates
on the website, this pace
might be familiar to you.
To me this is something
that I'm envy just of.
[Laughter.]
>> ANIL KANDANGATH: The
pedometer has both a live
and historical aspect to it,
but pace is only available
when you do live queries.
So just keep that in mind.
Now why do we provide pace?
After all, we do provide
distance and a notion of time
in the pedometer, and we
could just easily compute pace
from there.
The answer in one
word is robustness.
If you were to say do some
finer difference and try
to estimate pace just from
those distance chunks,
we would introduce an acceptable
amount of jitter in the pace,
and the pedometer takes
care of this and ensures
that the pace estimation
is smooth.
The other part is you could
try to get a smooth pace
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
The other part is you could
try to get a smooth pace
by looking back at a much
bigger chunk of the history
and then try to average
a pace over it.
But the cost of doing that
is you lose the ability
to respond quickly to
changes in the user's pace.
Pace from CMPedometer
also responds very quickly
to changes and it is responsive.
So smoothness and responsiveness
is why you should use pace
from CMPedometer.
And that's pace.
A close cousin of
pace is cadence.
And we now have cadence
in the pedometer.
Now, what is cadence?
Cadence is the rate of your
steps or in other terms,
it is how often your feet
are landing on the ground.
We know that the cadence is
really important to runners,
so now you can provide cadence
in your apps directly
from the pedometer.
Yeah! [Applause.]
>> ANIL KANDANGATH: So that is
our update for the pedometer.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> ANIL KANDANGATH: So that is
our update for the pedometer.
Here is a quick run down of
the features across platforms
and note that the
pedometer is also available
in the Apple Watch.
Let's move on to
pressure sensing.
The pressure is available
through the altimeter sensor.
It is available as part
of the CMAltimeter API.
And it gives you two things.
it gives you raw
pressure which is nothing
but a filtered version of the
pressure from the sensor itself
and gives you relative altitude.
Let's take a deeper
look at altitude.
When we say relative,
it is relative
to the first sample provided.
What it means is that
the first sample you get
from the altimeter will have
a relative altitude of zero.
And every subsequent sample
will be baselined against that.
What do we need to know about
the altimeter to use it?
Well, the altimeter is really
great for floor level changes.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Well, the altimeter is really
great for floor level changes.
It is not really great
for human level changes.
So you should probably
not use it to know
if the user is raising
their arm.
The error is not going
to be good for you.
There are challenging
situations for the sensor
that you should keep in mind.
One is the environment.
The environment might change
over time in the same location
and give you a false
sense of altitude changes.
For example, the
recent cold front
in San Francisco would have
changed the pressure enough
for us to think that
your altitude changed
by say 15 meters.
You should probably
not use the sensor
over long durations of time.
The case that your device is in
can also impact the pressure.
If you are using a rigid sealed
waterproof case for your phone,
the pressure sensor
is not your friend.
In terms of using the API, it
should be very familiar to you
and it gives you pressure
and altitude in the API.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
How fast can you access
data from this sensor?
So the first time you make a
request, the first sample takes
around 2.6 seconds to come.
Every subsequent sample comes
at a cadence of 1.3 seconds.
That is how fast you can
get data from the altimeter.
So to summarize, Core
Motion now has, is available
on a multitude of platforms.
It is available on
the iPads, the iPhones
and also the Apple Watch.
On the Apple Watch
we give you access
and not just the familiar APIs
but also historical sensor data.
The pedometer has a bunch of
really cool updates of its own.
It's more accurate than
ever, and it has two new APIs
in the form of pace and cadence.
So it is a well-rounded
now and should be great
for your fitness apps.
Now Core Motion has
traditionally been used
for gaming, for gestures,
for fitness.
There's a bunch of
applications for Core Motion,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
There's a bunch of
applications for Core Motion,
but Core Motion can also be
used to enhance the intelligence
of other kinds of apps.
To illustrate this
concept I am going
to invite Gabrielle on stage.
[Applause.]
>> GABRIELLE BADIE:
Hi, everybody.
I'm really excited to be here.
As a developer, I never got to
attend WWDC, but I spent every,
that first week of June
watching all of the sessions
on my computer at home
wherever I was working.
I learned a lot of information,
but one of the really
difficult things was
that there was too
much information.
Even in a given session
there is so much to absorb
that when you leave you may
even forget all of the things
that you learned
in the last hour.
What I'm here to do is to
take a few of the great things
that Anil just talked
to you about
and see exactly how you
might want to use them
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and see exactly how you
might want to use them
in your application and
hopefully make them stick.
Now Core Motion is really great
because it can't be just used
in a fitness context
or gaming context.
What I'm here to show you
today is how you can use it
in your app regardless of
what category it falls in.
So what can Core
Motion do for you?
Core Motion allows you to
detect what a user is doing.
You can see if a
user is in their car,
going on a run, if
they are cycling.
You can see their changes
in pace as we've seen,
we smooth that out for you.
And so by using those updates,
we can tell what context a user
is in without prompting them
and asking them to tell us.
This makes that experience more
magical and our apps smarter.
The next thing we can
do is engage the user.
Because updates come in every
few seconds we can see how their
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Because updates come in every
few seconds we can see how their
pace is changing.
We can see how their
altitude is changing.
We can update our app UI
quickly and accordingly.
The last thing we can
do is reflect back.
And I know that sounds
really cheesy,
but one of the great
things about Core Motion is
by running all the time
we can make it look
like your app is
working really hard 24/7
when actually the
coprocessor is.
We can look at your
activity updates
and your pedometer
updates over the last week,
which is pretty fantastic.
So with these three things in
mind, I decided I was going
to make a music player.
Nothing to rival Apple
Music or anything like that.
I wanted to see how Core Motion
could make my app experience a
little bit more engaging
and more magical.
Now a lot of music applications
have this idea of playlists.
I certainly listen to
different music when I'm
in my car I want podcasts, or
if I'm studying or working hard,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
in my car I want podcasts, or
if I'm studying or working hard,
I'm going to want maybe
some classical music,
and different app playlists
for working out as well.
By detecting what
a user is doing,
my music application can change
my playlists accordingly instead
of the user having to select it.
Similarly, by seeing changes
in pace we can say hey,
you're going on a slow jog
or maybe you went really fast
and climbed up a really
big hill and we are going
to play you a really exciting
song to make you feel good
about that, or try to
match your cadence.
Then there's the
reflection piece.
There are two really
great things
about reflection
especially in the context
of my music application.
One, I can look at data to
see how a user has been doing.
Hey, you haven't been
running in a while.
Maybe you want to go on a run
really soon and maybe you want
to use our application
to do that
and listen to your playlist.
At the end of the day I
want to motivate users
to come back to our app.
Also I thought it
would be really great
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Also I thought it
would be really great
if I said the user might want
to know what playlist
they were listening
to in the car yesterday.
And so by looking at activity
information and paring it
with my information
about playlists I can
give that info to them.
So with these three
ideas in mind,
let's see what my application
actually looks like.
So when the user opens
the app, we just want
to show them a basic playlist.
In my case I'm pretty much
standing around, right?
We're going to have
low intensity music.
Nothing too tough.
As we detect that the
user is speeding up.
Maybe they start
going for a workout.
We're going to change
the playlist
and give them a little
more upbeat music.
But this is where the
engagement part comes in.
This is where we look at
maybe changes in cadence
and maybe changes in altitude,
and we really engage with them
and follow along, so we can
give them a more high intensity
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and follow along, so we can
give them a more high intensity
playlist when they speed up.
Then we want to context switch.
When the user goes for a
drive, I want to be able
to give them their podcast or
whatever they usually listen
to on their morning commute.
So this is the detection
piece and engagement piece.
Next we have the
reflection piece.
I want to put this information
into interesting pieces
so the user can scroll back and
see what they have been doing.
This is where I might tie
in kind of that playlist
that they were listening to
at different moments in time.
So now that we have an idea of
what the app will look like,
let's see where motion
activity actually fits in
and what APIs we might need to
use for each of these pieces.
First we have the
detection piece.
This is something we are going
to want to do all the time.
We are going to want to
see if a user is driving.
If a user is walking, how
fast they might be walking.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
If a user is walking, how
fast they might be walking.
So what are we going
to need for that?
Activity updates and
pedometer updates.
We'll want to be monitoring
those and smoothing those out
and seeing which information
allows us to determine context.
Next we have that
engagement piece.
If the user is walking or
running, in my case I want
to see how fast they are going.
Look at that cadence, and
also I had that idea of maybe
if they climb a big hill I can
give them a great cheerful song
at their achievement.
For that I would need
pedometer updates
which includes those
pace and cadence changes
and altitude updates as well.
Then we had the reflection
piece.
As I said, the great
thing about Core Motion is
that they work really
hard for me 24/7.
So I can look back and get those
historical activity queries
and pedometer queries
to put that information
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and pedometer queries
to put that information
into interesting segments.
So now that I've gone through
exactly what my app might look
like, the detection, engagement,
and reflection pieces,
and which parts of Core Motion's
APIs I would need to use,
I'm going to show you
exactly how I coded it up,
at least for the data model
part of my application.
I am about to open Xcode.
Before I walk to the
computer over here,
be warned that there will be
a lot of text on the screen.
Don't be intimidated.
The sample code is
already available online.
Don't go open it now, please.
[Chuckles.]
>> GABRIELLE BADIE:
Please just focus
on the segments I
highlight for you.
As I said, there's a
lot of text on there.
If you focus on those few things
and maybe go back and look
at the sample code
after the session,
you'll remember the things I
focused on instead of trying
to do too many things at once.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
As I said, a lot of text.
Be warned.
So as I said, a lot
of text on the screen.
I just want you to
remember the three parts
that would be detect,
engage, and reflect.
As we keep those in
mind I am going to go
through these things here
in this data model
of my application.
This is just the
data model piece.
The rest is for you
to explore later.
So the first thing I'm going
to want to do is detect.
That requires activity
updates and pedometer updates.
Let's look at those
activity updates.
Great. A lot of text, right?
So the first thing I'm
going to want to do is check
if activity is available
on this hardware.
As an Anil noted before
activity isn't always available,
so we're going to want
to do those checks before
we query for any updates.
The second thing we
are going to want
to do is use the
startActivitiesUpdateToQueue
API.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
In my case I provide an just
a simple NS operation queue,
and then I want to
handle the data.
Now, activity updates
come in pretty frequently.
It's up to you how you
want to do the smoothing.
In my case when a user is going
on a run and maybe they stop
at a stop light and they are
semi-stationary , I don't want
to keep transitioning
from the running playlist
to the really slow
playlist and back
and forth and back and forth.
So I'm going to want to do
that application specific
smoothing myself and I leave it
up to you to do what's
best for your application.
Let's look at pedometer
updates next.
Great. As you can see,
the PedometerUpdates
API is very similar.
I'm also going to want to check
if step counting is available.
Even though we assume that
activity might be available,
that doesn't mean that step
counting will be available
as well.
Assuming that that's true, I'm
going to startPedometerUpdates.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Assuming that that's true, I'm
going to startPedometerUpdates.
This API allows us
to provide a date.
If you are in a more journaling
context you might want
to start these updates at
the beginning of your day.
In my case I want to look
at them during the
lifecycle of my application.
So I start these
updates from right now.
Now, there's a really important
thing I want to point out here.
We may get an error.
One of the great things about
Core Motion is that they do kind
of the: Hey, do you want
to let this app use
motion data pop up for you.
But while a user may first
open your application
and give you access to
motion data, they can go back
at any time in the preferences
and remove that authorization.
In that case then we would
throw an authorization error
when you try to receive updates.
You are going to want
to handle those smoothly
and prompt the user to go
into the settings application
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and prompt the user to go
into the settings application
and give your application
authorization again.
I won't go into it here, but I
pop up a simple UI alert view
to prompt to go into settings.
So that is the detection piece.
The next thing I'm going to want
to do is the engagement piece.
In my case when I'm looking
at activities updates,
I also want to start and
stop altimeter updates,
so I'm going to do that here.
I smooth out to see when the
user is running or is walking.
In that case I'm interested
in altitude updates.
Great. Now you're starting
to see a theme here.
First thing I'm going to do,
check if altitude
updates are available.
Assuming that's true,
I startRelativeAltitude
UpdatesToQueue.
I provide a queue.
And then I handle
the data accordingly.
I do check again if
there is an error.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
I do check again if
there is an error.
And if not, I'm going to
want to handle that data
as best serves my application,
and I leave that to you
to do as best serves yours.
The next thing I'm going to want
to do because I'm not interested
in altitude updates all the time
is make sure that I stop them
when I'm not running or walking.
Hmm, typo!
Great. And here, the API
is pretty straightforward.
I check for availability.
And I
stopRelativeAltitudeUpdates.
Now, I know that
when I said we want
to engage the user
we are going to look
at pedometer updates as well.
As you may recall I was already
looking at pedometer updates
for the lifecycle of my
application, and so I don't need
to start and stop those again.
I can use those in fusion
with the altitude updates
to give me what I want
for my application,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
to give me what I want
for my application,
which is to motivate
the user through music.
Now, we have the
reflection piece.
So here I'm going to want
to look at activity updates
so I can provide these
in interesting segments
as I showed you earlier.
Great. So the first
thing I want you to look
at is we can query activity
updates starting from date.
In my case I wanted to look
at it during the last week
so that even if we
are not looking
at using the application during
that week, we can get all
of that data and
put it together.
In my case I handle
that data and I put it
into interesting
segments for me.
I can't wait to see
what you guys want
to do with it yourselves.
Again, I know, broken
record, right?
I want to handle the
error accordingly.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
I want to handle the
error accordingly.
Now, in my case also
I wanted to give all
that extra historical pedometer
information once I found
interesting walking segments.
So what I'm going to do
is down here I am going
to request pedometer updates
for a given activity segment.
The first thing I do is
queryPedometerDataFromDate.
And I provide a start
date and an end date.
In my case I've already
put the data into segments.
So I look at the start
and end of those segments.
The second thing I want to
point out here is that if I want
to make any UI changes I'm
going to want to dispatch those
on the main queue, and
you'll start to find issues
if you don't do that in
your own applications.
So that's it.
That's all I really
have to show you
with adding motion
to your application.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
with adding motion
to your application.
And it took me like ten
minutes or something
so it is actually really easy
for you to go back and for you
to do in your own applications.
Now, I really want to show you
this app but I'm not going to go
for a run or drive
on stage here.
So I will show you
what I have been up to
in the last couple days and how
that reflection piece
organizing the data
into interesting segments.
Hopefully this will
be an iPhone.
Great! So here is my
music motion application.
As you can see, I'm
not doing very much.
So I'm sitting here
in low intensity mode.
Now, let me look at
my historical updates.
I'm actually going to
start at yesterday morning
and I'll take you through today,
but all I've really done
is walk around Moscone.
Yesterday I woke up and I
decided to go for a quick cycle.
What I've done, I've -- the app
has already figured that out
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
What I've done, I've -- the app
has already figured that out
and put that into an
interesting segment for me.
Then I walked to my bus,
realized I was a little late
and started running, and then
finally made it on the bus.
As you can see, what I've done
here is the walking segments
have that mile information,
the pace information,
floors ascended and
descended, and puts those
into interesting chunks for me.
Now if I were actually
developing a music application
in one of these cells I
might put attach the playlist
that I was listening
to for that segment.
One thing I want to point
out as I scroll to today
when I have been
walking around Moscone,
I'm not getting any floors
ascended or descended.
That's because I've been lazy
and taking the escalator
everywhere,
and so I just wanted you
guys to notice that as well.
So that's my basic
music application.
I'm just going to wrap this up.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
I'm just going to wrap this up.
So I really hope you
guys enjoyed the demo.
I tried to focus on
just a few things
so that you can remember
them for your application.
Remember, there's the
detection piece, using motion
to detect what a user is
doing instead of a user having
to tell you what they are doing.
Engaging with them with the
push updates from activity,
pedometer, and altimeter,
and as well reflecting back
and really using those
historical queries wisely.
If you have any more
questions, feel free to look
at the Developer Forums.
Please do check out
music motion.
It should be on the
developer portal.
And for any general inquiries,
contact our evangelist.
There are also some
related sessions
that are really worth
checking out.
If you are here at WWDC, you may
or may not have seen the
HealthKit presentation
or the Cocoa Touch presentation.
If you missed them or
are sitting at home,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
If you missed them or
are sitting at home,
please go check those
out online.
I also encourage you to check
out the Core Location
presentation as well.
If you really combine
motion data with health data
and location data, you can
create some incredible contexts
to really enhance
your applications.
I really can't wait to
see what you guys put
out on the App Store.
[Applause.]
>> GABRIELLE BADIE: Thanks.
Thank you very much.
[Applause.]