Transcript
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> Ladies and gentlemen,
please welcome Vice President
OS X Platform, Andreas Wendker.
[ Cheers and applause ]
>> ANDREAS WENDKER:
Good afternoon.
Welcome to WWDC 2015.
This is another exciting year
to be an Apple developer.
With the addition of
the new watchOS SDK,
there are now three major
platforms for your apps.
Our approach with these
platforms is integrating great
products with unique
user experiences,
while leveraging the same
underlying OS technologies,
APIs, and programming
concepts so it's easy to move
between these platforms
to share code
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and make these apps work
together across the platforms.
But we are preserving a unique
flavor for each of them.
The SDKs with these
three platforms are going
to ship later this year with a
new version of our Xcode IDE.
It's going to be version 7.
It contains a number
of great new features.
I just want to highlight
one here at the beginning
of the session, that is that
we are going to allow anyone
with an Apple ID
to download Xcode
and run their apps
on their own devices.
[ Applause ]
If you really want to make
becoming a developer a lot more
approachable, especially
for our younger students,
so this will work with
all three platforms.
Once you have worked on your
app and you want to deploy it,
we have another bit
of good news for you.
As of today, a single paid
developer program membership
will be enough to deploy your
apps in all our App Stores.
You don't have to
sign up and pay
for multiple memberships
anymore.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
for multiple memberships
anymore.
[ Applause ]
Of course, with that
single paid membership,
you get many additional
benefits,
like access to pre-release
software,
or our powerful store analytics
and crash reporting tools.
Now, let me talk about
software updates a little bit.
The iOS adoption
curve is, of course,
the envy of the entire industry,
and it's a huge advantage
to U.S. developers because
you don't have to deal
with the same fragmentation
that you find
in other app ecosystems.
However, with iOS 8,
we found that a number
of users had a difficult
time upgrading
because of the large amount
of free space required
for the installation.
We've been bringing that number
down with our subsequent
iOS 8 updates,
and we are continuing
to do so with iOS 9.
As you already heard this
morning in the keynote,
iOS 9 will only require 1.3
gigabytes of free space.
We are also changing the way
software updates are presented
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
We are also changing the way
software updates are presented
to the user.
Users will be given a choice
to install right now or later
at night, when they might not
need access to their devices.
So we think this is going
to keep pushing users
to update quickly and allow
you to focus your energies
on the latest version of iOS.
Now, we are also working
on a number of technologies
that we call App Thinning
that will enable you
to return some space
to your users.
App Thinning consists of three
technologies: App Slicing,
On Demand Resources,
and Bitcode.
Let's go over these
one after the other,
starting with Slicing.
Developing an app for iOS
actually means developing
multiple variants
of the same app
and then packaging
them all together
into a single app bundle.
So if you look inside an
app, you find a number
of redundant components
to cover the full breadth
of Apple devices.
There are binaries for 32- and
64-bit processor architectures;
images for different screen
sizes and resolutions;
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
images for different screen
sizes and resolutions;
and resources like
shaders, potentially written
in different languages for
the various kinds of GPUs.
But to run an app on each
given kind of device,
you only need a single
slice of these components.
So to make the most of the
available storage space,
we are going to strip
away all the components
that are not actually needed
on the device the
app is running on.
So you would still develop and
submit the same universal app
that you are used to, but the
store will only deliver a thin
variant of the app to
the user at install time.
[ Applause ]
So this leads to quite
impressive size reductions.
As you can see on the chart,
typical apps will get savings
in the range of 20
percent to 40 percent.
And leveraging App Slicing
is particularly interesting
for apps like games
that often bump
up against the download size
limit for cellular app installs.
The thin variants will have
a much easier time staying
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
The thin variants will have
a much easier time staying
under that size limit,
so you can now pack more
device-specific resources
into your apps and
provide the users
with a more refined experience.
And best of all, assuming
you are using Xcode's asset
catalogues, you don't have
to change a single
thing in your projects.
The Store will simply
do this automatically
for you the next time
you submit your app.
If you're using additional
custom data formats,
we ask that you opt into slicing
by creating an Asset Catalog
and using the new asset
categories we added to declare
for what kind of devices
your resources are needed.
Now, some apps benefit from
using even more resources,
through they usually don't
need them all at the same time.
For those kind of
situations, we are introducing
On Demand Resources,
or in short, ODR.
With ODR, the store will
separate your resources
into the appropriate
device slices
and then host them
on Apple servers.
But downloading gets delayed
until your app explicitly
requests them,
and then later iOS
will reclaim that space
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and then later iOS
will reclaim that space
when your app doesn't need
the resources anymore.
Typical candidates for
ODR are level-based games,
but many other apps can
benefit from ODR as well.
For example, you might want to
consider offloading tutorials
into ODR assets so that
they only get downloaded
when the user actually
wants to watch them.
Using ODR requires a little
more planning on your part,
but the important point
is that you can use more
than 4 gigabytes of resources,
just not all at the same time.
So App Slicing and On
Demand Resources are going
to help greatly with the
storage space for an app.
But we are also working
on a new technology
that is a little
more forward-looking.
We are introducing Bitcode
into the App Store
submission process.
Bitcode is an LLVM Intermediate
Representation of your binary,
which allows the store
to reoptimize your apps
for each kind of device before
they get delivered to the user.
You develop and debug the
app the normal way in Xcode,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
You develop and debug the
app the normal way in Xcode,
but the store will
be in a position
to constantly reapply the latest
compiler optimizations we have
been working on so that
your apps run great
on all kinds of devices.
It also future-proofs your apps
because it will allow the store
to let your apps take advantage
of new processor capabilities we
might be adding in the future,
and all that without you having
to resubmit your
app to the store.
[ Applause ]
So generating Bitcode
is controlled
with a simple setting in Xcode.
It will be mandatory for
all watchOS apps right
from the beginning, and for iOS
it will be optional for now,
though we feel that the
benefits are so strong
that we are you are going
to opt all your projects
into it by default.
So that's App Thinning.
It consists of App Slicing, On
Demand Resources, and Bitcode.
These will greatly
optimize your apps,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
especially for storage space.
Now let's take another
quick look
at the binaries in your apps.
The store currently requires
you to submit both 32-
and 64-bit versions
of your apps.
Over the last few years,
we've seen an explosion
in processor capabilities
on our iOS devices.
For our CPUs and even
more so for our GPUs,
and all that especially since
we introduced our 64-bit
processor architectures.
So this category of apps
that are really only possible
when they target these
64-bit processors --
64-bit architectures.
So going forward, we will allow
you to submit 64-bit only apps
to the store, starting
with iOS 9 later this year.
[ Applause ]
So next we want to talk
about the watchOS SDK,
and to talk more about
that, I am going to hand it
over to my colleague,
Eliza Block.
[ Applause ]
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
[ Applause ]
>> ELIZA BLOCK: Thank
you, Andreas.
We're delighted by the reception
that the Watch has gotten
in the developer community.
There are already thousands
of your WatchKit applications
available in the Store,
and today I am excited to
get to tell you a little more
about some of the great new
features we're introducing
in the watchOS 2 SDK.
First, let's talk
about the architecture
of an existing watchOS
application.
You have a user interface,
which you've constructed
as a storyboard, and this
is installed on the watch.
Powering this user interface
is your app extension,
which runs on the iPhone.
In watchOS 2, we are
making a significant change
to this architecture.
The user interface remains the
same, but the extension moves
over to run natively
on the Watch,
and this has a number
of advantages.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and this has a number
of advantages.
Because interactions with your
Watch application no longer
require a round-trip to the
iPhone, your users are going
to notice a substantial
improvement to the speed
and responsiveness
of your application.
And of course, for
the same reason,
your Watch application
can now work even
when the iPhone isn't present.
And since the extension is
running natively on the Watch,
it gets access to
the Watch hardware,
which opens up tons
of possibilities.
Okay. So now that you're writing
a native watchOS application,
what APIs are you going to use?
The good news is it's many
of the same frameworks
that you're already familiar
with from developing for iOS.
There's one important addition.
The watch connectivity framework
allows your extension to talk
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
The watch connectivity framework
allows your extension to talk
to the iPhone app, which,
of course, is now running
on a different device.
In addition, your
extension can talk directly
to web services using
the NSURLSession API.
And this works even when
your iPhone is out of range.
[ Applause ]
There are three ways
of surfacing your data
in an existing watchOS
application --
Glances, Notifications,
and the app itself.
But wouldn't it be cool
if you could see your
app's content just
by raising your wrist without
even having to touch the screen?
In watchOS 2, this is possible
because now you can create a
Complication for the Watch face.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
We designed the Watch
for brief interactions,
and the quickest way
to see your content is
through Complications,
Glances, and Notifications.
So let's take a closer
look at these.
Despite their name,
Complications are
actually quite simple.
They're snippets of
information that appear
on your Watch face
alongside the time.
So if we take these Watch
faces and we remove the time,
everything left is
a Complication.
Now, the ones you see here are
all built right into the OS,
but now you can create your own.
Now, you might not
all be aware of this,
but the fictional San Francisco
Soccer Club is currently holding
its 2015 tournament.
So we could create
a Complication
for the modular face to
show what match is coming
up next in the tournament.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
up next in the tournament.
If I were to install
this on my Watch,
I am going to see it every time
I raise my wrist right away,
and that's great, but for
it to be a good experience,
it's important that as time
goes by and the content
that the Complication
needs to show changes,
this has already happened by
the time the screen turns on.
And to make that possible, we're
going to be collecting the data
for your Complications in
the form of a timeline.
That way, as I glance at my
watch throughout the day,
the Complication will always
be showing the information
that makes sense at that moment.
So creating a timeline
for the schedule
of a fictional soccer
tournament is pretty easy.
For one thing, I
made up the schedule,
and for a second thing, it's
pretty unlikely to ever change.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and for a second thing, it's
pretty unlikely to ever change.
But that might not be the
case for your Complications.
As things change in the
world, you're likely to need
to update your timelines
accordingly.
And we've provided a couple
ways to go about this.
The first is Scheduled Updates.
Scheduled updates are perfect
for Complications
whose data changes
on a predictable schedule,
like a weather forecast.
If you've written a weather
forecast Complication,
you may know that
your server has access
to more accurate weather
data maybe once an hour.
So when you load your timeline,
you can tell us a good time
to next wake you up
to refresh your data.
Your extension will be
given the opportunity
to run in the background.
You can hit your server,
pull down new forecast data,
and reload your timeline.
But not all data
can be refreshed
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
But not all data
can be refreshed
at predictable intervals.
Suppose I wanted
to add live scores
to my soccer Complication.
It's not going to be
good enough for me
to hit my server every hour
or so or even every 15 minutes
to pull down the current score
because when a goal is scored,
that needs to show up right
away in my Complication.
To support this kind of case,
we're introducing a new kind
of high-priority
push notification.
If you are providing data that
is needed for a Complication,
you can send this push
notification to your iPhone,
and it will be delivered
immediately
to your extension on the Watch.
So timelines are great for
making sure the content
in your Complications
is always up to date,
but they also serve an
important additional purpose.
If you went to the Keynote this
morning, you will have heard
about the Time Travel feature
that we are introducing
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
about the Time Travel feature
that we are introducing
in watchOS 2, which allows
you to turn the Digital Crown
to peek forwards and backwards
and see what your
Complications will be showing
at different times of day.
For example, here it looks
like the Marina Noe Valley
game has already ended
and I missed it because I was
on stage talking about WatchKit.
Luckily, I can Time Travel
backwards to see what happened.
It looks like Noe Valley
won in stoppage time.
That must have been
really exciting.
So that's Complications.
They are a really quick way to
access data that's important.
Small amounts of data.
But what if you wanted to
see a little more detail?
That's when you might
use a Glance.
You get to your Glances by
swiping up on the clock face,
and Glances give
you the opportunity
to display an entire screen
full of data to your user.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
to display an entire screen
full of data to your user.
Here, for example, my soccer
club Glance is showing me the
standings for Group
A in my tournament.
Complications and
Glances allow me
to access your app's
data on my own schedule.
But sometimes you need to
get information in front
of your user right
when it matters,
and for that you would
use a Notification.
Notifications on the Watch
are incredibly powerful.
You can take full
advantage of the screen
to show an entirely custom UI.
For example, here I am being
alerted that it's my last chance
to pick the winners in
this afternoon's SoMa Cow
Hollow game.
In addition to the custom UI,
you can also provide
custom actions.
Pressing either of these
buttons would register my pick
with the application.
But that's not all.
Because in watchOS 2,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Because in watchOS 2,
we are introducing
Notification in-line text reply,
so I can add a button to
my Notification allowing me
to compose a message to send.
I am a really big SoMa fan, and
I want the world to know it,
so I am going to
choose that option.
When I pick the Reply option, I
am given the Quick Reply sheet,
and I can even use dictation
to compose a message,
all from inside the Notification
without switching context.
Thank you.
You are very kind.
[ Applause ]
So that's Notifications.
So Complications, Glances,
and Notifications are so easy
to access, they are
probably the way
that your users will
most commonly interact
with your app's content.
But sometimes you have
a little more time
and you want a more interactive,
immersive experience,
and that's when you would
launch the full application.
And there is so much you can do
with applications in watchOS 2.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
To give one example here, now
that you have access to input
from the Digital Crown, you
can use it to scroll quickly
through the groups
of your tournament
to quickly find your favorite
team and get more information.
But that's just one of
tons of new opportunities
that you have with watchOS 2.
You can also add
animations to your UI now.
You can access the
Taptic Engine.
You can do audio recording
right from the Watch.
And you can embed audio
and video playback.
You can make a phone call
from your application,
and you get access
-- live access --
to the data from the health
sensors and the Accelerometer.
We are so excited to see
what you guys create with all
of these possibilities, and
now to show you how easy it is
to bring your app to watchOS 2,
I'd like to invite
up Josh Shaffer.
[ Applause ]
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
[ Applause ]
>> JOSH SHAFFER: Thanks, Eliza.
So I am really excited to
show you how easy it can be
to take your existing watchOS 1
app and upgrade it to watchOS 2
to run natively on the Watch.
If you have downloaded
the WWDC app this year,
you may have noticed that it now
includes a WatchKit extension
to install and run
on your Watch.
So what we are going to do is
take a look at how we can update
that application and use it as
a native watchOS 2 application,
and we'll add a few
features to it using some
of the things Eliza showed us.
So over here, you can see that
I have the Xcode project open
for the WWDC app,
and Xcode has noticed
that I have a watchOS 1
extension and is offering
to upgrade it for me to
a watchOS 2 extension.
I am just going to click here
and let it perform
those changes.
It's going to take all
my existing targets
and update them, leaving my
source code and storyboards
in place so I can reuse all
of what I've already done.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
in place so I can reuse all
of what I've already done.
We can go over here and check
out our controller context,
and what we have to do first of
all is make a few code changes
to the way we access our data.
Now that we are running
on a different device,
we need to be able to move
the data from our iPhone
over to the Watch, and we can
use the new Watch Connectivity
frameworks to do that.
So I will start adding the
Watch Connectivity framework,
and we will replace the open
parent application call,
which is what I used in
the watchOS 1 version,
with the new Watch Connectivity
code that will ask for the data
to be copied from the
phone to the Watch.
With that changed, now we
can add some additional
functionality, and one really
cool thing to take advantage
of is the ability to interact
with the Digital Crown.
Now, the way that you
interact with the Digital Crown
in a watchOS app is using a
new interface picker control.
We will drag that out in our
Storyboard in just a minute,
but first we are going to add
some code to hook up to it.
We will make some
references to be able to hook
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
We will make some
references to be able to hook
up to the Storyboard
object once we drag it out,
and you add the items
that you want to choose
from the picker
programmatically.
So we are going to loop through
all of the session tracks
that exist throughout
the week and add an entry
to our picker for each track.
This way we will be able to
sort the list of sessions
and view just the ones for the
track that we are interested in.
Then finally, we have to add
an IBAction that will hook
up to our control that will get
called every time it changes.
So we will add that code there.
Now we can switch over to our
Storyboard and figure out where
to put this in our app.
So I will come over here and
find the new picker control,
and we will drag it out and put
it in our session controller.
The session controller is the
view that displays all the list
of sessions, so by adding at
the top, it will provide a way
to filter that session list.
Just make it a little shorter
so it's not quite so tall.
Now, the picker control
is really flexible,
and there's three
different appearances
that you can choose --
list, stack, and sequence --
that you can learn about
throughout the week.
The list one is a
really good choice
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
The list one is a
really good choice
for what we are doing here,
so we are going to leave that.
And we are going to
turn on a focus outline
so that we can make
sure the user can tell
when they turn the Digital
Crown what's going to happen.
So with those changes made,
we'll then hook up our picker
to the code that we
pasted in a minute ago.
We'll drag out the
connection to our IBOutlet
and drag a connection
into our IBAction
so that it gets called every
time the picker changes.
And that's it.
So with that, we are going
to hit Build and Run,
and we can switch over
to the Watch simulator
and see how this works.
The great thing about the
watch simulator in watchOS 2 is
that it's a full
simulator in the watchOS,
so you can run your entire
watch app side by side
with your iPhone app,
debug at the same time,
and test the connectivity
code between them.
[ Applause ]
So now we can jump over
here, and you can see
in our session list, we
can scroll through the list
of sessions, and we
can also filter based
on what track we want to see,
and it all updates
as we change it.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and it all updates
as we change it.
Now, one last feature
that I would really
like to add is the ability
to display the sessions right
on our Watch face, and
to do that, we are going
to add a Complication
to our app.
Now, I wrote most of the
code for this earlier,
so I am just going to drag
the file out and add it
into my project right here.
The one thing that I haven't
done yet is added the code
that will iterate through
and build the timeline.
Now, Eliza mentioned that we are
going to be providing the data
in the form of a timeline, so
we want to iterate through all
of the sessions that
I favorited so that
on the Watch face I can see my
favorite sessions all week long.
Now, to save us some time,
I built and installed this
on a Watch earlier today,
so I am just going to switch
over here and take a
look at how we can add
that to our Watch face.
So right at the beginning, I
had a Watch face configured
with Calendar in the middle,
but because I've got all this
on the WWDC app, we will
just switch over here
and customize it, and then we
can scroll down to the bottom
of the list of Complications,
and you will see
that now the WWDC app has an
entry all the way at the bottom.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
that now the WWDC app has an
entry all the way at the bottom.
We can click there to
turn that on and go back
and view our session list.
Now, obviously, the
platform State
of the Union is the session
I favorited for right now,
but with no additional work,
we can Time Travel forward
through that session
list and see
that up next is the
Apple Design Awards,
which I definitely don't want to
miss, and then looking forward
to tomorrow we've got the
Intro to WatchKit for watchOS 2
in the morning, which is
definitely something I want
to see.
So that's how easy it can be to
update your existing Watch app
to run natively on the watch
as a watchOS 2 application,
and add some support for
some great new features.
So next up --
[ Applause ]
Next up, Sebastien
Marineau-Mes is going to tell us
about some great new
foundation technologies.
>> SEBASTIEN MARINEAU-MES:
Thank you, Josh.
That was great.
Let me now talk about a number
of foundation technologies
that apply across our platforms,
and I am going to start
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
that apply across our platforms,
and I am going to start
with the first one that's
in the theme of performance,
and it is compression.
We are going to make
compression exciting.
[Laughter] So Apple's
always delivered a number
of compression algorithms as
part of our core frameworks,
LZ4, which is optimized
for speed; LZMA,
which has high compression, and
zlib, which many of you use,
which hits the sweet spot
between compression and speed.
We set out to build a
better compression algorithm
that improves on zlib.
As you may know, zlib
is over 20 years old.
It was built when processor
architectures were very
different, so we thought let's
build something optimized
for today's processors
and microarchitectures.
Now, we call this new algorithm
Lempel Ziv Finite State Entropy.
You may know of them.
They are also a great
Indie band.
You should go check
them on Connect.
To avoid any confusion, we are
going to simply call this LZFSE.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
To avoid any confusion, we are
going to simply call this LZFSE.
[Laughter] So now
LZFSE -- thank you --
improves on zlib on compression,
but more importantly,
it actually is three
times faster than zlib.
This is a great improvement
in terms
of making your apps
more responsive.
On top of that, it
helps with battery life.
It gives you a 60 percent
reduction in energy use
with the same compression
algorithm,
it also gives you a
3x speed improvement.
Finally, we've made it
super easy to adopt.
We've added it to our
standard framework.
All you need to do is switch
your algorithm to use LZFSE.
We do all the hard
work for you, and you
and your users can
reap the benefits.
So this is LZFSE, really
redefining the sweet spot
in mainstream compression.
Next up, I want to talk
about battery life.
Craig mentioned battery
life this morning.
Now, in our industry,
when we test battery life,
we often do so using
repetitive tests.
For example, we'll
do web browsing over
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
For example, we'll
do web browsing over
and over again as our test.
And we know that
in the real world,
the way that we use our
devices is very different
from these synthetic
lab tests; right?
We take the device in
and out of our pocket,
we receive notifications and
messages, we may use a variety
of applications, we may go in
and out of network coverage.
So what we've done in iOS 9 is
really focused on a broad set
of optimizations that apply
across all the most popular
apps on our platform.
That's step one.
Step two, we focused on a number
of infrastructure improvements.
For example, the algorithms that
drive the backlight intensity,
facedown detection so that
if you receive a notification
and are not able to see it
because the phone is face down,
we will simply not
light up the screen.
A number of other optimizations
in how the system, for example,
sleeps, and optimizing
its lowest power states.
All together, these
optimizations add up to one hour
of extra use -- sorry -- of
typical use for average users.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
of extra use -- sorry -- of
typical use for average users.
Great improvement.
Craig also talked about Low
Power Mode this morning.
Well, we've made it easy
to turn on Low Power Mode.
Simply go in Settings
and turn it on.
It activates a number
of internal levers.
For example, preventing
the CPU and GPU
from entering the
highest performance,
but most power-hungry,
states; preventing applications
from doing too much work in
the background; preventing them
from doing unnecessary
network activity; and finally,
turning off a number of
power-hungry animations.
And altogether, Low Power
Mode, when you start
from a full charge, will
give you an extra three hours
of usage.
So again, great improvement.
Let me now turn to
protecting customer data.
So from the start,
iOS has really focused
on protecting customer
data, and with iOS 9,
we are taking a number of
important steps to protect data,
both on device and in the cloud.
So let's start with the cloud.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So let's start with the cloud.
Of course, use your Apple ID
and use that to access a number
of services in the
cloud, your photos.
You might have documents that
are on iCloud, your purchases.
Use it for messaging with
iMessage and FaceTime.
And the biggest threat to
your data in the cloud is
that somebody gains
access to your password,
either because you've
shared that password
on another website
that's been compromised
or perhaps an attacker,
through a phishing attack,
is able to gain that password.
The solution to this is
two-factor authentication,
and we are making it easy
for everyone to adopt.
So let me show you how it works.
So with iOS 9, when you get
a new device and you want
to sign it up to your existing
iCloud account, you will be,
of course, prompted for your
password, but on top of that,
you will have to enter
in a verification code.
Simultaneously, we send a
notification to your existing,
trusted devices, and that
notification includes the
location of this new
device that's signing in.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
location of this new
device that's signing in.
So if that's you
that's signing in,
you can simply enter
the verification code.
But if it's an attacker
that's trying to sign
in to your account, you can
stop them in their tracks.
That's the cloud side.
Let's now talk about the device
where the passcode
is your last line
of defense protecting
your personal data.
So historically, we've kept
that passcode at four digits
because you had to enter
it every time you wanted
to use your phone.
But of course, these days
we don't enter the passcode
very often.
We use our fingerprint
with Touch ID to sign
in seamlessly into our phones.
So now with iOS 9, we are
able to extend the length
of the passcode from four digits
to six, increasing the strength
of the passcode by a
factor of a hundred,
without compromising ease
of use for our devices.
Now that we've talked
about the device,
let me introduce a
new technology on all
of our platforms, which we
call App Transport Security.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
of our platforms, which we
call App Transport Security.
App Transport Security is
really all about securing data
as it transits over the
Internet from a device
or from your applications
to your backend servers.
So of course we want that
information as it transits
over the network to be secure.
But it turns out that
today it's actually hard
for you to do that.
It's hard to get it right.
You can't use HTTP.
You have to use a secure
protocol, but then you have
to worry about protocol
versions, downgrade attacks,
encryption, vulnerabilities.
Keeping it straight and
doing what is best practice
is difficult.
With App Transport Security,
we are building this right
in to our core frameworks.
And so now when you use our
standard networking frameworks,
we will enforce a secure
best practice connection
between your application
and your backend servers.
Today that best practice is
TLS 1.2 and Forward secrecy,
but as the standards and the
state of the art evolves,
of course the framework will
implement that new state
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
of course the framework will
implement that new state
of the art and will enforce
it for you automatically.
[ Applause ]
So where do you go from here?
If you use out standard
networking frameworks,
for example, NSURLSession,
it is built right in.
It is on by default with iOS 9,
so when you download
the developer beta --
which I am sure many
of you already did --
and you recompile your app,
we will enforce best
practices secure connections
to your backend.
It is possible that you
haven't updated your backend
yet to support best practices
or perhaps you are using
a third-party library.
If that's the case, we have
provided an exception mechanism
to give you time to update your
backends through your info.pls.
So that's App Transport
Security.
Let me now switch and
talk about the Mac
and introduce a new technology
which we call system
integrity protection.
Now, on the Mac, user accounts
typically have administrative
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Now, on the Mac, user accounts
typically have administrative
privilege, and that
administrative privilege is
really equal to kernel-level
access,
and it makes it difficult
to protect the integrity
of the system.
With system integrity
protection,
we break that equivalency
and administrative-level access
is no longer sufficient in order
to do a number of
operations in the system.
Let me show you some
examples of this.
So for example, if you have
administrative-level privilege,
you cannot modify system files.
You can't install
to system locations.
You can't attach to system
processes and, for example,
introspect the memory or
change the control flow.
So with the beta that's
out today, we encourage you
to download it, test your apps
to make sure that you are able
to basically adhere
to all of these rules.
Now, you might also be wondering
how will this affect the way
that I develop?
Well, the good news
is we've updated Xcode
and the development tool chain,
and for the vast majority
of you, you will
see no difference.
For those that have a specific
development requirement --
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
For those that have a specific
development requirement --
for example, you are developing
kernel extensions on OS X --
we do provide a utility that's
part of the recovery partition
that allows you to disable
system integrity protection.
Which takes me now to the last
technology that I want to cover,
one that underpins the
modern Internet, IPv6.
Now, why is IPv6 important?
We've all heard about the
shortage of IPv4 addresses.
Well, guess what.
It's finally here.
In fact, in China and Asia, they
ran out of IPv4 addresses back
in 2011, and in the
U.S., we will be running
out in the next couple
of months.
What it means in
practice is that a number
of carriers are now
deploying IPv6-only networks.
And if your application
doesn't work properly with IPv6,
it will simply not function on
those carriers, those networks,
and for those customers.
So really important
to support IPv6.
Now, the good news is we've
had IPv6 support as part
of the platform for a long
time, well over a decade on Mac.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
of the platform for a long
time, well over a decade on Mac.
It is field proven,
and many of you
that are using our standard
networking frameworks are making
use of IPv6.
But we want to go further.
We want to make sure that every
one of you is building an app
that works on IPv6 networks,
so we've got a simple
recipe for you.
Use the standard
networking frameworks.
We've mentioned NSULSession.
It really takes care
of extracting the
complexities of the network.
Avoid the use of
IPv4-specific APIs,
many of which were developed
before IPv6 was even conceived.
And finally, don't
hard-code addresses.
Once you follow the recipe,
you might say, how do I test
that it works properly?
Because not all of us have
access to an IPv6 network.
The good news is again you
all use Macs for development,
and we are turning your Mac
into an IPv6 network
emulator with the latest beta.
So all you need to do is set
up through a new connection
sharing mode a personal hot
spot, you check the
IPv6 only box,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
spot, you check the
IPv6 only box,
then you can connect your test
device, test your applications,
and make sure that
they work on IPv6.
It's as easy as that.
[ Applause ]
So finally, because IPv6
support is so critical
to ensuring your applications
work across the world
for every customer,
we are making it an App Store
submission requirement starting
with iOS 9.
So that takes me to the end
of my foundation technologies.
Let me now hand it over to my
fellow Canadian, Toby Paterson,
who will be talking
about higher-level APIs.
Toby?
[ Applause ]
>> TOBY PATERSON: Well, I know
many of you have come from all
over the world to be here
today, some of you from even
as far away as Canada.
[Laughter] Well, your customers
are just as diverse as you are.
Here is an interesting fact
you may not have realized
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Here is an interesting fact
you may not have realized
about them.
This chart shows the worldwide
App Store sales broken
down by country.
One of the interesting
things it tells you is
that if your app is
not localized properly,
you are not going to reach
potentially the vast majority
of your customers out there.
Now, the good news is it's
not hard for you to do.
We have an internationalization
and localization guide
that tells you everything you
need to know, and of course,
we've got some great support
in our frameworks
and SDKs for this.
We have formatters
for dates and numbers.
And this year we are
introducing a new formatter
to help you display
people's names properly.
Now, names are a tricky thing.
Everybody has one, of course,
but different cultures
represent them differently
and have different
conventions around their use.
In English, we typically write a
name as first, middle, and last.
Chinese, on the other hand,
leads with the family name
and doesn't even have the
notion of a middle name at all.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
The NS person name
component formatter --
which I promise looks better
in code than it sounds
when you say it out loud --
[laughter and applause]
-- thank you.
This takes care of all
of the details for you.
And it even has some
cultural smarts.
So if you ask it for the short
version of a person's name,
it knows when it might
be inappropriate to use
that person's first
name all by itself.
Now, we have some really
exciting news for Arabic
and Hebrew speakers out there.
We've had rudimentary support
-- welcome -- [laughter] --
we've had some rudimentary
support
for these languages for years.
On iOS it was limited
strictly to text; whereas,
the Mac had some more
sophisticated layout
of your UI elements.
Well, I am really pleased
to tell you that in iOS 9,
we have full mirrored UI support
for right-to-left languages.
[ Applause ]
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Here's what it looks
like in Springboard.
Now, we haven't just
reversed some
of the UI elements
on screen here.
This is a much deeper
conversion,
where we've reversed the flow
between your view controllers,
your user interactions,
and the system gestures.
This is what the UI would look
like if it had been designed
by a native Hebrew
or Arabic speaker.
Let's take a look at what
this looks like on the phone.
I'd like to bring
Sara Radi up on stage
to give you a quick tour.
[ Applause ]
>> SARA RADI: So
let's take a look
at the new right-to-left
languages support on iOS 9.
So here I am running my
iPhone in Arabic, and starting
from the lock screen,
I slide from right
to left to unlock my device.
Here, my app icons are also
laid out from right to left,
and my page view flows naturally
from the first page
to the second.
So the whole system and all our
native apps fully support user
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So the whole system and all our
native apps fully support user
interface mirroring or running
in right-to-left localizations.
So let's take a look at the
Photos app, for example.
Here, all of my photos are laid
out the way I expect them to be.
The entire UI feels just
right from the navigation bar
at the top to the
bar at the bottom.
Now let me show you Mail.
In addition to the standard
layout, navigation, gestures,
and animations also
flow as expected.
So hitting Mail to trigger
the quick message actions,
I am going to swipe
from the left edge
of the screen to the right.
And that compliments the layout
flow of the table view cell.
Also, the navigation
gesture works as expected
from the right edge
of the screen.
Now rotating my device
to landscape reveals
the Message view.
As a native speaker, this
feels so natural to me
since all my emails
are on the right side
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
since all my emails
are on the right side
and the details are on the left.
And since we've built
all of this into UIKit,
your apps will get the
exact same behavior for free
for running in right-to-left
localizations.
Now let me show you
a third-party app
from the App Store.
And just in case
you are wondering,
we didn't choose Kayak because
the name is a palindrome
and you can also read
this from right to left.
[ Applause ]
So since it is using
autolayouts and our new APIs
with very minimal work,
they just add translations,
and their app just
works as expected.
So after WWDC, I am planning to
go on vacation, so I am going
to explore some views.
So the first slider lets
me pick the number of days
of my vacation, so I am
going to drag that slider
to the right side to
decrease the number of days.
Since they are using
a stock UI slider,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Since they are using
a stock UI slider,
they got this behavior for free.
I also prefer nonstop flight
only, so I am going to turn
on that switch at the
bottom, and as you can see,
it also flows from
right to left.
So the entire UI feels just
right and feels very intuitive
for natives of these languages.
And that's how easy it can be to
add right-to-left localizations
to your apps on iOS 9.
Thank you so much.
[ Applause ]
Now back to Toby.
>> TOBY PATERSON:
Thank you, Sara.
Well, as Sara mentioned, we have
full support for this in UIKit
and our other system frameworks.
So you can get a lot
of this just for free
in your own applications.
If you have custom views
or gesture recognizers,
you do need to think
about what they mean
in a right-to-left language.
We have this API here, which I
am not even going to try and say
out loud, to tell you
which way the UI is flowing
so you can make the appropriate
decision for your UI.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Next I'd like to
talk about links,
and by these I mean the kind of
links that you get from a friend
in a message or an email.
You tap on it, and invariably
it opens up in a web page.
Well, this is exactly
what you want on the Mac
because Safari is a natural
home for web applications.
iOS, on the other hand, is
all about the native app.
Wouldn't it be great if that
same link when you tapped
on it could open up right in
the associated application?
Well, that's exactly what
we've made possible in iOS 9.
[ Applause ]
This kind of universal app
linking that takes you to the --
the same link that can take
you to the natural destination
for the platform
that you are on.
Safari on the Mac or
native apps on iOS.
Here's how it works.
You host a file on your website
that lists the kind of URLs
that your application
can handle natively.
When the user taps
on one of those URLs,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
When the user taps
on one of those URLs,
we wrap it up in an
NSUserActivity and hand it off
to your application the same way
that Handoff does,
and that's it.
It's that easy.
Now I'd like to talk about some
of the new things in iOS 9,
starting with HomeKit.
So this morning we announced
a number of great new features
for the HomeKit platform, and
of course, we've been continuing
to enhance the HomeKit
framework.
Certain classes of accessories
can now post push notifications
right to your devices, doors,
windows, and alarm systems.
And we have a new
system UI for managing
who you are sharing your
HomeKit network with.
This is available in Settings,
and we also have an API
so you can bring it up right
in your own application.
But I think the really
interesting things are
event triggers.
These let you set up "if this,
then that" sequence of actions
so that you can turn
the lights off
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
so that you can turn
the lights off
when you activate the alarm
system or, more importantly,
turn the coffee machine on
first thing in the morning.
I think these are going
to be really powerful.
Now we have some
new things in Search
that I think you are
probably all interested in.
The big news for iOS
Search, of course,
is that now it can search
your application content.
We can show rich results, and
when the user taps on them,
take them straight
to the appropriate spot
in your application.
There are three ways that you
can index your application data.
Core Spotlight lets you
explicitly index all
of the application
content that you have.
Now, some of that data may only
be valid for a limited time,
and so you can optionally
provide an app indexing
extension, which Core
Spotlight will call out to
at the appropriate time to
make sure your indices are
up to date.
Another way of indexing
data is via NSUserActivity.
This lets you index things that
the user is seeing or doing
in your application so that
they can easily find them
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
in your application so that
they can easily find them
and get back to your
application.
And finally, if your
app is mirroring content
from your website, you can
mark up that website content
so that our web crawler
will find it, index it,
and provide results when
searching locally on the device.
Now, you saw this morning
that when you swipe
into the new Search UI,
we already have a series
of suggestions ready
and waiting for you
without you having
to type anything.
Well, Siri can even
suggest results from deep
within your application.
If you are using
NSUserActivity to make a note
of what the user is doing in
your application, Siri can take
that into account when coming
up with these proactive
suggestions.
You've probably noticed we
are building a lot on top
of NSUserActivity here.
There's Handoff, of course,
universal links,
Search, Suggestions.
I think this is a great
example of how we are building
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
I think this is a great
example of how we are building
on our existing foundations
to make your apps
and the whole ecosystem
much richer.
Now, the big news this morning,
of course, was Multitasking,
and we are really excited
to bring this to you today.
There's slideover
and split view,
which lets you pin apps
side by side so you can work
on them at the same time.
I could almost hear many of you
wondering out loud this morning,
oh, gosh, what do I have
to do on my application now
to take advantage of this?
Well, the answer hopefully
is not very much at all.
You may remember last year
we introduced this notion
of Adaptive UI, which is
a really simple concept.
It says that instead
of designing your UI
for fixed screen sizes, instead
pick a layout and then adapt it
to changes in your
window bounds.
And we provide a number of
tools to help you with this.
There's Dynamic Type
for laying out your text
and autolayout constraints
for adapting your UI
to changes in bounds.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
for adapting your UI
to changes in bounds.
Now, we recognize, of course,
that a single layout is
not going to make sense
across all these
different screen sizes,
so we introduced this
abstraction called Size Classes,
which try to allow you to
choose the appropriate layout
without having to resort
to a device-specific check.
This works by broadly
categorizing sizes,
screen sizes, into two
buckets, regular and compact.
You can see here that the iPad
has a regular horizontal size
class, and the iPhone has a
compact horizontal size class.
We are using these
exact same mechanisms
for the Multitasking UI.
You can see here in the slide
overview it has a compact
horizontal size class.
And when I tap to pin those
two apps in split view,
we adjust the bounds of the
main app there on the left
but keep it with a regular
horizontal size class.
Now, the interesting thing,
when you resize that split view
to 50/50, is not only do we
adjust the window bounds there,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
to 50/50, is not only do we
adjust the window bounds there,
but we also change the
size class of the main app
on the left from
regular to compact.
We found in our own applications
that this is the best layout
for these window dimensions.
So if you have already adopted
Adaptive UI in your app,
there's actually very
little left for you to do.
You need to use a
launch Storyboard,
which is a more flexible
replacement for default pings,
and then declare that you
support all orientations,
and that's it.
As Craig mentioned this morning,
we have literally converted apps
in a matter of minutes and
had them up and running.
It's really great.
Now, this morning we also
announced Picture in Picture,
which lets you continue
to watch videos over top
of your other applications.
And if you are a
video application,
it's really straightforward
for you
to take advantage
of this yourself.
The first thing you
need to do, of course,
is just support background media
playback, and then you need
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
is just support background media
playback, and then you need
to enable Picture in Picture
support in your controller.
We have this built into
our standard media playback
controls, and we provide an AV
Picture in Picture controller
to give you very
fine-grain control
over entering Picture
in Picture.
Next I'd like to
talk about the Mac.
So OS X, of course,
has had Multitasking,
multiple windows forever,
and this year we announced some
new window management features
that make that even
more compelling.
The key element of
this is the ability
to tile two applications
side by side in a split view,
as you can see in
this screenshot here.
Now, adoption of
this is really easy.
Any resizable window can partake
in this, and the key thing
for you to do there is to make
sure that your window lays
out nicely in both
narrow and wide geometries
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
so that it can play
along well with just
about any other window
in the system there.
And of course, we've added some
new APIs and behaviors to AppKit
to make this really
easy for you,
such as automatically
hiding the sidebar
in the NS split view controller.
Another key development on
the Mac is the introduction
of Force Touch, which we
released a few months ago.
So Force Touch opens up a
whole new dimension of Touch,
with pressure sensitivity and
haptic feedback on the Trackpad.
We use this in our own apps
for things like text lookup,
pressure-sensitive playback
controls, and finer control
over drawing, preview,
and markup.
Naturally, we've added some
API for you to take advantage
of this in your own application.
You can set the pressure
configuration for the kind
of Force Touch that you want,
configure the default haptic
feedback on the Trackpad,
and have pressure changed
events delivered to your views
and gesture recognizers.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and gesture recognizers.
We think Force Touch is a
really exciting new capability
for the Mac.
We can't wait to
see what you guys do
with it in your own apps.
Now, I have some updates
on iCloud for you.
Last year we launched iCloud
Drive, and this is a great way
for accessing all
of your documents
across your mobile devices,
your computers, and the web.
Now, on iOS, apps provide
a great in-app experience
for managing your documents,
and we think this simple
app-centric model is great
for many people out there.
But Mac users, of course,
are accustomed to working
with their documents
directly in iCloud Drive.
So for those folks, we are
adding a new application
to iOS, the iCloud Drive app.
[ Applause ]
This is what it looks like.
You can see all of
your documents
in iCloud Drive organized
just as you would expect.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
in iCloud Drive organized
just as you would expect.
You can preview many document
types right in the application.
And naturally, you can open a
document in its own application.
Now, prior to iOS 9, the only
way of accomplishing this was
to copy the document into
the application container,
and of course, that's
exactly what you don't want
for documents in iCloud Drive
or any other document
provider for that matter.
What you really want is to be
able to edit those documents
in place without
moving or copying them.
So if you are a document-based
application,
here's what you need
to do to enable that.
You need to support
file coordination
since there may be multiple
processes trying to access
that document simultaneously.
And then you just add this
key to your Info.plist
and handle the application
open callback.
And that's it.
iCloud Drive is built on top
of CloudKit as are, in fact,
many of our iCloud services.
CloudKit is a public API,
and when we launched it,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
CloudKit is a public API,
and when we launched it,
we said it was going
to be basically free
for probably most of you.
Well, what does that
actually mean in practice?
Here's what you can store in
iCloud Drive for free right
from the get-go with no users.
As you add more users
to your application,
these grow commensurately up
to these maximum data limits.
Now, we've built a
CloudKit Dashboard for you
that shows you all of the
key metrics at a glance
and includes a line below which
everything is free for you.
We project that line a short
ways out into the future
so you can have some advance
notice before crossing it.
And if you do cross that
line, we've published a clear
and simple pricing guide here.
You can also find tons of
other interesting information
about CloudKit here, such as
the details of a new feature
that we are launching this
year, CloudKit Web Services.
This is basically a full-on
CloudKit implementation
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
This is basically a full-on
CloudKit implementation
for web apps.
Anything that you can do
with the native CloudKit
API you can now do via JSON.
We provide a JavaScript
library that mimics as closely
as possible the native
CloudKit API.
And we provide a web login flow
so that you can authenticate
your users securely.
We think this is going
to make it really easy
to write a CloudKit-based
web application
to run alongside your
iPad, iPhone, and Mac apps.
And that's what I
have for CloudKit.
Thank you.
[ Applause ]
Now I'd like to hand off to
Chris Lattner, who is going
to tell you something
about Swift.
[ Cheers and applause ]
>> CHRIS LATTNER: All right.
Thank you.
Thank you, Toby.
The response to Swift
has been amazing,
and it's really actually
hard to believe
that it was unveiled
just one year ago.
When we talked about Swift
back then, we talked about some
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
When we talked about Swift
back then, we talked about some
of its major features
like performance, safety,
and also the interactivity
of Playgrounds.
But we want Swift to be
pervasively available
to everybody, and so we are
open sourcing the compiler
and the standard library
and even kicking things off
with the Linux board as well.
[ Applause ]
This project is going to be run
under an OSI-approved permissive
license starting later this year
after we wrap up
work on Swift 2.
We look forward to working
even more closely with you all,
incorporating both your ideas
and your code contributions
as well.
So there's been a lot written
about Swift, and we're not going
to go through all the quotes,
but one of our favorite was
when Swift topped the list
of Stack Overflow's most-loved
programming languages.
And there's a lot of
reasons to love Swift; right?
One of which is it's built
to run your applications
at top speed, and
over the last year,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
at top speed, and
over the last year,
performance of Swift code
has grown tremendously
as we've implemented new
optimizations in the compiler.
And there is a ton
new in Swift 2
with improvements
across the board.
But let's talk about a few of
these, and maybe we will start
with one of the most
requested features,
a new error handling model.
So error handling is a
very well-known field.
There's a lot of
known approaches.
But all these approaches
have very well-known problems
as well.
So we weren't satisfied with
any of the existing approaches,
so we came up with something
we think will feel familiar
but will solve these
kinds of problems.
So let's talk about it now.
It starts out simple.
A method or function in
Swift can now be marked
as being able to
produce an error.
This greatly simplifies
many common Cocoa APIs
and also allows the
compiler to verify
that you are handling
your errors properly.
Swift now has familiar catch
syntax to handle errors
and uses powerful pattern
matching to enable you
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and uses powerful pattern
matching to enable you
to express rich cache
conditions.
Swift uses the try keyword
but uses in a different way.
It uses it to mark
calls to methods
that can produce an error.
This defines the way an entire
class of errors that happen
when you have unanticipated
control flow by making
that control flow
explicit in the code
so you can reason about it.
Of course, it's easy to throw
an error, like you'd expect,
and Swift is the perfect way
to define your own categories
and families of your own
custom error conditions.
It works really great.
Now, we think that Swift --
error handling in Swift
will be super familiar
and will feel really natural,
but it will also greatly
increase the quality
of Swift code across the board.
Let's move on now and talk
about our next big feature,
availability checking.
So we introduce great
new APIs all the time.
Often you want to adopt
some of these new APIs
to get new capabilities in
your applications as well,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
to get new capabilities in
your applications as well,
but you can't always
drop support
for the previous OS
X or iOS release.
Now, this brings a challenge
because some symbols you want
to use are unconditionally
available,
where other symbols are
conditionally available
depending on the OS that your
app is running on in the field.
With Swift 2, handling
this is a breeze.
Now if you try to use a symbol
that's conditionally available
without checking for it,
the compiler will
produce an error message,
so you can't mess it up.
And Xcode goes even farther
by giving you several
great ways to handle this.
The first is you can use the
new "if available" statement
to add a fine-grain check
right where you need it.
[ Applause ]
But I think even
better than that is
that you can also mark an entire
method or even an entire class
as dependent on new OS features.
This eliminates the need
to do fine-grain checks
and directly expresses
many common situations
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and directly expresses
many common situations
that you will have in your code.
It's a great solution that
works really well together.
Now, the error handling features
and the availability
checking features
of Swift 2 are two great
ways to make it easier
to write correct code.
But, of course, we want your
code to be beautiful as well.
So we've done several great
changes in Swift 2 to help this.
Swift 1 introduced a number
of really powerful
global generic functions
to do powerful things
with algorithms.
[Laughter]
Powerful and wonderful.
The problem is these were
not always beautiful to use.
Let's say I have a collection of
numbers, I want to scale them up
and drop some of the results.
With Swift 2, this is easy,
but writing it requires
rearranging a lot of code,
and when you look at it, you
have to read it inside out,
which makes it difficult
to reason about.
Swift 2 introduces a new
language procedure called
protocol extensions.
With protocol extensions,
we can now recast these
global functions as methods,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
we can now recast these
global functions as methods,
the way they should
have been all along.
This means that the new code
that you write is beautiful.
It's simple to write,
and it's simple to read.
[ Applause ]
Let's talk about early exits.
It's very common to want to exit
a scope early for some reason,
so maybe you have a method
that takes a parameter
that could be nil, and if
it's nil it doesn't want
to do anything.
Well, the "if let" statement
gives you a very familiar,
comfortable, and great way
to check these conditions,
but it requires you to indent
all your code an extra level.
To solve this problem, we have
introduced a new guard statement
that allows you to
check a condition
and then bail out early.
It allows you to write
nice straight-line code
without the excess indentation.
[ Applause ]
Next, a huge component
of how Swift feels is how
well it works with Cocoa.
But with a plain
Objective-C API,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
But with a plain
Objective-C API,
the Swift compiler
has no idea whether
or not a pointer is nullable
or what the element types
of a collection are.
We solve this by introducing
new features subjective C,
including the ability to express
nullability for pointer types,
and in Xcode 7, we introduced an
entirely new first class generic
system to Objective-C
that allows you
to express element types
and many other things
right in Objective-C.
[ Applause ]
Through the adoption of this and
a whole bunch of other features,
Cocoa is feeling quite swift.
Finally, let's talk about Xcode.
Xcode 7 introduces a
revamped Swift migrator.
It automatically will move your
Swift 1 code to Swift 2 syntax,
including adoption of the
new error handling model.
As we continue to evolve the
Swift language going forward,
we will keep moving the
migrator forward to match.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
we will keep moving the
migrator forward to match.
Next, let's talk
about header files.
But wait. Header files and not
having header files are a huge
feature of Swift; right?
The problem is, is that
sometimes you actually do want
to skim a bunch of code
to understand what it does
at a glance, and all the
implementation details get
in the way.
Well, Xcode has solved
this problem
by introducing a new
Assistant editor,
which gives you a
header file-like view
of an arbitrary Swift
source file.
This gives you all the
skimmability advantages
of a header file without the
maintenance burden of having
to write, maintain,
and edit it yourself.
That's great.
[ Applause ]
Next, rich comments.
Xcode allows you to write
beautiful Swift Playgrounds
by writing rich comments
directly into the Xcode editor.
These rich comments use the
very popular Markdown syntax,
and now we've brought
in the syntax
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and now we've brought
in the syntax
to documentation
comments as well.
We've even added support for new
features, like in-line images
and links, giving you a
great, consistent experience
between Playgrounds
and doc comments.
Finally --
[ Applause ]
-- finally, let's talk about
Playgrounds themselves.
Playgrounds are very
important to us.
We know that they are a great
way to experiment with API
and with the Swift
program language itself.
But we think that Playgrounds
can be a great way for teaching
and learning programming
as well.
And so we have been
adding a number of features
to Playgrounds, including
most recently support
for multiple pages
within a Playground.
To show you this and more, I'd
like to invite up Max Drukman.
[ Applause ]
>> MAX DRUKMAN: Thanks, Chris.
Today I'd like to show you
how easy it is to learn
and to teach using
Playgrounds in Xcode 7.
Let's start off with a
playground that I am developing
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Let's start off with a
playground that I am developing
to teach students about some
of Apple's great graphics APIs,
and in this section,
I am introducing them
to a brand-new one, GameplayKit.
As you Playground says,
GameplayKit is used
to develop the mechanics
that drive your gameplay.
We will see a little bit
more of that in a minute.
This Playground uses a
car chase visualization
to demonstrate some of the
GameplayKit capabilities,
and I have created
this Playground
so that my students have the
ability to edit the bad guy cars
by tweaking these three
Sprite Node variables.
Now, Playgrounds have always had
the ability to show you results
for each line of code.
This year, we added the
ability to add them in line.
So now I can start by editing my
Playground the way my students
will, by changing the values
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
will, by changing the values
to get a different
look to the bad guy.
And you can see that
as I make my edits,
the changes are updated live.
Now that's a bad guy car.
[Laughter] Down here,
I am asking my students
to write a little bit of
code to put the pieces
of the bad guy together
into one Sprite.
So now as I write that code,
it's going to assemble my bad
guy, and now I am ready to go.
That's it for the setup.
Now it's time to move
on to the main event,
which is making stuff move.
Which I have on another page.
Now, pages are a great way
to divide up your Playgrounds
into subtopics, kind
of like a book.
You can navigate to pages using
the handy navigation links
at the bottom of the page.
You can use the jump bar.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
You can use the jump bar.
And of course, the navigator.
Here are all the pages
in my Playground.
Now, each page can have its
own sources and resources,
so you can factor your
Playgrounds exactly the way
you want.
Let's check out the next page.
I am going to put away
the navigator for now.
So this page talks
about GameplayKit's
flocking behaviors.
Now, flocking is what's
going to take the bad guys
and make them move as a unit.
Now, without any further ado,
I am going to open
Xcode's timeline Assistant
and let's meet the flockers.
Okay. Here are all my evil
but colorful bad guys tracing
our intrepid truck-driving hero.
But you can see they are kind
of driving all over the place.
They are a flocking disaster.
[Laughter] Let's see
if we can fix that.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
[Laughter] Let's see
if we can fix that.
So there are several parameters
that you can adjust in order
to govern the flocking behavior.
Here is an interesting
one, cohesion,
which governs how much the bad
guys want to stick together.
I can just play around
with that value
and see immediately
what the effect is.
That's a little bit
too much cohesion,
so I think I will
back that one off.
That's a little better
in terms of spacing,
but they're still not quite
as goal-oriented as I'd like,
so I am going to play
with the seek parameter,
and give that a different
value and see
if I can get a good
flock going here.
All right.
Now they're flocked.
So now I know what kind
of range of values I want
to tell my students
to play with.
I am going to put away the
assistant, and I am going
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
I am going to put away the
assistant, and I am going
to switch the editor
to be in raw markup.
So now I can edit the prose
of my Playground using
familiar Markdown syntax.
And I can go back down
here to the seek parameter,
and I can add a little bit of
extra prose to tell my students
to play around with
these values.
I can even come up here
and add some branding
in the form of an icon.
And now whenever anyone asks
me if I know about flocking,
I say I wrote the flocking book.
[Laughter]
Thank you.
You are very kind.
[ Applause ]
These are just a few of the
authoring features we've added
to Playgrounds in Xcode 7.
Can't wait to see what
you build with them.
And now, to tell you about
some more great Xcode features,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
And now, to tell you about
some more great Xcode features,
I'd like to invite
up Matthew Firlik.
[ Applause ]
>> MATTHEW FIRLIK:
Thank you, Max.
Alongside the new releases
of watchOS, iOS, and OS X,
we are excited to
bring you Xcode 7.
This new release includes
features and support for all
of our platforms, and
the unique experience
of each platform has been
integrated into the tools
to allow you to target and
deliver your applications
to all of our devices.
That unique experience is
where I'd like to start today,
with our design tool
Interface Builder.
And first up is a new
way for you to lay
out your applications
called stack view.
[ Applause ]
With stack view, you can
design your applications using
horizontal and vertical stacks.
You start by selecting
an orientation
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
You start by selecting
an orientation
and add your controls.
With each control that you add,
stack view automatically
adjusts sizing and positioning
to give you just the right look.
Stack views nest
beautifully, and they allow you
to change orientations,
so you can get the
precise layout you want.
Now, we've built stack
view on top of autolayout,
and it handles all the
constraints for you.
Zero constraints gives you
control over alignment, spaces,
and distribution in the stack.
And we've made stack view
flexible as well, allowing you
to play with your interface
in Interface Builder
and at runtime.
When you add controls into a
stack view, you can reorder them
to try out different layouts
and, perhaps best of all,
when you hide views at runtime,
stack view automatically
adjusts.
[ Applause ]
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
[ Applause ]
So stack view, a new way in
Interface Builder for you
to get the precise
layout you want.
Next up are Storyboard
references.
With storyboards, you can
create and design scenes
and connect them
together with zero code
to make complete
user interfaces.
Now, knowing that your scenes
and the interfaces evolve
and become a little
bit more complex,
we wanted to make
this easy to manage.
Storyboard references
allow you to take a section
of your interface and move it
to a separate storyboard file,
continuing to make connections
to and from the scene.
[ Applause ]
With storyboard references, you
can keep your interfaces focused
and modular, the same way you
do as your application code,
and still easily design
your application flow.
Now, equally important
to the flow and layout
of your application is the look.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
of your application is the look.
And in this release,
we are making Interface
Builder something to see.
In an upcoming scene,
you will see more of iOS
and OS X's visualizations
brought right
into Interface Builder.
Blurs and shadows will
render inside of the canvas.
Vibrancy will become a standard
part of the presentation.
And your own designable controls
can present masking and shadows
that compose with other
views in your scene.
And the combination of these
visualizations alongside our
preview editors, which allow
you to further refine the look
and layout of your interface
for specific devices,
means that now, more than ever,
you can design your interfaces
right in Interface Builder
and see what your users
will on their devices.
[ Applause ]
Another aspect of building your
applications is managing assets,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Another aspect of building your
applications is managing assets,
and Xcode includes tools
to work with the new
On Demand Resource APIs.
With Xcode 7, you can tag assets
and files throughout
your project
to be downloaded
and used on demand.
Each tag represents a
collection of resources,
and we call these a pack.
And you can apply multiple tags
to any individual resource,
allowing you to use
that resource
in different situations.
To help you manage
your tagged resources,
Xcode includes an overview of
your tags in the Project Editor.
Here you can easily add,
remove, and rename tags.
You can change the
pack contents.
And you can also
configure various aspects
of On Demand Resources,
such as the download order
and the download priority.
When you build your
applications,
your tagged assets are
automatically assembled
into resource packs.
You just need to tag your
assets and you are ready to go.
When you deploy your
applications,
your On Demand Resources will
be hosted by the App Store.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
your On Demand Resources will
be hosted by the App Store.
While you are debugging,
Xcode will host your
On Demand Resources on
your Mac and stream them
down to your devices on demand
to simulate the Store
download behavior.
And for those of you using
Xcode's continuous integration,
your bots will build and host
your On Demand Resources,
so you can easily test your apps
over the air amongst your team.
And finally, to help
you track your packs,
the debug gauges will show
you the progress and status
of all your On Demand Resources
while you are debugging.
So the combination of all these
tools makes it really easy
for you to adopt On Demand
Resources in your apps.
Xcode 7 also has some great new
debugging and profiling tools
to help you further enhance the
quality of your applications.
Sebastien mentioned earlier
our focus on battery life.
Getting the most out of
your devices and getting
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Getting the most out of
your devices and getting
through the day on a
charge is very important.
So we are making it easy for you
to track the energy use
of your application.
The new Energy gauge for
iOS will show you CPU usage,
calls to networking APIs
and location services,
and also show you
your apps' transitions
from foregrounds to
background states.
And as with our other debugging
gauges, you can gain insight
into the behavior of
your application and,
when you need more information,
dive into Instruments
for all the details.
And speaking of which,
Instruments has been updated
to be better than ever.
The track view is now more
compact and makes it easier
to visualize your data
with new fluid gestures.
We also have a number of updated
instruments and new instruments
for you to use, such as
core location profiling
and a new GPU system trace.
And in this release,
we are integrating
Clang's Address Sanitizer
into our debugging workflow.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
into our debugging workflow.
Address Sanitizer is a
memory error detection system
for C-based languages.
When enabled, Xcode will monitor
memory used by your application
and can detect common issues,
such as buffer overruns.
When these are detected,
Xcode alerts you
and provides essential
details to help you diagnose
and debug the problem.
And unlike other similar tools,
Address Sanitizer is fast,
so fast you can use it with all
of your interactive
applications.
But even with great
debugging and profiling tools,
sometimes -- sometimes --
bugs get out and will cause
crashes for your users.
To help you get these
fixed quickly,
Xcode has integrated
support for crash logs.
For your apps submitted to --
[ Applause ]
-- for your apps submitted to
TestFlight and the App Store,
Xcode will provide symbolicated
crashes for each submission.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Xcode will provide symbolicated
crashes for each submission.
The integrated view will
provide an updated list
of the top crashes for your
apps, showing you the backtrace,
details about the crash, and
charts of recent occurrences.
You can retitle the
crashes for easier tracking,
you can add notes to yourself,
and you can mark the crash
as resolved once
you fix the issue.
Now, when reviewing a backtrace,
you want to make it easy,
so Xcode has this
Open In Project button
that will load the backtrace
into the debug navigator
in your project, giving you
the very familiar workflow
to navigate your source
code for the crashing frames
and find and fix the problem.
[ Applause ]
Earlier this spring, we
brought you crash logs
for your iOS applications.
Starting today with Xcode
7, you can get crash logs
for your OS X applications,
and crash logs
for your watchOS applications
will be coming soon.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
for your watchOS applications
will be coming soon.
And in this release, we will
also be bringing you crash logs
for all app extensions on all
platforms, so you can track
and improve those as well.
[ Applause ]
Now, another way to improve your
applications is with testing.
And we all love testing; right?
I am going to try that again.
We all love testing; right?
[ Cheers and applause ]
That's good because
testing is an essential part
to delivering a great
application,
and we have two big new
additions for you this year.
The core of our testing solution
is the XE test framework
and our test navigator, which
combine to provide great ways
for you to design and
organize your tests.
Building on top of these, Xcode
has provided a number of ways
for you to test your
application.
You can test the correctness
of your APIs and measure
and track their performance
over time.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and track their performance
over time.
You can evaluate synchronous
and asynchronous behaviors
in your code, and you can run
your tests locally in Xcode
and continuously on
your integration bots.
And all of this combines
together
to make it a really powerful
unit testing solution.
But we wanted to
kick it up a notch.
So this year we are adding
in user interface testing.
[ Applause ]
This is built on the
same testing foundation,
so now you can develop
correctness
and performance tests that work
through your user interface
to further extend
your testing coverage.
And speaking of coverage,
we are adding that as well.
[ Applause ]
Code coverage is a
great new way for you
to evaluate your
testing progress
and ensure you have
all the tests you need.
So let's take a quick
look at a demonstration
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So let's take a quick
look at a demonstration
of the new UI testing and code
coverage features in Xcode 7.
So here I have an application my
team and I have been working on.
It's a SpriteKit game for iOS
and iOS X called Demo Bots.
As we have been developing
the application,
we have been working on
tests, and you can see here
in the test navigator we have
quite a few of our tests.
It's helpful to know which
area of our code has not
yet been tested either
because we have not
yet written the tests or
because our current tests are
not complete.
With Xcode 7, this is exactly
the kind of insight you can gain
about testing with
code coverage.
Xcode collects coverage
data each time you test,
so I am going to jump
to the testing report,
and we'll see there's
a new coverage section.
Inside of the coverage
section, all of the files
in my application are
listed with an indicator
on the right showing
the percentage
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
on the right showing
the percentage
of coverage each file has.
Now, in this list, I've
sorted it from highest
to lowest coverage, so
all this might look good.
As we scroll down, we see I
have some work to do here still.
For each file in the list,
you can disclose the contents
and see the coverage
of the methods
and functions contained therein.
For the file I've disclosed,
the first two functions
have 100 percent coverage,
but the other three
have no coverage at all.
The report gives you a
good high-level overview
of the coverage for your
project, but we wanted
to bring this information
directly to you.
So we'll see here
when I navigate to one
of the source files, we bring
the code coverage information
right into the source editor.
The darkened areas are
the parts of my code
that have not yet been tested.
And this is great.
It allows me to see
where I need to focus,
and it also works great
with our assistant editor
because you can have your source
code and the tests side by side,
and as you continually test,
get updated information.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
If we go back to the report
and look at the classes
that were not covered, many of
them are user interface classes.
They are view controllers
and the like.
So to help me with that,
I am going to create
a user interface test.
I've already set up a user
interface testing bundle
and class, and so I
will select that here.
The test I want to write is
testing the options panel
of our game.
Now, the new APIs and XE tests
give you the ability to interact
with user interface elements,
accessing their properties
and triggering actions, and
it's really easy to write.
But you know what's actually
easier than writing a UI test?
Recording one.
Let me show you how that works.
I am going to collapse
the project navigator
to give myself a
little bit more space,
and with my insertion point in
the test I want to implement,
I am going to click this
little Record button down here
at the bottom of the editor.
You will see when I do this,
Xcode launches my application.
I want you to pay
attention in the test
in the upper left-hand corner.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
in the upper left-hand corner.
The first thing I want to do
is click the Options button
in my application,
and you can see
that Xcode's recording
inserted the corresponding line
of testing code.
[ Applause ]
You can see it's
pretty simple API.
It's asking the application
for the window and the button
and telling it to click.
For the next piece of my
test here, I want to fill
out my player name here,
so I will start typing
in my standard gamer
handle, and you will see here
that two cool things
are happening.
One, while I am typing
in the text field,
the editor is updating
live to show me the values.
That's pretty cool.
The other thing that happened is
that UI recording
noticed I was interacting
with the same user interface
element more than once,
and it refactored my test to
create a local variable for it.
This keeps the tests readable
and means that refactoring
and reusing this code
later is really easy.
Let's continue selecting
some options here.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Let's continue selecting
some options here.
You'll see that the
test updates.
For the last element, I want to
select a different robot here,
so I will click this button
and select the blue robot.
I will click Done, and go
back and look at my test.
Just like that recording
was easily able
to track all of my actions.
Now a test should probably do
more than just poke things.
We should probably
validate some values here.
So before we click the Done
button, let's insert some code
to check that I have
the right settings.
I am going to insert a little
bit of code here that gets
that value of the active
robot from the interface.
It's in a text field.
I will compare it to the
value I expected, blue bot.
With no further ado,
let's run our test.
I will stand back.
Here we see it's
updating the values,
selecting the right robot,
validating our test,
and our test passes.
[ Applause ]
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
[ Applause ]
So recording makes writing
UI tests really easy.
We can look at our tests inside
of Xcode and run them locally,
and we can also have them run on
our continuous integration bots.
I have bots set up for both
the iOS and the OS X version
of the application, and if I
select my iOS version and look
at my tests, when I filter
down to the failed ones,
I see something interesting.
I have four user interface tests
that are passing on an iPhone
but they are failing on an iPad.
To make it easy to
diagnose your test failures,
each test run includes
a transcript
of all the testing actions that
took place, and I can see these
by expanding the test.
Here's a list of all the
actions that took place
in this test that's
been running.
Because user interface
tests are visual, though,
we wanted to go a step further.
So each one of these actions
includes a screenshot.
So if I look over at
the passing iPhone test,
I can get the screenshot
for how the test starts,
there's the start of our game.
I can look partway
through at the test.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
I can look partway
through at the test.
Here we've typed a value
into the text field
with the keyboard up.
And I can also look at
the very end of the test.
Here's the state of the UI just
before we click the Done button.
If I flip over and look
at the tests on the iPad
and jump all the way
to that last action,
we can see here's our iPad UI
with all the right settings,
but there's no Done button.
Clearly we misconfigured
something in our UI,
but the screenshot was
able to help us narrow
that down really quickly.
So code coverage showed
me where to write tests.
UI testing and recording
made it easy
for me to get more coverage.
And the new testing
reports helped me narrow
in on my problems.
That's just a little
bit of code coverage
and UI testing in Xcode 7.
[ Applause ]
User interface testing makes use
of our systems' accessibility
APIs
to interact with UI elements.
This means if you've
already invested
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
This means if you've
already invested
in making your app accessible,
you've also already invested
in making it testable.
UI testing also works great with
UIs designed with size classes
and localizations, including
right-to-left support,
which means your
UI tests can scale
to all presentations of your UI.
The new APIs -- yes,
you can clap for that.
[ Applause ]
The new APIs and XE tests
are available in Swift
and Objective-C,
and UI recording will generate
whichever one you are using.
So these have been some of the
many new features you will find
in Xcode 7 as you build your
apps for all of our platforms.
[ Applause ]
Now, we also have some exciting
new technologies to share
with you for games and
graphics, and for that I'd
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
with you for games and
graphics, and for that I'd
like to invite my
colleague up, Geoff Stahl.
[ Applause ]
>> GEOFF STAHL: Thank
you, Matthew.
Xcode 7 looks amazing, and I
am really excited to be here
to talk about games
and graphics.
It does not seem
like so long ago
that we were here introducing
our first game-related API Game
Center or social gaming network.
Since then, we've been busy
with APIs for key input
and graphics technologies, like
game controllers, SceneKit,
SpriteKit, most recently Metal.
This year, we are introducing a
number of new game technologies,
including things like Model I/O
for modernizing your graphics;
ReplayKit for extending
your game's social reach;
and Game Center for
adding super-smart brains
to your games.
Also, we're packaging all of
these together into GameKit,
so with one include, you
get all this functionality
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
so with one include, you
get all this functionality
at your fingertips,
a great solution,
great complete solution
for game development
across our platforms.
So let's take a look at some
of the new cool features
in GameKit, starting with Metal.
So as you know, Metal is a 3D
API we introduced last year
for iOS.
We are really excited that this
year we are bringing it to OS X.
And let's, as a recap,
look at an example we ran
into just recently of
game development moving
from OpenGL to Metal.
So here you have a game
when we started out on --
running on OpenGL,
and what you notice is
that OpenGL command processing
is keeping one CPU completely
saturated, which has pushed the
game play to the second CPU,
and the GPU is not very busy.
In fact, in this example, we
saw the GPU was 40 percent idle.
Moving to Metal, we
drastically reduce the command
processing time.
We have an API that you
can program more directly
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
We have an API that you
can program more directly
to the GPU.
And if you choose, you can
split your command processing
across multiple cores.
In this case, we've unlocked
the full performance of the GPU.
We've been working with some
third-party developers on Metal
for OS X, and let's
take a minute
to see what they've
done on the Mac.
So it's my pleasure to
introduce David McGavran
from Adobe Systems up to
show what they've done
with OS X in Metal.
David, welcome.
[ Applause ]
>> DAVID McGAVRAN:
Thank you very much.
Thank you.
Adobe was very excited
when Apple brought iOS
to Metal's iOS last spring.
In fact, we have already been
using it to optimize some
of our popular iOS apps like
Premier Flip and Photoshop Mix
and seen great results.
So we were thrilled when Apple
told us they were bringing Metal
to OS X.
This way we can share our code
on both of those platforms.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So today I would like to
show you some of our progress
with two of our flagship
Mac OS products,
Illustrator and After Effects.
Let's take a look.
So here we are in the current
version of Adobe Illustrator
with a massive piece of artwork
with over 300,000 points,
gradients, and blend modes.
I want to go ahead and do a
zoom on this piece of artwork.
You are going to notice a bit
of lag as it works its way
into that complex piece of art.
We really wanted to see if
we could do something better.
So with this version, we are
demonstrating what we can do
with Metal when we take
the entire rendering engine
and put it on Metal.
In fact, it was so
performant, they were able
to demonstrate a brand-new
feature with continuous zoom.
So here you can see the
results are amazingly different
when you are looking at
what we can do with Metal.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
when you are looking at
what we can do with Metal.
In fact, I can zoom all the
way in here extremely far
until you can actually
read the text.
[ Applause ]
So this just drastically
changed the way artists can work
with our products.
Now we want to talk
about After Effects.
So here is a video
from extreme sports
videographer Devinsupertramp.
Let's take a quick look at this.
[ Music ]
That's pretty fun.
What we want to do is
take one of those shots
into After Effects
and add some effects
into it to make a promo spot.
So here we are.
I am going to turn on some
color correction, some ripples,
and some lens flares,
and go ahead and play
that back without Metal.
And you can see while it's
working quite hard on the CPU,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
And you can see while it's
working quite hard on the CPU,
it's not really giving
that interactive performance
we'd love to be able
to give our compositors.
So we really challenged our
programmers to take a look
at this and see what
they can do with Metal.
So in a very short
time, they were able
to port those three effects
to Metal, and they are going
to show you a pretty
astounding difference.
So now I am going to switch
to the same composition,
but this time those
effects have been replaced
with the Metal versions, and
I am going to play that back.
Immediately you are going to
see it's playing in real-time
without dropping any
frames, and we are seeing
that in these effects up to
an 8x performance improvement
and drastically reduced
CPU usage.
So Adobe is committed
to bringing Metal to all
of its Mac OS creative
cloud applications,
such as Illustrator and After
Effects I showed you today,
as well as Photoshop
and Premier Pro.
We are very excited to
see what Metal can do
for our Cloud users.
Thank you very much.
[ Applause ]
>> GEOFF STAHL: David,
thank you.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> GEOFF STAHL: David,
thank you.
That is amazing.
I agree with Craig.
It's really interesting to see
when you can take an interaction
that's not real-time and move it
to a silky smooth
user interaction.
But Metal is not just
for applications.
As we spoke about this morning,
we are bringing key graphics
technologies for both OS X
and iOS onto the Metal API.
Up to now, technologies
like core animation
and our hardware
accelerated Core Graphics,
PDF and Safari page rendering,
have been on top of OpenGL.
That meant the CPU was very
busy feeding the GPU commands.
With Metal, we get improved
efficiency and performance
for critical user-level tasks.
Also, we didn't want to just
increase the system performance;
we wanted to actually make
it really easy for all
of you to adopt Metal.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So we are introducing MetalKit.
So MetalKit is your
gateway to Metal.
It does the heavy
lifting for you.
MetalKit can do anything from
set up your rendering loop
to load 3D models
via its interaction
with Model I/O from your artist.
In fact, MetalKit can
load meshes directly
into Metal buffers and set
up pretty much everything you
can do to render your scene.
We didn't stop there.
Metal performance shaders
are high-performance image-
processing shaders for
Metal applications.
These are GPU-based parallel
compute shaders individually
tuned for all of our GPUs,
again, making it really easy
for all of you to adopt Metal.
So now you have a great
rendering pipeline,
but you need great content.
So we have Model I/O.
Sometimes loading and working
with 3D models can be tricky.
So that's what Model
I/O comes in.
It makes it simple
to load content
from all the file formats you
expect directly into SceneKit
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
from all the file formats you
expect directly into SceneKit
or Metal with this
interaction with MetalKit.
But where it sets itself apart
is the way it handles lighting.
Model I/O offers
state-of-the-art, raytraced,
materials-based lighting
solution for you.
What this really means
is it offers amazingly
realistic graphics.
Let's take a look.
So let's start with a baseline.
This is just a model
with textures loaded.
This is not really interesting.
It's actually pretty flat,
and I think the '90s is
calling this model back.
If we strip away the textures
and apply a global
illumination solution,
what we see is something
that immediately
looks more realistic.
We add the textures back
in, and we get something
that looks really good.
Finally, of course,
even with Model I/O
and its lighting solution, you
can add your own lighting in
and shadows to get something
that looks amazingly realistic,
giving you those amazing
results, from ordinary
to extraordinary, Model I/O
provides you a great solution
for your models and
for lighting.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
for your models and
for lighting.
So now you have a great
rendering pipeline,
you have great models,
great content,
you need great Gameplay.
So we have GameplayKit.
Gameplay is divided
really into two areas.
We have navigation or how you
move your objects in your game.
We also have your strategy
or how your game thinks.
GameplayKit has solutions
for both of these.
GameplayKit is a component-based
API that is a full coverage
of solutions for Gameplay.
Whether you need to track the
gold from each of your players
or need to navigate your bad
guys from point A to point B
and avoid the obstacles
in between,
or you need to generate
deterministic random numbers
for your next online
multiplayer game,
Gameplay has solutions for you.
From very simple
games to very complex,
we think Gameplay will do a
very good job helping you guys
develop your games.
We started out with a
social piece, Game Center.
ReplayKit can extend
the social interaction.
ReplayKit allows your players
to record video replays
of their gaming exploits.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
of their gaming exploits.
They can save them or
they can share them.
It's hardware accelerated,
it's high performance,
and it's absolutely brilliantly
easy for you guys to adopt,
really easy to add it to your
game, and maybe it's the thing
that kicks off that next
viral marketing campaign
for your game project.
Finally, SceneKit and SpriteKit.
We have some great updates
for SceneKit and SpriteKit,
but I think the most
interesting,
maybe the most compelling,
is we have built tools
for both SceneKit and
SpriteKit directly into Xcode 7,
and I'd like to hand
it off to my colleague,
Jacques Gasselin de Richebourg,
to show you these tools
inside of Xcode 7.
[ Applause ]
>> JACQUES GASSELIN
DE RICHEBOURG: Thanks.
So last year with Xcode 6, we
introduced visual scene editing
for SpriteKit right
inside Xcode.
This year we are
kicking it up a notch
and have added animation
editing right inside Xcode.
So here we have a little
cut scene I am making.
It's got this cute red panda
and he is about to make his way
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
It's got this cute red panda
and he is about to make his way
across the two chasms
to the other side.
These are the base
animations I have on there.
I want to layer these into
the animation we expect.
First off, obviously,
you need to make him run.
So let's go down
and grab an action.
Now, here we have a
really interesting action.
It's a reference action,
which means it actually
lives in a different file.
It lives in this
actions.sks file.
I am referencing here so I
can make composed actions.
So I am going to grab that.
Of course, it's just as
easy to just copy and paste.
I am going to do that a couple
times, and naturally it appends
after on the same track.
Let's have a look at
that, and we are going
to use the scrubber this
time, so it's a timeline
with a scrubber, and you can
basically preview what the
animation is going to
look like at any time.
That's not just a
great preview tool;
it's actually great
for editing as well.
So you noticed he didn't
jump over the chasms,
and that's a problem and
something we are going to fix.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So let's figure out a
good spot for him to jump.
Looks like here is
about perfect.
And then we are going to grab
a move action, and we are going
to build the jump right here.
Now, notice how it's
snapping to the timeline.
Okay. I am going to
make it go up 100 units,
and then I am going to
use the timeline yet again
to figure out the apex.
I want it to be about there.
Now I am going to
snap the duration.
Okay? It's that easy.
Obviously, when he jumps,
I want him to ease out.
That means he is going to have
a lot of velocity at the start
and very little at the apex.
Going to copy and paste
that and reverse it.
Minus 100.
That's going to ease in so
that he has some hang
time up at the apex.
Okay. Let's scrub across that.
That's a parabolic jump.
Perfect. Now, obviously,
it looks a bit weird
if he just runs in the air,
so let's add an animation.
You notice here I am layering
animations on top of each other,
and SpriteKit handles
that for you.
Here I am going to snap
the animation to the length
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Here I am going to snap
the animation to the length
of the jump, and I am going
to hop into the media library
and fetch my jump frames.
So I am filtering jump, select
that one, select all of them,
drag and drop, perfect.
All right.
Now scrub.
Okay. That's a cute jump.
Now, there are two
chasms, so I am going
to have to do this twice.
The easiest thing to do here is
to turn this into a reference,
so I am going to select
all those, right-click,
convert to reference, and I
am going to call that Jump.
That's going to store
in the actions.sks file
that I used the run from before.
All right.
Great. That's a jump.
I'm going to use the scrubber
again to lay down another jump.
The excellent thing about
references is they turn up here
in the object library
straightaway.
So I am just going to
drag and drop that in.
And here we should have
our final cut scene.
Okay. Jump and jump.
Great.
[Applause] Okay.
Thanks.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
The great things
about references is they also
help you structure not just your
other scenes, but
your code as well.
So here using SK action
named, I am going to load
up those very same actions,
and I am going to hook them
up to the left and right keys
and the space bar for jump.
Let's run that and
have a look at it.
So here we have the panda
doing the cut scene.
Excellent.
Now I am going to control it.
Look at that.
Left and right.
Jump, jump.
It's that easy.
[ Applause ]
Now, we've been busy,
so we took this
to the next dimension as well.
So new in Xcode 7, we have
full visual scene editing
of 3D content using
SceneKit as well.
This editor is very much like
the 2D editor for SpriteKit.
Same things work.
Drag and drop to
add new content.
You can use the manipulators
in 3D to drag it around.
You can snap to other objects.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
You can even duplicate.
And we have physics
and animations live
inside the editor.
So we are going to now animate.
Perfect. And let's
interact with this as well.
So I am going to play.
And we got the same cute
panda from the previous demo.
Here we go.
[Music] This is a red panda,
not a fox; a red panda.
I am going to walk up,
and I am going to interact
with that little
block I had before.
Oh, yeah, kick you down.
Great. So in this game, you
are playing a little red panda
that collecting flowers
and pearls and -- oh.
Sorry about that.
I promise, no pandas
were hurt in the making.
Okay. This is not just
a demo for stage here.
We are actually handing this
out to you, and this is a sample
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
We are actually handing this
out to you, and this is a sample
that you can use to build your
next 3D SceneKit game using this
editor, SceneKit, and Xcode 7.
Thank you.
I am going to hand
you back to Andreas.
[ Applause ]
>> ANDREAS WENDKER: All right.
Let's quickly review the
technologies we discussed
in this session.
We announced new versions of
our OSs today, and of course,
the new native watchOS SDK.
The SDKs contain many new APIs
you can take advantage of,
particularly for gaming.
The new optimization
techniques with App Thinning.
A new version of Swift that
advances your apps faster
and at the same time lets you
write more effective code.
And of course, this new
version of our Xcode now
with user interface
recording and testing.
And all these technologies
will be available for download
from the WWDC attendee
portal this afternoon.
We hope that you will
quickly adopt them all
and create even more
innovative apps for users.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and create even more
innovative apps for users.
Please go ahead and
install all this stuff
and let us know what
you think about it.
Also we are doing something new
with our TestFlight
service for you this year.
Starting today, you will be
able to deploy apps written
to the iOS 9 SDK in TestFlight,
and over the next several weeks,
it's even going to add
support for App Slicing,
on-demand resources, and Bitcode
so you can test your apps even
with those new technologies
before we launch iOS 9 later
this year.
[ Applause ]
So there are, of course, many
opportunities to learn more
about these technologies
here at the conference.
There's more than a hundred
sessions you can attend,
numerous labs where you can go
and get help with your projects.
In fact, there's more than a
thousand Apple engineers here
on site to answer
your questions.
So I hope you enjoyed
this session.
I hope to hear from
you later this week.
[ Applause ]