WWDC2015 Session 107

Transcript

X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> Jason Beaver: Good afternoon.
Welcome to What's
New in Cocoa Touch.
I'm Jason Beaver, I'm one
of the senior engineering
managers in the iOS group.
There is a lot that's
new to discuss in iOS 9,
but first I want to take a look
back at some changes we've made
over the past few years.
Since iOS 6 we've been laying
the groundwork for major change
in how applications are built
for iOS and we've been rolling
out the changes in
each major release.
In iOS 6 we added autolayout.
This allows you to
easily build a dynamic
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
This allows you to
easily build a dynamic
and versatile interface
that responds to changes
in available size and
screen orientation,
as well as localization.
In iOS 7 we introduced
dynamic type
and this lets you achieve
beautiful typography
in your application in a variety
of users' selectable text sizes.
And coupled with autolayout,
your interface will adapt
automatically to changes
in the user's selected text size
and by adopting dynamic type,
the system can make improvements
to the text rendering and layout
that your application
will inherent for free.
In iOS 8 we made what is
perhaps the biggest change.
We introduced the
concept of adaptivity.
This is the concept that
your application should adapt
to changes in the
running environment.
Fundamental to this
concept is size classes.
Size classes drive
structural changes
to your applications'
user interface based
on whether space is constrained
in a particular dimension
or not.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
in a particular dimension
or not.
In the settings application
shown here this is the identical
code running on both
an iPad and an iPhone.
The difference in size classes
driven structural changes
to the user interface.
Size classes drives the
adaptivity of many parts
of the iKit view controllers,
view controller presentations,
search results, action
sheets, and many more,
all of these adapt to the size
class and to the available space
that they have to work with.
When we introduced the concept
of adaptivity last year it
wasn't clear why we were moving
in this direction.
In applications we're
always full screen
and they generally did not need
to change their user interface
structure while the application
was running.
With the introduction
of multitasking,
the reason for the
changes is now clear.
You're no longer building two
distinct experiences tailored
to specific pieces of hardware.
You're providing a continuum of
experiences tailored to the size
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
You're providing a continuum of
experiences tailored to the size
that your user is
running the app at.
Adopting these foundational
technologies,
not only is your app ready for
the entire range of hardware
that we ship, it is also ready
to support these new
multitasking features
and developers that we worked
with that had already
adopted all of these changes
and in some cases literally
only needed a matter of minutes
to get their app ready
to run seamlessly
alongside our own apps.
One special case to deal
with is picture-in-picture,
if your application supports
background media playback you
can use new API in iOS
9 to enable your player
to support picture-in-picture.
Keep in mind that with
picture-in-picture as well
as all of multitasking,
your application needs
to be a good citizen to make
sure it is not interfering
with the overall
user experience.
We have a number of sessions
on multitasking this week,
Getting Started with
Multitasking on iPad,
Multitasking Essentials
for Media-based Apps
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Multitasking Essentials
for Media-based Apps
and Optimizing your
App for Multitasking.
Let's move on and talk about
things that are new in iOS 9.
The first is autolayout.
We're introducing a new UI
layout guide class in cases
where you would have
used extra views
in your autolayout constraints
you now can use layout guides
to avoid cluttering up
your view hierarchy.
There are two layout guides
we're introducing by default,
layout margins guide and
readable content guide.
We can use these
properties on UIView
to allow more expressiveness
in defining
where within a view
content should be drawn.
These supersede layout margins
that we've introduced in iOS 8
and you should use layout
guides in any new code you write
and as you revisit old code
you should replace margins
with layout guides.
I want to take a moment though
to talk about the second case,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
I want to take a moment though
to talk about the second case,
the readable content guide.
When you have large
bodies of text,
the length of the line
has a great deal to do
with the readability and
legibility of that text.
If the lines are too wide, as
we see here, it's harder to see
where a line begins and ends and
it can be difficult to continue
onto the next line
when you're reading.
Conversely, if the lines
are too narrow your eye has
to travel back too frequently
which interrupts
the flow of reading.
The optimal line length is
dependent on many factors,
including the font, the
size, and line spacing,
but by using the
readable content guide,
you can be assured that your
text will be readable regardless
of the user's text size
or the available space
that your application
has to render the text.
Let's talk about how
layout guides can be used
to reduce the number of
views in your application.
Let's say we have a view that
shows a number of planets
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Let's say we have a view that
shows a number of planets
and we want to show them all
centered in their parent view
or show them spread
out along the x axis.
Before using autolayout
constraints we would've had
to introduce a number
of extra views
into the view hierarchy
to do that.
We can now simply
define the layout guides
that define the proportion
of the space
that the view is
using when laid out.
We also will create an
additional set of constraints
that will center the views
and then simply by changing
which set of constraints is
active we get a nice animation
like this.
In this case we're changing
all of the constraints
at the same time but if
we instead change the set
of constraints at different
times using a keyframe animation
we can achieve a number of
very interesting effects.
We're also bringing the
StackView over to iOS in iOS 9.
This allows you to manage
a set of sub views as stack
that can be arranged either
horizontally or vertically,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
that can be arranged either
horizontally or vertically,
the StackView uses autolayout
under the covers but manages all
of the constraints for you.
It allows you to adjust
the spacing alignment,
it can allow you to adjust
how the views are distributed
within the available
space either equally
or proportionately.
StackViews also can be nested.
You can have a StackView
as an element
in another StackView allowing
you to create very rich layouts.
These layouts will
automatically adjust
to the content being
displayed as well
as to the available
space that's there.
So we also introduced
shortcuts bar.
This appears over the software
keyboard or at the bottom
of the screen when the
hardware keyboard is attached.
This bar contains stock controls
for things like bold, italic,
underline and cut,
copy and paste.
It can also be customized
in your application.
Anything that conforms to the UI
text input protocol can return
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Anything that conforms to the UI
text input protocol can return
an input-assistant
item using this method.
UI text input-assistant
item is a new class in iOS 9
and has two properties
for specifying the leading
and trailing bar-button groups
so you can easily add your
own elements in there.
Now, as you saw in the State
of the Union, we no longer have
to have the entire
interface for your application
in a single storyboard.
You can now link from one
storyboard to another.
We also offer a way
now to unwind segues.
[ Applause ]
Also as you saw, we now have
full right-to-left support
in the operating system.
If you have a localization
for a right-to-left language
and have linked to your app
on iOS 9 this will
happen automatically
in your app as well.
All UIKit controls will
automatically reverse.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
All UIKit controls will
automatically reverse.
Notice that navigation
is reversed,
all of the animations
are reversed,
layout of table view cells,
even controls like sliders
and switches are all reversed.
Collection view will
automatically reverse its layout
and will flow from
right-to-left.
Here we see that the swipe to
the leader mark requires a swipe
to the right and marking
as unread requires
a swipe to the left.
There's two properties
on both view controller
and view called semantic
content attribute
that let you specify how
content should lay out.
By default, everything
will flip automatically,
but you can override the
properties to customize
that if it makes sense
in your application.
There is also a method to
get the layout direction
for a particular
content attribute.
The semantic content attribute
has one of five values,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
The semantic content attribute
has one of five values,
the first is unspecified.
This is the default.
Unless you have a reason to
change what UIKit is doing,
this is the value you
should leave it at.
If you have a set of controls
that represent playback controls
like play, fast forward
and rewind,
you can use the playback
content attribute.
If you have a set of controls
that represent or result
in some sort of directional
change in your UI,
so let's say text alignment
controls, left, center,
right in a text editor,
you can use the spatial
content attribute.
Finally, there is
two additional values
that lets you force
either left to right
or right-to-left layout.
Now, for images, UIKit can't
know whether an image should be
flipped by default in all cases,
things that represent user
content like photographs,
shouldn't be flipped but
things that represent controls
in your application
you may want to flip.
So we have a property
or a method on UI image
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So we have a property
or a method on UI image
from right-to-left to allow
you to tell us that you want
to use a flipped image
and also a property to ask
if that image has been flipped.
I encourage you all to come
to the new UIKit support
for international
interfaces and find
out how you can make your app
support right-to-left languages.
For accessibility you
now have the ability
to use more speech
voices like Alex as well
as high-quality voices
in AV speech synthesis.
I encourage you to come out
to iOS Accessibility to find
out more about how to
make your app accessible.
You also saw that we're
introducing a number
of new text editing
gestures in iOS 9.
These gestures can be
performed either on the keyboard
at the bottom of the screen
or in the text area itself.
They allow you to move the
insertion point indirectly,
they allow you to
select a word, sentence,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
they allow you to
select a word, sentence,
or paragraph very easily just by
multiple taps and also allow you
to easily extend an
existing selection.
Coupled with the shortcut bar
we introduced, it allows you
to easily access, copy and paste
without moving up to the text.
Text interaction manipulation
has never been easier or faster.
Now, there's nothing you need
to do in your application
to adopt these text
editing gestures,
but if you have customized
the gestures in a text view
within your application,
you should make sure
they're not conflicting
with the new system
text gestures.
In iOS 7 we added support for
application to find keyboard,
hardware keyboard commands.
App could have their own
commands like command-N
to create a new document.
This was a benefit
to power users
who already knew these
commands but there was no way
to discover what
these commands are.
In iOS 9 we're introducing a new
keyboard command discoverability
HUD and if you press
and hold the command key
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
HUD and if you press
and hold the command key
on the hardware keyboard and
hold it for just a moment,
the HUD will appear and
show you the set of commands
that are available at
that particular moment.
Notice that this is
context dependent.
Depending on the state
of your application
at that time you'll get a
different set of commands.
So, instead of seeing the
entire set and some enabled
and some are disabled, you
only see the ones that apply.
All you need to do to adopt
this in your application is
to set the discoverability title
for each of your key commands.
UIKit will take care
automatically to figure
out what commands are applicable
in that moment in time.
Let's take a moment to
talk about touch events.
When a touch is tracking
on the screen there
is some inherent lag
between where the finger is
and any resulting drawing
that you see on the screen.
It simply takes time to
scan the hardware to do all
of the intermediate
processing and drawing
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
of the intermediate
processing and drawing
and then flush the
results to that display.
This is called the
touch-to-display latency.
Most users will never
notice this latency
under most circumstances.
It's perceptable when the user
moves their finger fast enough.
Despite the fact that iOS
already has industry-leading
touch-to-diplay latency,
in iOS 9 we set
out to dramatically
improve this latency.
When a touch is tracking
on the screen,
UIKit is notifying
the application
with each screen refresh so
that the application can update
its state.
In some of our more
recent hardware,
the touches are updated more
frequently than the display is.
So there is now a new
method to get access
to any intermediate touches
since the last display refresh.
For drawing applications this
can result in much smoother
and more accurate drawing.
As a first step to improving
that touch-to-display latency
we now offer touch prediction
as well.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
This uses a sophisticated
algorithm that looks
at the touches velocity,
acceleration and curvature
to predict where
the touch is headed.
That lets you reduce the
apparent latency for drawing
where the touch will be rather
than where the touch is.
[ Applause ]
We went much further than that.
In addition to offering
touch prediction,
we have made a number of changes
throughout the entire software
stack to reduce the latency
from over 60 milliseconds which,
as I said, is already
industry leading,
to less than 30 milliseconds.
[ Applause ]
We have also introduced a number
of changes in UIKit dynamics.
The first of these is now
we support non-rectangular
collision bounds.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
[ Applause ]
In addition to rectangular
we now support elliptical
and path-based collisions
and this results
in much more realistic
collision interactions.
We also support a wide
variety of field behaviors.
We now have linear and radial
gravity like you see here,
spring, drag and velocity, or
noise and turbulence fields
like you see here, electric
and magnetic fields as well.
We also support the
ability for you
to define your own
field effects.
Finally, we have added a number
of new additional
attachment types.
Before we had a very simple
way to attach two objects
and if you wanted to
constrain their movement
in some way you had to set up a
number of external constraints.
Now with these additional
attachment types,
it can dramatically simplify
how you build your UI
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
it can dramatically simplify
how you build your UI
dynamics model.
We're also adding the ability
to animate the blur radius.
[ Applause ]
To let you receive beautiful
effects here like you see here,
moving into and out
of spotlight.
We've also made a number
of EPI changes in iOS 9
to optimize their
use with Swift.
This is taking advantage of
the better expressiveness
that Swift offers and to improve
compile-time type checking.
For nullability, you can now
specify whether properties
actions and return
values can be nil or not.
We have gone through our entire
API and defined where it's valid
to have nil arguments
or nil return values
or whether it is
guaranteed not to be nil.
We also have lightweight
generics.
This is a lightweight form
of type parameterization
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
This is a lightweight form
of type parameterization
that lets us better express
Cocoa and Cocoa Touch APIs.
For example the sub views method
on UI views can now be typed
to return an array of UI views
instead of an array of just ids.
You can find out more about
this in What's New in Swift.
New in iOS 9 is the
ability for notifications
to allow text input for the user
or from the user in the same way
as quick reply for
text messages.
[ Applause ]
The UI user-notification
action class has a new behavior
property and if set to
the text-input behavior,
your notification has
a text field allowing
for a quick reply.
[ Applause ]
There is also an additional
action parameters dictionary
that allows you to customize
the title of the Send button.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
that allows you to customize
the title of the Send button.
You can find more at
What's New in Notifications.
We're also introducing a new
SF Safari view controller
and this allows you
to display web content
in a native app surrounded by
all the key Safari UI elements
that are already
familiar to your users.
It even supports
advanced features
like Reader and autofill.
You can find more at Introducing
Safari View Controller.
For iOS 9, we're introducing a
number of new extension points
that allow you to extend other
applications in the system.
For VPN we have three
new extension points.
There is a packet-tunnel
provider which allows you
to build the client side of
your own VPN-tunneling protocol.
There is an app-proxy
provider that allows you
to implement the client side
of your own custom transparent
network proxy protocol
and a filter-control provider
and filter-data provider
that allows you to
perform dynamic on-device
content filtering.
For Safari there's two
new extension points.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
For Safari there's two
new extension points.
Shared Links lets your
application specify content
that should appear in the
shared links users feed.
The content blocking extension
allows you to identify subsets
of content or resources
on a page
and prevent them from showing.
For Spotlight, there's now an
extension that allows the system
to index your application
data and it can also do this
in the background so you don't
have to fire up your application
to re-index your apps content.
You can find out more at
Introducing App Search.
Finally, audio units or core
audio plug-ins that can act
as musical instruments, audio
effects or audio generators,
and until now iOS
users were limited
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and until now iOS
users were limited
to the built-in audio
units provided by Apple.
In iOS 9 we added a
new extension point
and you can bring your
audio units to iOS.
[ Applause ]
You can find more at
Audio Units Extensions.
Many of you have asked, and
I'm now pleased to announce
that we've delivered an all-new
Swift and Objective-C API
for interacting with contacts.
[ Applause ]
Thank you.
You can find out more
at Introducing the Contacts
Framework for iOS in iOS 10.
For Wallet and Passkit, you can
now provision cards from scratch
in bank and merchant
applications
if you have a special
entitlement.
You can also suppress ApplePay
from coming up in situations
where it would interfere
with your application.
So say your application
needs to display a barcode
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So say your application
needs to display a barcode
in an environment where
there's an NFC terminal present.
Normally that would
make ApplePay appear.
You can suppress that so that
your barcode can be scanned.
In core location, for
apps linked against iOS 9
or later there are some changes
to how the background
location tracking works.
There is also a new bit of
API on CL Location Manager
to request a one-time
location update
if you don't need ongoing
updates to location,
this is a much more
efficient way
to get the user's
current location.
[ Applause ]
For MapKit there is several
new features with a map view,
the first is access
to 3d Flyover View.
[ Applause ]
You can now show traffic as
well as the compass and scale
and create your own
custom callouts now.
[ Applause ]
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
In HealthKit you have direct
real-time access to sensor data
in Watch OS 2 and there is
a number of new data types.
Tracking users intake of
water, tracking their exposure
to sunlight and reproductive
health.
There is new APIs for
better device tracking
of HealthKit data, better
support for deleting data
and a new workout session
API to track exercise.
We announced ResearchKit a
few months ago to make it easy
for developers and researchers
to create apps for
medical research.
ResearchKit now includes iPad
support, new active tasks
for PureTone Audiometry
which is used
to identify hearing
threshold levels
and also a simple
reaction-time task
to measure a user's
reaction time to an event.
There is also a new image
capture step to use.
As you saw yesterday, we made
a number of changes to HomeKit.
We now support detailed
change notifications.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
We now support detailed
change notifications.
When a light gets turned on,
you now get a specific
delegate message identifying the
characteristic and the
accessory that changed.
So, you may get the particular
light turning on instead
of just the generic something in
your house changed notification.
HomeKit also has four predefined
action sets when you wake
up in the morning, when you
leave, when you arrive home,
when you go to bed, and you
can implement standard actions
with these.
Things like turning off the
lights, locking the doors,
making sure the garage
is closed.
Things like that, all
without requiring any
user configuration.
In iOS 8 you can create a
timer trigger to allow you
to execute a scene
on a schedule.
In iOS 9 you can now create
much more complicated triggers.
For example, you know,
when this door is unlocked
and this motion sensor
is detecting motion
and it is 30 minutes
after sunset do something.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and it is 30 minutes
after sunset do something.
Of course HomeKit is available
on the Watch and you can use it
to control devices within your
home directly from your watch.
Wi-fi enabled accessories can
now be accessed remotely even
if you don't have an Apple TV.
You will be able to
communicate with them directly
through iCloud
and bluetooth-enabled
accessories can be automatically
bridged over Wi-Fi.
Even if you are out of Bluetooth
range with an accessory,
you can still control it.
Finally, we've included
standard definitions of a number
of new items within your house.
For CloudKit, we have updated
the limits and the pricing
for public databases but much
more exciting we have now
offered CloudKit Web
Services to integrate
with your web applications.
[ Applause ]
There are two sessions
discussing CloudKit this week:
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
There are two sessions
discussing CloudKit this week:
What's New in CloudKit and
CloudKit Tips and Tricks.
So until now the UI document
and action controller
would create copies
of every document
sent to another app.
Now, with open in
place, apps can request
to open documents directly
if they're stored in iCloud.
Adoption consists of adding
a key to your info.plist
to declare conformance
in implementing a
new delegate method
to opening the document
directly.
We'll discuss this at
Building Document-Based Apps.
Now, many applications contain
resources that are not needed
when a user first
starts your application,
including every resource your
app might ever need makes your
app bundle significantly larger,
slows down the installation
and can even push you
over the OTA limit
which can impact your sales.
So to address this we're
introducing on-demand resources.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So to address this we're
introducing on-demand resources.
Your application is uploaded
as single package exactly
like you do today, but content
can be downloaded dynamically
when the application needs it.
These assets are intelligently
cached using heuristics based
on your input as well
as user behavior.
In Xcode, assets are grouped
together using simple tags
and we're introducing a new
class called NS Bundle Resource
Request that allows
you to request all
of the resources
with a given tag.
Once those resources have
downloaded you can use familiar
API to access them, things
like UI Image, image named.
There is even a way
to test this in Xcode
to simulate not having the
resources and requesting them
from the server, Xcode
will act as the server
and deliver those resources
to your application.
Now another new technology
we're introducing
to address your application
size is app slicing.
Now, applications often have
assets and executable content
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Now, applications often have
assets and executable content
that aren't relevant
to the hardware
that the user has
installed it in.
This slicing takes care of
delivering just the resources
and executable slices that
that user's device needs.
Using app slicing the App Store
will generate automatically
and deliver tailored
variants of your applications
to all devices running iOS 9
and these will only
contain the resources
and executable slices
pertinent to that device.
This happens on the server
so the extraneous content
isn't even downloaded.
This saves installation time
and can hopefully even bring
the app under the OTA limit.
We're also introducing a
new NS data asset class
which allows you to easily fetch
content tailored to the memory
and graphics capability
of your device.
Now for Game Center we have
added guest players which allows
for new configurations in
Game Center multiplayer.
We've also unified the Game
Center server environment
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
We've also unified the Game
Center server environment
to streamline development
and testing.
Finally we're introducing
ReplayKit
which provides a new way
to share game experiences.
It allows your apps to easily
record audio and visuals
and share them with other users.
In iOS 9 SpriteKit
is now Metal-backed
on systems that support it.
It is open GL on systems
that don't support Metal.
This happens automatically
without your involvement.
There is also an all
new action editor
and of course it is
integrated tightly
with on demand resources.
You can find out more at
What's New in SpriteKit.
For SceneKit there is an
all-new scene editor available
in Xcode 7 that supports
particles, physics,
actions, and much more.
There is also a ton of new
features in SceneKit itself.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Things like scene transitions,
audio nodes, Model I/O,
ambient occlusion and light
maps and many other features.
Now while SpriteKit and SceneKit
are fantastic frameworks
for building the graphical
interface for a game,
there is far more to a great
game than just the graphics.
Games have entities and
components, they have agents
which are autonomous
objects within your game
that have goals and behaviors.
You need path-finding algorithms
to allow your agents to navigate
around your game and you need
AI for allowing the agents
to decide what moves
they're going to make.
All of the elements you need
to build really engaging
gameplay can be found now
in GameplayKit and
you can find out more
at Introducing GameplayKit.
Finally, of course, we're
introducing WatchOS 2.
In addition to direct access
to existing frameworks,
there is now new frameworks
such as Watch connectivity
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
there is now new frameworks
such as Watch connectivity
for connecting to your app
on the phone and WatchKit
for building Watch
face complications.
You can find out more
at Introducing WatchKit
for WatchOS 2 for an overview of
all of the sessions this week,
the Talk About WatchOS 2.
For more information, there is,
of course, the documentation,
there is the online forums,
developer technical support.
Thanks. I hope you
have a great week.
[ Applause ]