Transcript
[ Music ]
>> Hi everyone.
I'm Conor Hughes, an engineer
on the Accessibility
Team here at Apple.
And welcome to What's
New in Accessibility.
Today we're going to learn about
some cool new assistive features
in our operating systems and
learn about what you need
to know to make sure your
apps are usable by everyone.
So let's get started.
So accessibility is a big word
and many are not
familiar with it.
What does it mean?
To us, accessibility means
making a system usable
for everyone, with all
of their unique needs.
This means both designing
in accommodations for people
with different needs
and being careful
about your basic
design to avoid issues.
This being WWDC, we're going
to focus on accessibility
on Apple platforms
where we're concerned
with four major classes
of ability: Motor,
with four major classes
of ability: Motor,
vision, hearing and learning.
Motor encompasses
things like Parkinson's
and Multiple Sclerosis and
affects things like the ability
to push with force on the screen
or accurately tap targets.
Vision accommodations
encompass both blindness
and partial vision loss.
Hearing covers hearing
loss of various gradations.
And learning, which includes
things like dyslexia and autism.
Our operating systems can do a
lot to provide accommodations
in all of these areas.
But apps are a core part of
our platform experiences.
So for our platforms to be
truly accessible environments,
we need you to make sure your
apps are accessible as well,
which is why we're so happy
that you're here to learn
about how to do just that.
So let's go over what we're
going to talk about today.
First we'll go over some of
the new assistive features
in Mac OS, iOS, watchOS and tvOS
so you can see some
concrete examples
of how software can
provide accommodations
for people with different needs.
Then we'll take an iOS app
as our example and audit it
Then we'll take an iOS app
as our example and audit it
for accessibility issues.
Then dig into the
APIs that you can use
to make sure your
apps are accessible.
And finally, we'll
use those APIs
to fix the issues we
identified in our audit,
including using some new API
we think you're going to love.
So what new assistive features
do we have to show this year?
Well, there are quite a few, so
let's go through and take a look
at what's new in
each major area.
First off, motor.
So some people may only be able
to interact with their devise
through pressing a
single physical button,
for instance a switch on
a wheelchair headrest.
They can interact with the
rich touch interface of iOS,
for instance, through Switch
Control, which presents a cursor
which moves through the
interface elements on screen.
When someone wants to
interact with the element
under the cursor, they
press their switch.
Switch Control lets people with
motor disabilities do everything
from simple taps to
complex gestures.
iOS and OS X have had Switch
Control for several years.
So what's new?
Well, we're excited to announce
that we're bringing
Switch Control to tvOS.
Now, people who use
Switch Control can use it
to directly interact
with the tvOS interface.
The familiar Switch interface
is available with that cursor
that highlights elements
onscreen.
Or an alternative
interface is available
with an onscreen remote.
And what's more,
people can do this
without repairing their switch.
Someone using Switch
Control on their iOS device
or their Mac can immediately
start controlling their TV
totally autonomously
without any external help.
We think this is a
huge improvement.
And we can't wait for
people to try it out.
Next up, Dwell Control.
There are a number of
assistive technologies
that track someone's focus when
they're using their computer.
So they may take the form
of tracking a reflective dot
on a headband, or actually
tracking eye movements.
Dwell Control is a
new feature in Mac OS,
which integrates
support for these devices
by allowing them to
control the mouse.
When the mouse dwells
on a certain location,
Dwell Control presents
a timer [inaudible].
And when a timer expires,
it invokes an action,
And when a timer expires,
it invokes an action,
for instance a mouse click.
This allows someone to control
their Mac without having
to manipulate the mouse.
And of course, it's customizable
with custom actions,
just like Switch Control.
So next, let's look at vision.
So, for many years Mac OS and
iOS have supported features
like invert colors
and gray scale
to help people increase
the contrast of the content
on their device, deal
with light sensitivity
or work around color issues.
This year we're expanding
that support
across Mac OS, iOS and tvOS.
We've added color
adjustments to help people
with color blindness, which
can help something that appears
like this, appear
more like this.
And we've also added the ability
to tint the entire
display a certain color,
which can significantly increase
reading ability in people
who find reading black
on white text difficult.
So in watchOS, someone using
VoiceOver can have the time
spoken aloud to them
by raising their wrist
or tapping on the watch face.
But sometimes they may want
to discretely check the time
But sometimes they may want
to discretely check the time
without disturbing
those around them.
So in watchOS 3 we're
introducing Taptic Time.
Taptic Time is a VoiceOver
feature that uses a series
of distinct taps from
the Taptic engine
to help someone tell time
silently and discretely.
So iOS has a lot of
features that help people
with visual disabilities
discover and interact
with the content
on their device.
But what about their
physical environment?
In iOS X we're introducing
a feature called Magnifier.
Magnifier is available
from anywhere in iOS.
And it lets someone use
their device's camera
to magnify objects in
their physical environment.
And I'd love to show
that to you now.
Okay, so I'm going to
turn on Magnifier by going
into Settings, General,
Accessibility, Magnifier
and flipping the switch.
Now I can turn on Magnifier
by just triple clicking
the Home button.
So here we are.
So, if I want to read what's
on this bottle of medicine,
So, if I want to read what's
on this bottle of medicine,
for instance, I can
just zoom in quite far.
I have access to things
like the camera torch
for a low light environment.
I can lock focus on a
certain focal length.
And in case I have trouble
keeping the device steady,
or I'm pointing at
something far away,
I can take a freeze frame
here with the Center button.
And we captured this image
at the device's native camera
resolution, so we can zoom
out and in, move around.
And Magnifier also supports
various color filters
to help someone increase
the contrast.
So for instance, maybe
I find this grayscale,
maybe inverted grayscale,
increase the contrast a bit.
So that's pretty
easy for me to read.
Okay, so that's an example
of Magnifier on iOS X.
Okay, so what's new in hearing?
iOS has supported hardware
TTY devices for some time.
TTY technology allows
someone with hearing loss
to hold text conversations
over standard telephone calls.
to hold text conversations
over standard telephone calls.
Here is what an existing
TTY machine looks like.
As you can see, it's
fairly large
and probably difficult
to carry around.
But it's incredibly important
for someone who's deaf,
especially when you factor in
relay operators who translate
from TTY to voice, allowing
someone using this machine
to contact businesses,
services and family and friends
who may not be using one.
We thought it would be
great if the functionality
of this device was
available to everyone
without any additional hardware.
So new in iOS X anyone can
play software TTY phone calls
in a familiar interface
without having
to use any additional hardware.
These calls work with legacy
TTY technology and make it easy
to dial a non-TTY number through
your carrier's relay service.
Plus, it has built
in TTY-specific quick type
predictions so it fits
in with existing
TTY user culture.
And finally, learning.
Dyslexia is an incredibly common
learning disability affecting
the ability to read
and write of millions
of people around the world.
In iOS X we're excited
to bring a number
In iOS X we're excited
to bring a number
of enhancements designed to
help people with dyslexia.
We've implemented
improvements to Speak Selection
and Speak Screen to help people
better understand the text
that's already been entered.
And we've implemented new
audio feedback for typing
to help people immediately
catch mistakes.
And I'd love to show
that you now.
Okay, so I've turned on typing
feedback features already
in Accessibility
Speech Settings.
So I'll just fire up Notes.
And what I'm going to
do here is I'm going
to type the word
Welcome and I'm going
to pause after typing the O.
And I've turned on
Character Feedback
where iOS will read the last
character that I typed back
to me after a pause
so I can be sure
that I didn't enter a visually
similar character like an E.
>> E.
>> So, it may be difficult
to hear a little bit,
but iOS just read
back an E to me.
So that wasn't correct.
>> O.
>> Great. Now, I'm going
to keep typing, excuse me,
and iOS is going to read
back each word to me,
because I've turned
on Word Feedback.
So I can get immediate feedback
as to whether the word I've
entered is the one I wanted.
>> Welcome to our session.
>> Okay. So that's
a quick example
of enhanced typing
feedback in iOS X.
So, hopefully I've whetted
your appetite as to the kinds
of accommodations
software can provide
to make sure the entire
experience is accessible
to everyone.
So if you're new
to accessibility,
you may be wondering about
how you can put similar
accommodations into your apps.
And if you've used the
accessibilities APIs before,
maybe it's time for a quick
refresher before we jump
onto some new API.
So, how do we use the
accessibility APIs
to make sure our
apps are accessible?
Well, the first step is to
audit our app for accessibility.
Traditionally, the best way
to do this was just to fire
up an assistive app like
VoiceOver or Switch Control
and see what happened.
You'll want to check that
all your interface items
are exposed.
That everything that is
exposed has a good label
so someone using
VoiceOver can identify it.
so someone using
VoiceOver can identify it.
You'll want to check that
someone using your app
with VoiceOver can do everything
that any other use can do.
And finally, you're designing
the accessible experience
for your app.
So you'll want to make sure
that someone using your app
with VoiceOver, for
instance, has just as great
and smooth an experience
as anyone else.
So this year we're also excited
that there's a brand new
accessibility inspector that's a
lot of more powerful and
can help you immediately
catch mistakes.
If you want to learn more about
this inspector I encourage you
to check out the
session Auditing your App
for Accessibility on Wednesday
to learn more about it.
For this session we're going to
focus on auditing with VoiceOver
because it's still the best
way to get intimately familiar
with accessibility
of your app's flows.
Okay, so let's do it.
Let's take a look at a live app
and find its accessibility
issues.
So, for this presentation
I'm going to be using iOS,
but the same basic flow
applies to all of our platforms.
Okay, so the first thing I'm
going to do is add VoiceOver
Okay, so the first thing I'm
going to do is add VoiceOver
to my accessibility
triple-click shortcut by going
into Accessibility Settings.
And all the way at the bottom,
Accessibility Shortcut,
turn on VoiceOver.
Okay, so let's fire up
our app now, DogRoutePro,
a pro app for selecting
routes to walk your dog.
And turn on VoiceOver
by triple-clicking the Home
button and Select again.
>> VoiceOver on.
DogRoutePro.
Routes.
>> So, there are two, excuse
me, there are two ways
to navigate your app
when VoiceOver's on.
You can take one finger
and pan around on screen
and VoiceOver will select
and read what's underneath
your finger.
Or you can swipe with one
finger, right or left,
to move forward or back
through the list of elements.
So let's see how accessible
these table cells are.
>> Avenue Loop.
0.8 miles.
>> So, immediately I
noticed that I know
that if you activate this cell,
we can go somewhere on our app.
But nothing conveys that
to a VoiceOver user.
Let's keep going.
>> [Inaudible] button.
>> So this button has a label
that's derived from the name
>> So this button has a label
that's derived from the name
of the image I used
to initialize it.
That's the framework's
falling back
on the only information
I gave it.
So we'll have to give
it a better label.
All right.
So let's activate this cell.
To activate an item
with VoiceOver,
when the item is
selected you'll want
to double-tap onscreen
with one finger.
>> Avenue Loop.
0 -- Avenue Loop.
Routes. Back button.
>> Okay, let's go
through this UI.
>> Avenue Loop.
Avenue Loop ratings.
Heading. Distractions.
Smells. Greenery.
Friends.
>> So for this ratings graph,
the graph values are
totally inaccessible
to someone using voiceover.
Let's keep going.
>> Avenue Loop route.
Heading. Broadway and Ridgeway.
Hazard. Boys' Choir on Ridgeway.
41st and Broadway.
>> So, because I gave these
annotations good titles,
the frameworks were able to use
that as the accessibility label
for the map annotations.
So that's good.
Let's move on to this Steps tab.
>> Steps. Text selected.
Step 41st.
-- Broadway and Ridgeway.
Turn right at Broadway
and Ridgeway.
Boys' Choir on Ridgeway.
Boys' Choir between Gilbert
and Montgomery on Ridgeway.
Continue along Ridgeway.
Be careful of cars and children.
>> So, VoiceOver was
able to read all the text
in these table view cells.
But you'll notice that these
cells that represent hazards
on the route are not
differentiated at all
for someone using VoiceOver even
though a sighted user can see
that there are hazards from the
red text and distinctive icon.
Okay, let's jump
out to the slides
and discuss how we
can fix these issues.
So, what did we learn
about our app?
First, those route list cells.
The Favorite button has that
odd label that's derived
from the name of the
image we're using.
In addition, it isn't clear
that the table cell itself can
be activated to go somewhere.
Now, for the ratings graph,
those graph values were totally
inaccessible to VoiceOver.
And finally, in the route step
list the cells representing
And finally, in the route step
list the cells representing
waypoints that were hazards
are not differentiated
from the other steps
on the route.
So how can we fix
these problems?
Well, to understand how, we need
to understand how VoiceOver
is able to access information
from our interface and drive it.
VoiceOver interacts with our app
through the UIAccessibility
protocol.
So let's take, as an
example, that Favorite button
in the app we just saw.
Assistive apps like
VoiceOver need
to ask your app's user
interface items questions.
So for instance, when Voiceover
is on and someone touches
that button, VoiceOver
asks the button, "Hey,
what kind of thing are you?"
And the button says,
"Well, I'm a button."
VoiceOver then asks,
"Who are you?
What's your name?"
That button should respond
with something like "Favorite."
Finally VoiceOver asks,
"Where are you on screen?"
And the button replies with
"Its screen space frame."
This is the information
VoiceOver needs to announce,
"Favorite, Button" and draw
a cursor around the element.
"Favorite, Button" and draw
a cursor around the element.
So these messages from
assistive technologies
to your app take the form
of method invocations.
The methods are all part of
the UIAccessibility protocol.
By implementing the methods
of this protocol you
make your custom view
hierarchies accessible.
So standard UI controls
have accessibility baked in.
But depending on how you
configure them, you may need
to set or override
certain properties.
So what properties?
Let's take a look.
Now, there are a lot of methods
on the UIAccessibility protocol
that you can use for
fine-grained control
over how your interface
items expose themselves
to accessibility.
But for most work you need
to only concern yourself
with a few.
Let's go through the basic
properties one by one.
First, there's
isAccessibilityElement.
This property determines if
an interface item is serviced
by an assistive technology
at all.
So for instance, if you have
an image view that's background
decoration, you can
leave this as False.
However, if it represents a
photo that someone uploaded
to your service, you'll
want to set this to True
so someone using VoiceOver
can discover the image,
focus on it and learn about it.
So now your interface element
is surfaced by VoiceOver.
You'll want to give it a name
so that users can identify it.
That's what the accessibility
label is for.
The label is a concise
identifier of the element.
For instance Send
Message or New Alarm.
So how does someone know that
they can activate your element?
New Alarm doesn't convey
that the item is a button
in the same way that its
visual styling might.
That's where the
accessibility traits come in.
Accessibility traits declare
to the assistive technology
what your element is
and how to interact with it.
So for instance, a button
should have the trait
UIAccessibilityTraitButton.
An adjustable control
like a slider would
have the trait
UIAccessibilityTraitAdjustable.
The traits are a bit masked,
so an element could
have more than one.
For instance, a selected
button should have both
UIAccessibilityTraitButton and
UIAccessibiltyTraitSelected.
The Button trait is what
makes VoiceOver speak "Button"
so someone knows they can
activate your element.
Next there's the
accessibility frame.
Next there's the
accessibility frame.
This determines where
your element is onscreen
and is used both for hit testing
and for drawing a cursor
around the element.
Now, for UI views you usually do
not have to set this explicitly
because it's computed for you
based on where the view is
in the view hierarchy.
But you can modify it if needed.
Just remember that it is
in screen coordinates.
And finally, there's
the accessibility value.
This is useful for
elements that have some sort
of conceptual value
associated with them.
So for instance, a slider
returns its current value
in percent through the
accessibility value.
An on-off switch
returns on or off.
And if you had some
sort of custom control
that controlled the
magnification level
of an instrument, you'd want to
convey that magnification level
through the accessibility value.
Okay, so what do you do
if you do custom drawing
where each logical piece of your
interface doesn't map directly
to a UI view?
When you need to do that,
use UIAccessibilityElement.
UIAccessibilityElement objects
represent logical accessible
regions onscreen.
And they interact with the
UIAccessibility protocol just
like your views do.
You can use them, for instance,
to expose each individual piece
of a control that's built
as a single UI view.
So to do so, make the element
whose pieces you want to expose
into an accessibility container
by setting its accessibility
elements property to an array
of UIAccessibilityElement
objects
that represent each
individual piece of the control.
So, UIAccessibilityElement
is a natural choice to expose
that graph we saw earlier where
I was doing some custom layers
to present the graph bars.
But if you're a very astute
listener you may have noticed a
slight snag.
UIAccessibilityElement
objects aren't views,
so we need to set their
accessibility frame manually.
However, those graph bars
were inside of a scroll view,
so their frames and
screen coordinates change.
Now, to rectify this
we could subclass,
or we could set accessibility
frame after each scroll.
or we could set accessibility
frame after each scroll.
Or, in iOS X we can
set accessibility frame
in container space.
When you set this property to
a container-relative frame,
UIAccessibilityElement will
automatically follow its
container's position onscreen.
So those are the
most basic methods
in the UIAccessibility
protocol along
with UIAccessibilityElement.
And you'll get most of the
way there with just those.
So what are the takeaways?
First, know that the methods
in UIAccessibility protocol are
how you expose your interface
items to accessibility.
Through the methods of this
protocol your app answers
questions about what
its interface items are
and what they can do.
Now, for many classes
support is baked in,
so there's nothing
for you to do.
But you may need to set
or override properties
to get the behavior you need.
Finally, if you have
pieces of your interface
that don't map directly
to views,
use UIAccessibilityElement.
So, let's go in and use
what we just learned
to make our app accessible.
The first thing we're
going to do is we're going
The first thing we're
going to do is we're going
to fix the fact that that
table view cell didn't convey
that it was able to be
activated through VoiceOver.
So, the way we're going to do
that is, in its AwakeFromNib,
all we're going to do is "Or"
in the trait
UIAccessibilityTraitButton.
Done. All right, now
let's fix the label
for that Favorite button.
What we're going to do here
is when we set the route
for this cell we're going
to grab the Favorite button
and set its accessibility
label property to say Favorite
and then the name of the route.
I'm putting the name
of the route in there
so it's unambiguous which
route I'm going to Favorite
if I activate that button.
All right, let's move
on to that graph we saw.
So, like I said, I'm doing
some drawing here with layers.
So we're going to use
UIAccessibilityElement.
Now, what I want to do
here is first I'm going
to add some ivars to keep track
of the UIAccessibilityElement
objects.
So, just four ivars, one
for each bar of the graph
So, just four ivars, one
for each bar of the graph
that are
UAAccessibiliytElements.
Next I'll add a short
little helper function
to help me translate
from a numeric rating
into a user-friendly string.
So, rated X of 5.
And then, for each of
these didSet callbacks
for these rating
properties I'm going
to set the accessibility value
of the appropriate accessibility
element to a friendly string
that describes the rating.
So, for smells -- oops.
Put distractions first.
Smells. Greenery.
And friends.
So again, just grabbing
the correct element,
setting its accessibility value
to be the value string
for the rating.
Okay, now we need to
actually set the ivars
to be valid accessibility
elements.
So in AwakeFromNib here, for
those four ivars I'm just going
to set it equal to a
UIAccessibilityElement
to set it equal to a
UIAccessibilityElement
with our self as the
accessibility container.
Now, we're going to set the
labels for these elements
to just be the text from the
label that's already there,
the UI label, I should
say, that's already there.
And then finally, we'll set
the accessibility properties
of our self, because we want to
be an accessibility container,
to just be those four elements.
So now we have valid
accessibility element objects
with a valid label,
or a good label,
and a good accessibility value.
Now we just need
to set their frame.
So what we're going to
do is when we're laying
out the graph bars, we'll
just grab the appropriate
accessibility element and
set its accessibility frame
in container space.
Okay. Then the last thing we
needed to fix is the label
of the cells in the
Route List view.
So, I'll go to the
Route Step cell.
And when we set the
appropriate waypoint
that that cell represents,
what we're going
that that cell represents,
what we're going
to do is add some code
to manually set the
accessibility label
to be the waypoint's name and
then the waypoint's description.
And then if it's a
hazard we'll add "Hazard."
And then the hazard description.
Okay, so before we build
and run this, let's go back
to the slides and think about
whether there's anything
that a sighted user
can do easily
that a VoiceOver user still
will have trouble with.
All right.
So again, first, that route map.
VoiceOver users can discover
the steps on the map, however,
the route order is lost because
VoiceOver is reading those
elements in a spatial,
top-down, left to right order.
In addition, someone who's
sighted can easily see the
hazards on that graph by
the distinctive icons.
But someone using VoiceOver will
have to navigate through all
of the waypoints on the graph
to discover all the hazards.
Now, for the route step list,
there's a similar problem
Now, for the route step list,
there's a similar problem
in that it's easily
skimmable for a sighted user
who can scroll through quickly
looking through their cells
with the distinctive red
text and distinctive label.
But for a VoiceOver user,
it's not as skimmable.
They have to go through
each individual cell.
So how can we fix these issues?
We're going to use the rotor.
The rotor is a VoiceOver feature
that lets someone navigate
across an interface by searching
for elements of a given type.
So one of the built-in
rotors, for instance,
is the Headings rotor,
which allows someone
to navigate an interface by
jumping between the headings.
To use the rotor you take two
fingers and twist onscreen
as though you're
manipulating a physical dial.
Do that to select the rotor
setting to use, and then swipe
up or down with one
finger to search previous
or next with that rotor.
So this year we're
introducing the ability for you
to add your own custom
search rotors.
We hope you'll take
advantage of this
to add a totally new
navigational power to your apps
when people are using
them with VoiceOver.
On iOS, to add your
custom rotors,
On iOS, to add your
custom rotors,
set the Accessibility Custom
Rotors property on a super view
that contains the elements
through which you wish
to search, or that super
view's view controller.
You'll want to set
this to an array
of view accessibility
custom rotor objects,
which are initialized
with a name,
and also an item search block
that's called each time someone
swipes up or down to
search for the previous
or next match to for that rotor.
All right.
Let's switch back over
to the demo machine
and see this new API in action.
Okay. So, let's first
add a Hazards rotor
to the Route Step
View controller
so someone can easily navigate
to the appropriate table cells
that represent hazards.
So, I'm going to open up the
Route Steps View controller.
And I'm going to set this
property inside of AwakeFromNib.
So basic skeleton here.
I'm going to make a Hazards
rotor an accessibility custom
rotor object with the name
Hazards and a callback.
And I'm going to set the
accessibility custom rotor's
property to be just that rotor.
And inside of the callback here,
the search block, I'm just going
to get our reference
to our data model here,
an array of waypoints,
and determine whether we're
going forwards or backwards
by inspecting the
predicate that we're passed.
Next, we're going to find where
we need to start our search.
So, by default, if
there is no selection
in the table view right now,
we're going to start
either before the first item
or after the last item,
depending on whether we're
going forwards or backwards.
So we go into the list of cells.
Otherwise, if we
have a current item,
we're going to get it as a cell.
And then start at the
appropriate row that represents
that item in our data model.
Okay. Then we're going to search
through our data model
with this loop here.
And we're going to search
until we find a match.
And we're going to search
until we find a match.
And a match is just a
waypoint that is a hazard.
So if we find a result, we're
going to get an index path
for that result, scroll to
that row in the table view,
grab the cell that we
potentially just scrolled
onscreen and return a
UIAccessibility custom rotor
item result with the
target element the cell.
Now, it also supports
a target range property
in the initializer.
If we conform to UI text input,
we could use this to rotor
through different
pieces of text.
However, we're not doing that,
so we're just going
to leave this as nil.
And if our While Loop terminates
without returning anything,
we'll return nil to say we
weren't able to find a result.
Let's do a similar thing
in the route map cell.
Sorry, we have this
helper here, which already,
which searches our
annotation views starting
from one annotation view
going forwards or backwards
with a given predicate
that corresponds
to whether we should return
that annotation view based
on what data model
item it represents.
So we're going to use this
to do a similar thing.
We're going to override
awakeFromNib.
We're going to make a
short closure that's going
to help us make multiple rotors,
because we're basically
all we're changing
for the various rotors
is what items we return.
And then we're going to use this
when setting our accessibility
custom rotors property.
We're going to have
a Route Steps rotor
that just returns
every waypoint.
And a Hazards rotor that
returns just the hazards.
Okay, so in this fact array,
what are we going to do?
Well, we need to create a
UIAccessibility custom rotor
object with the appropriate
name.
We're going to use our Search
Annotation Views helper.
Pass it the current element.
Whether we're going
forward or backward.
And what the predicate was.
And again, if we were able to
find a result, we're just going
And again, if we were able to
find a result, we're just going
to wrap it up in a customer
rotor item result and return it.
Otherwise we'll return
nil, saying we weren't able
to find a previous or
next match to the rotor.
All right.
Let's build and run this.
>> Avenue --
>> There we go.
>> Heading.
>> All right.
So let's look at
whether our changes took.
>> Avenue Loop.
0.8 miles.
Button.
>> Okay. So VoiceOver
spoke "Button."
That's great.
Someone can know that they
can activate this item now.
Let's take a look at
this Favorite button.
>> Favorite, Avenue Loop.
Button.
>> Great.
>> Selected.
Favorite Avenue Loop.
>> So the selected
trait was taken care
of for us by the frameworks.
So I'd like to activate
this cell.
>> Avenue Loop.
Avenue Loop.
Avenue Loop distractions.
Rated 3 of 5.
Smells. Rated 2 of 5.
Greenery. Rated 5 of 5.
Friends. Rated 4 of 5.
>> Excellent.
So our graph bars
are now accessible.
>> Avenue Loop route.
Broadway and Ridgeway.
>> Okay. Now let's try
and use our new rotors.
>> Headings.
Words. Characters.
Hazards. Route steps.
Hazard. Hazard.
Ridgeway and Piedmont.
Piedmont and 41st.
41st and Broadway.
>> Great. So our Route
Steps rotor works.
>> Use the rotor to access --
>> Excuse me.
>> Hazards.
Hazard. Ridgeway and Piedmont.
Hazard. Boys' Choir on Ridgeway.
>> And the Hazards rotor
lets us jump right to
and through the list of
all the hazards there.
So, let's check out
the Steps tab again.
>> Steps. Selected.
Steps. 41st and Broadway.
Start at --
>> That was me.
>> Hazards.
Boys' Choir on Ridgeway.
Boys' Choir between Gilbert
and Montgomery on Ridgeway.
Continue along Ridgeway.
Hazard. Be care --
>> Oh, sorry.
The switch was cut off there.
So you heard, though,
that it said "Hazards."
So you heard, though,
that it said "Hazards."
So our modification to the
accessibility label took.
It's now conveyed that these
cells do represent hazards.
And in addition, we were
able to jump directly
to them using the Hazards rotor.
All right.
Let's jump back to the slides.
So, a quick word on tvOS.
tvOS apps are built with UIKit,
so everything we've been talking
about earlier applies
directly to them as well.
But there are a few
considerations you need to keep
in mind when building your
accessible TV apps that stem
from the fact that tvOS has
a focus-driven interface.
So a common pattern on
tvOS is to use headers
to group content onscreen
and to separate it visually
from the other content.
You can give a similar
experience
to someone using your app
with VoiceOver through the use
of the Accessibility
Header Elements API.
Set this property on
your Accessible Views,
their super views or on
applicable view controller
to be the headers that are
associated with that content.
to be the headers that are
associated with that content.
Then when someone
navigates to your items,
VoiceOver will read
the associated headers
in a distinctive pitch to give
someone using your app a sense
of place in your content.
So I recorded a short video
here of a art gallery app.
And when I play it,
I'm going to move focus
down into the collection view of
available works at the bottom.
You're going to hear
VoiceOver say "Available Works"
in a distinctive pitch.
And that's because
I set that label
to be the Accessibility Header
element of the Collection view.
Now, you'll also
hear something else.
When I pause, VoiceOver will
start reading through all
of the non-focusable
accessibility elements onscreen.
So let's hear that now.
>> Available works.
Grain. Button.
Title. Grain.
Author. Vivian Li.
Description.
Close up of a strand of grain.
It glistens with morning dew.
>> So, the reason I showed you
that secondary Read All behavior
is to hammer home that all
of the information your app
provides needs to be exposed
through an accessibility
element,
even if it's not
natively focusable
by the tvOS focus engine.
When you do that, when you
pause navigating with VoiceOver,
VoiceOver will then read all
of the non-focusable
content onscreen
so that information
will be conveyed
to someone using the
app with VoiceOver.
Okay. Let's wrap up.
So, when you're out there
implementing your apps
with an eye to accessibility,
keep in mind what
you've learned here.
First, accessibility
is about everyone.
It's about making the great
experiences your app provides
available to as many
people as possible.
You'll want to design
with accessibility
in mind from the start.
Is anything difficult to use
in your interface for someone
with a motor disability?
Is anything too low contrast?
Excuse me.
UIKit helps your apps become
accessible through your adoption
of the accessibility APIs.
And finally, accessible apps are
what make our platforms great.
So thanks so much for being here
and learning about
how to do that.
So for more information you can
check out this session online.
It's Session Number 202.
And there are some
related sessions this week
that you should check out.
At 10:00 a.m. in this very
room there's a session
on Inclusive App Design.
So make sure you stick around.
And on Wednesday at 10:00 a.m.
in Nob Hill there's the
session Auditing your Apps
for Accessibility, which again
is where you can learn more
about that new accessibility
inspector,
which I definitely
recommend you check out.
So that's all I have for today.
Thanks so much and have
a great conference.
[ Applause ]