WWDC2014 Session 210

Transcript

X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> Hi everyone.
I'm Clare and I'm on the
iOS Accessibility Team here
at Apple.
Today, I'll be sharing with
you about accessibility on iOS,
and how you can develop your
apps to be used by everyone.
Our talk today is going
to be in two parts.
First, I'll give an
introduction to accessibility.
I'll talk about some of
the features we offer
on iOS currently, as well as
some new things coming in iOS 8.
In the second part of this talk,
I'll be showing you how you
can make your apps accessible,
both by adding visual
accommodations,
and by implementing
Semantic Accessibility.
So first, what is accessibility?
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So first, what is accessibility?
Accessibility is all
about making things
usable for everyone.
On iOS, we look at four
broad areas of user needs.
We want users with physical
and motor challenges,
users with learning needs,
users with hearing loss,
and users with different
vision needs,
to be able to experience
iOS fully.
Our goal is equal
access for everyone.
So let's look at some examples
of how we accomplish this.
Some users may have a physical
impairment that prevents them
from using the touch screen.
In fact, their only [inaudible]
device may be a switch.
A switch could be something
like a button mounted
on a wheelchair headrest.
So how do we make the rich
touch interface of iOS usable
by someone who can
only use a switch?
Our answer is Switch Control.
When Switch Control is enabled,
a cursor appears on the screen,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
When Switch Control is enabled,
a cursor appears on the screen,
and it moves from item to
item with a specific timing.
When the user activates their
switch, that lets them interact
with a highlighted element,
so they can do things
like tap or scroll.
Next, let's look at learning.
iOS devices have been incredibly
powerful in the classroom,
because of their
easy to use interface
and their engaging apps.
For the same reasons,
they've been great
for children with autism.
But for these users, we faced
some unique challenges at first.
For one, when a child with
autism is learning in an app,
they might decide to
press the Home button.
And now the device is
full of distractions.
So to address this issue,
we created Guided Access,
which is a way for
parents or educators
to keep the device
in a specific app.
When Guided Access is enabled,
pressing the Home button
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
When Guided Access is enabled,
pressing the Home button
or the Lock button
have no effect,
and so the child can have
an uninterrupted learning
experience, free
from distractions.
Next, let's look at hearing.
Many users with hearing
loss use hearing aids
to help amplify the
sound around them,
but traditional hearing aids
come with a couple of issues.
First, they require that
you carry a remote with you
so that you can adjust
the volume,
and that can be cumbersome.
And second, call quality
can be less than optimal
when using a hearing aid.
When you have a phone
next to a hearing aid,
either the hearing aid
amplifies all the sound,
including background noise,
or it picks up the signal
from the phone's
electromagnetic field,
but that signal can also
be noisy, creating static.
So to address these two
issues, we partnered
with hearing aid
manufacturers around the globe
to create made-for-iPhone
hearing aids.
These hearing aids connect
directly to your iPhone,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
These hearing aids connect
directly to your iPhone,
and you can do things
like adjust the volume
of your hearing aids, without
the need for a separate remote.
In addition, audio streams
directly from your iPhone
to your hearing aid,
so it becomes effectively
a Bluetooth headset.
And that improves call
quality dramatically.
Finally, let's look at
the broad area of vision.
When the iPhone was
first introduced,
many people were concerned that
blind users would not be able
to take advantage
of its features.
Because if you think about it,
how do you use a touch screen
if you don't see the screen.
It's going to be difficult
because you won't necessarily
know what you're touching,
and you might accidentally
tap a button
that you didn't mean to tap.
So to address this,
we created VoiceOver.
When VoiceOver is enabled,
a user can touch anywhere
on screen to hear what
they just touched,
before they take an action
like tapping a button.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
before they take an action
like tapping a button.
We also have accommodations
in our settings for users
with low vision who may not
necessarily need the full power
of VoiceOver.
For example, some users
prefer bolder fonts,
in which case they can enable
bold text to get thicker text.
Some users may need
higher contrast, and if so,
we offer several options,
including the ability
to reduce transparency
on the system.
And finally, for users who
are more sensitive to motion,
we offer a Reduce
Motion setting,
which provides more
subtle transitions.
For example, when you launch an
app with Reduce Motion enabled,
it cross-fades into view.
So that's some of what
we offer on iOS currently
to support equal access.
But we've been working hard, and
I'd like to highlight just some
of the new features
we're offering in iOS 8.
First, Guided Access
Time Limits.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
First, Guided Access
Time Limits.
Time limits are all
about transitions.
When a child is immersed in an
app, and then it's time to move
on to the next activity,
that transition can
cause some anxiety.
With time limits, a caretaker
can assign a specific amount
of time that a child
should use an app.
When the time limit
is almost reached,
the child receives a warning,
helping them to prepare to move
on to the next activity.
And when the time limit
is over, the device locks.
Next, we're bringing
Alex to iOS.
Some of you may have
already met Alex.
But for those who haven't,
I'll let him introduce himself.
>> Hi everyone.
I'm Alex. And I'm
the US English voice.
I've been living on Macs for
years, and now I'm excited
to make myself at home on iOS.
>> So Alex will be
available in all
of our assistive
technologies that use speech,
including VoiceOver,
Speak Selection,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and a new feature
we're introducing
in iOS 8, called Speak Screen.
Speak Screen is great for anyone
who might have trouble
reading the screen.
When it's enabled, you can do
a simple gesture or ask Siri
to speak the screen to you,
and the device will start
reading whatever's on screen.
You also get options, like
the ability to pause, rewind,
or again slow down the speech.
And finally, I'd like
to highlight Zoom.
Some of you may have been
using Zoom for years on iOS,
but we've re-imagined it in
iOS 8, and have a whole new way
for you to interact with it.
But before I introduce
that, let's first talk
about what Zoom does currently.
Zoom magnifies the
entire screen,
and then you use three-finger
gestures to navigate.
This approach comes
with a few issues.
First, if you're using
three-finger gestures,
that means you're covering up a
significant part of the screen,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
that means you're covering up a
significant part of the screen,
especially on iPhones.
Also using three-finger
gestures means it's hard
to use a phone on the go.
You can't use the phone in a
single hand, you always need
to have one hand to hold
it, and the other hand
to do the three-finger gestures.
And finally, typing can be
tricky with Zoom enabled.
You have the choice
between zooming
in on the content
you're creating,
or having access to
the full keyboard.
But you can't have
both at the same time.
So to address all
of these issues,
we're introducing a new
windowed mode for Zoom.
And I'd like to demo
that to you now.
This is windowed Zoom.
Notice how only part of
the screen is magnified?
I can change what part is
magnified by dragging the edge
of the window with
a single finger.
I can also tap the edge of the
window to get resize handles,
so I can change how much
of the screen is magnified.
If I tap on the arrow at the
edge of the window, I get a menu
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
If I tap on the arrow at the
edge of the window, I get a menu
with several options,
including the ability
to adjust my Zoom level.
Now, let's go into notes.
I'm going to create a new
note, and watch what happens
when the keyboard comes up.
The Zoom window automatically
got out of the way
of the keyboard, so I have
full access of the keyboard,
but my content is zoomed.
And as I type, notice how the
Zoom window follows my cursor.
So I can always see
what I just typed.
So that's some of what we've
been doing here at Apple
to promote equal
access for everyone.
But for the second part of
the story, we need your help.
IOS is a great platform because
of all the apps you guys create,
and it's up to you to make sure
that your apps can
be used by everyone.
To that end, I'll be showing
you now how you can do
that with our API.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Our tutorial is going
to be in two parts.
First, I'll show you how to add
visual accommodations for users
with different vision needs.
And second, I'll
be showing you how
to implement some
magic accessibility.
So an assistive technology like
VoiceOver can navigate your app.
Earlier in this talk,
I showed some
of the visual accommodations
we offer on iOS for users
with different vision needs.
Now, we're providing API,
so you can make those
accommodations
in your apps as well.
We have a way for you to detect
whether the user has bold text,
reduced transparency,
darkened colors,
or Reduce Motion enabled.
And that makes it
really easy for you
to make specific adjustments
in your app for those users.
So I'd like to show you
how you can do that now,
using a sample app we've
created for this presentation.
Okay, so first we're going to
identify some areas of our app
that need visual accommodations.
To do that, I've
gone to my simulator,
and I've enabled several of
the accessibility settings.
In this case, I've got
Bold Text, Reduce Motion,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
In this case, I've got
Bold Text, Reduce Motion,
and underneath Increase
Contrast,
I've also got Reduced
Transparency
and Darkened Colors.
So that way, when we build
and run our app from Xcode,
we'll see what a user with
those settings will see.
So now, I'm going to Xcode.
I'm going to click
on build and run.
So this is a simple dating app.
At the top, you have
the title of the app,
which is HelloGoodbye.
Notice how the title
is very bold,
and the background
is fully opaque.
That's because we're using a
standard navigation controller.
So we get the bold text
and reduced transparency
behaviors for free.
But because we're doing
some custom UI in our app,
there's still some
work we need to do.
You probably already noticed
how we have transparent overlays
over a photo background.
When reduced transparency is on,
we should really be
making those opaque.
In addition, the text in the
buttons should be thicker
because we have bold
text enabled.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
because we have bold
text enabled.
Now I'm going to click
on the Profile button.
So here's where as a user
I can adjust my settings.
Again, we have the transparent
overlay, and the text
that should be bolder.
At the bottom of this view,
we have a preview tab,
which I can slide up
to see how other users
in this app will see my profile.
Notice how my info is in gray
text under each of the headings.
When the user has
darkened colors enabled,
it would be better if
that text were darker
to provide higher contrast
against the white background.
Now, I'm going to click on
Back and I'm going to go
into the Matches page.
So here's where I
can see matches
that the app is suggesting
for me.
And I can say hello or
goodbye to each one.
I like this guy's
elevator pitch,
so I'm going to say hello.
And watch what happens
when I swipe up.
Notice how the next
match animates on screen
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Notice how the next
match animates on screen
as though it's coming
from far away.
So that's an example
of an animation
where we should provide
an alternate to users
who are more sensitive
to motion.
And now, let's do all this,
using the API in Xcode.
So the first thing we're
going to address is the text.
Let's make sure that it's
thicker when bold text is on.
To do that, I'm going to click
on the Style Utilities file
on the left, which is
where all of our colors
and fonts are defined.
I'm going to click on
the drop-down at the top,
to find our font methods.
Notice how both of these methods
are returning Avenir-Light
as a custom font.
Now, if we were using the system
font, or the preferred font
for text style API, we would
be getting bold text for free.
But because we've picked
a specific font to use,
we also need to pick a font to
use when bold text is enabled.
So to do that, I'd like
to create a helper method,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So to do that, I'd like
to create a helper method,
that returns a font name.
All we're doing here is first
checking whether bold text
is enabled.
If it is, we can return a
thicker version of our font.
Otherwise, we can return the
same font we've been using.
And now, we just need to use our
helper in our 2-font methods.
So that takes care of bold text.
Now, let's look at those
transparent overlays.
I'm going to click on the
drop-down at the top and find
where that transparent
overlay color is defined.
Notice how right now we're
returning an 80 percent white.
When reduced transparency
is enabled,
we should return a 100 percent
white, so we do that here.
Okay. Now, let's make
sure that the gray text
in the profile preview is
darker when Darken Colors is on.
To do that, I'm going to
click on the drop-down again,
and find where that
color is defined.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and find where that
color is defined.
So here, we're just
returning gray.
Let's first check whether the
user has Darken Colors enabled,
and if so, we can
return black instead.
Finally, let's look at that
zooming animation that we saw
on the Matches page, where
the new match appears
as though it's coming
from far away.
To address that,
I'm going to click
on the Matches View
Controller on the left.
Now I'm going to
open the drop down,
and find where we
animate the cards.
So notice how here we're
zooming the new card into view.
Now, for users who have
Reduce Motion enabled,
a better experience would be
to fade the card into view,
because that animation does
not utilize a motion effect.
So I'm going to create
a helper method
that fades the card into view.
All we're doing is
setting the alpha
of the card with animation.
And then at the point where
normally we would Zoom the card,
let's first check whether
Reduce Motion is enabled.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
let's first check whether
Reduce Motion is enabled.
If it is, we use our
fading method instead.
We can use the Zooming
method for other users.
Okay, now let's build and run
our app with these changes.
Here's our app again.
Notice how the transparent
overlays are gone,
and have been replaced
by opaque white,
because reduced transparency
is enabled.
In addition, the text on
the buttons is much thicker.
When I click on Profile, again,
the same issues have
been resolved here.
I can slide up on the View tab,
and now notice how my
information is darker
and higher contrast against
the white background.
Now I'm going to click on Back
and click on the Matches button.
Again, I'm going to say
hello to my favorite match,
and watch what happens
when I do that.
The next match fades onto view,
because we have Reduce
Motion enabled.
So that's how easy it is for you
to make visual accommodations
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So that's how easy it is for you
to make visual accommodations
in your app for users with
different vision needs.
Next, let's look at implementing
Semantic Accessibility,
so that VoiceOver users
and Switch Control users
can also use your app.
Semantic Accessibility is what
allows an assistive technology
like VoiceOver to get
information from an app,
and to control the
app doing things
like tapping buttons
or adjusting sliders.
Here's an example of Semantic
Accessibility at work.
Let's say I've got the
calculator app open
and VoiceOver is enabled.
When I touch on top of the 5
button, VoiceOver intercepts
that touch, and then it asks the
app what it has at that point.
The app responds with an element
that represents the 5 button.
And that allows VoiceOver to
draw a cursor around the 5,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and to speak 5, so that I
know what I've just touched.
If I then decide to
tap the 5 button,
I can do a double tap
anywhere on screen,
and that tells VoiceOver to
activate the selected item.
So that's how VoiceOver users
can use the calculator app.
So how do you make sure
that your app works
with VoiceOver in
this way as well?
Well, the good news is that
most Semantic Accessibility is
already built into iOS, so you
get a lot of stuff for free.
But you should still
audit your app
with VoiceOver on the device.
And there are a couple questions
you should ask yourself.
First, can VoiceOver
speak everything
that needs to be seen?
And second, can VoiceOver
do everything
that a user would want to do?
Now, I'm going to be referring
a lot to VoiceOver in this talk,
but all of the work we're going
to do in our app also applies
to Switch Control, so
just keep that in mind.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
to Switch Control, so
just keep that in mind.
To set up an audit of your app
for Semantic Accessibility,
I recommend that you go to
your Accessibility settings
on the device, and set
your accessibility shortcut
to VoiceOver.
That will allow you to
triple click the Home button
to quickly enable or
disable VoiceOver.
If you've never used
VoiceOver before,
here are some tips
to get you started.
When you tap something,
that selects and speaks.
So, for example, in this screen,
if I tap on the accessibility
table cell,
the cursor moves there, and the
device speaks accessibility.
If I double tap anywhere on
screen, that tells VoiceOver
to activate the selected item.
So, in this case, if I did that,
the accessibility table
cell would get pressed,
and we would move to
the accessibility page.
When I swipe to the
right, that selects
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
When I swipe to the
right, that selects
and speaks the next element.
Right now, the Back button
is selected, so if I swipe
to the right, the Navigation
title becomes selected.
And finally, when you
have a scroll view,
you can use three-finger
swipes to scroll.
So if I want to see more of
the accessibility settings,
I can three-finger swipe
up to scroll down by page.
So that's all you need
to know to do an audit
of your own app with VoiceOver.
And now I'd like to show
you how you can do that,
with our sample app.
So here's our dating app
running on the device.
When I triple click
the home button --
>> VoiceOver on.
>> VoiceOver becomes enabled.
The first thing we'll do is
ask ourselves that question.
Can VoiceOver speak everything
that needs to be seen?
So first, I'm going to tap
on the title at the top.
>> HelloGoodbye.
Heading.
>> Okay, so VoiceOver
can see that.
Now, let's touch on the
logo [beeping sound].
It's kind of quiet, but that
sound we just heard means
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
It's kind of quiet, but that
sound we just heard means
that VoiceOver doesn't
think there's anything
under my finger.
So it looks like VoiceOver users
won't be able to read this logo.
Now, I'm going to tap
on the profile button.
>> Profile button.
>> And VoiceOver sees that.
When I swipe to the right,
that selects the next element.
So when I do that.
>> Matches.
Button.
>> VoiceOver finds
the Matches button.
If I swipe to the right one
more time we hear that sound,
which means there's nothing
else after this element.
Which is correct in this case.
So it looks like this view
is in pretty good shape,
except for the missing logo.
Now, I'm going to
tap on profile.
>> Profile button.
>> And I'm going to
double tap to activate it.
>> Profile, Profile,
Back, Back button.
>> Let's swipe to
the right a few times
to see what VoiceOver sees.
>> Profile.
Heading. Your age.
19 percent.
Adjustable.
Swipe up or down with one
finger to adjust the value.
>> So that was a little strange.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> So that was a little strange.
We just heard "your age,"
followed by "19 percent,"
it would make a lot more sense
if VoiceOver said the age
here, which is actually 37.
And the instructions
said to swipe up
or down to adjust the value.
So if I swipe up,
watch what happens.
>> 28 percent.
>> My age is jumping
up by 10 years.
So it doesn't look like
VoiceOver users will be able
to set their age accurately
right now in our app.
Now, I'm going to swipe to
the right a few more times
to see what else VoiceOver sees.
>> 47. Hobbies.
Music. Swing dance.
Wine. Text field.
Double tap to edit.
>> Let's make sure
that VoiceOver users
can edit their hobbies.
So I'm going to double tap.
>> Text field is editing.
Music. Swing dance.
Wine.
>> And the keyboard came up.
So that looks like VoiceOver
users will be able to do that.
Let's just make sure that the
Done button, which appeared
in the top right, is also
visible to VoiceOver.
So I'll tap on that.
>> Done button.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> Done button.
>> And it is.
When I double tap.
That takes us out of editing.
So it looks like the
hobbies are in good shape.
And the elevator pitch field
is very similar to hobbies,
so we'll skip that for now.
Now I'm going to touch on the
Preview tab at the bottom.
>> Preview.
>> So it's just saying
"preview,"
and if I couldn't see the screen
with that visual indication
of a tab, it might not be
clear to me what this does.
I mean, I might try to double
tap it, and so when I do that --
>> Preview.
>> Nothing happens, because this
tab doesn't respond to a tap.
It needs to have a
slide gesture up.
So right now, VoiceOver users
cannot preview their profile.
Now, I'm going to touch
on the back button.
>> Back. Back button.
>> And double tap.
>> [Beeping sound]
HelloGoodbye, heading.
>> I'm going to tap
on Matches now.
>> Matches button.
>> And double tap.
>> Matches, Matches,
Back, Back button.
>> Okay. Again we're going to
swipe to the right a few times
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> Okay. Again we're going to
swipe to the right a few times
to see what VoiceOver sees.
>> Matches.
Heading. Swipe up
arrow to say hello,
swipe down arrow to say goodbye.
Ellipses.
>> Okay, that's also
a little strange.
Visually when you look at
this logo, or this label,
you see swipe up to say hello.
But VoiceOver is saying
swipe up arrow to say hello.
And that's because
VoiceOver is reading each
of those characters literally.
So it would be better if
VoiceOver described this
in a more natural way.
Now, let's swipe to the
right to get to the photo.
>> Age.
>> Except we didn't
get to the photo.
So now let's try touching it
directly [beeping sounds].
And again, I get that sound,
which means VoiceOver doesn't
think there's a photo there.
Now, some of you might be
wondering whether that's
even important.
Because after all, a VoiceOver
user doesn't see the photo,
right?
But it turns out
it is important,
so for a couple of reasons.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
so for a couple of reasons.
First, if I'm a VoiceOver user,
using this app, I need to know
that other people in this
app can see my photo.
Even if I don't take full
advantage of the photo,
it's important to
know that it's there.
And second, remember
that Semantic Accessibility
is not just about VoiceOver.
So it's always a good idea
to expose any meaningful
info from your app.
Okay, now I'm going
to touch on age again.
>> Age.
>> And I'm going to swipe
to the right a few times.
>> Hobbies.
32. Cooking.
Bubble tea with friends.
Travel.
>> If you're paying
close attention,
you may have noticed how
VoiceOver went from age
to hobbies, and then back to 32.
So when you're looking
at the visual layout
of this profile card, it's
clear what this means,
but VoiceOver users
are often swiping
to visit different elements,
because that can be easier
than touching exactly
where an element is.
And so it's important
to have a logical order
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
And so it's important
to have a logical order
for how these elements
are visited.
It's also important for
Switch Control users.
The Switch Control cursor
will move from element
to element in the same order.
And it's important for that
order to be predictable,
so that a Switch Control user
can activate their switch
at the right time.
Okay. So last thing, remember
that our second question was
"Can VoiceOver do
everything that needs
to be done in this app?"
The point of the Matches page
is to allow users to say hello
or goodbye to their matches.
But VoiceOver users won't be
able to do this right now,
because swipe gestures get
intercepted by VoiceOver,
and we'll need to
provide a different way
for VoiceOver users to
do the same actions.
So that's the kind of process
you can take with your own app
to see whether it is
semantically accessible.
Now, let's look at resolving
those issues we encountered.
First, I'm going to
introduce the most basic
accessibility API.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
accessibility API.
There are only 2
properties here,
and if you remember
nothing else from this talk,
please remember these 2.
Just by using these 2
properties, you can solve a lot
of accessibility issues.
The most important
property you need
to know is the
isAccessibilityElement property.
This is a Boolean
that determines whether a
view is visible to VoiceOver.
By default, it returns yes
for standard UIKit controls
and labels, which is why even
though we haven't done any
Semantic Accessibility
work in our app yet,
a lot of the text is
already visible to VoiceOver.
And the second-most
important property you need
to know is the
accessibilityLabel property.
This describes a view, and
VoiceOver will speak this
when a VoiceOver user
touches on a view.
If you're creating your
views in Interface Builder,
you can set these properties
directly from Xcode's UI.
On the right-hand side of Xcode,
you'll find an identity
inspector.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
you'll find an identity
inspector.
And there's an accessibility
section there.
The Accessibility enable
check box corresponds
to isAccessibilityElement and
the label field corresponds
to accessibilityLabel.
If you're creating your views
in code, it's still really easy
to set these properties
as shown here.
So now let's look at
using these two properties
and see how much more
accessible we can make our app.
So the first thing we're
going to do is make sure
that VoiceOver users can read
the logo on the first page.
To do that, I'm going to go
click on the StartViewController
on the left, which is
where we create that logo.
So I'm going to click
on the drop-down
and find our viewDidLoad method.
So here's the logo.
It's just an image
view right now.
And first we're going to set
isAccessibilityElement to yes,
so that VoiceOver can see it.
And then we also need to make
sure VoiceOver describes it
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
And then we also need to make
sure VoiceOver describes it
correctly, so we give it
an accessibilityLabel.
In this case, we're just going
to give it the text
that's inside the logo.
Notice also that we're
localizing the string,
because VoiceOver
speaks in many languages.
Next, let's make sure
that the photo that's
in the profile card can
also be seen by VoiceOver.
To do that, I'm going to click
on the card view
clef on the left.
And I'm going to open
the drop-down to find
where we add our ProfileViews.
So here's where we're
adding the photo.
Again, we set
isAccessibilityElement to yes,
and we give it an
appropriate accessibilityLabel.
In this case, we're just going
to call it profile photo.
Finally, let's look
at the instructions
for saying hello or goodbye.
Right now, VoiceOver is saying
swipe up arrow to say hello.
Let's fix that.
So to do that, I'm
going to click
on the Matches View Controller
on the left, and we're going
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
on the Matches View Controller
on the left, and we're going
to open the drop-down
at the top to find
where we add those
swiping instructions.
So here's a label that
says those instructions.
It's already in
accessibilityElement by default,
so all we need to do is change
the accessibility label.
We're going to give it one
that is just like the text
on the label, except we're
replacing those unicode arrows
with words.
Now I'm going to build and run
the app with these changes.
>> [Beeping sound]
Extras folder.
Two apps. Double tap to
open [beeping sound].
>> HelloGoodbye.
Extras folder.
Two apps. Double tap to open.
Hello goodbye [beeping sound].
>> So here's our app
again with these changes.
The first thing I'm going
to do is touch the logo.
>> HelloGoodbye.
Meet your match.
Image.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Image.
>> So now VoiceOver
users can read this.
Next, I'm going to touch
on the Matches button.
>> Matches button.
>> And double tap.
>> [Beeping sound]
Matches, Back, Back button.
>> Let's first look at what
the instructions say right now.
>> Swipe up to say hello.
Swipe down to say goodbye.
>> Okay, so that's much
easier to understand.
And now when I swipe
to the right.
>> Profile photo, image.
>> VoiceOver is able to see
the photo in the profile.
So that's how, even with
just these two properties,
you've already made other
parts of our app accessible.
But because we've got some
more custom UI in our app,
there's some additional
API we'll need to use
to give VoiceOver users a
really great experience.
First, I'd like to introduce the
concept of Accessibility Traits.
Traits describe the
function of a view.
For example, in the familiar
timer app, each of the wheels
of time has a trait
of adjustable.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
of time has a trait
of adjustable.
And that means that its value
can be changed up or down.
The Start button has
a trait of button.
And that gives the user a
hint that this is something
that can be activated.
Again, if you're creating your
views in Interface Builder,
it's really easy to set these
traits directly from Xcode's UI.
In the same section we
talked about earlier,
you'll see a bunch of
check boxes for traits.
Some of these check boxes
may already be checked,
in which case I would generally
recommend leaving those alone,
but add check boxes
next to traits
that you think are
important for your view.
In code, it's also
straightforward
to set the Accessibility Traits.
But just remember that it is
a bit mask, so if you want
to add a trait in addition
to others, use a bitwise
or operator, as shown
here, to combine them.
Next, let's talk about
Accessibility Value.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Next, let's talk about
Accessibility Value.
Accessibility Value is a string
on properties, so in that way,
it's similar to accessibility
label.
Except you use Accessibility
Value
for things that are dynamic.
For example, you
use this on views
that have the adjustable trait
to describe the numerical value.
Right now, in our app, the age
slider has an Accessibility
Value of 19 percent.
So that's a case where we can
override Accessibility Value
to return the age instead.
Next, we'll talk about actions.
So actions are how VoiceOver
can control your app.
Two really common ones are the
Increment and Decrement actions.
So these are actions that
are associated specifically
for views that have
the adjustable trait.
And they allow the
user to increase
or decrease an element's value.
So again, in our app, the
age slider right now can be
incremented or decremented,
but right now it's
doing that by 10 years.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
We can override these
methods to allow the user
to change their age
by a single year.
Another really important
action is the Activate action,
and that's what happens
when a user double taps.
Normally, when you
double tap something,
that just taps the view.
But you can override this
method on views that respond
to a specific gesture, and
that way VoiceOver users can do
that action as well easily.
For example, in our app, we can
use this for the Preview tab,
which currently only
responds to a sliding gesture.
And finally, we have
notifications,
which allow your app
to tell VoiceOver
when something has changed.
One really important
notification is
the LayoutChangeNotification.
You should send this when you're
changing the layout of your app,
for example, adding
or removing views.
For example, in our
app right now,
when you reveal the preview,
VoiceOver might not realize
that new elements
have come on screen.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
that new elements
have come on screen.
Once you post a layout change
notification, VoiceOver checks
to find the elements
that are there currently.
Now, let's put all this
together in our sample app
to resolve some more of
those issues we encountered.
So the first thing we're
going to do is make sure
that VoiceOver users
can edit their age.
To do that, I'm going
to go to Xcode and click
on the AgeSlider
class on the left.
Here let's override
accessibilityValue,
and that way we can return the
age instead of a percentage.
All we're doing is taking
the value of the slider,
which corresponds to the age,
and making it into a string.
Next, let's make sure that
VoiceOver users can add
or remove to their
age by a single year.
So to do that, let's override
accessibilityIncrement.
All we're doing here is adding
1 to the slider's value.
We also send the Value
Change Control Event,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
We also send the Value
Change Control Event,
that way other parts
of our UI that rely
on the slider's value
can update accordingly.
We do the same thing for
accessibilityDecrement.
And now, one nuance to this.
After these changes, our age
slider has all the information
that the user needs to know.
Namely, the user's age.
So from a Semantic
Accessibility standpoint,
that label that's showing up
next to the slider is redundant,
and might even be confusing.
So let's just hide it.
To do that, I'm going to click
on the Profile View Controller
on the left, and open
the drop-down to find
where we add that label.
To hide it, we simply set
isAccessibilityElement to no.
Okay. Next, let's make sure
that VoiceOver users can
preview their profiles.
To do that, let's go
in Xcode and click
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
To do that, let's go
in Xcode and click
on our PreviewLabel class.
So here's our roadmap for this.
We're going to override
accessibilityActivate
on the Preview Label, so that
we can control what happens
when the user double taps.
But we need to tell the
Profile View Controller
to show or hide the preview.
So we'll need a way
to communicate that.
To do that, let's set
up a delegate protocol.
And all this delegate
protocol has is a single method
that we're going to call when
the label gets activated.
Of course, we also set
up a delegate property
so that the View Controller
can set itself a delegate.
Now let's go to the
implementation file.
Here we're going to override
accessibilityActivate.
All we're doing is
letting the delegate know
that the label was activated,
and we return yes to indicate
that the action succeeded.
While we're here, let's
also add a button trait
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
While we're here, let's
also add a button trait
to the Preview Label, and that
way VoiceOver users will have a
hint that they can
double tap to activate it.
To do that, I'm going to
override accessibilityTraits,
and first recalling super,
to get any traits we
would have by default.
Then we use the bitwise
or operator to combine
that with a button trait.
Okay. Now, let's go back to
our Profile View Controller,
to implement the
delegate protocol.
So first, I'm going to scroll
to the top of the file,
and declare that this
view controller implements
that protocol.
Next, I'm going to open the
drop down and find where we add
that Preview Label to the view.
Here we set the View
Controller as its delegate.
And finally, let's
implement that protocol.
I'm going to scroll
to the bottom.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
I'm going to scroll
to the bottom.
And here we've implemented the
only delegate protocol method
we have.
So when the Preview
Label is activated,
we'll first check whether the
preview is currently showing.
If it is, we can hide it.
Otherwise, we reveal it.
One additional thing.
Recall that when you
hide or show the preview,
you're changing the
layout of the screen.
So we should tell VoiceOver
that that's happened.
To do that, let's go
to the implementation
of reveal card and dismiss card.
So I'm going to click on
the drop down of the top
and find those two methods.
Both of these methods
are animating the card
on or off screen.
In the completion
for that animation,
we post the layout
change notification.
Let's do the same
thing down here
when dismissing the preview.
Okay.
>> Now let's build
and run our app
with these changes
[beeping sound].
>> HelloGoodbye.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> Double tap to open.
>> [beeping sound].
>> Extras folder.
Two apps. Double tap to open.
HelloGoodbye.
>> Okay, so here we are in
the app with our changes.
First I'm going to tap
on the profile button.
>> Profile button.
>> And I'm going to
double tap to activate.
>> Profile, Profile,
Back, Back button.
>> First, let's touch
on the age slider.
>> 37. Adjustable.
Swipe up or down with one
finger to adjust the value.
>> So notice how
that's much clearer.
We're actually seeing the
age now, and when I swipe
up to adjust the value.
>> 38.
>> It goes up by a single year.
When I swipe to the right --
>> Hobbies.
>> Notice how we've skipped
over that redundant
label to the right.
Now, I'm going to touch
on the preview tab.
>> Preview button.
>> So this time, because
it says it's a button,
I know I can do something
with this.
So I'm going to double tap.
>> Preview.
>> And notice how
the preview appears.
So when I swipe to the right.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So when I swipe to the right.
>> Profile photo.
Image.
>> I can see stuff
inside the preview.
If I tap on preview again --
>> Preview button.
>> And double tap.
>> Preview.
>> That hides the preview, so
when I swipe to the right now,
I hear that sound,
VoiceOver knows
that the preview
has gone off screen.
So that's how we can make
even custom UI accessible
to VoiceOver.
To resolve the last remaining
issues we encountered during our
audit, we're going to use some
new API we've added in iOS 8.
First, let's talk about the
UIAccessibilityContainer
protocol.
Some of you may already
realize that that isn't new,
but we've updated it in iOS 8
to make it much easier to use.
For those who aren't
familiar with this protocol,
UIAccessibilityContainer
is a way to return a list
of accessibility
elements from a view.
You'll need to do this
if you have elements
that don't correspond to views.
For example, you might have
a view with a drawing in it,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
For example, you might have
a view with a drawing in it,
and you might want
different parts of the drawing
to be separate accessibility
elements.
In that case, you would create
an accessibility element
for each part of the drawing,
and return those elements using
the UIAccessibilityContainer
protocol.
And you can see last year's
talk, for a great example of how
to create your own elements.
But what many people
don't realize is
that this protocol
is also a great way
to return existing elements
in a specific order.
So for example, if
VoiceOver is navigating things
in a different order
from what we want,
we can use this protocol
to return those elements
in the order that we want
VoiceOver to see them.
And now for what's changed.
So currently, in order
to use this protocol,
there are three methods
you need to override.
In iOS 8, there's a
single property called
accessibilityElements, which
you can set to return a list
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
accessibilityElements, which
you can set to return a list
of those elements, and
notice how now you don't need
to override subclass, and
you don't have to deal
with three separate methods.
So we think people are going to
find that much easier to use.
[Applause]Thank you.
Okay next up we're
introducing a new class called
the CustomAction.
You should use CustomActions
when you have more
than one action for an element.
For example, in the
app switcher,
each of the app thumbnails
can either be tapped
to launch the app, or you can
slide it up to close the app.
And so we expose a CustomAction
for closing the app,
and that way VoiceOver users
can do that action easily
without needing to do
the sliding gesture.
In our sample app, we
can also use this API
to allow VoiceOver
users to say hello
or goodbye to their matches.
The custom matching
class looks like this.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
The custom matching
class looks like this.
You create a custom
action object with a name
that VoiceOver uses
to describe it.
And you also give it a
target and a selector,
so similar to what you
would pass to a UI button.
And that determines what happens
when the action is performed.
Once you've created all of
your CustomAction objects,
you can assign them
to a view using the
accessibilityCustomActions
array property.
So now, let's use this in our
app to fix the remaining issues.
First, what we're going
to do is make sure
that VoiceOver navigates
the views
in the profile card
in the correct order.
Namely, we don't want
it to go from age
to hobbies and back to 32.
So to fix that, let's click on
the card view class on the left.
Here's our method where we
add all the profile views.
When I scroll to the bottom,
after all of our subviews
have been created,
we can set the
accessibilityElement property
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
we can set the
accessibilityElement property
of our card view to an
array of its subviews
in the order you
want them to appear.
Notice how I put the age title
right next to the age value.
Whoops. Okay.
Next let's make sure that
VoiceOver users can say hello
or goodbye to their matches.
To do that, I'm going to click
on the Matches View
Controller on the left.
First, let's find out what we're
doing when a user swipes up
or down without VoiceOver
enabled.
So I'm going to click
on the drop down
and find our swipe gesture
recognizer handlers.
So both of these handlers
are simply calling
in to methods called say
hello and say goodbye,
which are implemented
right above.
These methods are also great
candidates for the selector
that we pass to the
CustomAction objects.
The only thing we'll need
to change is make sure
that they return Booleans
to indicate whether
the action succeeded.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
to indicate whether
the action succeeded.
In most cases, you
can just return yes.
Oh, one more thing.
When you say hello or goodbye
to a match, the layout changes.
So again, we're going to post
a layout change notification.
Both of these methods
are calling
in to a helper called
animateCardsForHello,
which is implemented
right above.
So in the completion
for this animation,
we should post our
layoutChangedNotification.
Okay. So that's all
the setup we need.
Now, let's create those
CustomAction objects.
I'm going to click on the
drop down at the top and find
where we add the card
to the Matches page.
So here's where we're
creating the card,
and we're assigning swipe
gesture recognizers to it.
Alongside those swipe
gesture recognizers,
let's now create
our CustomActions.
First, I'm going to create
an action for saying hello.
So here's how that looks.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So here's how that looks.
We give it a name, which in
this case is simply say hello.
We pass in the View Controller
as a target, and as a selector,
we pass in the method
that we just modified
to return a Boolean.
We do the same thing
for saying goodbye.
And now, we want the user
to be able to say hello
or goodbye wherever they
are in the profile card.
So for every element
in the card view,
we're going to set it's
accessibilityCustomActions
property to an array
of these two actions.
Okay. Now, let's build and
run with these changes.
[ Background Sounds ]
>> Double tap to open.
HelloGoodbye [clicking sound].
HelloGoodbye.
>> Okay. Let's tap on Matches.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> Matches button.
>> And double tap to activate.
>> [Beeping sound] Matches.
Matches. Back.
Back button.
>> So the first thing I'm going
to do is tap on the photo,
and start swiping to the right
to see what order VoiceOver
sees these elements.
>> Profile photo.
Image. Age.
32. Hobbies.
Cooking. Bubble tea
with friends.
Travel.
>> So that's much
easier to understand.
Next, let's touch any
element in the profile card.
In this case, I'm going
to touch on the photo.
And listen to what it says.
>> Profile photo.
Image. Swipe up or down
to select a CustomAction.
>> So we're given the
option to choose one
of the actions we created.
When I swipe down --
>> Say hello.
>> I hear the description
of the action.
If I want to take that action, I
can double tap, [clicking sound]
and so now VoiceOver
users can also say hello
or goodbye to their matches.
And that's how easy it is
to use the new API
we've added in iOS 8.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
[ Applause ]
>> So we've come to
the end of our talk,
and there are a few
takeaways I'd like you
to have when you leave.
First, iOS users are diverse.
Keep in mind that there
may be users of your app
who have different needs,
which you should accommodate.
When you accommodate more users,
you're widening your user base.
When more people can use
your app, more people will.
And finally,
adding accessibility is a low
effort, high reward business.
We've just seen how, with only a
few code changes, we took an app
that was just unusable
for many users,
and made it into an experience
that they, too, can enjoy.
So without needing
a lot of work,
you can make a real
difference to users.
For more information, you
can contact our Evangelist,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
For more information, you
can contact our Evangelist,
Jake Behrens, or you can consult
the Accessibility Programming
Guide for iOS.
There is also the
developer forums.
So you can ask other
developers your questions.
There are several related
sessions to this one.
Prior to this talk,
we had a talk
about accessibility on OS X.
If you missed it, you
can watch the video.
Tomorrow there will be a talk
about designing apps for people
on the autistic spectrum.
And on Friday, you can learn how
to make even complex web
applications accessible.
Thank you all for coming.
I hope you enjoyed the talk.
And have a great rest
of the conference.
[ Applause ]