Transcript
[ Music ]
[ Applause ]
>> Good afternoon.
Welcome to Introducing
PencilKit.
I'm Will Thimbleby, and I want
to talk about this fantastic
little device.
This is the Apple Pencil and it
can truly transform your user's
experience on iPad.
In fact, it's the most unique
aspect of iPad.
From preschoolers to Isaac
Newton, from your To-Do list to
fine art, the Pencil is often
where it starts.
It's available on the fantastic
iPad Pros and across our entire
lineup of iPads, from the
smallest Mini to the largest
Pro.
It's fantastic for photo
retouching, annotating,
doodling, in fact, anything you
want to do that requires
precision.
In iOS 13, we've made great
strides in latency, we've added
a fantastic new tool palette.
We're introducing PencilKit to
enable you to add Pencil to your
apps that much easier.
And we're also introducing
Markup Everywhere to allow users
to annotate and markup your
app's content even if your app
doesn't do anything with Pencil.
We're going to talk about great
Pencil experiences, how
PencilKit can help you make a
great Pencil experience.
And finally, with Markup
Everywhere, how you can provide
the content to users so that
they can mark it up.
What makes a great Pencil
experience?
Well, a great Pencil experience
is one that fully uses
everything the Apple Pencil has
to offer.
And that means fully using the
precision the Apple Pencil has
to offer.
It lets you touch a pixel and it
gives you that information at
240 times a second.
It's uniquely expressive, giving
you force, azimuth, and
altitude, allowing you to create
real expressive marks on your
app that bring your apps to
life.
And finally, the
second-generation Apple Pencil
has Pencil taps, that allow the
user to change modes in your app
without ever putting the Pencil
down.
There are now three different
types of Pencil.
The first-generation Apple
Pencil, the second-generation,
and the Logitech Crayon.
These all have slightly
different capabilities.
They all have that same great
precision.
They all have the same azimuth
and altitude that allow you to
create your expressive marks.
The first-generation Pencil adds
force to that.
And the second-generation Pencil
adds tap gestures.
I'm going to talk about a few of
the more complicated aspects of
supporting Apple Pencil well.
But if you're just starting and
you're wanting to build a custom
drawing experience, I especially
recommend ''Leveraging Touch
Input on iOS'' in WWDC 2016.
Understanding how a Pencil works
is key to understanding what it
does.
And so, I'm going to take you a
little bit behind the scenes and
tell you how the Pencil
functions.
The Pencil creates a uniquely
precise touch location on the
screen and gives you that 240
hits.
As you know, it gives you
azimuth, which is the angle
around the perpendicular of the
iPad.
And it gives you altitude or
tilt, which is the steepness at
which you're holding the Pencil.
What you may not know is how it
does this.
The Pencil generates a second
hit and touchpoint on the
surface of the iPad.
And using trigonometry, it uses
that to calculate azimuth and
altitude.
Finally, the Pencil has an axial
force sensor that detects the
pressure and it sends that data
over Bluetooth.
There are a few consequences of
this.
The first is azimuth and
altitude may be estimated when
that second touchpoint is
obscured, perhaps by a finger or
the edge of the screen.
Azimuth is emphasized when the
Pencil is nearly perpendicular
to the iPad.
And finally, the force data
which comes via a different
mechanism is delayed from the
touch location data.
Handling these more complicated
aspects of the Pencil is key to
building a great Pencil
experience.
As I draw in from the edge of
the screen, I'm getting
estimated azimuth and altitude.
As I continue to draw into the
screen, as a second touchpoint
comes onto the surface of the
screen, you get the correct
values.
Now, instead of leaving it
looking something like this, you
should be back-filling those
correct values to correct the
estimated ones.
Second, as you draw, there's a
region behind the Pencil that is
using estimated force.
You should be continuously
listening to force updates so
you can be drawing the correct
values.
This remains true even after the
Pencil is lifted off the surface
of the iPad.
There's a region of the stroke
that's still waiting for those
final force values.
You need to keep listening to
them even after touch has ended.
Now, one aspect of this is that
it means that the user can start
drawing the next stroke before
the last stroke has got all the
final values.
I would recommend using a serial
queue to only be handling one
stroke at a time.
The time is short enough that
the user is not going to notice.
But you want to handle the data
correctly.
Another part of handling Pencil
well is latency.
With physical pencils, the
pencil is intrinsically tied to
the line that's being drawn.
On a digital device, there's
often a gap between where the
implement is and where the line
on screen is.
Keeping this as small as
possible is critical to
maintaining that feeling of
drawing on paper.
We care so much about latency at
Apple, that we have robots
testing it continuously.
This is a high-speed capture
from one of our tests.
It's at 800 frames per second
and you can see just how small
the distance between the pencil
and the line is.
Let me show you that in real
time.
Did you get it?
I'll show it to you again.
So, a few tips for providing the
best latency.
First, you need to be rendering
in Metal.
You only have a few milliseconds
each frame and you need to be
doing so consistently frame
after frame to provide good
latency.
In iOS 13, we've made great
strides improving prediction.
You should be using predicted
touches to reduce your latency
even further.
Finally, if you're building a
drawing app, for absolute best
latency you should avoid things
like transparent Metal layers.
And you should be avoiding
things like UI effects views
with blurs and overlays on top
of your Metal layers.
And one aspect of this that gets
overlooked sometimes is the
default navigation bar and, in
fact, the home affordance can
add extra cost to the rendering.
The final part of supporting the
Pencil is the Pencil Tap
gesture.
And this is a great way to let
the user switch modes without
having to put the Pencil down.
To do so, you should use
UIPencilInteraction, set
yourself as the delegate, and
when the user taps, you'll get
called back.
When you do so, you should be
respecting the user's preferred
tap action.
And this is something the user
will have chosen in Settings.
If you can, you should be
respecting this.
If it doesn't make sense for
your app, Pencil Tap should be
used for nondestructive mode
switching.
So, up until today, building a
great Pencil experience was a
lot of work.
We provide some great APIs and
fantastic hardware and you have
created some of the best drawing
experiences in the world.
I've highlighted a few of the
more complicated aspects of
supporting Apple Pencil well.
And if you haven't already,
consider these as polish.
But for those of you who are
just getting started, we'd love
to make your lives a whole lot
easier today.
So, I'm very happy to introduce
PencilKit.
[ Applause ]
PencilKit is the framework that
we use across our entire
operating system.
We use it in Notes for providing
low latency drawing and
notetaking.
We use it in Pages for marking
up documents.
And we use it in Markup
Everywhere for annotating
screenshots and PDFs of your
app's content.
We gave it to a few developers
and Pinterest and Canvas added
features to their apps in just
such little time.
You can add it to your app in
just three lines of code.
Here, we create a canvas, add it
to your view hierarchy, and
choose an ink.
And with that, you get the same
industry-leading, low latency
that we have across our
operating system.
The same great expressive inks
that we've spent hours
perfecting.
And the same fantastic UI and
tool palette.
With a few more lines of code,
you can create something a
little bit more comprehensive
like our sample app.
I'd love to show that to you,
now.
So, here we have our sample app.
It's a little, little drawing
app.
It's got a few thumbnails of
things I've been working on
earlier.
And to give you an idea of the
breadth of what PencilKit can
do, I'll just show you some of
these.
Here're some notes that have
been taken and we can scroll
through these beautiful notes.
And here's a beautiful flower my
friend Andy drew.
This gives me an opportunity to
also show how PencilKit reacts
to Dark Mode.
If I bring down Control Center,
I can switch into Dark, and the
flower that looked great in
Light looks just as stunning in
Dark Mode.
As I come out, the thumbnails
are rerendered and you can see
that the notes are just a
legible.
But luckily for me, I get to do
what every engineer dreams of
and that is drawing on stage.
So, I'm going to switch back to
Light Mode and I'm going to
continue drawing a logo I was
drawing earlier.
At the bottom here, you can see
our great new tool palette UI.
And with a single finger, I can
drag it around the screen to
where it's comfortable.
For now, I'm going to leave it
at the bottom of the screen.
To finish this, I'm going to use
the Ruler and marker to add some
color.
So, I can tap the Ruler to bring
the Ruler in.
And with two fingers, I can
position the Ruler.
I can draw a straight line along
the Ruler but I can also use the
Ruler to mask.
So, I'll do that.
Let me add some more color.
I'll tap the Ruler to dismiss
it.
And I'll add some color in at
the bottom.
Now, one of the most exciting
things PencilKit is doing, is
it's starting to bring together
the worlds of bitmap and vector,
object and pixel.
And one of the areas you see
there is the Eraser.
So, I'm going to use the
double-tap gesture on the Apple
Pencil to change the Eraser.
Just like that, I've switched
modes.
And this is the Pixel Eraser
which lets me draw around and
cut out parts of this logo.
Now, if I tap the Eraser, I
switch to the Object Eraser.
And using Object Eraser, I can
draw around this and I can just
delete the bits that I've cut
out.
Just like that, I've used vector
and bitmap operations to create
a drawing.
Let me show you that again.
I'll tap. I'll get the Pixel
Eraser.
I'll take a cutout.
And this time, I'll use the
Lasso Tool, which is next to the
Eraser.
I can move that where I want, or
I can tap on it and delete it.
Now, that I've created my work
of art, I really ought to sign
this.
Luckily, our sample app has a
signature feature.
And up here in the top right,
I'm going to tap Signature and
you'll notice as I do so, the
tool palette goes away and we
have a custom picker up here
which lets me take just a black
or blue ink.
For now, I'm going to stick with
black.
I'm going to add my signature.
Now that I've done that, I'm
going to tap to sign my drawing.
There.
I think that's a great place to
leave it.
I think I could frame that.
That's cool.
[ Applause ]
So let's talk about the
architecture of PencilKit.
The main thing you'll be using
is the PKCanvasView.
This provides the drawable
region for your app.
PKDrawing is the data model.
It captures all those beautiful
strokes.
PKToolPicker provides the UI
that floats around the screen.
And PKTools are the tools that
provide those inks and
interactions that happen on your
canvas.
PKCanvasView is a UI scroll view
that lets you pan and zoom.
It lets you choose how the
user-- what the user's
interaction does to it by
setting the tool.
And it allows you to get the
data model from it and set the
data model on it using the
drawing property.
PKDrawing is the data model of
PencilKit.
And this is the one piece of
PencilKit that is available on
macOS.
It has a data format and it
allows you to load and store
drawings to data.
You can use these drawings to
generate images for sharing or
thumbnails.
Let's take a look at how the
sample app generates thumbnails.
Because all of these values have
value types, we can safely do
this work on a background queue.
Because we want to generate
those thumbnails in Light or
Dark Mode depending on where the
app is, we can use
UITraitCollections
performAsCurrent.
We use the drawing to generate
that image.
And then, finally, we can set
that image back on the main
thread.
So, now I'd love to hand over to
Jenny who'll talk more about the
fantastic tools and Tool Picker
and the great things that
PencilKit can do.
[ Applause ]
>> Thanks, Will.
Hi. I'm Jenny and I'm going to--
let's continue our tour through
PencilKit by walking through
some of the great tools that
PencilKit offers you.
These tools are located in a
brand-new Tool Picker.
It's dynamic.
It floats above everything.
I can drag it from edge to edge
or even dock it to the bottom to
really let it get out of my way.
As Will mentioned, these tools
are PKTool types.
And for the marking tools,
they're PKInkingTool types.
You can specify one of the three
types, either pen, marker, or
pencil.
Each of these tools are super
dynamic and expressive.
And you can see how within even
a single stroke, the width and
opacity change based on
different Pencil properties like
force, azimuth, altitude, or
velocity.
You can set this tool on the
canvasView to set what ink is
set on the canvas.
If you set the canvasView as an
observer of the Tool Picker,
underneath the hood will set the
ink on the canvasView for you.
However, if you have the
signature pane and you don't
want the Tool Picker, you'll set
this in your application
yourself.
For the PKInkingTool, you'll
specify either the type, pen,
marker, pencil, the color, or
the width.
For the width value, each of ink
type has a default width.
However, as we saw before, this
width isn't a fixed value and it
changes based on different
pencil properties.
And so, rather this width
represents a base value based on
an average pencil user pencil
characteristic.
You can also query the valid
width range for each of those
ink types.
You can see this here as I use
the pencil tool.
As I hold the pencil more
vertically, the pencil stroke is
thinner.
But as I hold it more
horizontally, the stroke is
actually thicker.
As I change the thickness in the
Tool Picker, the thickness
scales accordingly.
We also have a PKEraser tool in
which you can specify either a
vector or bitmap, where vector
corresponds to objects and
bitmap corresponds to pixel.
We've actually worked really
hard to unify the two worlds of
vector and bitmap together,
where vector is object and
bitmap is pixel.
Instead of just erasing the
pixels on the screen, we also
sliced through those strokes so
that you can separate them out
or object erase them later.
We also have for selection, the
PKLasso Tool.
With the Lasso Tool, any strokes
that you intersect will be
selected and then, you can drag
them around, cut, copy, paste
it, or even drag and drop them
to different applications.
New to iOS 13, we also have a
great Ruler Tool.
And it's important to note that
the Ruler is not a tool--
PKTool.
But rather, it's a property that
you toggle on the canvas to show
or hide the Ruler.
You can either draw against it
to snap your lines to draw
straight lines, or you can mask
against it like Will did with
the apple in here with the water
and the grass.
Now, that we've gone through
some of the amazing tools, let's
look at the PKToolPicker and how
we can get it on the screen.
An important thing to note with
the Tool Picker is that it's not
a view.
Rather, it's an object that
shows or hides the view and it's
separate from the Canvas View.
It's also important to note that
it floats above everything and
it's very similar to the
keyboard in that its visibility
is based off of first
responders.
So, let's walk through some code
and see how we can do that.
First, we'll ask for the shared
Tool Picker for the window.
We'll add the Canvas View as an
observer.
And by doing so, whenever you
change the tool in the Tool
Picker, you'll also change the
tool on the Canvas View.
The Tool Picker also has a list
of responders.
If your object becomes first
responder and it's in this lest
by setting visible to true, the
palette will show up.
If setVisible is false, it'll
remove it from that list and
then the palette will hide.
And so, when the Canvas View
becomes first responder, we want
the palette to show.
So, we'll set visible to true.
Finally, we'll make the Canvas
View becomeFirstResponder so the
palette will show up.
You can see this in our sample
app.
Once we set the Canvas View to
becomeFirstResponder, the
palette will be visible.
However, in our sample app, we
also have a case with at
signature pane where we only
want to offer you blue or black
ink.
And we don't want the palette to
be shown.
And so, in order to handle that,
we actually make the canvas--
the signature's Canvas View
becomeFirstResponder, which will
then make the Tool Picker go
away.
When you dismiss that Signature
View Controller, underneath the
hood you'll automatically resign
first responder which will then
cause the tool palette to show
up again.
One of the other things you'll
want to keep in mind with this
responder-based visibility is
that you may already have
objects in your application that
take first responder.
Like, for example, for edit and
the new controllers.
And you'll basically want the
palette to show even when your
Edit menu is up.
To do this, you'll simply set
visible to true for your object
so that the palette stays
visible.
Another thing you'll want to
consider with the Tool Picker is
regular versus compact size
classes.
You'll notice in the regular
size class that it floats above
everything, you can move it
around.
However, in the compact size
class, it's actually fixed and
docked to the bottom.
And so, what does that mean in
your application?
Well, let's say you have a
full-size app.
The photo spans mostly edge to
edge.
And maybe it's obscuring some of
the photo in the regular size
class.
But you can just move the Tool
Picker out of the way, so it's
fine.
However, in the compact size
class, it actually obscures kind
of the most interesting part of
this photo.
And so, what you'll need to do
in the compact size class is to
make sure to adjust your view's
frame or your scroll view insets
to account for the obscured
frame from the Tool Picker.
You can do this by just
listening for the Tool Picker's
frame changes with the Observer
method toolPickerFramesObscured
DidChange.
You'll get this whenever you
move from floating to docked.
At that point, you can adjust
your content accordingly by
asking for the frame obscured in
your view.
Another thing you'll want to
consider with the Tool Picker
are your undo and redo buttons.
You'll notice how in the regular
size class that undo/redo
buttons are actually baked
inside of the palette and
provided for you.
However, in the compact size
class, they're not in the
palette at all.
So, you'll need to make sure in
the compact size classes that
you show your own undo and redo
buttons.
Now that we've walked through
how you can get a basic Canvas
View and Tool Picker on the
screen, let's walk through some
more of the advanced behaviors
that you can have in PencilKit,
starting with some of the Canvas
View delegates.
You might want to update your
app based on what the user is
drawing.
You can do so by listening to
the pencil or touch down, in
which case you'll get a
canvasViewDidBeginUsingTool
callback.
On pencil or touch up, you'll
get a canvasViewDidEndUsingTool
callback.
However, at this point, your
drawing is not yet fully updated
because as Will mentioned, it's
not until those final force
values come in, that you get a
final
canvasViewDrawingDidChange.
Only at this point, you're
guaranteed to have a final
finished drawing.
And so, at this point, you can
query the drawing from the
canvas and update your model
objects, generate thumbnails, or
save if necessary.
You might also want to load a
drawing into your Canvas View.
You can do so by calling set
drawing.
At that point, we'll start
loading in the tiles.
However, it's not until those
tiles are done loading that
you'll get a
canvasViewDidFinishRendering
callback.
You'll also get this callback
after scrolling or zooming.
Speaking of scrolling, we not
only have let you draw with your
Pencil, but we also let you draw
with your finger.
And since PKCanvasView is a
scroll view, that means that one
finger actually lets you draw.
And two fingers scroll.
This is toggled via the
allowsFingerDrawing property on
the Canvas View.
When this is set to true, one
finger and pencil draw while two
fingers scroll.
Now, this is the default
behavior on the Canvas View.
And if this is not what you
want, you can actually set this
property to false.
In which case, only Pencil will
draw and one finger will scroll.
However, you should keep in mind
contexts like iPhone where
Pencil is not available.
If you have some more complex
interactions in your app, we've
also exposed the
drawingGestureRecognizer for
you.
With that, you can set up
gesture recognizer exclusions or
failure requirements.
There's a great talk from WWDC
2017 for ''Modern User
Interaction on iOS''.
As Will mentioned, we also use
PencilKit across all create
parts including screenshots and
markup.
And in these contexts, you can
draw over contents.
You can easily achieve this in
your app as well, by setting the
opaque flag to false and setting
the background color to clear.
In iOS 13, we also introduce
Dark Mode which is a fantastic
way to see your content in a
completely different way.
Your PencilKit canvases also
still look amazing as the colors
dynamically adjust to maintain
legibility.
You can see how I originally
wrote this note with black ink
over a white background.
But in Dark Mode, it changes to
mostly white ink over a black
background.
Not only do my notes still
maintain legibility, but they
also still look fantastic.
By default, your canvases will
also dynamically adjust their
colors if they're in Dark Mode.
However, if this is not what you
want, you can set the
overrideUserInterfaceStyle to
always be light.
You'll especially want to do
this if your content that you're
marking up doesn't change, like
if you're marking up over an
image or a PDF.
Now that we've walked through
PencilKit and shown you how you
can make an amazing drawing
experience in your application,
let's look at Markup Everywhere,
a new feature which allows your
application to pass any content
to be marked up.
This has surfaced through a new
API on Screenshots, where we let
you provide full content that
whenever you take a screenshot
over your app.
You can see this adopted in
Safari.
Here, I have apple.com.
I'm going to take a screenshot
by using the new pencil gesture
by pulling in from the corner.
And so, I have the screenshot I
know and love but I can't see
the rest of the page.
So, I'll tap that full-page
segment at the top and now I
have the full scrolling webpage
for me to markup and share.
[ Applause ]
Yeah.
[ Applause ]
You can also see this
implemented in our sample app.
Again, with the new Screenshot
gesture, I'll tap that full-page
segment and now I have the whole
note for me to send off.
You can also see this in
interesting use cases like maps.
Again, using the new Screenshot
gesture, I'm going to take a
screenshot.
But you can see my screenshot is
kind of covered up by the
chrome.
When I tap the full-page segment
at the top, I'll be able to see
my map without any of the chrome
in the way.
I can see the roads and
restaurants underneath.
You can easily adopt this in
your application with only a few
lines of code.
You'll start by setting yourself
as the delegate of the
UIScreenshotService on
UIWindowScene.
UIWindowScene is a new API to
UIKit this year.
And you can learn more by
referencing the ''Introducing
Multiple Windows on iPad'' talk.
Once you've set yourself as the
delegate, you'll deliver as the
full content, which is expressed
as PDF data.
You might already have this
information generated for
actions like sharing or
printing.
But in case you don't, there's a
great talk from WWDC 2017 from
''Introducing PDFKit on iOS''.
So, once you have that PDF data,
you'll implement the delegate
method screenshotService,
generatePDFRepresentation
WithCompletion.
You'll pass that PDF data to us
in the completion handler along
with two other pieces of
metadata.
Which will help us ensure that
when you go from screen to full
page, that we have a smooth
seamless transition.
The first piece of metadata is
the indexOfCurrentPage.
This is useful in cases like
Keynote.
Let's say I take a screenshot of
slide seven.
When I switch to the full-page
segment, it'll automatically
jump me to page seven.
The second piece of metadata is
the rectInCurrentPage.
This is useful in cases like
Safari.
Here, I have this long scrolling
page.
And I'm going to scroll to the
bottom because I'm really
feeling excited about the new
iPad Pro.
So, I'll take a screenshot at
the bottom there.
Now, when I tap the full-page
segment at the top, instead of
awkwardly jumping me to the top,
it'll actually take me to the
same page-- same RECT that I
took the screenshot in.
The one important thing to note
about this RECT is that we
expect it in PDF coordinates.
And so, what does this mean?
In View Coordinates the origin
is actually the top left.
However, in PDF coordinates,
it's actually in the bottom
left.
So, you'll need to make sure to
do the appropriate coordinate
transformations and send us that
RECT in the appropriate
coordinates space.
And so, we've shown you how with
existing Pencil APIs, you can
build your own powerful custom
drawing engine.
But it will require a little bit
of elbow grease.
You'll need to listen to
estimated touches, delayed
force, all while rendering
quickly and asynchronously.
But now, with the new PencilKit
APIs, we've made it super easy
for you to integrate drawing
into your application.
And you'll get the same
expressive low-latency
experience that we have across
all of iOS.
Finally, you can also adopt
other great Pencil APIs in UIKit
such as UIPencil interaction to
handle double-taps on the new
Apple Pencil.
Or the new UIScreenshotService
API so that you can deliver full
content to be marked up
everywhere.
For more information, you can
reference the URL at this
session.
Now, go off and build some
amazing great Pencil
applications.
And we hope you have a great
WWDC 2019.
[ Applause ]