WWDC2018 Session 603

Transcript

[ Music ]
[ Applause ]
>> Good afternoon, everyone, and
welcome to Integrating Apps and
Content with AR Quick Look.
My name is David Lui, and I'm so
excited to talk to you about AR
Quick Look.
As you all saw yesterday at the
Keynote, we brought this new
capability of previewing 3D
models in AR space throughout
iOS.
AR is becoming the tool to see
what objects look like in the
world, from a new sofa you're
purchasing for your living room
to that cute animated character
you're reading about in your
book.
AR Quick Look was designed to
enrich any application on the OS
with AR content with a simpler
way to adopt AR previewing for
consistent viewing experience.
So, in this session, we'll take
a deeper look at what AR Quick
Look is, how to design your app
and web page to take advantage
of AR Quick Look, and creating
content specifically for it.
So, let's get started.
So, let me show you what AR
Quick Look is with the files
app.
Let's preview the 3D silver
pitcher model and push it into
AR.
We find the table, and we place
the pitcher on the surface.
You can see the pitcher on a
1-to-1 scale and get a better
understanding of how it fits
with your surrounding.
On my table, I have an Apple
mug, some Post-its, and a
kettle, and it looks so
realistic with these items.
And what's really amazing is
that you can see the colorful
Post-its reflecting on the lower
body of the pitcher, giving this
illusion that it's actually
really there.
So, it's really magical how the
virtual pitcher blends in with
the physical world.
To help get my lovely pitcher in
the world, we're using AR Quick
Look.
The way to preview 3D content in
AR.
AR Quick Look is built in and
deeply integrated into the OS to
allow previewing from any
application and websites.
The deep integration comes from
us using, of course, Quick Look
technologies and expanding its
capabilities to enhance the
system experience with AR.
The AR portion of Quick Look is
available in iOS 12 on ARKit
compatible devices and only
object mode on non-ARKit
supported devices.
You'll see this new AR file icon
throughout the OS to represent
the new 3D file format usdz.
Let's invite you to Quick Look,
the 3D content in our viewer and
see how it looks like in your
world.
So, let's tap on the icon to
launch AR Quick Look.
AR Quick Look handles setting up
the AR experience like plane
detection, object placement,
gesture manipulations, and
creating contact shadows.
But these are only a subset of
all the viewer capabilities that
together with rendering a 3D
model, provides an immersive
experience.
So, use AR Quick Look if your
goal is to preview 3D objects
and push them into the world.
By adopting our new system
viewer, you got a consistent 3D
content previewing experience,
so you don't need to create your
own.
It's really easy to adopt and
integrate the viewer into a
website and application and it
could be as simple as embedding
a usdz file if your application
already supports and uses Quick
Look.
Just as you saw from the
previous slide, AR Quick Look
does all the heavy work for you
in providing and creating a
great AR experience.
And it's truly magical how you
get an amazing AR experience by
just providing the 3D content
and we make this super easy for
you so you don't have to deal
with any AR specifics.
So, how about we see AR Quick
Look in action with the web
demo.
[ Applause ]
So, here, I have the files up,
and I have a catalogue of 3D
objects.
You'll notice that there's these
rich and beautiful thumbnails
with the AR badge stamp at the
top right-hand corner, and
that's there to tell you that
there's an AR experience behind
this.
So, I have a lot of great models
to choose from, but my personal
favorite is this little purple
guy at the top right-hand
corner.
So, Radar is our bug tracking
software, and if you don't know,
the purple aardvark is our
mascot.
So, I'll go ahead and Quick Look
the purple aardvark into the
viewer where we can see her in
our studio-like environment,
which we call object mode.
So, you can really pan around
and see the model from various
different angles.
We can pinch to size an object
and make it look a lot larger
and really see the fine details
that went into this model.
So, you can see how there's
animation baked into this model,
and AR Quick Look supports
transform animation and skeletal
animation, so the purple
aardvark is really lively and
animated with personality.
And just like a photo, you can
double tap on the model to
recess the position and the
size.
But of course, the beauty really
comes from pushing my purple
aardvark into the world.
So, I'll tap on the AR tab, and
there, the purple aardvark is in
the world.
And so we were able to place the
model so quickly into the world
because we started the AR
session the moment AR Quick Look
was launched, and we gathered
all the feature points of the
environment and pushed the
objects quickly out into the
world.
So, right here, you can notice
that the controls are hidden for
a full immersive experience.
Controls like the status bar,
on-screen controls and iPhone X,
we hide the home indicator.
So, you can go ahead and rotate
the objects using your standard
iOS rotation gestures.
You can scale the object down
using your two-finger pinch and
scale it larger if you want it
to be a lot cuter.
And so, you can see that
animation there where the purple
aardvark is just fixing all my
bugs and just slurping them up.
So, I can go ahead and
double-tap to reset the size
back to 100%, and you can
translate the model by just
tapping on it and dragging it to
a different place on the table.
Or you can tap on it and move
the device around for device
movement, really seamless
interactions.
Of course, since this is an AR
experience, we really encourage
you to walk around and be really
engaged and immerse and see that
animation, see the model that's
pushed out into the world.
So, we've also provided a
contact shadow.
So, you'll notice right there
just sitting on the edge, this
tells you that the purple
aardvark is grounded and placed
on that table in the world.
And what makes the aardvark look
so amazing right now is that
we're matching the color
intensity and the temperature
that's reported by ARKit from
the world, and we're using that
with our lighting setup.
So, I think this is really
amazing.
I want to remember this moment.
So, I'm going to go ahead and
tap to show the controls.
I'm going to take a snapshot of
this.
So, right here, I'm going to go
ahead and mark up the moment,
and this is really amazing, so I
can save this later and take a
look at this.
But with AR Quick Look, we
really wanted to make AR
accessible to everyone, and we
do mean everyone.
Accessibility is one of our core
values, and so we made AR Quick
Look work with VoiceOver and
switch control.
So, let me show you the
VoiceOver flow right now.
So, I'll triple click the home
button to activate VoiceOver.
>> VoiceOver on.
Files. Close.
Button. Landscape.
>> So, I navigate over to the AR
tab.
>> Home button to the right.
AR button.
Selected AR.
Radar aardvark is now positioned
in the world.
>> And what's really cool is now
you get audible feedback for
when my Radar aardvark is off
screen.
>> Radar aardvark is now off
screen.
>> AR when is back on screen.
>> Radar aardvark is now on
screen.
>> And this is really amazing,
and we really encourage
developers and users to really
think about the full
accessibility workflow in your
applications and websites as you
adopt and integrate AR Quick
Look.
[ Applause ]
So, I just showed you AR Quick
Look integrated into the files
app, but that's one of six first
party apps to have adopted and
integrate the viewer.
The others are mail, messages,
notes, news, and Safari.
And the great news is with AR
Quick Look built into Safari
itself, developers can integrate
it into their websites by using
our notational [phonetic] markup
that I'll talk more about later.
Lastly, you can adopt a viewer
into your iOS application to
provide a more immersive
experience with AR.
So, sharing a 3D model between
all these apps requires the
models to be bundled up in a
single sharable file.
To enable this, AR Quick Look
supports usdz, which is a new
file format for mobile
distribution of 3D models.
usdz packages all these models
and textures into a single file
for efficient delivery of 3D
content without having to work
with reference files.
It's based on Pixar's
open-source universal scene
description format, or short,
USD.
And usdz is supported on iOS and
macOS, in SceneKit and Model
I/O, and later in the session,
we'll show you how to convert 3D
models into the usdz format
using the new usdz Converter
tool that ships as part of Xcode
10.
Now that you know more about AR
Quick Look, let's talk about how
to integrate the viewer itself.
You can integrate AR Quick Look
in two different mediums.
The first, in an application.
Apps that showcase a lot of
photos and text can provide
better imagery and a really
immersive experience with AR
Quick Look integrated.
And here, we're using Quick Look
API to facilitate this
integration, to preview usdz
files, and load them into the
viewer.
Second, is in websites in
Safari.
So, integrating AR Quick Look in
news articles and product pages
of online storefronts really
complements the reading and
shopping experience by providing
a new way to visualize the
content and develop a better
understanding of what you're
actually seeing, and we'll be
writing HTML to allow previewing
AR content on the web.
So, let's get started with
applications and how the Quick
Look API is used to provide an
AR experience.
So, if you're not already
familiar, let me quickly explain
Quick Look.
Quick Look is all about
previewing documents like
Keynotes, PDFs, images, and now
3D model files like usdz.
Quick Look itself is an iOS
framework that not only provides
support to preview documents,
but also will be really to
create and provide custom
previews for your custom file
types.
It comes with everything you
need to preview documents.
You can control the transition
and the different presentation
modes for Quick Look, like being
embedded inline or incorporating
a push navigation flow and the
standard modal presentation.
And what's really amazing is
that security is handled for you
where the contents of the usdz
file is safely read to present a
preview for you.
Now, there's a dedicated endup
session about Quick Look and
previewing all file types,
including the new usdz file type
that I encourage you to check
out.
So, let's take a quick look at
the preview flow.
Here's the apps ViewController
on the left.
It has a grid of thumbnails for
various 3D models.
When someone taps on one of
these thumbnails, I want to show
a Quick Look preview of the
model for that thumbnail.
The first thing I want to do is
create a QLPreviewController.
I have instance, I have
ViewController that allows you
to preview documents.
And I set myself as a dataSource
and a delegate.
So it can ask me questions about
what should be previewed.
With that configured, I present
that QLPreviewController.
Now, it doesn't appear
immediately, however.
It first asks me some questions.
It starts by asking how many
items do you have for me to
preview?
And for usdz files, the answer
is one.
Now, it asks me for a URL for
the first item to be previewed,
and I respond with the file URL
for my app bundle for the model
that was tapped.
Finally, it asks me for the view
to use as a source view for the
transition animation for when
the preview is presented.
And here, I want a seamless
animation from the view that was
tapped to the view that was
going to be presented.
And so, I return View #3.
With all that configured, we're
ready to present our preview
controller, and so it's going to
animate and scale up from the
view that was tapped.
And this is what it looks like
in code.
In order to preview and present
documents, we need to
instantiate a
QLPreviewController.
It's part of the Quick Look
framework.
It has a dataSource and a
delegate, which I'll get into in
a bit.
And then we then present the
previewController modally.
Pretty simple, right.
So, let's take a look at the
protocol for a dataSource.
So, as I just mentioned, we need
to conform to the
QLPreviewControllerDataSource
protocol.
This protocol has two required
functions; one to provide data
to the Quick Look Preview
Controller.
The first being, the number of
preview items in controller.
And here, you return the number
of items that your Preview
Controller should preview.
For AR Quick Look, it only
allows previewing a single
object in one dedicated session,
and so we returned the one item.
The second function we're
implementing is
PreviewController
previewItemAtindex.
And here, you returned the one
item that we we're requesting to
be previewed.
This is the URL where to find a
3D object on disk.
Now, this should look very
familiar to a lot of you, and
this is because we're using a
very similar data source and
delicate pattern from UIKit like
Table Views.
So, moving to the protocol for
that delegate, there are a few
ways to present the preview.
AR Quick Look is intended to be
presented full screen.
To share the same consistent
experience with AR Quick Look, I
recommend you configure your
Quick Look PreviewController's
appearance to have the same zoom
transition animation.
So when the file's up, you can
see the nice transition from the
thumbnail view to the viewer and
back down as it goes down for
dismissal.
In code, the easiest and
recommended way to achieve this
effect is to implement
PreviewController, controller
transitionViewfor item and
return a UIView.
So, Quick Look does all of the
heavy work here for you and uses
the UIView to infer the
rectangular frame that the
animation should start from when
presenting and ending when
dismissing.
If possible, I suggest the final
or UIView to have a square
frame, to have the best seamless
transition.
AR Quick Look nicely handles the
transition from this view to the
full screen viewer on
presentation and dismissal,
creating this effect where the
model just magically appears.
And for an overall better
experience, I recommend using a
UI button as the view and
setting the thumbnail as the
button's image.
That way, you get visual
feedback whenever the view is
tapped with the button
highlight, and this is extremely
important so that the user knows
that some action is about to
happen.
And that is how to integrate AR
Quick Look into your iOS
application.
It's never been easier to
introduce AR preview and
capabilities in your apps today.
Now, let's talk about
integrating AR content on the
web.
We really want AR content to be
accessible online.
So, starting in iOS 12, Safari
has built-in support for
previewing usdz files in AR.
By hosting usdz files on your
website, you got a way of
delivering previews of 3D
objects.
So, not only can you read about
the object and see it through
the image, you can now
experience what the model looks
like in your world.
So, here, I have a modern
version of my online garden
store with various thumbnails of
gardening tools I have listed
for sale.
For better integration of AR on
my web content, I'll use the new
HTML markup that I'm about to
show you shortly.
This opens up and allows
customers to preview my
gardening tools before they
actually purchase it.
So, by following this markup,
you had a badge that I mentioned
earlier, rendered on the image
in the top right-hand corner,
and this is useful and there to
tell you that our system viewer
is behind this so you can
preview my watering can in AR.
So, you can provide your own
cover artwork associated to the
3D object, and this can be a
portrait shot or nice hero image
like I've chosen for my lovely
watering can.
And with this flow, we support
drag and drop and, of course,
long-press Safari actions from
Safari like Add to Reading List
or Copy and Share.
And the best part is you get a
more immediate and optimized
workflow into AR Quick Look.
So, you're not following any
links, and you get a response
without updating your web URL.
The HTML Markup is very
straightforward and simple, and
this is it.
So, by adding this attribute,
this will turn your webpage into
an AR experience.
The requirement is an a elements
with rel=ar, and this tells
WebKit that this is AR content.
You then provide the link to the
location of the usdz file.
Lastly, we have a single child
that is the image element, and
this is the cover artwork that I
mentioned earlier about that you
provide as your thumbnail for
the 3D content.
The badge will appear above your
thumbnail image inviting you to
preview and it screens the
object in AR.
And this can also be done with
the picture element.
Similar to the image element,
the rules here are single
picture child of the a rel=ar
tag.
The picture itself can have all
the children it needs, from
being very simple to having very
complicated roles about which
child source image it picks to
render based off of device,
size, and other platform logic.
usdz content must be served with
the correct media type, and so
make sure that the MIME type is
set for these files.
The registration of an official
type is in progress.
So, in the meantime, WebKit is
affecting both of these media
types if you're using Apache.
And that's how to integrate AR
Quick Look.
The goal was to make it really
easy to put AR everywhere, on
not just websites in Safari but
also in apps in iOS.
Now, you might be wondering
where can I get usdz content.
I'll invite my colleague Dave
who will talk more about how to
create content for AR Quick
Look.
[ Applause ]
>> Thanks David.
Well, let's start by taking a
look at what it is that we want
to achieve with our 3D models.
Here, I have a model of a teapot
that I've loaded in AR Quick
Look, and we can see that it
sits perfectly on the ground, as
indicated by its shadow.
It spins around its natural
pivot point, the center of the
bowl, and if I place it in the
world, we can really believe
that it's there on the surface
in front of me.
We even see the environment
reflected in its shiny metal
handle.
So, how do we achieve this
effect with our 3D models?
Well, when we're creating our
models, there are six things we
need to get right, and we'll see
how to do each of these in the
remainder of the session.
We need to set the model's
placement, we need to set its
physical size to be correct in
AR, we need to create any
animation we want for our model,
add its contact shadow, modify
its appearance, and then add any
transparency that our model
needs.
Once these six things are done,
we can optimize and export our
models for use in AR Quick Look.
We'll start with placement, and
there are three things we need
to get right in our 3D software
to make our model appear
correctly in the world.
The first is that the object
should face toward the camera,
toward positive z.
Secondly, the base of the object
should sit on the ground plane,
the plane where y = 0.
And finally, we should put its
natural pivot point at the
origin.
So, let's see how that looks for
a sample model.
Here, I'm in my 3D modeling
software.
I have the origin at the center.
I have positive x heading off to
the side, positive y heading up
from the origin, and positive z
toward the camera.
Now, if I import a model, if I
import my teapot without having
set it up, we can immediately
see there's a problem here.
The model is below the ground
plane.
If we look at it from the side,
we can see this model has been
created around the origin.
So, let's move it up such that
its base is set on our plane
where y = 0.
And now, it's ready to position
in the world on the ground.
If we look at it from the front,
however, there are still some
esthetic improvements we can
make here.
This kind of looks like a
teapot, but it could look more
like a teapot.
This is what we think of when we
think of a teapot, with a spout
to the side.
When you're thinking how to
position your objects for this
optimal recognition, think how
you would place that object if
you put it on a shelf or in a
display cabinet.
If in doubt, ask a young child
to draw the object for you.
Chances are the profile they
choose is the one you should
use.
Now, this profile really
matters.
Not only is it what people will
see when your object first loads
in our Quick Look, but it's also
the profile that's used for the
object's thumbnail, and these
two deliberately match when the
thumbnail is in files or in
messages.
So, we get that seamless
transition between the two.
So, choosing a good profile is
really important.
We'll head back into our 3D
software and rotate our model so
the spout is to the side.
As one final check before we
export it, let's take a top-down
view and make sure that that
bowl is centered on the origin,
so it will spin around the point
we want it to.
The second thing we want to set
up for our model is its physical
size.
Here, I have a gramophone record
player, and size matters here.
This record player is designed
to play 12-inch records, so we
need to make sure it's the right
size in the world.
If I export this model and load
it in object mode in AR Quick
Look, it looks great because we
always scale models to fit in
this mode.
But if I place it in the world,
we can see there's a problem
when I put it next to a 12-inch
record.
It's far too small.
Right now, it's set up to play
6-inch records, and they don't
exist.
So, let's head back into my 3D
software and create a reference,
12-inch cylinder.
Now, I know this is exactly 12
inches in diameter because I
know the units that this project
is working in.
I can use this reference
cylinder to resize my turntable
and make sure that the platter
is exactly the right size to
play a 12-inch record.
Having done so, if I re-export
my model, it looks exactly the
same in object mode because it's
still scaled to fit.
But when I place it in the
world, we can now see it's the
right size to play this 12-inch
record.
We could imagine picking it up
and placing it on that
turntable.
Now, not all objects have a
natural size.
This little guy is Chatterbox,
and he doesn't exist in the real
world.
He only exists in our
imaginations.
But he's pretty cute, and he has
a fun animation, and he's good
to take a photo with.
So, we've created him at desktop
size, which makes him easy to
place in the world in common
environments, then people can
resize him, shrink him down.
They can move him around or
scale him up and make him the
size they want him to be for a
photo.
But by making him desktop size
by default, he's always easy to
place.
We saw here that Chatterbox has
animation, and this is something
we recommend you use where it
helps to bring that object to
life with an idle animation that
makes it feel more like it's in
the world.
Note that animations always
loop, and the animations you
could create can use skeletal
and/or transform animation to
bring them to life.
Here's another example.
This is a cartoon style chicken
character without animation, and
if I place him in the world, he
looks pretty cool, but he's kind
of static.
He doesn't really feel like he's
alive there in the room with me.
So, there's more we can do here.
So, if we add a little bit of
skeletal animation to our
character to make him bob his
head, now when we place him in
the world, he feels much more
realistic, much more like he's
there on the table in front of
me.
Some tips as you add animation
to your models.
Firstly, choose animations that
enhance the AR immersiveness,
that make you feel more like
that object is there in the
room.
Now, because people can
manipulate your objects, they
can retape them, scale them, and
move them around, it's important
not to move them away from the
origin.
If I create a car that zooms
around in a meter-wide circle,
when someone tries to put their
finger on that car to pick it
up, it's already moved away and
zooms somewhere else.
So, that gesture is very
confusing to somebody trying to
manipulate that object.
So, try and keep those objects
centered at the origin.
For the same reasons, try and
keep a consistent bounding box
throughout the animation as
well.
Even if the object is at one
location, it's much easier for
me to manipulate it if it stays
in the same place under my
fingers throughout the entire
gesture.
As a result, prefer animations
that make sense at a static
location.
If you have a T-rex character,
rather than giving it a walk
cycle that would make it look
like it was running on the spot,
make it roar or stamp its feet
or something else that makes
sense at a static location.
Now, there is an alternative.
You can also consider creating
animations as self-contained
scenes to help with these
gesture manipulations.
And let's see an example of how
that can work.
Here, I have a swimming koi
fish, and it makes sense for
this fish to have some
translation.
So, we created a scene with an
aquarium base, and this means
when we place this in the world,
if we rotate it, or if we move
it, the scene we're manipulating
is the scene, not the fish, and
so the gesture makes a lot more
sense.
It also means by having a base
to our scene that we have a
consistent bounding box
throughout the entire animation.
Another thing that really brings
your models to life in the world
is that contact shadow.
Now, note here that AR Quick
Look provides the contact shadow
for you.
This means that it can turn the
shadow on and off as you
transition between modes.
It also means it can apply
ambient lighting conditions to
the shadow as the lighting
changes around you.
Because of this, don't bake a
contact shadow into the models
you provide.
If you do, you'll end up with
two shadows.
Note also that the first frame
of any animation is used to
create the shadow for a model,
and sometimes this means it's
important to choose the right
first frame.
We'll see an example.
Here, I have a model of a
Newton's cradle, a desktop type,
and it has a swinging animation,
a transform animation to bring
it to life.
Now, if we freeze this with its
first frame, here the first
frame has one of the ball
bearings out to the side, we can
see there's a problem.
The shadow we've generated from
this first frame also has that
ball bearing out to the side.
And so, if we animate to the
other extreme of this animation,
at this point, the shadow
doesn't really make sense.
Now, because animations loop, we
could choose a different
starting point, a more neutral
one where the ball's in the
center and still have the same
animation, the same overall
effect.
But now, by choosing a better
first frame, our shadow makes
sense throughout the entire
animation.
One of the biggest things we can
modify about our model to make
it feel real in the world is the
model's appearance.
Now, here, AR Quick Look uses a
Physically Based Rendering, or
PBR shader, and this gives us
six things that we can modify
about our model's appearance to
make it feel more real in the
world.
We do this by providing
textures, images, for each of
these six things if we want to.
The first is the model's albedo.
This is the base color of the
model, its underlying color.
The second is indicating which
parts of the model are metallic.
This changes whether they are a
conductor or an insulator, which
changes how they interact with
the physics of light in the real
world.
We could also specify which
parts of our model are rough or
shiny by providing a roughness
texture.
We can add in a normal map.
This creates the illusion of
depth and variances within the
surfaces, the model's surface,
without changing the underlying
mesh.
We can add an ambient occlusion
texture to specify where the
model casts shadows on itself in
the crevices and nooks and
crannies of the model.
And if our model emits lights,
we can provide an emissive
texture for things like a TV
screen or a computer monitor.
Now, if you're just getting
started with PBR, we have a
session on Thursday about
creating PBR-based models for
any AR experience, and I highly
recommend you check it out.
For now, we'll take a look at
how these six textures affect AR
Quick Look particularly.
So, I'll start with a model of a
TV.
Right now, it has no textures.
It's pretty bland.
We'll bring in an albedo texture
to give it some base color, and
it's much more recognizable as a
TV now, but it's still very
flat.
It doesn't feel realistic.
The next thing we'll do is to
introduce a metallic texture to
make the trim around the TV
screen and the antenna on top to
indicate that they're made from
metal.
Note here that our metallic
texture is pretty much black and
white.
Things either are a metal, or
they aren't.
But the metal still doesn't look
as realistic as we want it to
do, and that's because the
metal, like all of the model, is
entirely rough.
There's no shine, there's no
smooth surfaces on this TV right
now.
To fix that, we'll bring in a
roughness texture, and at this
point the model really comes to
life.
The metal around the TV screen
and the aerial now looks like
metal.
It looks like chrome, and the
wood on the top of our TV
actually looks like it has a
varnish or lacquer effect.
We can believe it much more
realistically as a TV.
And roughness is one of the
biggest points of creative
control you have over the look
of your model and how it feels
in the real world.
And you'll see, we have quite a
lot of variance in our roughness
texture, even within the same
parts of the model, to give more
realism and to bring it more to
life.
Next, we'll add a normal map to
add the illusion of variance,
particularly in the woodgrain on
top of the TV here.
The underlying mesh hasn't
changed, but it feels more 3D.
Also, on the speaker grill.
Because we have some crevices
and some hidden areas on the
front of the TV that would cast
shadows on themselves, we'll add
in an ambient occlusion texture.
The effect is pretty subtle, but
it does add to the depth of the
model and make it feel more real
when it's in the world.
And because this is a TV with a
screen, we'll add an emissive
texture as well to indicate that
the screen emits light into the
world.
The effect of this is best seen
when pushing the object into a
room.
Here, we can see that the main
body of the TV reflects the
ambient lighting conditions of
the room it's in, but the screen
is brighter.
It's actually emitting light.
Emissive texture is a fairly
special case but not something
you'd use every day.
But if the object you're
modeling would glow if you
turned the lights off, it might
be a good candidate for an
emissive texture.
We can also make parts of our
models transparent.
This is useful for making
objects made out of partly
glass, for example.
If you do this, be sure to use a
separate material for the
transparent and non-transparent
parts of your model to ensure
they render correctly.
You do this by providing an
albedo texture that has the
transparency in its alpha
channel.
So, a transparent PNG file, for
example.
Note that transparency is really
intended for see-through parts
of the model, not for creating
cutouts like a leaf edge or a
butterfly wing.
We'll see this in action for our
TV model from before.
In addition to the mesh we
already have for the base of the
model, I'm going to add a second
mesh for a curved screen, a
screen cover that's made of
glass, with a slight amount of
dirt on the screen.
Here's how it looks in AR Quick
Look.
We can see that the light
reflects on the screen and
shines off the curved surface,
and if we turn it to the side,
we can see the color of the
screen there as well where the
glass as some grime to it.
We achieve this by using this
kind of texture for our screen.
It's pretty simple.
It's just adding that small
amount of dirt.
But crucially, it has the
transparency in its alpha
channel to indicate that the
screen is see-through, and we
can see the main screen behind
it.
When you are setting up the
textures for these individual
properties.
These are the formats to use.
Albedo should RGB, or RBGA, if
you're also providing
transparency.
Normal and emissive should also
be RGB.
And metallic roughness and
ambient occlusion should be
Greyscale.
You can use any image format
supported on iOS, and note that
these textures should be square.
They should be powers of 2
squared.
So, 2k, 1k, 512 pixels, so on.
David mentioned earlier that AR
Quick Look is available on all
devices that support iOS 12, and
this is actually quite a range
of devices with different
capabilities.
Now, because AR Quick Look is a
system-wide extension, this
means it has to share the
available system memory with the
applications that launch it.
And so, there's only so much
memory available to display and
load these models.
To help with this, we recommend
that you create and optimize and
test your models for high memory
devices like the iPhone 7 Plus,
8 Plus, and X, and the iPad Pro
12.9 inch.
If you do this, AR Quick Look
will dynamically downsample the
textures in your model for other
devices of different
capabilities, if needed.
This means you can make the
devices, your models look great
on those devices but still know
they'll work everywhere.
Know that the meshes and the
animations won't be modified.
It's only the textures that will
be downsampled.
You're probably thinking at this
point, okay, so how big can my
model be?
How big and complex can my
textures be?
And the answer is there is no
one answer.
Many factors affect a model's
memory usage, its mesh, and its
animation complexity, and its
textures size and count.
But as a guide, for a model that
has a single PBR texture, using
those six textures we saw
earlier on, if you aim for about
100k polygons, one set of 2k by
2k PBR textures and about ten
seconds of animation, this is
typically good for those
high-memory devices we mentioned
earlier on.
If in doubt, always test your
model on a device to make sure
that it renders and loads as you
would expect.
As you optimize your models for
use in AR Quick Look, a few
things to keep in mind.
Do be sure to freeze any
transforms in your models and
merge any adjacent vertices to
tidy them up ready for export.
And if you can, try and use a
single material and texture set
for the entire model.
This enables you to optimize how
you pack the different parts of
that model into those square
textures that you're using.
And if you don't need a texture,
don't include it.
If none of the parts of your
model are made from metal,
there's no need to provide a
metallic texture.
The default is non-metallic for
any model that's loaded.
When you're choosing where to
spend your texture budget,
prioritize the areas that give
most realism to that model.
For some models, that might be a
high-resolution albedo texture.
For others, it might be better
spent on roughness or the normal
map.
And remember the pixels actually
have a physical size in AR.
If you're printing a model of a
thimble, and it's one inch
across, there's probably no need
to use a 2k by 2k texture for
it.
Those pixels will be tiny in the
real world.
And although we want models to
look great, we also need to
balance texture size and quality
against file size.
If a model is being downloaded
from a website, we want it to be
a good size for downloading and
also for sharing via messages
and mail and other means.
So, with all of that in mind,
how do we make these things?
Well for this, we have a usdz
Converter.
This is a command line tool that
will convert existing 3D models
to usdz format.
It ships inside Xcode 10, and in
addition to creating usdz files,
you could also use it to map PBR
textures to the meshes and
submeshes inside those existing
models.
It's X3 input format, OBJ files,
Alembic files, and existing USD
files, either USDA or USDC, the
ASCII and binary versions of
USD.
Before we create one, let's take
a look at what it is we're going
to make.
Let's look inside a usdz file.
In essence, these are
uncompressed zip archives.
The first file is always a usdc
file.
This contains the model's mesh,
its animation, if it has some,
and any material definitions it
needs.
And then, the remainder of the
files in the archive are any
textures, any images, like the
ones we saw earlier on.
If you're thinking of creating
your own tooling to build usdz
files, the great news is that
this is that this is an open
format and Pixar have published
the direct specification for
usdz on the graphics.pixar.com
site.
To create one with a usdz
converter, first note that we
call the tool with xcrun because
it shifts inside Xcode.
We pass in the name of the model
we want to convert, an OBJ file
in this case, and the file name
of the usdz we want to have the
other end.
To map PBR textures to the
meshes they're in, we use the -g
option, followed by the name of
one of the groups, a mesh or a
submesh that we want to map them
to, and then we can provide any
number of these textures for
that particular group.
To map multiple textures to
multiple groups, we can use the
-g option multiple times, as
many as we need, to map all the
textures in our model.
And if you're not sure what the
groups in your model are called,
you can pass in the -v option
for verbose output.
This will print the names of the
groups that we find in the model
plus useful other information
about the conversion process.
In addition to making your own
models, we've also put together
a gallery of example models that
show good practices for PBR.
If you visit this gallery in an
iOS 12 device, you can tap any
of these models, open them in AR
Quick Look, and place them in
the world.
This gallery also showcases the
Safari integration that David
showed you earlier on.
So, if you have iOS 12 on a
device, visit
develop@apple.com/arkit/gallery
and give them a try.
So, in summary, AR Quick Look is
a system-wide way to view AR
content in the real world.
It can be integrated into your
own apps and websites.
It uses the usdz file format for
3D model distribution and
sharing.
It supports PBR, animation, and
transparency, and you can use
the usdz converter tool that
ships in Xcode 10 to convert
your existing models to usdz.
For more information on the
subjects in this talk, check out
the session's website.
And we also have ARKit labs
throughout the week.
So, please do bring any
questions you have about AR
Quick Look, and we'd be really
happy to help you answer them.
Thank you.
[ Applause ]