Transcript
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> Hello. My name is Norman.
I'd like to give you a quick
tour on some of the details
in what's new in SpriteKit.
So last year we had announced
SpriteKit at a conference,
a high performance
2D game engine
with built-in physics support.
We also built tools
right inside of Xcode
to help developers improve their
game content iteration time
like the particle editor and the
automatic [inaudible] generator.
And this year we made
SpriteKit even better.
We have awesome graphics
technologies like shaders,
lightings and shadows
that help you really bring
out the liveliness of your game.
We also have good
simulation technologies
like per-pixel physics
occlusion, physics field,
universal kinematics
and constraints.
They really allow you to
build rich user interactions.
So that's not all.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
In Xcode 6's release we now
have a SpriteKit editor built
into the IDE that allow
you to create game scenes
without writing any code.
And you can use the very same
editor to interact with any
of the new and existing
SpriteKit features.
So how cool is that?
So, for the agenda for
this session we're going
to start looking at how
to use custom shaders
to customize the drawing
behaviors of your sprite,
how to add lightings and
shadows to your scene
and we really have a lot
of new physics updates I'd
like to give you an update on.
And the integration
with SceneKit really
brings new possibilities
of writing to these games.
It also blurs the lines
between the actual 2D game
and 3D game implementations.
And lastly, I'm going to give
you a quick tour and demo
of the new SpriteKit editor
and some other improvements
in already existing APIs.
So the first thing I want
to talk about is shaders.
SpriteKit does a fantastic job
of extracting the lower-level,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
SpriteKit does a fantastic job
of extracting the lower-level,
platform-dependent graphics
APIs away from our users.
But from time to time there's
always a very special effect
they want to achieve, say a
custom blur or a heat signature
when you turn on thermal
vision, the post-effect,
or when you build a space game
where the spaceship's hit,
taking damage, you want to have
a custom warping effect just on,
like, the sprite.
The shaders become a more
relevant solution in this case
and for this release we
do give you the ability.
So here we have a
SpriteKit shader demo.
We're actually just
rendering a single sprite
without any textures but
the sprite itself is running
on custom shaders.
So the shader does a
lot of things here.
Its' rendering all the
stars in the background;
it's also drawing the grid;
also provides the warping effect
of the grid; and also
when the touch --
the user touches any of
the screen areas it's going
to provide an energy effect.
So all of these are
done in shaders.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So all of these are
done in shaders.
So to do this we're also passing
the actual touch locations
down to the shaders as uniform.
So in a shader, shader
knows, "Okay,
the user just touched this area.
Now let's render
the warping here
and render the energy
effect associated with it."
So let's have a look at what the
overview of a shader looks like.
So shader helps you
really customize the way
that sprites are drawn.
You have 100 percent control of
how every single pixel is going
to be output on the screen.
The shader syntax uses
very a C-like GLSL syntax,
a very powerful tool to have
any of the powerful effects,
like image processing or, to --
all the way to fast motion blurs
that's being used in 3D games.
And whether you are
new to SpriteKit
or existing SpriteKit users,
you can drop this custom shader
in with just a single
line of code.
So to use custom shaders we
create a brand new object called
SKShaders object.
It is essentially a
container that holds the GLSL,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
It is essentially a
container that holds the GLSL,
the OpenGL ES fragment shader.
So we're following the GLSL --
OpenGL ES 2.0 standard and once
you write a shader it can be
deployed to both OS X and iOS.
So here is a list of the
nodes that's being supported
in SpriteKit.
So SKSpriteNode supports
custom shaders
and SKShapeNode now supports
custom shaders on both drawings,
the stroking and filling.
And SKEmmiterNode.
So with the particle effect you
have a custom shader running
on every single emitter
particles in the Scene.
And lastly the EffectNode
and SKScene now support
custom shaders.
So that gives you
the powerful ability
to not only use the CI filters
to have a full-screen effect,
but also you can have a shader
to have any possible effect
that you want to implement.
So, as soon as the shader is
getting uploaded to the sprite
and it's being run at the pixel
level, we're also passing a lot
of building attributes
to the shader.
So we don't have to set
up a brand-new attribute
or uniforms and pass it in.
So, for example, here
we're passing the u-texture
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So, for example, here
we're passing the u-texture
that gives it to you as
what texture is being used
for the current sprite
as well as,
as well as the texture
coordinate and the size
of the sprite that's
in pixel size.
So how does shader
work in SpriteKit?
Once you know about the SKShader
class now it has two possible
attributes that you
can start on.
The first one is
the shader source.
You can create a shader
from two possibilities
by attaching the source.
One is creating the
shader from a file
and also you can set the
shader from a string.
And the shader itself also
has an array of uniforms.
These are optional, but if your
shader actually requires any
of the external parameters
that're being set in your game,
you can use that SK uniforms
on the uniforms property.
So now once we have
these two properties set,
the shader is loaded
and is ready to go.
And we have a scene
here, for example,
a scene has three
separate sprites.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
You can assign a shader
directly to the scene.
You can assign the shader
to any of the sprites
and then they picked up
the effect right away
in the next, in the next frame.
So creating a shader will have
the new API called SKShader.
It can create with a file.
SKShader WithFileNamed with
passing like "blur.fsh".
The shader will look like any
G, OpenGL fragment program
that has a main function
and in the return values
and the gl-FragColor.
And if the shader needs any
of the custom uniforms you can
just set a uniforms array here
with creating SKUinform
uniformWithName.
You give it a name here.
We call it u-red,
that's a floating type
and the next one we're
setting a u texture,
or passing in a secondary
texture to the custom shader.
So the supported types are
float, texture, back to 2,
3 and 4 and matrix
2x2, 3x3 and 4x4.
And for the full list
of the custom symbols
that we're passing
into the shader,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
that we're passing
into the shader,
as soon as your shader
is getting run,
here is the full list.
So in terms of uniform, you get
direct access to the texture
of the sprite, the
sprite size in pixels,
the global time (in case you
need to do any of the animation
in terms of color or shapes)
and you also have access,
if you have a custom shader
that's running on a shape,
to the shape path length.
In terms of varying we're
passing the texture coordinate,
the color mix, as well as
the distance of the path
and we also have a very
convenient method in the shader,
so you can call directly,
called SKDefaultShading.
This is as if you were
to let SpriteKit render
the current sprite
and give me the pixel value
of the current behavior.
So the shaders are cool.
There's some best practices
I'd like to call out.
So number one, I'd like to
recommend using built-in
uniforms as much as possible.
So we do passing a lot of
uniforms for you to use.
That gives you a lot
of raw attributes
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
That gives you a lot
of raw attributes
of what we're doing
at the OpenGL level.
But if you do need to access
all the same uniforms,
please use the building one
to minimize the amount
of redundancy.
And also we would recommend
to avoid changing this shader
source in the middle of a frame
because that will force a
shader recompilation happening
at the backend.
And also when the shader is
being loaded and we recommend
to loadset all the
uniforms upfront
and you can change
uniforms that reframe,
but that's nice and fast.
But adding or removing
uniforms will also cause another
shader recompilation.
Also in terms of draw
call performance we
like to share the same shader
instance as much as possible
because the same shader running
on multiple drawing instances
gets to batch together.
And we'll recommend to
initialize the shader
at load time and initialize the
shader using a filename rather
than initialize shader using
string because if the shaders --
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
than initialize shader using
string because if the shaders --
multiple shaders are sharing
the same shader source then
SpriteKit will pick it
up as identical shaders
or drawbacks -- draw
call batching performance
remain high.
So the summary of using
custom shaders: it allows you
to completely customize the
way sprites are being rendered
on the screen.
You have raw access to a
lot of building attributes
that we're passing
down to the shader.
And it can create an
infinite number of cool
and unique effects
for your game.
Next I want to talk about
lightings and shadows.
So lightings and
shadows really bring
out the liveliness
of your games.
Say, if I'm building
a dungeons game,
and walking in a
long corridor...
All of a sudden the
area gets dark,
and all of a sudden
I see a very dim,
shaky light at the
end of the hallway...
that kind of brings out
that really scary atmosphere
and mood for the player.
So to create a light
in SpriteKit we introduce a
new type called SKLightNode.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
in SpriteKit we introduce a
new type called SKLightNode.
You can add it to the scene and
position it to anywhere you want
and it can light any
of the sprites that're
being participated
in the current light.
You can change the color,
shadow and falloff of the light
and will support up to 8 lights,
like, for a single sprite.
And the bottom line
is SKLightNode is just
another SKNode.
You can move it around,
turn it to another sprite
if you want the light to
follow one of your sprites.
You can run actions on it,
have it run follow path.
It's really cool.
So now let's look at some
of the basic add
properties for SKLightNode.
So here we have a scene.
It's very bland.
We add a light source
in the scene.
Now if -- we can change
to the lightColor,
we decide the lightColor
to be yellow,
it adds a yellow tint
to the light color.
And now if we set a
shadowColor to be gray...
and now the boxes start
casting shadows...
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and lastly if we want
to see a little bit more
out of the scene we
set the ambientColor
to really bring the scene
up a little bit more.
So let's look at the additional
attributes for SKLightNode.
The number one is falloff.
Falloff controls the radius of
the light in terms of the effect
and we also take what
works really well
with our physics properties
with the category of BitMasks.
So with SpriteNode now
you have individual,
big control of whether the
sprite is participating
in the lighting, whether
it's casting shadows
or whether it's receiving
shadows.
You have fine control
of exactly how you want
to control these 3 attributes
for each individual sprite.
And now since we are
talking about SpriteNode,
SpriteNode also has a brand new
property called normalTexture,
which you can assign
normalTexture with.
So usually normalTexture
is heavily used in 3D games
and normalTexture uses the
pixel RGB value to describe
where the normal vector for the
surface is pointing towards,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
where the normal vector for the
surface is pointing towards,
so the lighting calculation
can be used upon
with the normal vectors.
So here we're using
the exact same formula.
So to render the scene
with a normal map we use the
traditional A + B = C formula,
which means you supply
the texture --
you supply the normalTexture
-- SpriteKit will do its magic
and give you the result,
which is lighting up the scene
with the bumpy service
that's being mapped.
So here you-- as you can see
I can just set normalTexture
directly on the SKSpriteNode
by loading it
from the normal .png file.
So now being SpriteKit we
like to make our user's
life as easy as possible.
So in addition to the A + B =
C formula we get you directly
from A to C.
You do not need to
supply any NormalMap.
So, what we do is we take
the source texture image
and perform image analyzation
on every single pixel,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and perform image analyzation
on every single pixel,
and based on the
bightness of each pixel,
we have a multi-path algorithm
that generate the
best normal map
that describes the
current picture.
So if you give us a picture
that you take from the beach,
which is a bunch of rocks,
and their rounded
edges are really sharp
yet the surface remains
very smooth...
So here it's very easy.
It's magic one liner.
All we need to do is just
sprite.normalTexture = generate
a NormalMap from the
existing sprite's texture.
[ Applause ]
That's not all.
There is not a solution, or
a one-size-fits-all solution.
So, in addition we provide two
more parameters for you guys
to change the dynamic behavior
of the GeneratingNormalMap
in terms of smoothness, in terms
of contrast (which is how
bumpy you want the main surface
to be).
And this will give --
in combination will give you
an infinite number of looks.
For example, our cobblestone I
just have two for loops running
on these parameters
with 1 second delay.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
on these parameters
with 1 second delay.
As you can see it
actually changed the look.
You can get an infinite
number of looks out of it.
And the best part
about automatic NormalMap
generation is you can do this
on dynamic content.
If the user takes a picture
from outside and decides to put
that in the game, guess what?
You can light it without
any normal texture.
We can do all of this for you.
So to summarize lighting
and shadows,
they are very easy to use.
SKLightNode is just
like any SKNode.
You can run animation on it.
You can parent it.
You can set up colors and
everything is a one-liner.
Automatic normal map generation
really provides this dynamic
look for you.
You don't have to spend time
sitting down with artists
and trying to figure out how
to hand-paint a normal map.
"Oh, it's purple facing outward
or green facing to the left."
You don't have to
do any of that.
We take out the nitty-gritty
detail for you.
There's some best
practices in terms
of performance I'd
like to point out.
It's okay to have multiple
lights in the scene
and it runs reasonably fast.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and it runs reasonably fast.
But if you have -- the number
of lights that's lighting
the very same sprite
actually matters.
And if I have more than two
lights lighting the same sprite
you might not be able
to stay on constant 60
on certain iOS hardwares.
It's just something
to point out.
Next I'd like to point out
all the new physics features
that we have.
So let's look at them.
Number 1, per-pixel physics
updates: we have constraints,
allows to remove the boilerplate
code in your updates.
We also have inverse kinematics:
allow to you build
mechanical AI.
We also have physics fields
that apply these forces,
allow you to build the
next space game simulation.
So let's look at the per-pixel
physics body creation.
With the current
implementation on the left side,
if I want to build a
gears game, I can't.
The best thing I can do
is use bounding circles
and create the bounding circles
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and create the bounding circles
around these box,
uh, these gears.
They don't actually
grind each other
and the teeth don't interlock.
Okay, maybe I can
fake it a little bit.
I reduce the radius
of the physics bodies
so that the teeth might overlap,
but they still don't
touch exactly.
And also, the angular
velocity don't transfer
from one gear to another.
But now with a single line of
code you can generate from --
go from there to give
you a physics, uh,
a pixel-based physics body.
It's very easy.
[ Applause ]
So when we introspect
the source image,
based on the alpha mask we
generate a rough shape and based
on the shape we generate an
exact shape that's the minimal
in order to fit the
current sprite.
It's very accurate.
And now, to build
the gears demo,
you can just have a couple
one-liners, have the same code
in there, just change from
using bounding circles
to using image-based physics
body and you are good to go.
And they can be pinned,
they can be transferring
and interlocking;
it's really fun.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and interlocking;
it's really fun.
So, let's look at how we do it.
So, with the current API
to create a physics body
with a bounding box
we use physicsBody
and bodyWithRectangleOfSize will
give you an exact bounding box
of the dimensions that
you have specified.
Now, with a new API it's
the same initializer.
Convenient.
Instead you're just
passing the texture
and the size of the texture.
For example, a hammer here will
give you the exact hammer body
that's traced with the outline.
You no longer have to
do that yourself or try
to build any approximation
of the physics bodies.
Just like the automatic
and NormalMap generation there's
not a one solution fits all.
If you have a source
art that has a lot
of semi-transparent
pixels we do allow you
to specify the alpha
threshold, which defines
which pixels are being
interpolated during the process
as opaque or not.
So in summary, per-pixel physics
body is very easy to create.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So in summary, per-pixel physics
body is very easy to create.
They're very accurate.
And whether you are existing
SpriteKit users or new
to the APIs it's just a matter
of setting one line change
in your codes to create really
accurate physics simulations.
So again I'd like to point
out the performance tips.
SpriteKit does a really good
job of optimizing this algorithm
and we provide a good
balance between performance
and accuracy, but the
texture size matters.
So if you are passing in a
2K by 2K texture and scale it
down to 10 percent
by rendering a 20 by,
200 by 200 sprite we do have
to work down the whole 2K
by 2K pixels in order to
figure out the exact shape,
so just something
to think about.
Next: a brand new
API, the constraints
to help you really simplify
the game update logic.
The constraints, the motivation
of creating constraint for us is
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
The constraints, the motivation
of creating constraint for us is
to really remove the boilerplate
code in your updates.
So, a lot of times if I
want to move a character
but I want a health indicator
to follow the character
that's exactly 5 pixels above,
2 pixels behind...
I need to put that in
the update somewhere.
I need to have a cannon; you
want to orient the cannon
to follow the airplane.
Guess what, I need to add that
code in to calculate the angle
and figure out the delta,
translate the delta rotation
inside the translation.
Or if the airplane wants
to land on the runway,
I need to orient that first.
We do all of that work
for you; we can do it
with simple constraints.
So now we add, because of the
constraint that's being added,
we need to add some new
features in our update loop.
So in the update loop, we
expand it to some new selectors.
Number 1, right after physics
simulation now the scene starts
kicking in the constraint.
So constraints are not
limited to physics anymore,
so you don't have
to worry about "Oh,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
so you don't have
to worry about "Oh,
is the character standing
on a conveyor belt?
Is it going to be pushed
over because the box
is right beside it?"
Constraints will take
care of that for you.
And right after constraint
update,
users have another chance
of doing another update.
We do give the user a
callback on didApplyConstraints
so here you have a chance to do
any other last minute cleanup.
And the basis of constraints,
again we'll have a new object
called SKConstraint object.
It is used to wrap around
this mathematical constraint
on the properties of the node
that you want to animate.
So the constraints
are then attached
to nodes via the new
constraints array.
And the scene will
apply the constraints
to the attached node.
So what kind of constraints
can it set on a node?
You can set the constraints
on position, orientation
and distance and you
can quickly enable
and disable between frames.
So for example, let's
look at a quick example
of how the orientToNode
constraint works.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
of how the orientToNode
constraint works.
So we just call SKConstraint
with orientToNode
initializer and follow a node.
So here the arrow's just
following the circle
and then we're just
passing the circle.
[inaudible] the SKNode and
the range of offset is 0
and you can set, I want to
lead the circle following in
or lagging it, you
have that possibility.
So once the constraint is
created I just directly set
on the arrow's constraints
property.
Next, how do we set
position of constraint?
So here we create a positional
range of minus 100 to 100.
We can first set the limit
of constraint on the X axis.
This will limit the movement
in the X direction of the node.
Now we can also set
it on the Y direction,
that will give a
limitation on the Y axis.
And if you combine these two
together you are limiting the
movement of the current
node to a 200 by 200 box;
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
movement of the current
node to a 200 by 200 box;
very, very simple to use.
You no longer have to
write all this update code
that will manually snap
the object back in the box.
So in summary, it really
helps you remove a lot
of the boilerplate code,
making sure you just write
the code that's focused
on building the game
you want rather
than having to fix up things.
And also the -- because the
constraints is the array
that you can add the multiple
constraints into the same array
and the order of
evaluation happens from --
is based on the order of the
insertion into the array.
And we also offer a lot
of varieties of constraint
into position orientation
and distance.
Next, I'd like to talk
about inverse kinematics.
This is usually a very
strange word for people
who doesn't have a mechanical
engineering degree or people
who haven't written
an animation engine.
So inverse kinematics allow
you to use kinematics equations
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So inverse kinematics allow
you to use kinematics equations
to solve the joint parameters
if you have a joint
hierarchy trying to reach
for things in a 2D space.
So here I'm trying to use
the robot arm reaching
for where my mouse cursor is
pointing and I want to set
up the exact behavior of
how the robot will move.
I imagine to do that you
will update yourself and try
to do this every frame.
Okay, it's easy to
move the hand and now,
what does the lower
hand is going to do?
Followed by what the upper
hand is going to do to provide
that realistic behavior.
So you can do this for arms
and you can do this for legs.
You can do it with
blend with animation.
So number 1 of using inverse
kinematics is you need
to have a joint hierarchy.
So for example, if you look
at the robot arm it's a
joint of three pieces.
We start with the upper arm
(that's the root node) followed
by the lower arm (which
is attached as a child
to the upper arm) and then we'll
have the claw (which is attached
as a child to the lower arm).
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So now each of these joints --
to create a realistic look I
need to set up some constraint.
For example, my arm, my elbow
probably opens at 180 degrees
and closes at 30
and anything beyond
that range is going to snap.
I can't take it anymore.
So you can set up
that constraint
for each individual
node to create
that really realistic behavior.
So with SpriteKit how do
you do inverse kinematics?
How do I set up the constraint?
How do I set up the
parent-child hierarchy?
You don't have to do that,
because we use the
existing scene graph
that already have the
parent-child relationship
and we are good to go.
The only thing you need to set
up is setting up the constraints
on how each joints
open and close
to create the realistic
look that you want
and would provide actions
to drive these constraints
of the chain.
So the joint rotates
around its anchor point.
So by default the anchor
point is at 0.5 and 0.5
and that's not really
realistic for my shoulder.
Probably set it to
0.5 - or, 0 and 0.5.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Probably set it to
0.5 - or, 0 and 0.5.
So the constraint that we set is
called SKReachConstraint object.
It simply have to
two properties;
the lower angle limit
and upper angle limit.
Once you have these angles
specified you can attach
that to any of the SKNode
that's in the scene.
And now you have a perfectly
working joint hierarchy.
How do I drive it?
To drive it, we provide
SKActions.
We have two variants,
reachToNode and reachToPosition.
So, if you want to
reach to a moving target
or any stationary position
within the scene
you can use either
of the variants that's
being specified.
So, here I have a quick
example of one-liner writing,
running the SKAction of
reaching a constraint.
So here we have a simple
4 -- 3 joint constraint.
I use constraint.
Each joint will have a
constraint of opening
from 0 to 180 degrees.
As you can see when the mouse
moves it actually obeys the
constraint and tries
not to overbend and,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
constraint and tries
not to overbend and,
but it really give you
that realistic look
of mechanical bell
mechanical AI.
And now we also take the same
inverse kinematics solver to 3D.
It's much, much more interesting
but it's also very
closely implemented just
like this SpriteKit API.
So you have SK --
SCNIKConstraint and each node
has a SKConstraints array
that you set these
constraints on.
Also you have animation
influenceFactor.
So here, I'll just
give you a quick demo.
You have a 3D scene with a
3D character playing a punch
animation, nothing,
nothing else is running.
But now with IK running
I can blend it on top
of the animation playback
at 60 frames per second,
making sure the hand is always
punching at a red target.
So now imagine the possibility
of I'm building a tennis game.
All I now need is
two animations.
One is the back paddle
and forward paddle,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
One is the back paddle
and forward paddle,
everything else I will let IK
take care of it so I don't have
to build an infinite number
of animation combinations
for the game to also
have the realistic look.
So this really opened up
a lot of opportunities.
So, in summary, inverse
kinematics is really easy
to use.
You don't even need
to set it up.
The scene graph will
take care of it.
And the constraints can be set
on every single joint
that you have.
You can control the
opening and closing angle
and to drive these chains
you just run a single action
one-liner and tell
the joint to reach
out for a position or a node.
Next, I want to talk
about physics fields.
Now physics fields are
a type of field forces
that apply the generated forces
to the object that's
being part of the scene.
So here, I'm having a space
cannon launching off cannonballs
that interact with 2 different
radial gravity fields.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
As you can see, as the
cannonballs get closer
to the planet the linear
acceleration gets converted
into angular velocity and
start orbiting the planet
or shoots out.
So, we use the fields
to simulate any physical field
forces and fields can interact
with the physics bodies
that's in the region.
And the region is the place
where we define the
field effect areas.
And we have a lot of different
fields that's provided
with this release.
There's about 10 of them.
So, when those fields
get updated...
So number 1, I need to have
field nodes in the scene graph.
They are just like any SKNode.
You can add them to a scene.
You can run action
on them as well.
You can parent it to
another sprite so if you want
to have a really big cookie
planet you can add a radial
gravity field as a child and
move that cookie planet around
and then the field is
going to follow it.
And if there are physics
bodies that's located
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
And if there are physics
bodies that's located
within this region and the
bitMask matches the interaction
will start happening.
Now, the control fields, fields
will provide a lot of parameters
that will allow you
to get the exact look
and different interactions
that you want.
You can control number
1, region.
That's the area of effect of how
big of the area I want the field
to interact with the user with.
Now, the strength in combination
of falloff controls
what's the magnitude
of force that's being applied
to each individual object
that's in the field.
And minimal radius is
just a clamp radius
and bitMask can be
used to differentiate
which physics body you want to
interact with this field or not.
Now, let's look at the regions.
The SKRegions define
the area of effect
for this particular field.
The region defines
this area in 2D space.
By default it is infinite and
you can create a rectangle,
circle or even create
a region from CGPath.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
circle or even create
a region from CGPath.
You can do a lot of complicated
operations on them like invert,
subtract, union and intersect.
So for example, here, I'm
if building Earth here
and the radial gravity around
the Earth, as you can see,
is pulling everything
towards the center.
So, in addition to physics
bodies fields can also interact
with SpriteKit particle effects.
So, as long as you
set the fieldBitMask
on the particle effects,
every single emitted object --
particles can interact
with the field.
So, here we have a noise field
that apply a coherent
noise force to each
of the emitted particles.
Now, let's look at some
of the basic fields
that's being provided
so everyone can get a feeling
of what fields are
really looking like.
So, by default we provide the
base, linear gravity field.
This is just to simulate
Earth's gravity in one dimension
and you can change the direction
at any time or if you go
up it will attract the object
at the correct location.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
up it will attract the object
at the correct location.
And second, if I want to
simulate a space game,
have a planetarium
gravity effect,
we have the radial gravity
field node that you can use.
For example, here the object
carries a linear velocity,
but when it reaches close
to the orbit the linear
gravity's converting
into angular velocity
so the object actually
orients around the planet.
We also have a spring field.
This is the imaginary field
as it's imagining every
single object in the scene --
in the field -- actually have
a spring hooked from one end
and attached to the node.
So, here you can see they're
being oscillated back and forth.
And we also have noise fields
that apply a coherent
noise force
to every single object that's
being participated in the scene.
And electric fields
are particularly cool.
So, imagine each of the objects
have charges, positive charges
and negative charges and here
we have an electric field
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and negative charges and here
we have an electric field
that carries positive charges.
And positive charge attracts
objects with negative charges
and repel objects
with the same charge.
So, here the red particles, or
red cannons actually, that...
when they carry a positive
charge they get repelled away.
And the green ones
carry a negative charge
and are being attracted
and interact
with the electric field.
So, fields we provide
-- physics fields
as building blocks
as like Legos.
Feel free to interact
with them and build,
combine them together.
So, you can combine
them together
to have big building blocks and
each of the fields can interact
with different fields.
So for example, if I want
to implement one variation
of the Lorenz attractor I can
simply have 4 magnetic fields
sitting right by each
other with opposite charge.
And, what happens if I send
particles through the field?
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
And, what happens if I send
particles through the field?
So, that's what it looks like
so -- which is very cool.
[ Applause ]
So, in summary the
fields are very fast,
they're very efficient.
We have a brand new
implementation for --
and we actually have a lot of --
spent a lot of optimization
effort on this feature.
And you can use fields
to interact
with either physics
bodies or particles to have
that really fun interaction
experience for the user.
And you can also use fields
to interact with other fields
to have a combined effect.
Next, I want to talk about
integration with SceneKit.
We worked really closely
together with the SceneKit team
to make sure we have the
best possible experience
for bringing 3D content
into 2D games.
So, here we have a demo.
The spaceship is in 3D,
object that's in a 2D and same
as the asteroid that's in
the 3D object but we'll bring
into the 2D background.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
into the 2D background.
So, the integration
brings new possibilities
to developing 2D games so we
can now officially include 3D
content into SpriteKit games.
You can control any
of the 3D objects just
like any other SKNode.
You can run action on it.
You can run scale.
You can make it follow paths or
even make any 3D manipulations.
It's deeply integrated
of the two frameworks,
yet it would remain
loosely coupled
as two independent solutions
for the game developers.
So, it is rendered very
efficiently together,
SceneKit is rendering directly
into the OpenGL content.
We're not passing -- we're
not rendering to texture
and then passing texture around
between the two frameworks.
This is a very efficient
solution.
So, to bring 3D content
into 2D-based games
we created SK3DNode.
It is the toll-free
bridge allowing you
to incorporate any 3D content
into SpriteKit-based games.
So, once you have SK3DNode you
can attach any of the scnScenes
to this SK3DNode in order for
it to render in this SKScene.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
to this SK3DNode in order for
it to render in this SKScene.
And once you have the scnScene
you can set the scnScene
property on an SK3DNode and they
will start using this SceneKit
render in our render path.
So, how do we create
an SK3DNode?
And we have the initializer
create a scnNodeWith
ViewportSize, you
specify a static size.
You can attach any of the
scnScene, SceneKit scenes
or SceneKit objects through
the scnScene property.
You also have access to SCNNode,
which gives you the point
of view of where the
default camera --
or if there is a camera,
where is the pointer --
where is the camera looking
at, at the current scene.
And if the scene doesn't
have any lighting you can use
one-liner automatic --
autoenable DefaultLighting
that will turn on--
add a default light to the scene
so all the objects
are properly lit.
So, here is a quick example,
if we want to add a 3D alien
from a [inaudible] into SKScene,
which is called SK3DNode
initialize
with a default viewport.
Load scnScene and set the scene
and add it to the SKScene.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Load scnScene and set the scene
and add it to the SKScene.
Now, the 3D alien object
is going to appear.
Now, the integration
also goes both ways.
SpriteKit now powers all the
texture needs for SceneKit
as well as sounds so you can
use any SpriteKit texture object
directly on SceneKit,
that including all
of the tools we built
from the last version,
which is the automatic
TextureAtlas generation
within Xcode as well
as the procedurally
generated normal map.
So, you can automatically
generate a normal map,
put it on any 3D object and
the effect looks really,
really cool.
And SpriteKit and SceneKit
also share the same audio
playback interface.
So, having the integration
within the 2 frameworks really,
really add a lot of
possibilities here.
You can have another level
of interaction with you user.
For example, have a constant
background of 3D and all
of the sudden you see a 3D
object flying out of the screen.
It's actually making the user
having a third perspective
of what a game looks like.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
of what a game looks like.
Lastly, I want to
talk about tools here.
So, for Xcode 6 we have released
a brand new SpriteKit editor.
It is part of Xcode release and
you can use it to create any
of the game scenes
without writing any code.
You can also use it to interact
any of the SpriteKit features.
In a nutshell, everything you
have seen here today can be done
inside of SpriteKit Editor
without writing any
lines of code.
It forces you also...
[ Applause ]
Thank you!
It also enables you to
write the data-driven model.
Writing games usually
deals with a lot of data.
And we want to shift
the focus from focusing
on designing one level rather
than have a generic approach
of data-oriented
programming model.
So, now with SpriteKit Editor
we actually separate the game
content from the game logic.
So, you no longer have to
manually add a spaceship,
set a degree at 10-10 and
launch the game, recompile it
and launch the game and
"I'm about five pixels off.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and launch the game and
"I'm about five pixels off.
Maybe I'll add 5 pixels and..."
It takes all the guessing work
out of your iteration process!
[ Applause ]
And we also provide
simplified game templates that's
in both Swift and Objective-C.
So, out of the box
you are good to go
and have a brand new scene
created for you and ready
for drag-and-drop and
play, create your level,
and making sure your game
is running on day one.
Not only can you use
SpriteKit Editor as an editor,
you can use it as
a debugger as well.
So, if you're in the middle
of running your scene,
and one of your ships
becomes missing,
you can use this one line
of code and just type
that in a debugger,
you get an SKS file.
And guess what, you can
load that back into Xcode
and see what's going on for that
scene and you trace back exactly
where the scene hierarchy is.
So, in the case of, say...
If the spaceship got hidden
because of the Z order,
you can totally see
that within the Xcode.
And if you have an existing
game that's not even written
for data-oriented programming,
you can use the same line
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
for data-oriented programming,
you can use the same line
of code to serialize it out.
And if you need to retouch
it or adding new features,
dragging new Xcode in -- use the
editor and add new features in.
They are ready to go.
[ Applause ]
So, some of the basic
features that we provide
for the SpriteKit
Editor allow you
to do basic object
manipulation and placement,
they include position,
orientation and scale.
You can set up a physics bodies,
bounding box, bounding circles
or even the brand new
per-pixel physics set up.
You can bring in 3D content
from [inaudible] directly
into a 2D scene and save
it and load it in game
and see the 3D object
and it's ready for --
ready to be manipulated.
And we can set up shadows
and lighting effects,
inverse kinematics.
You can set up an inverse
kinematic joint hierarchy right
inside of Xcode and preview
that effect right here.
We also provide an integrated
shader editor, allowing you
to have a WYSIWYG effect
of editing your shaders
and tuning your shader uniforms.
So, I'm going to give
you a quick demo.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So, I'm going to give
you a quick demo.
So, from Monday's talk, the
"State of the Union" demo,
hopefully you have seen
how to use SpriteKit Editor
to create physics bodies, set
up per-pixel physics collusions
and also interact with
field forces and 3D objects.
So, today the topic that
I'm going to cover is how
to use SpriteKit
Editor to set up lights
and shadows using
inverse kinematics and how
to use the building
shader editor
to quickly iterate your shaders.
So, let's have a look.
So, here we have a brand
new lighting scene.
So, nothing is in
here and ready to go.
If you click in the object
library this gives you any
existing textures that's
in a current project.
So, if I just drag
in a cobblestone
and I can make it
slightly bigger.
And to see any of the
SpriteKit widgets,
you just open the object library
and, because we're adding
in a SpriteKit scene,
its content-sensitive.
It knows these are the SpriteKit
objects that're relevant
for this editing experience.
And now let's drag a light in.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
And now let's drag a light in.
It doesn't do quite what I want
yet, because it hasn't lit yet.
As you can see the lighting
mask that we have on the sprite
or on the cobblestone
is not set.
So, here if we set it to be
1 we you see that right away,
no code of writing, nothing.
So, if you save this file,
load it in your scene,
this is exactly what
you're going to get,
because SpriteKit Editor
actually uses SpriteKit writing
instead of Xcode.
So, now we can move the scene
around, move the light around.
You can see different effects.
And now it's kind of
2D-ish and blandish.
Maybe we can change the texture
so we can automatically generate
a normal map on the fly.
Say if I want to make
stone a little bit sharp,
but also have a bit of
contrast I can do that.
It's just two numbers.
And if I want to
make it slightly...
mm, maybe it's too sharp,
it might look very
discomfort to walk on.
So, if I want to have --
lower that, that's
slightly more subtle look.
And if I move that object
around as you can see
the light, real light.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
around as you can see
the light, real light.
[ Applause ]
So, now let's go ahead and
add a stone object here.
That's a little bit too big.
Let's make it smaller.
And I want the stone
to be lit as well
so let's set the
lighting mask to be 1,
same as the scene object, okay.
And because we want to have
maybe a 3D look for this,
maybe the stone will
need to cast shadows...
we just need to set the
shadow mask on that.
So, move it around and now
we can change some properties
of lighting.
So, I can change the
lighting color...
Oh that's a little weird.
Something normalish, warm color
is the way to go, so there.
As soon as you hit File,
Save the scene is ready
and you are good to go.
You don't have to do SKLightNode
in it, add it to scene,
position equals -- none of that.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
position equals -- none of that.
So, this is very cool.
So, next I want to show
is the inverse kinematics,
so here I have a
preassembled robot.
I'm just going to give
you a quick overview.
So, here I have the arm object.
Arm is parented directly to
the scene for the upper arm.
The lower arm is attached as
a parent for the upper arm.
And we have the claw that
attach to the lower arm.
So, to launch or set
up inverse kinematics
for this robot I just need
to start simulate a scene,
select these objects and I
can run inverse kinematics
on the robot right
inside of the editor
and to see how it's set up.
And if you want, that doesn't
look quite right, maybe I need
to set a little bit of
constraint on there.
Maybe I need to limit to say
90 degrees to 180 degrees
for that joint and you
can have the same effect.
As you can see it
actually reaches back
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
As you can see it
actually reaches back
and it will not over-bend
that arm at the elbow.
So, that's the inverse
kinematics, very easy to set up.
Again, you don't have to
write any lines of code
to see the code effect or
use any of the features here.
Plus I want to go
over our shaders.
To uses shaders is very easy.
So, here from the widget
library I can just pull
in a solid color of sprite.
And here I happen to have a
custom shader that I can run.
So, how do I set
a custom shader?
SpriteKit Editor automatically
process your work space
and figure out how many
FSH files that you have.
So, here I have a single one.
I set that and boom,
I'm good to go.
So...
[ Applause ]
This is only half
of the equation.
To actually -- the iteration
experience is even better.
So, here if I want to make a
change to the shader I just call
in the assistant editor.
If I just click on the object
it knows which shader you used
and it brings up the
shader source side-by-side
and you're ready to edit.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
You're ready to make changes and
monitor your whole workspace.
So, now if I notice the radius
of the center of the circle,
I want to shift it a little bit.
So, if I change it to 0.2 and
0.2 and then what happened?
Usually you need to
rebuild your application.
You need to rerun
the application.
The application will upload
a new shader to the OpenGL,
GL's driver will compile whether
if you forget a semicolon,
guess what?
You start that process again.
Here you just need
to do File, Save.
So, if I make -- Oh, it
automatically saved for me.
So, if I change it back to 0.5,
0.5 and file save you see
the live change right here.
And what if I decide to
add a brand new uniforms
to the shader?
So, here I want to
add a speed parameter
so I can control the
effect that I'm having here.
So, if I save that, guess what?
You have real OpenGL annotation
error right inside of Xcode.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
You have real OpenGL annotation
error right inside of Xcode.
So, here because I
introduced a new uniform
that has not been
declared so the declaration
of current time fails at any --
the two other places
that I referenced
to current time will
fail as well.
So, how do we fix that?
Because we add this new
uniform, let's go ahead and add
that in the dictionary.
So, because we're calling it
"u.speed" we make
sure the name matches,
u.speed and has a value of 0.
And if I change that to 1
I see the live effect right
in the editor and
make it spin faster.
Ooh, see there?
OK, let's make it slower.
So, live shader editing,
right inside of Xcode.
[ Applause ]
So that's the demo of
our SpriteKit Editor.
Lastly, I want to go over
some additional improvements
that we have done
to the existing SpriteKit
framework APIs.
So, for those of you who
are new to SpriteKit,
here is the brand
new update clock
that we have done for this year.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
As the frame starts we
start with update function.
So, this is where you can
set up your game logic.
After that, scene will
start evaluating actions.
After the actions are evaluated
user gets a callback saying,
"Okay if I animate, or move
this object from A to B,
do I need to do anything else?"
After the actions,
physics kicks in and set
up the physics use
stepping for the frame.
Once physics simulation
is finished,
user gets another callback
with the simulated physics.
And this is where
you can set up, say,
if the player gets pushed off,
maybe I move back
by another 5 pixels.
And now with the
brand-new constraints API
where adding a scene will
apply constraints right
at this moment.
After constraints are being
applied user gets another
notification of -- with
didApplyConstraints.
And now we also added one more
selector for user to react
on called didFinishUpdate.
This is absolutely the last
stop before SpriteKit packages
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
This is absolutely the last
stop before SpriteKit packages
everything up and send
it to the graphics GPU
for the current frame.
And SpriteKit renders
the current frame
and the same loop continues
60 times per second.
Now, SKTexture got a
little bit revamped here.
We introduce a new type of
texture called mutable texture.
You can create from data and
can be modified very efficiently
every frame.
We provide a callback
block allowing you
to make modifications
to the raw pointers.
So, here if I'm just
making changes
to the raw pixel data
you can set that.
So, if you have a really cool
CPU-based post-processing effect
and you want to modify
a texture,
you have the freedom to do that.
If you want to have a custom
data, you want to send it
to a shader as input by sending
up all the data as textures,
you can do that as well.
Also SKTextures can
generate noise textures now.
It generates coherent
noise that gives you --
or the noise vector
from a sphere.
So, it supports both
the noise generated
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So, it supports both
the noise generated
in the grayscale color,
grayscale or the color output.
If it's generated from a
noise vector we have to stay
in the color output space.
So, here, to create the noise
texture, you just call texture
with noise, which you can
control the smoothness,
as well as control the size.
Now, SKShapeNode also received
a lot of revamp this year.
So, we added convenient
constructors for common shapes.
They include rectangles,
circles, ellipse and splines.
We also allow you to set texture
and shaders for both the stroke
and fill for the actual shape.
You can use ShapeNode
to interact
with your physics as well.
If you build up this very
complex shape using ShapeNode,
you can just access the
path property directly
and then get a CG path,
send it directly to physics,
physics will create a
physics body for you.
We've also made creating pin
joints much, much easier.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
We've also made creating pin
joints much, much easier.
So, with SKphysicsBody we now
have a new property called
"pinned".
To pin an object to
another object you just need
to set one property.
So, here I have a big gear.
I want to pin it to the board.
I just set the property to
yes, SpriteKit will figure
out all the parents of
the conversion space
and whether the other
object has physics body.
It will take care of all of
that detail for you, very cool.
Now, in addition to pin joint
we also make creating weld joint
really, really easy.
So, weld joint is just
the same as pin joint,
which means pinned equal YES,
but if it won't allow rotation
that means I'm welded
to my parents.
So, here I have a small gear
that will be welded
to the big gear.
So, we set 2 properties and the
physics is automatically set
up for you.
In addition, physics body can
now be created using compound
physics bodies.
All you need to do is just,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
All you need to do is just,
in SKPhysicsBody we'll
add a new initializer
called bodyWithBodies.
You pass in an array of the
different physics body shapes.
And for example, hammer here
is contained with 2 rectangles,
one for the top and the handle.
Now, SKTexture Atlas is our --
one of the key components
to allow users
to have efficient
graphics performance.
So, anything that's in a
texture atlas we allow OpenGL
to do efficient batching here.
So, it's now supported for
both SpriteKit and SceneKit.
We support both retina and
non-retina resolutions.
So, if you have a game and
have all the assets put
in one folder the Texture Atlas
generator will separate them
for you.
So, you don't have to
pay the memory overhead.
If you're loading the Texture
Atlas on a retina device,
you don't have to load
the non-retina asset.
It also supports the
full 32-bit pixel format
and also the compressed
16 format.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Now, one of the big
changes: we now support
up to 4k by 4k resolution.
So, it's a simple change in the
Xcode settings for your project.
And in addition we
support Runtime Texture
Atlas generation.
So, if you have downloadable
content, say,
user downloaded a new
level, everything is coming
in loose files, or users go take
-- go out and take some pictures
and decide to use that
in-game as a cube map,
you can just simply pass
into the SKTextureAtlas API
and we'll automatically
stitch it for you
and trim off the
transparent pixels.
So, in summary we really have
a lot of new features packed
in this year's SpriteKit
release.
We have a lot of cool
graphics technology
like custom shaders,
lighting and shadows.
We have really cool simulation
effects like inverse kinematics,
physics field, per-pixel
physics and constraints.
All of these features can
be done using one line,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
All of these features can
be done using one line,
or no lines at all if you
use the SpriteKit Editor.
So, the SpriteKit Editor is a
new edition to the Xcode family
and is a really,
really cool feature.
I highly encourage
you to use it.
And it's also a good learning
experience to see how any
of the new features interact
with each other within a scene.
And we can't wait to
see what you can come
up with all these technologies
and tools we provided
for this release.
So, with that said, if you
have any questions or feedback
or anything you want to see in
the future, we would like to --
you can feel free to contact
our Developer Evangelist Allan
Schaffer and Filip Iliescu
and we have a revamped SpriteKit
Programming Guide that's
on the Developer Portal.
So, if you want to pick
up the documentation
for these new features
they're already there.
And for the related session
right after this session is the
"Best Practices for
Building SpriteKit Games".
We're going to go into depth
of what are the best performance
practices and how to set
up a game right in order
to use these new features
and set up for scalability.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and set up for scalability.
And as I said before,
we worked really closely
with the SceneKit team to
making sure SceneKit is also a
high-level 3D API
just like SpriteKit.
I highly encourage you to
check out the SceneKit sessions
for tomorrow in the same
room for two sessions.
And with that said,
thank you very much,
this is the end of the session.
I hope you guys have the
rest of the -- a good week.
[ Applause ]