Transcript
[ Applause ]
>> Thank you for coming.
It's going to be a great week.
Today, in this session, we are going to talk
about all the great technologies that we make
for the graphics and media needs of your applications.
Apple delivers products that give users
incredible immersive, vivid user experiences.
We saw that today with the release of iPhone 4.
iPhone 4 with its new retina display and a
high resolution display is the perfect platform
for really bringing vivid user experiences.
And of course, we see this with the iPad,
the revolutionary iPad, and the Macintosh.
[ Applause ]
>> At the heart of all of this are
the Graphics and Media technologies.
We deliver over twenty frameworks,
over twenty different graphics
and media frameworks for you to
leverage in your application.
These frameworks deliver a huge amount of graphics and
media technology that you can leverage to build this vivid,
immersive, user experiences right at your application.
These are the same frameworks that
we use when we build our products.
They are the same technologies that
you see when we demo on stage, right.
These are really fantastic technologies.
These are technologies you should learn about and use to
their fullest, and you will be able to build applications
that really make your users love your products.
So we deliver these technologies at different levels of API.
At the high level on the iPhone, we have a Cocoa Touch
API which is an Objective-C API that makes it easy for you
to get access to the Graphics and Media technologies
and easy for you to incorporate into your product.
We, of course, have low level technologies
such as Quartz for high-quality 2D rendering,
OpenGL for high performance 3D, and
Core Audio for professional grade audio.
These low level technologies give you
the flexibility to do custom things
to get access to all the capabilities of the platform.
And then we have technologies such as Core Animation that
let you mix and match between the different technologies.
They let you take 2D graphics, 3D
graphics, and video, and blend them together
into really custom-immersive user experiences.
Core Animation is a very powerful technology for you
to build really immersive vivid user experiences.
So what we do at Apple is we build the graphics and media
technology of a common platform across all of our products.
So across the desktop Mac OS and across IOS, it is the
same graphics and media technologies, the same foundation.
This means that we've been able to take years of our
experience and we deliver that across all of the platforms,
giving you that same high-quality
professional grade technology.
So what this also means is that we have
optimized this technology for the hardware.
So whether you are trying to get high performance or
get access to all of the features of the hardware,
whether it be microphones and speakers and cameras, this
technology has been optimized for that hardware device.
And what is really important about
that is that the world is going mobile,
and optimized means that when you use our technologies
right, you also are optimizing for battery life.
And battery life is going to be
something very important for your users.
You want to make sure that when
you are building an application,
you are giving your users a good experience
with battery life when going mobile.
So looking at some of the applications
that you have built, there is BoinxTV.
BoinxTV takes our graphics and media
technologies and incorporates them in a way
that brings a live TV studio right on to the Mac.
It is really cool.
Or Pinball HD uses an intuitive Multi-Touch interface
combined with OpenGL ES to take an arcade classic
and make a really great application for the iPad.
OrFour Track.
FourTrack takes Core Audio and makes a
multi-track recorder right on your iPhone.
That is pretty amazing.
So you have taken all our technologies,
all the graphics and media technologies,
and you have used them to build over 225,000 applications.
That is a huge number.
And of course, I'd like to say thank you because you guys
have built some applications that are absolutely amazing.
So for the rest of the talk, what I want
to focus on is I want to focus on iOS 4.
We are going to talk about some of the new and innovative
features that we are building into iOS 4, specifically,
OpenGL ES and the new extensions we are offering
there, AVFoundation for the audio and video needs
of your application, HTTP Live Streaming, the
best way to stream live video over the internet,
and Game Center for bringing social gaming into your game.
And to start, I am going to bring
Gheoff Stahl up to talk about OpenGL ES.
>> Thank you.
[ Applause ]
>> OpenGL ES, it's the open standard-spaced technology
that allows you to access the power of the GPU.
When we conceived iOS, we chose
OpenGL ES as a graphics foundation,
and John talked about layering OpenGL ES
as a full participant in this layering.
Core Animation, for example, is built on top of OpenGL ES.
So no matter what high level API your application uses,
you get the power of OpenGL ES and the power of the GPU.
So before we go into the new features in iOS 4, lets
first take a review of OpenGL ES and what it is.
OpenGL ES is a low level graphics API that uses vertices,
triangles, and triangular meshes to express geometry.
When that is combined with textures and pixel
data, you can render compelling 3D images
such as this Sergeant Shock [phonetic]
who would look right at home in a 3D game.
But it is more than this.
OpenGL ES can be used for high speed compositing as we see
with Core Animation, and also can be used for image effects.
And when we go to OpenGL ES 2.0 and we add shaders,
OpenGL ES can really unlock the power of the GPU.
So what are shaders?
Shaders are small programs written in
OpenGL shading language that run on the GPU
and can either affect the vertex or
fragment parts of the rendering pipeline,
thus allowing you to really customize your drawing
experience and unlock that full power of the GPU.
In this case, we replied dynamic lighting and shadows to
Sergeant Shock to really immerse him in your application.
I think the best way to show this is to look at a demo.
So I'd like to have David Biterman [phonetic]
help me with this demo called Extreme Ball.
[ Applause ]
>> So this demo is created by Digital
Legends for Imagination Technologies,
and it is a immersive 3D world
here, I made a futuristic arena,
and what it shows is the power of OpenGL ES 2.0 and shaders.
First, we are looking at the triangular mesh
that makes up the world as you would expect.
It makes a structure of the world.
And now, let us look at some of
the fine details in the simulation.
So you have the reflections in the floor, you have the water
with animated normal maps with that specular highlighting,
the character itself, his facial
expressions, a physics-driven hair model,
she is animated on a bone structure with motion capture
and vertex shaders, animated bones and skinning the mesh,
fragment shader provide that lighting detail.
When we look at the character, we can see dynamic lighting,
we see reflections in the floor, and we see shadows cast.
This simulation drives over a million
and a half polygons per second.
The models-- the balls themselves are very simple
geometric meshes covered with textures, normal maps,
specular highlights, create a more-- it looks
like you got more geometry than you would expect.
Really, all in all, this is a great demonstration of the
power of OpenGL ES 2.0, what you can do with shaders,
and all of this is running on a GPU that
you can hold in the palm of your hand.
Absolutely fantastic, amazing technology.
Again, written by Digital Legends
for Imagination Technologies.
>> Thank you, Dave.
[ Applause ]
>> We have reviewed OpenGL ES.
Let us talk about iOS 4.
So last year, we introduced OpenGL ES 2.0 and shaders.
This year we are going to introduce some really
fantastic new features to add on top of them.
>> So let us jump right in, starting off
with low cost full scene anti-aliasing.
We've added a full scene anti-aliasing which
reduces the aliasing artifacts in your scene
for very low cost while maintaining optimum frame rate.
So it markedly improved the rendering quality.
For example, here, you see, if you look
at the ball from the previous simulation,
you will see there is an aliasing
artifacts in the under side of that ball.
If we turn on full scene anti-aliasing, what you can
see is this aliasing artifacts are markedly reduced.
This, and the last demonstration you saw,
was running with full scene anti-aliasing on.
You can see the frame rates and throughput we achieved.
Low cost full scene anti-aliasing, new in iOS 4.
One of the really important things that you saw
on the demonstration is the way your 3D
scene interacts with lighting and shadows.
So what we have done for iOS 4, we have enhanced shadows.
We have added support for two shadowing
techniques, both shadow mapping and shadow volumes.
We have added depth texture support for
shadow mapping, and shadow is a technique.
It is image-based technique where you take the light and
you render the scene from the point of view of the light
into the depth texture, you accumulate these results, and
then when you are rendering the scene from the user's point
of view, the user results, they guide you to
what is in shadow and what is out of shadow.
Shadow volumes are-- use a stencil buffer and show--
and allow pixel accurate shadows,
like you see in this example here.
If you are interested in either of these
techniques, the more advanced rendering techniques,
we are going to be talking about these later this
week, the advanced OpenGL ES rendering sessions,
where we talk about exactly how they created
this demonstration right here, enhanced shadows.
So beyond shadows, a really important
thing in 3D rendering is textures.
And we have a standout example we used
earlier, Pinball HD, fantastic use of textures
to give you that immersive 3D Pinball experience.
So how do they do this?
Well, it is actually fairly simple, and it is a common
and convenient technique that many of you may already use
if you are using OpenGL ES, and it is texture atlas.
So what is a texture atlas?
A texture atlas is a large texture that accumulate--
you store all of your smaller textures into,
and then during your rendering, you reference out of that.
As you can see here, we have textures for the
bumpers, for the flippers, and for the table art,
all of that is contained in one texture and it's-- and
you reference that when you are drawing your scene.
For iOS 4, we have added a new extension texture max level
which allows you even more control over texture filtering,
allows you to have pixel accurate texturing, and
ensures you have the highest quality scenes possible.
And we also added shader texture LOD, which
is level of detail control for shaders.
This means you can write a single shader
that you can control the texture filtering.
So whether an object is near to you or far
to you and far away from you in the scene,
you can ensure smooth alias-free texturing across
your scene, allows a shader to select the texture,
select the mipmap bubble, and it's
interesting when texturing.
That is shader texture LOD, two new advanced
texture management routines now available in iOS 4.
The next thing we have added are
two additional texturing techniques.
The first one is all about integration with media.
YUV 422 textures allow native support
of video on the GPU with OpenGL ES.
This allows you to maintain-- do not have to do costly
format conversions, maintain an optimal memory layout
for video textures, and really allows you
to integrate video with your 3D rendering
in one space, integration with media YUV 422 textures.
The second thing we have added for
texturing is floating point textures.
The floating point texture is a little bit different.
Floating point textures allow some new rendering
techniques you could not do before on a handheld GPU.
In this case, I will talk about two of them.
One case is you-- let's say you have a really complex
calculation that you want to run on a per pixel basis.
So, that-- running in a shader.
What you can do is you can precalculate
your data at a granularity that is set for--
that's right for your application, store that in a floating
point texture, upload it to the GPU, and then during--
when your application is running, that
complex shader calculation collapses
down to a single texture lookup ensuring your
application maintains optimum performance.
Great technique there.
Also, floating point textures, of course, can be
used for uploading high precision numerical data.
Let us say you have hype field that you want to upload to
the GPU, put it in a floating point texture, upload it,
you will be able to reference that
at floating point resolution.
So we have added a number of new things to iOS 4.
We have also ensured that it is optimized for performance.
We think this is the most optimized
release ever for our graphic system.
We have added stencil wrap for high performance,
stencil shadows, we have added vertex array object
that allows improved vertex efficiency, especially for
multiple vertex objects, we have added discard frame buffer
which allows you to-- allows the GPU
to optimize its memory bandwidth use,
and we have paid particular attention to texturing.
We have improved both GL tech subimage, and we have also
improved the overall texture upload speed throughout
the system.
All in all, this is the most optimized
release we have ever had.
So how do we bring this all together?
Well, let us show one more-- let us show another
demo that shows some of the power of OpenGL ES 2.0
and the use of some of the new features of iOS 4.
I'd like to invite up Torsten Reil, the CEO
of NaturalMedia to give the demonstration.
>> I would like to show you something today.
I've been very excited about it.
It is something that we have been working on
at NaturalMotion with our partners at MunkyFun.
We are trying to push the boundaries of what
is possible in terms of interactivity, realism,
and believability of a character on a handheld device.
To do this, we are using advanced OpenGL ES 2.0 features,
as well as our runtime animation engine, Morpheme.
Now, the character that we are working on is a horse.
It is a horse that is much less about my little pony and
much more about creating something that is authentic,
believable, and something you can
interact with in a natural way.
It forms the core component of a game that
we are working on together with MunkyFun,
and I would like to introduce Nick Pavis [phonetic], who's
the CEO of MunkyFun, to show us what this looks like so far.
[ Applause ]
>> So what you see here on this screen
is running live on iPhone 4 on iOS 4.
You see here first of all, starting with the close up of the
horse, we are using high resolution textures for the horse,
as well as advanced OpenGL ES 2.0
shaders to make the coat look natural,
but also for details such as the eyes on the horse.
Now, what you will see in a moment is
that Nick can interact with the horse.
You would not see his fingers here,
but he is stroking the horse,
and the horse is reacting to that
naturally and interactively.
We are using procedural animations for this driven by the
morpheme animation engine rather than canned animation.
Now, what's Nick done here is send the horse on to
the paddock and have it run around on the paddock.
So the scene itself is about 16,000 polygons,
the horse is another 10,000 polygons,
we have got full scene anti-aliasing running on this scene,
and you can see that the horse itself is running naturally,
again, using a morpheme animation engine.
So what we want to show you now is really, how we are
building this scene out starting with the beginning,
with the basic lighting which you see here.
We are now adding textures to this, and the
textures themselves are high resolution.
So for example, the grass alone uses four different high
resolution textures to avoid repetitiveness and tiles.
Next up, we are adding anti-aliasing, as you
have just seen in to the scene, and particularly,
is useful for areas like the fence, for example, especially
when we do the slow camera pans, we get rid of the jaggies.
Again, that is something that is possible now in iOS 4.
And moving on, we are now adding vegetation here.
So you can see the trees have just
been added in the background,
as well as grass in the foregrounds close to the fence.
This is done using Alpha, again on OpenGL ES 2.0 and iOS 4.
And next up, we've added fog.
So the fog is used to make all the scene
components sit together in a natural way.
And we have implemented that again, not
really in the old style, but in a new style.
We are using desaturation shaders for this
again, to give us that extra realistic look.
We are now adding the horse to the scene.
There it is, obviously, right now, without any textures.
But what you see is again, that we are using dynamic
shadows using the shadow buffers in OpenGL ES 2.0.
And also, if you look carefully,
what you can see is that the green
from the grass itself is bleeding into the horse itself.
And the same is true for the sky as well.
That again, gives us an additional level of realism
which is only really possible now with OpenGL ES 2.0.
And Nick is now going to add the texture
on the horse here, as well as the shaders,
again, which gives the coat a realistic look.
Next up, I think Nick is going to send the horse off.
[ Laughter ]
>> Not 100 % realistic yet.
But I think, as a next step, he is
adding the animation engine to it.
And now, the horse is running.
And I think Nick's just actually triggered that jump itself.
You can see we get very smooth
transitions from one gate to the other one,
and we also have sound to the background running as
well, which is triggered off the animation events.
Now, Nick's just called the horse back to the gates,
presumably to give it a few more
strokes for having screwed up the demo.
So this is really where we are right now.
We are amazed at the power that is now available
on the iPhone and in the handheld device.
And we are really excited about the possibilities
that we now have, and that NaturalMotion together
with MunkyFun, for creating just amazing games.
So thank you very much.
[Applause]
>> Thank you, Torsten.
It's absolutely stunning what they can achieve with a
handheld device, OpenGL 2.0-- OpenGL ES 2.0, and iOS 4.
Think about where were three years ago.
Think where we were five years ago.
This is amazing.
This is a device that you hold in your hand, this
fits in your pocket, absolutely amazing power.
So what do we talk about?
OpenGL ES 2.0 and shaders unlock the
fantastic power of the handheld GPU.
Full scene anti-aliasing, very low cost.
Enhance shadows, shadow mapping, shadow
volumes, new texture management techniques,
new texture formats about integration, unlocking new
rendering techniques, and finally, all of these is optimized
for performance, delivering what we think is
the absolute leading handheld graphic system.
I thank you very much, and I will turn the talk
over to Meriko to talk about our media systems.
[ Applause ]
>> This is awesome.
Thank you, guys, for being here.
I know it is a little bit late on a Monday
night, and you all got here early to see Steve.
We are thrilled to see you all here.
And I am incredibly excited to be talking to you
about the media systems on iOS 4 and iPhone 4 today.
I would like to start by talking about AV Foundation.
AV Foundation is a modern Objective-C 2.0 API that is
designed for maximum flexibility, professional grade access
to time-based media-- and professional access
to time-based media with dropping ease of use.
To set the stage, I would like to talk
a little bit about what we had in iOS 3.
In iOS 3, we gave you UIKit API to access the
camera, record data, trim it, and save it off.
We gave you a set of AV Foundation
API for working with audio.
This allows you to record audio, process
audio, and save it off to a file.
You guys responded with fantastic media-based applications,
things like Pro HDR for taking beautiful HDR photography,
Reel Director for recording and editing media,
and Skype for awesome quality audio conferencing.
We said, "Yes, please.
Could we have some more?"
So we spent the last year dramatically
extending AV Foundation.
So first off, we are giving you great access to the camera.
We are giving you control over the
camera and access to its output.
We have fantastic professional grade frame-accurate
editing services, and high-quality performance playback.
And finally, a really easy to use set of export API.
We use this technology to power our applications
inside of iOS 4, from our iPod media playback
to our camera applications to record and trim, and save.
These technologies are at the heart of Face Time.
Did you guys all get to see a Face Time this morning?
We are-- yeah.
[ Applause ]
>> We are really, really, really proud of that feature.
And Randy showed you iMovie.
iMovie for iPhone 4.
iMovie is built entirely on exactly the same public API and
AV Foundation that we are presenting to you in iPhone 4.
[ Applause ]
>> It is pretty cool.
So because we're the graphics and imaging
and media teams, we built AV Foundation hand
in hand with our foundation technologies.
It works beautifully with OpenGL ES, it
sounds great with Core Audio and OpenAL,
it's best friends with core animation, and it is a first
class citizen in our multitasking operating system.
So I am not kidding when I say it is extensive.
We have more than forty new classes
in AV Foundation for you to work with.
I would like to talk to you a little bit about the
media you can access when you are using AV Foundation.
In all of these classes, you can access camera
directly from your-- data directly from the camera,
from your user's camera roll, or from their iPod library.
This means you have direct access to their
media libraries, their asset libraries,
and anything that you save off in your application space.
I would like to take a deeper dive into some of these areas.
We are really proud of the new camera on iPhone 4,
and we are giving you great access and control to it.
We are giving you frame level access, we are
giving you control over the hardware features,
and we are also giving you drop-in recording API.
To talk a little bit about recording, what you
are going to do is open an AV capture session,
and all you need to do is pick which preset you
would like to record for your application's use.
We have high, medium, and low preset that automatically
adjusts to your users' iPhones, so you do not have to query
and figure out what your aspect ratio should
be or what the right bit rate should be.
So on an iPhone 4, we will record
16 by 9 high-definition video.
And on the iPhone 3GS will record
standard definition 640 by 480 video.
We also allow you to record stills
using the AV capture session.
You can record in JPEG format, and we also
allow you to record in-- uncompressed.
We are going to record that in the native format for
your user's iPhone for maximal hardware acceleration.
So in iPhone 4, that is five megapixels of uncompressed
data for you to work with in your photography applications.
We give you excellent device control.
On all iPhones, you can set the point of
interest for exposure and for white balance.
I'm really looking forward to seeing what you guys do with
HDR and various kinds of image processing applications.
We also, on an iPhone 3GS and an iPhone 4, give you access
to set the focal point or the interest point for the focus.
And on iPhone 4, you can choose between your cameras
and whether or not you would like to enable the flash.
All of these functionalities are available
in automatic mode, just like our camera uses,
or in manual mode if you would like to
set that within your own application.
You guys have also shown tremendous access in
working with the data live off of the camera
as a live source of information for your application.
And in iOS 4, you have got it.
What we are doing is giving you direct and
unfettered access to the frame streaming directly off
of the camera to process and work with as you wish.
So this is great for things like object
tracking, it is great for bar code scanners,
and it is really great for augmented reality.
[ Applause ]
>> So we thought we put together a little demo for you.
So what we have got is an iPhone 4.
We are using the capture classes to put up a live
preview of the data streaming off of the camera.
In addition, we are doing image processing
on all of those frames to look for this tag.
And this tag is really just a unique black and white
bar code that we have coded into our application.
You will see that when David moves
it into frame, and you can see it,
we are pegging this OpenGL ES 2.0
tower of blocks onto that tag.
We are using that tag and we are doing image
processing on that tag to see it shape.
So we can see it is tilted and we know where the floor is.
So if David tilts the tag, you can see that the blocks
track, moves around the base, so it sticks there.
Mostly, totally live, high frame
rate, and great performance.
[ Applause ]
>> Because our games team put this together,
they could not help but put a physics model in.
So you can actually knock the blocks over.
[ Laughter ]
>> And they're still sticking to the floor, tilt
it, they'll continues sticking to the floor.
It is pretty cool.
I hope this provided some inspiration
for your applications using the camera.
So in iOS 3, we gave you really basic
access to editing on the iPhone.
We basically gave you trim and save.
We have dramatically expanded this inside of AV
Foundation in iOS 4, and we are really proud of our models.
Again, it is professional grade,
frame accurate, access to editing.
And I would like to take you through
a little bit about how that works.
So the first thing you are going to do is
you are going to open an AV composition.
You are going to pick your clips, and then
you are going to set the areas that you like.
Once you do that, you will sequence them into a composition.
Once they are in a composition, you can start working
with them live within the composition, rendering them out.
We have got easy drop in opacity and affine
transforms for great transitions, super easy to use.
We also love Core Animation.
So if you want to do something a little more custom, titling
your own transition, you can do that too, lay those in.
And again, we really like audio here in this group.
So you can do all kinds of things with audio.
It is professional grade, you can lay in a new track, you
can mix down the existing tracks, you can pass them through,
you can overlay an audio track if you like.
It is pretty cool.
So once you have that composition, you probably
want to be able to play it back in preview.
We have high performance, beautiful
playback engine on iPhone.
Since iPhone 1.0, we have been compositing beautiful,
animated, blended alpha user interface on top of this.
It sticks to your fingers, operates
completely in real time, does not drop frames.
This is really important to us, and we are very
glad in iOS to extend that capability to you.
Now, how do you do that?
You do that by picking up an AV player.
AV player is the way you inspect your media,
understand its structure, and work with your media.
You can observe and control your playback state,
you can enable and disable tracks, you can set in
and out points, you can even control your playback rate.
So now, you have got a whole lot of information
about your media and the ability to control it.
>> The next thing you want to do is
probably lay in a custom control.
Now, the way we do this is that all of our media
plays directly back into a Core Animation layer.
It is a first class citizen in
your Core Animation rendered tree.
So you can draw your own custom controls, track it on in,
and then you can use something
that we call AV Synchronized layer.
We think AV Synchronized layer is super cool.
What this does-- if you will bear with me a minute--
is you can take the time base of your movie and use it
to drive the animation of your core
animation layer tree or subtree.
And what this means is that no matter what is going on,
whether your user is playing back and forward, backward,
scrubbing, you can peg exactly the animation point
you want in your UI to that spot in the movie.
We put together a little bit of demo to show you this.
We call it CoverPlay.
We took the movie that Randy made
this morning in the keynote.
That is 720P full frame rate video.
[ Background Music ]
>> We can start it playing.
So what we have done is we have used the AV Asset Image
Generator to pull out all of the key frames from that movie
and lay each one of them into their own layer.
We are using AV player to play
back the main movie, full frame,
and then we are using Core Animation
to animate those layers of thumbnails.
And those layers of thumbnails are
actually our scrubber bar for this movie.
These thumbnails stick to the same spot in the
movie at all times, and the way that works is
that the user input gesture is
controlling the time base of the movie.
The time base of the movie is then controlling
that animation using an AV Synchronized layer.
I think this is pretty cool.
Our users love to share their media, and we want
to make it really easy for you to help them.
So what we have got is a set of AV export sessions.
Again, just like in recording, we have presets.
You can choose high, medium, and low, and we will
keep track of what kind of phone your user has.
We also will allow you to pass through tracks.
We can set trim on pass through tracks, or
enable or disable any particular track you like.
So it is pretty flexible.
We also have fantastic support from
metadata in our export classes.
We support ID3, iTunes, and Quicktime metadata.
We will propagate your data from source to destination,
we will allow you to add or edit metadata as you like,
you can even do something like take
your core location, get your GPS data,
and lay that into the Quicktime metadata
location inside of a movie track.
I challenged the AV Foundation team to make a
demo that uses many of these classes as we could.
So what they built me was a knock, knock joke generator.
I am not sure if they were trying to tell me something.
But what you have got here is a series of movies.
Each of them has a thumbnail, and a set of themes.
And what we are going to do is [inaudible]--
I have not seen Roger in a couple of days.
Let us get him to tell us a joke.
This is going to use AV composition to edit
these clips together and play them back.
>> Knock, knock.
>> Who's there?
>> Sam.
>> Sam who?
>> Sam Francisco, here I come.
[ Laughter ]
>> You guys like the joke?
[ Cheering ]
>> Okay. Okay.
If we can go back to the beginning, we
will dive into a little more detail.
So what is going on here is when we made these movies, every
engineer read the entire knock, knock joke start to finish.
And then we use the AV Foundation
API to mark the in and out points.
When we use the composition, we are
using the built-in transitions I talked
about to make those cross fades and pushes.
So if you want to do another one,
you guys can watch for that.
>> Knock, knock.
>>> Who's there?
>> Interrupting cow.
>> Interrupting cow moo.
[ Laughter ]
>> That sounds like a thumbs down.
[ Booing ]
>> Finally, they really wanted to
show off the power of Core Animation
and how well it integrates with
all of the AV Foundation classes.
So in this middle reel, what you have
got is a series of clip art and a series
of descriptions of the Core Animation transitions.
When you watch this next movie playback,
you will see the badges dancing.
All of those are rendering live at full frame rate
on top of 720P video that we took on iPhone 4.
One more joke.
>> Knock, knock.
>> Who's there?
>> Candice.
>> Candice who?
>> Candice be the last knock, knock joke?
[ Laughter ]
>> Yeah, okay.
[ Cheering ]
>> So those arms are animated in
and out with Core Animation.
We also created some custom playback controls.
We have got a play button, a rewind button,
a scrubber, and even a share button.
Maybe we will save off the metadata about
whether you like the joke before we send it out.
[ Applause ]
>> So I wanna talk for just a moment about multitasking.
It is one of my favorite features in iOS 4.
I think it is hugely powerful.
We built the AV Foundation classes hand
in hand with the multitasking system,
so that your users will get the behaviors
they expect with the media system.
Just like audio sessions, you can set your multitasking
category directly inside the AV foundation API,
and we will take of starting and stopping
system services for you as your users expect.
So for instance, we do not allow
you to record in the backgrounds,
so we will shut down the camera
preview and bring it back up.
You do not have to keep track of that.
We will maintain a list of what is changed in the state
of your application while your users multitask out,
provide that information back to you
when your user comes back into your app,
so you can update your user interface to match.
We have also enabled export in the
background, which is pretty cool.
So AV Foundation, we have more than forty new classes.
We have great access to the camera and control over
it, we have professional grade editing services
that are frame accurate, we have performed
a playback, and easy to use export.
We are really excited about AV Foundation.
So I would like to change tracks for a
moment and talk about HTTP Live Streaming,
which is something pretty near and dear to my heart.
Last year, we stood up on stage with our friends
from CNN and MLB as we introduced our open standard
for high-quality, internet-based streaming media.
They showed you an introduction to what
they were planning to ship with iOS 3.
And in the last year, your response has been overwhelming.
These are just a few of the folks who have
fantastic applications up in the store.
Our folks-- our friends in the broadcast
industry who have signed up as well, inlet--
has got built-in support for streaming
media and their encoder boxes
and optimized streaming a ton of
data over the network everyday.
So to review how HTTP Live Streaming works, I would
like you take-- take you through a few of the features.
At its heart, it is adaptive.
We are serving you the bits you need
for the feed that matches your speed.
We are continuously analyzing your user's bandwidth
and determining what the highest speed they can
get that is going to match their bandwidths.
So you get full frame video and seamless audio.
If your network pickups or they change state, they might
drop down the 3G, we will move them to a lower stream
without ever interrupting their audio, or
rebuffering video in a way that they can see.
If your network comes back, we will bring it back up.
This is really, really beautiful.
I am hoping some of you guys have played with NetFlix
on an iPad and seen a little bit about how this works.
Streaming is great for live content, things
like live sporting events and concerts.
It is also great for on-demand content, whether you want
to rebroadcast lectures, watch TV shows, watch a movie.
And we are built on the industry's best standard
codecs for beautiful video and crystal clear audio.
It's the H.264 for video and AAC for audio.
We have stream level encryption to keep your
content partners feeling secure and happy.
And on the networking side, we are built on top of the HTTP.
And that sounds pretty simple,
but it's really pretty profound,
because what it gives us is the maximum
compatibility with the existing web infrastructure.
This makes this really firewall-friendly.
If your users can download a webpage, look at a
webpage, they can probably get you stream without having
to reconfigure their router or their Mac.
I think this is pretty cool.
We have some new features in iOS
4 that we think are fantastic.
The first one is Stream Insertion.
And what this is a programmatic way to inject or add content
to the beginning or in the middle of one of your streams.
This allows you to take a series of bumpers or interstitial
content and inject it into a stream live or on-demand
so that you do not have to burn a copy of that content
into every single stream that you stream to each user.
We handle various frame sizes, different aspect ratios with
no problem, just jump in your trailers and we'll stream
to your users and play them back, and we have
added the ability to include a metadata track.
And you can put whatever data is useful
for your application into that track,
and it will stick to the time base of your movie.
You can put GPS data in, you might want to tie some
slides to a lecture, or inject live player data
or stats if something interesting happens in
the baseball game that you are streaming live.
And we have also got some new tools.
We have given you final segmenter so that you can work
with your existing on-demand content without having
to re-encode it, we have given you a file validator so
that you can make sure that your streams are compliant
and adhering to our best practices, we have also given
you a file index generator to help user posting workflow.
So we also have a demo.
I would like to invite up Matt Johnson from Engaged Sports.
He has been working on an application that he plans to
launch next month that uses a bunch of iOS 4 features.
[ Applause ]
>> Thank you.
At Engaged Sports, we are building new applications
so fans can experience live sporting events
in entirely new ways on their iPhone.
We have a really exciting new app that we
want to demonstrate for you today for one
of the world's largest sporting events, the Tour de France.
>> But first, a quick bit of background.
This project started [inaudible] the passion for cycling.
Also, my day job is running one of the teams that's in
the Tour de France, and they'd been killing us for years
so we'd be at the race, and we really had no
ability to follow the action live on our iPhones.
We've been in development for the
last couple of months on a 3.0 version
of the application when the 4.0 beta was released.
We had a number of features that we dreamed
up, but we now realized were possible with 4.0.
We're really excited to demonstrate some
of those features for you here today.
Joining me on stage is Jay Graves from
our development partner, Double Encore.
And Tour de France is very different than
the traditional stadium sporting experience,
because it turns the entire country
of France into its stadium.
It happens over three weeks every month in
July, and it travels almost 4000 kilometers.
One of the first features we developed
with 4.0 was using MapKit.
One of the things that you want to do when you're
looking at the Tour de France is you want to be able
to follow the action along the race course.
We now have live GPS data streaming
to us from the race as it's happening
that we can now show the user on their iPhone.
At the top of the map here, you
see the start town for this stage.
This is the last year's stage 20 of the Tour de France.
The yellow line indicates the progress the stage has made.
With 4.0 and map overlays, we're able to
integrate this data directly into the Map View.
So both the route that the race is following, the
progress the race has made, as well as the position
of the relative leaders on the race all
integrated into a map overlay later.
And with Core Animation framework,
we're able to pan and zoom,
and our map overlays will animate directly with the map.
Now maps are great.
But when you have a sport as visually exciting as
cycling, you really want to be able to see it live.
With HTTP Live Streaming, we can do exactly that.
And what's really exciting about 4.0 is we have an
entirely new level of control over the video layer.
So one of the first things we did, and
something we'd always want to do for cycling,
was overlay the elevation profile of the race course.
So here, you can see on the left is the start
of the race, and the right is the finish.
And in between, you can see the
elevation games or the climbs or mountains
that the racers will go over during the course.
Now, we can make this come to life.
With synchronized metadata, we're embedding the
live GPS data, the distance the race has traveled,
and the individual position of
the leaders on the race directly
into the video while the video is
still playing live in the background.
Now, the Tour de France is a long race.
It's three weeks long, but each day is also six hours a day.
So there are times I need to leave
the app and go do something else.
Now, you may have all seen what multitasking
and the ability to have background audio,
but you've probably never seen it
done with a video-centric application.
So as you can hear, the audio track
from the race is still playing.
If something exciting happens, I can jump right back in,
and the audio-- video is perfectly in sync with the audio.
So the last feature--
[ Applause ]
>> So if you thought that was something,
this is even better.
So the last feature is something
we've always wanted to do for cycling,
and it really completely changes the user
experience of how you go and watch a bike race.
We've converted that elevation
profile into a video scrubber.
So in case you want to-- you missed some action, you know,
you wanted to go back to the start of a climb or you wanted
to make your own highlights and replay the race later, you
can now scrub along the profile, release, and both the video
and the audio will go perfectly in sync with
that geographical location along the race course.
Thank you very much.
[ Applause ]
>> So I guess I shouldn't go too
far because I'm wrapping up.
HTTP Live Streaming, it's our open protocol for
high quality adaptive media over the internet.
In iOS 4, we brought you new tools, stream
insertion, and synchronized metadata.
And if this is any indication of what you guys are
going to do, I'm really excited to see the next year.
With that, I'd like to bring out Mike
to talk about Game Center for a minute
[ Applause ]
>> There are over 50,000 games and
entertainment titles on the App Store today.
There's no doubt that our platform
offers a revolutionary gaming experience.
Games are huge in our platform, and we
want to make them bigger and better.
Therefore, we're introducing Game Center.
At its core, Game Center is a social gaming network that
enables your games to take advantage about social gaming,
structure, and allows the players to enjoy.
Game Center is actually made up of three pieces.
One piece is a Game Center application.
Game Center application is used for people to discover
new Game Center games, to make new Game Center friends,
and to compare their progress versus their friends.
Game Center also includes a powerful
and flexible framework called Game Kit.
Now, Game Kit is what your apps use to take advantage
of the features we're again talk about today.
Game Kit is also tightly coupled
with our Game Center services.
Our Game Center services are a flexible,
scalable set of services that really fast.
And what they do is they bring the people to your
games so that your games can enjoy all the people
in our social gaming network and have great time.
I wan to talk about some of the features in Game Center.
So, one of them is Leaderboards.
Now, Leaderboards are great way for players to
compare how they're doing versus everybody else
in their-- in your game all over the world.
Our servers take care of storing, sorting, and filtering
all the Leaderboard information you send up to us,
and giving it back to you at the appropriate
time and in the appropriate format.
We've made it really easy for you with a few simple lines
of code, to have one or more Leaderboards to your game.
Whether your game is a single-player game or multiplayer
game, you can take advantage of the Leaderboards.
Every Leaderboard comes with a couple of built in filters.
There are filters for friends and
everyone, and there's still just for time
for today, this week, this-- and all time.
We also provide a basic UI, standard UI, you can use to
make it really easy to add Leaderboards to your game,
or you can choose to customize the look of your
Leaderboards to make them fully immersive within your game.
I think it's a great feature that
we've made super easy to use.
I want to talk about the next feature
now, which is Achievements.
Achievements are a way for rewarding a
player for accomplishing a goal in your game.
It adds depth to any game, whether your game is a
racing game, a strategy game, a first person shooter,
it doesn't really matter, Achievements
add a whole another dimension to the game.
The completionist will want to collect
every single one of your achievements,
and everyone will enjoy comparing their
progress in your game versus their friends.
Just like with the Leaderboards, we've provide a standard
UI you can use so that Achievements look the same
within your game as they do in our Game Center app.
Or you can customize the user interface so
that it fit seamlessly within your game.
That's Achievements.
And I want to talk about another feature
I'm really excited about, Multiplayer.
We've taken something that's normally very, very
complicated and difficult to do and made it simple.
So the first part of Multiplayer I want
to talk about are game invitations.
So with iOS 3, we introduced push authentications,
real powerful system to get a message to any device.
We leveraged that with our game invitations.
So now, when a player invites their friend to play
a game, that friend will get a push notification.
Whatever they're doing, wherever they are,
they get a push authentication to play the game
with optional personalized message,
and here, they can accept the game.
They accept it, launches the game,
they play Multiplayer game,
but we also recognize that not everybody has the same games.
So in this case, if I invite one of my friends
and my friend doesn't have the game I'm inviting
him to, they'll still get a push notification--
[ Applause ]
>> -- but in this case, we replace the
Accept button with an App Store button.
So it allows him to go to the App Store, purchase
the game, and then play with the person inviting him.
This is a great way for your game to
leverage our social gaming network
and increase the distribution and visibility of your game.
We have a lot of features like this built
into Game Center, and this is just one.
But this gets us friends inviting friends.
We want to make so that any Game Center players
can play against any other Game Center player.
So we introduced this great feature called Auto-matching.
Let me talk about what Auto-matching is.
What Auto-matching does is players who want to
play your game in an auto-match, they let us know,
and it goes up to our server, our server takes all the
people all over the world asking to be matched to your game,
figures out the best possible groupings
of matches out of all those people,
and returns that information back to your game.
>> It's really fast, it's really simple, our
servers take care of all the complex work here.
So Auto-matching is a great way for anybody to
be matched with anybody, and we also add a way--
we realize that even though we think our logic
is really great for putting people together,
every game has its own set of requirements.
So we've also offered some flexible API's for you
to tell our service exactly how you want
to-- the players to be matched for your game.
That way, we return the exact proper
matching your game requires.
[ Applause ]
>> So, now we found the people for your game.
Another very difficult thing normally in multiplayer games
is connecting the devices so they can talk to each other.
We also offer the peer to peer connectivity.
Now, when we find the players, we will connect
them over Wi-Fi or 3G on a peer to peer network,
we have a real simple API for doing this to make--
take all the complexity out of this process.
So now, we got a game.
We've taken the people, we found them, we've
matched them together, and they're talking.
But if you're like me, when I'm playing a Multiplayer
game, I like to talk to the people I'm playing with.
I like to strategize, I like to trash talk, whatever.
And so, we've introduced in-game voice chat.
[ Applause ]
>> This is tightly coupled to our-- all
of our matching and our peer to peer.
So it works almost seamlessly.
Few simple lines of code, your game can have
everybody in the game, talking to each other.
Or if you want, you can have team A talking amongst
themselves and team B talking amongst themselves.
There's a lot of flexibility in here
for your games to take advantage of.
So I just want to go over, there's-- Game
Center's made up of three main pieces.
Our Game Center application, the Game Kit framework, the
Game Center services, and all those provide these features
and more, Leaderboards, Achievements,
Invites, Auto-matching,
peer to peer connectivity, and in-game voice chat.
Right now, Game Center is available as a developer preview
and will be available to the general public later this year.
Thank you.
[ Applause ]
>> Back to you, John.
>> That's just fantastic stuff, isn't it?
We're really happy about what we've
been able to do in iOS 4.
So we've gone over a little bit of OpenGL ES, AV
foundation, HTTP Live Streaming, and Game Center,
and the new features and innovations
we're bringing to iOS 4.
And to bring that together, we want to do something fun.
We, at Apple, built a technology demo.
It's a game that brings a bunch of these
technologies together into a really interesting demo.
It's a demo that we're gonna be using through
some of the sessions this week, and teaching you
and showing you how we use these technologies,
and how we incorporate them into an application.
So with that, I'm going to bring Graeme Devine up, and we're
going to look at a demo-- a technology demo called Quest.
>> I'm really, really proud to be here today to
give you, well, the world's first preview of Quest.
We spent the last couple of months at Apple writing
a highly performant game prototype that runs on all
of Apple's embedded platforms at 40 to 60 frames
per second using Apple frameworks and technologies.
So let's jump in and take a look.
One of the things we did is integrate Game Center.
But one of the things we also did was on
this iPhone 4, it is made of resolution.
So let's see if we can zoom in and just
take a look at this gorgeous display.
That is full frame OpenGL ES 2.0 on
top of a Cocoa Touch user interface.
It logged in to Game Kit and it uses AV Foundation
to provide a three dimensional sound environment,
because a lot of people play games with the headphones
on, and we wanna be able to provide them environment.
So let's take a look at my Achievements.
You see that is a Core Animation flip.
By using Cocoa Touch to drive all
of my UI, I can easily in--
a few lines of code, just do large flips
like that and flip around the display.
You can see, I've not quite run-- I've not
quite won the demon slayer achievement,
so I think you can see where this demo is going.
So let's sit down and lets dive in.
Now, the Quest environment is lit per pixel.
As the character runs around and runs into areas
and out of areas, the world around him affects him.
The world itself is around 50,000 triangles, putting a
few million triangles per second on each of our platforms.
As he runs through the red area, you can see that
he is getting lit by the environment around him,
and the environment around the character is affecting him.
One of the other things that we did--
[ Applause ]
>> Thank you.
But the other thing that we did was
we are using the GPU on our devices
to animate using skeletal animation and skinning.
He is completely running on the GPU.
A lot of talks this week about how we're doing that.
But one of the other things we wanted to do with Cocoa Touch
was really make our environment interactive touching the
world around you.
So if touch my character, I can make up a contextual menu
and I can do all sorts of things to my contextual menu.
But if I slide to the right, I can call in my pet [noise].
And then if I touch again on top of my contextual menu, the
environment-- that contextual menu has completely changed.
It's changed because of my character.
My character now has his little robot friend.
So the menu is completely changed.
So that's how you dismiss him.
And you can see, I'm still able to run around, I'm
going to [noise]-- I think the demon is pretty hungry.
So maybe we'll have to have a bash on him.
So by interacting with the demon and pressing on top
of the demon, I can actually start the fight [noise],
making the world interactive, I want to interact
with that demon and fight him right now.
The text lying off there is all Cocoa Touch again.
Everything in the user interface is running
on top of OpenGL ES 2.0 using Cocoa Touch.
We didn't write our own UI at all.
[ Applause ]
>> And I got that achievement.
Let's do one more thing.
Let's touch in the character's face.
I used to put him on a drive home.
This Cocoa Touch interface and how
integrated it is to the game environment.
By taking the game and skinning in such a way that
it looks like it's part of the game environment,
I can't tell the difference between
the game and the interface.
But to our game players, it is exactly the
familiar interface they use on the hundreds
of thousands of other iPhone app that are out there.
There is no difference between this
application and the photos application.
We just skinned the UI.
So let's press done and quit on out.
Thank you.
[ Applause ]
>> Thank you, Graeme.
So we talked about just a few technologies
available in the Graphics and Media frameworks.
We have a wealth of technology for you to use.
I encourage you to go learn about all of them.
And we have 26 sessions this week just
on the graphics and media technologies.
And we have 24 labs.
The labs are really useful for you to be able to
go and ask the hard questions to the engineers
who actually wrote these things, these frameworks.
They will be there to answer all your
questions, so we encourage you to go to these.
With that, thank you and enjoy your week.