Transcript
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
>> Welcome to "Mastering
Modern Media Playback".
My name is Stefan Hafeneger.
I'm an engineer on
the AVKit Team.
And if you are already using
or planning to adopt AVKit
or AVFoundation in your
iOS or OS X applications,
this is the right
session for you.
The goal of this session
is to show you how easy
and powerful media playback
is on iOS 8 and OS X Yosemite.
You have more sessions focused
on other major operations
later today and this week.
And we will refer you to
those at the end of this talk.
In the first part of this talk
I'm going to introduce you
to AVKit for iOS
and show you why
and how you should use
it your applications.
I'll also give you a
very brief overview
of the most important
API additions
and changes in your IM behavior.
We have for you this
year an AVKit for OS X.
In the second part of this talk,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
In the second part of this talk,
my colleague Shalini Sahoo
will teach you best practices
for using AVKit and
AVFoundation in your iOS
and OS X applications.
But before we dive in,
let's take a quick look
at our current Media
Stack on iOS.
On the lowest level, we have
the Core Media framework,
the core of our Modern
Media Stack.
On top of our Core
Media sits AVFoundation,
a powerful Objective-C framework
that provides you easy
access to our Media Stack.
Finally, on your iKit level, we
have the Media Player framework
with built-in UI, providing
you playback UI in the form
of MPMoviePlayerController.
This year in iOS 8 we're adding
a new framework called AVKit,
replacing parts of the
Media Player framework.
AVKit is our new high-level
media framework providing you
access to the rest
of the Media Stack.
And I'm going to show you
in this talk what
possibilities this opens
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
in this talk what
possibilities this opens
up for your iOS applications.
But to be clear,
we're not deprecating
MPMoviePlayerController
in iOS 8,
but we are strongly
encouraging you to adopt AVKit
for media playback instead.
So here you have it, our
Modern Media Stack on iOS.
For those of you with an OS
X background, you might see
that it's pretty similar
to our Media Stack on OS X.
And you're correct.
In fact, it's the same
now on both iOS and OS X
and will allow you to create
and maintain cross-platform
applications easier than ever.
And now, let me introduce
you to AVKit for iOS.
So up to now we've
provided you two options
for media playback on iOS.
MPMoviePlayerController and its
UIView controller companion,
MPMoviePlayerViewController
and AVFoundation.
So some of you might be
using MPMoviePlayerController
on your applications,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
on your applications,
which means you get standardized
playback UI, but you lack access
to the Media Stack, which
means you're limited
to basic media playback.
Or, you might be
using AVFoundation,
and thus have access
to the Media Stack,
and can do things beyond basic
media playback, but you have
to implement your own
playback user interface.
Finally, some of you in the
audience might be in the process
of adding media playback
to your applications.
And you might be torn
between these two options.
We want to make your life
as developers easier.
And that's why we have
something new for all of you.
AVKit provides you both
standardized playback controls
and behaviors by
giving you full access
to the Modern Media Stack
through AVFoundation.
Last year we introduced
AVKit in OS X.
This year we bring
AVKit over to iOS.
Our goal for AVKit in iOS
is to provide you view-
level facilities for
media operations on top
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
level facilities for
media operations on top
of AVFoundation and your iKit.
Media playback on
iOS is now easier
and more powerful than ever.
AVPlayerViewController
- sorry, AVKit on iOS,
which uses
AVPlayerViewController,
a state-of-the-art UV controller
subclass provides you the same
look and feel as our
video applications
and the existing
MPMoviePlayController API.
And we made it really
easy for you
to adopt AVPlayerViewController
in your iOS applications.
Let me show you the
necessary steps in a demo.
So in XCode, we create
a new iOS application
and we select the empty
application template.
And then press the Next button.
As the name, we enter
AVKitPlayer.
Press Next again.
Set the project on the desktop.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Now we select the
AppDelegate implementation.
And the first thing
you have to do is
to import AVFoundation
and AVKit.
Then, an application
didFinishLaunchingWithOptions.
We first create an
AVPlayerViewController
by calling
AVPlayerViewController
alloc init.
And then we create
an AVPlayer object
by calling AVPlayerWithURL.
And then we use NSBundle
mainBundle,
URLForResource with extension.
The resource name is going
to be "longboarding",
and the extension .mov.
We set the player object on
the AVPlayerViewController.
And then we set the
AVPlayerViewController
as the rootViewController
of our window.
Finally, we add the
movie to our project.
And if we now build and run,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
you can see we have a fully
functional playback application.
And this was just three
lines of code, basically.
But this example is probably
not what your application would
look like.
So let's switch to
a different project.
Let me first run this
and show you what it is.
So this is a very simple
media playback application,
or media library, movie
library application.
And it's using the master
detailViewController template.
So in the master view here,
we have a list of movies.
And for each of the movies
we have the thumbnail,
the name of the file
and some information.
We will now click
one of the movies.
We see that in the detail
view we have some information
about the movie in
this gray box here.
So in the rest of this
demo, I'm now going
to add AVPlayerViewController
and show you how to hook it up.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So we start by modifying
the main storyboard.
As you can see here, I
already have a container view.
So now in the object library,
we search for the
Player View Controller.
Drag in an instance
to our storyboard.
Move this over here a bit.
And then using a Control Drag,
and choose Embedded embed.
We can set the AP
Player View Controller
up for our container view.
Now we need to select the
segue in the inspector,
give it an identifier.
In this case we just
use showMovie.
Since we are using AV Player
View Controller in a storyboard,
you also have to manually link
against the AVKit
library a framework.
So for Linked Framework
and Libraries,
we add a new framework.
So we type in AVKit, select Add
and then press the Add button.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So we type in AVKit, select Add
and then press the Add button.
And now we have to modify
the DetailViewController
implementation.
Again, the first thing we have
to do is import the
header files.
So do import AVFoundation
then add import AVKit.
And then we need to
implement the prepareForSegue
sender notation.
So first we check
for the identifier.
We want to make sure
that it is showMovie.
Then we get the
AVPlayerViewController,
which is the
destinationViewController
of the segue.
And then finally we create
an AVPlayer the same way
as we did before by
using playerWithURL.
And this detail view
controller has a movie property,
which itself has a URL.
And then we set this
AVPlayer and its player object
on the playerViewController.
We now build and run
and select a movie.
You can see that the
movie is loaded here.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
You can see that the
movie is loaded here.
And we can press Play.
And we press the
fullscreen button.
It goes into fullscreen and
I can rotate to landscape.
And you can see it's
fully working.
All right?
So next I want to show you how
AVPlayerViewController works
in AVFoundation, especially for
those of you in the audience
that are new to AVFoundation.
AVPlayerViewController
has a strong reference
in AVPlayer object.
This object provides
the content.
And an AVPlayer object manages
an AVPlayerItem, which serves
as a mutable data structure
from an individual AVAsset.
This means in order
to provide content
from the AVPlayerViewController,
you have to do the
following steps.
First, you create an
AVAsset from an NSURL.
This URL can either be a local
file on disk or a remote URL.
With an AVAsset you then
create an AVPlayerItem.
Once you have an
AVPlayerItem you can create an
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Once you have an
AVPlayerItem you can create an
AVPlayer object.
And finally, you
associate the AVPlayerObject
with the AVPlayerViewController.
But if you don't need to
inspect any properties yourself
of the content, and
just want to play
with AVPlayerViewController, you
can do all four steps at once.
As I've shown you in the demo,
you can directly create an
AVPlayer object from an NSURL
and then pass it to the
AVPlayerViewController.
And there's a chance that the
only reason why you inspected
the AVAsset so far was in order
to implement your own
playback user interface.
With AVKit you don't
have to do this anymore.
So if you would take a look
at the AVPlayerViewController
API you might notice
that besides the
player property,
there isn't really much more.
But there is so much stuff
you just get automatically
without any further setup.
Here's a list of the most
important AVPlayerViewController
features for your reference.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
You might notice that
it is mostly identical
to MPMoviePlayerController.
And there's a reason for that.
We want to make your transition
to AVKit as easy as possible.
For those of you not familiar
with MPMoviePlayerController,
let me show you what these
features actually look like.
AVPlayerViewController has
adaptive player controls.
This is different from
MPMoviePlayerController
and AVPlayerViewer on OS X.
Instead of setting a
certain control style,
AVPlayerViewController
automatically adapts
and controls that for you.
So as you saw in the demo,
when you show a movie embedded
in your application, and your
user taps the fullscreen button,
AVPlayerViewController
automatically switches
to the fullscreen
playback controls.
If you don't want it to
show any controls at all,
you still have the
option to hide them.
AVPlayerViewController also
has dynamic player controls.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So for continuous chapters,
tapping this seek backward
or seek forward button seeks
the previous or next chapter.
For content with additional
languages or subtitles,
AVPlayerViewController adds a
media selection button allowing
the user to select a different
audio or subtitle track.
Dynamic also means
that AVPlayerViewController
automatically switches
to a different set
of playback controls
for live streaming content.
You don't have to do anything.
Finally, AVPlayerViewController
has built-in support
for both AirPlay and HDMI.
So when a user enables AirPlay
or plugs in an HDMI adapter,
the application will
automatically present the
content on the external screen
but keeping the player
controls on the iOS device.
So let's see how
AVPlayerViewController stands
up against
MPMoviePlayerController so far.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
up against
MPMoviePlayerController so far.
Every major feature is
available in AVKit as well.
But if you look closely, you
will notice one difference.
As I mentioned before,
AVPlayerViewController
automatically selects the
control style for you.
So there's less for
you to worry about.
But actually, there's
a lot more.
Let's take a look at an object
diagram from earlier for a bit.
If you want to replace
the current AVPlayerItem,
you typically use
replace current item
with player item on AVPlayer.
But if you already know
in advance what the next
item is going to be,
you can help AVFoundation
and AVFoundation can help you
to get smooth playback
when switching
to the next player item
using AVQueuePlayer.
AVQueuePlayer is a subclass
of AVPlayer that allows you
to enqueue a list
of AVPlayer items,
each of which is
backed by an AVAsset.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Everything I'm showing you today
about AVPlayerViewController
works just fine
with AVQueuePlayer as well.
So far I've just talked
about basic media playback.
With AVFoundation you
can do a lot more.
AVComposition is a subclass
of AVAsset and allows you
to create multi-clip and
multi-track media compositions.
As a result, this is heavily
used in our and possibly some
of your video editing
applications.
It doesn't matter if you provide
AVPlayerViewController an
AVPlayerItem backed
by an AVAsset
or an AVComposition,
either works fine.
AVComposition is
also your entry point
to even more advanced features.
For instance, if your reader
composition allows you
to apply FN transforms
and simple transitions
for video content, but you can
even create your own custom
compositors to create all
kinds of video effects.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
compositors to create all
kinds of video effects.
AVAudioMix provides a similar
functionality for audio tracks.
AVVideoComposition,
AVAudioMix work in combination
with AVFoundation - I'm
sorry - with AVComposition.
If you want to analyze,
modify or visualize raw
audio data during playback,
you can use
MTAudioProcessingTap.
These are just a few classes
that AVFoundation provides you
for media operations
beyond basic media playback.
If you want to learn
more about this topic,
check out our previous WWDC
sessions, like this one
from last year, and
related sample code.
Let's take a look
again at our list
of AVPlayerViewController
features.
So as I've shown
you in this talk,
AVPlayerViewController features
the same features - sorry -
has the same features as
MPMoviePlayerController,
but due to the full access
to AVFoundation, you
get so much more.
Let me show you what you can
easily do now with AVFoundation,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Let me show you what you can
easily do now with AVFoundation,
AVKit in your own
iOS applications.
So this is an iPad version of
the movie library application
that I used earlier
to show you how
to use AVPlayerViewController.
In this demo I can now add
video effects to movies.
As you can see, I added the
Hue Curve effect to this movie.
Right now the effect
has zero impact.
But once I start dragging
handles up or down,
you can see how I can change
certain hue values in the movie.
Each handle represents a hue
value on the color wheel.
And changing a handle
modifies that color or shifts
that color to a different value.
I can structure any frame in
the movie to adjust the values.
So here, the helmet,
for instance.
Well, let's go back to the
beginning and start playing.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So as you can see, the video
effect is applied in real time.
And on the right, I even
added a digital volume meter
that shows the volume
for the left
and right audio channel
during playback.
In order to do all of this,
the application is using the
AVFoundationComposition API
that I mentioned before.
AVKit continues to manage the
user interface just as it does
for basic media playback cases.
Thank you.
[ Applause ]
So now that I have
shown you how easy it is
to adopt AVPlayerViewController
in a new application,
and what amazing things
you can do in combination
with AVFoundation, I
hope you can't wait
to adopt AVPlayerViewController
in your existing applications.
In order to help you
with the transition,
I'm going to highlight
the necessary steps
for the three common scenarios.
In the first scenario,
we're using AVPlayerLayer
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
In the first scenario,
we're using AVPlayerLayer
and possibly implemented your
own playback user interface.
Start off by replacing
your AVPlayerLayer back
to your UIView and
your user interface -
or your playback user interface
with AVPlayerViewController's
view.
Then set the player property
on AVPlayerViewController
instead of AVPlayerLayer.
And there's a chance that most
of your remaining code
will stay unchanged,
unless you have some
special UI needs.
If you're using
MPMoviePlayerViewController,
just replace
MPMoviePlayerViewController
alloc init to content URL
by AVPlayerViewController
alloc init
and then create an
associated AVPlayer object
like I've shown you before.
However, if you are
also accessing the
MPMoviePlayerController
property,
the third scenario
also applies to you.
So if you're using
MPMoviePlayerController
on your applications,
you're dealing
with a couple more
properties and methods.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
with a couple more
properties and methods.
But don't worry,
the transition's actually
quite straightforward.
MPMoviePlayerController API can
be grouped into two classes,
view-specific API and
controller-specific API.
In the former case, you use
AVPlayerViewController API.
In the latter case, AVPlayer
and AVPlayerItem API.
Things like MPMovieErrorLog
and MPMovieAccessLog are
actually very [inaudible] object
around AVPlayerItemAccessLog
and AVPlayerItemErrorLog,
for instance.
There are two things which you
should watch out for, though.
As I mentioned before,
AVPlayerViewController's
control style is dynamic,
so there's no setting property.
Also, MPMoviePlayerController
auto plays by default.
AVPlayer does not do that.
If there's anything you cannot
do or do not know how to do
with AVKit and AVFoundation
when switching
over from
MPMoviePlayerController,
please talk to us in the labs or
ask us in the Developer Forums.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
please talk to us in the labs or
ask us in the Developer Forums.
So let's wrap up AVKit for iOS.
In iOS 8 we're bringing
AVKit over from OS X
as our new UI-level media
framework for AVFoundation.
AVPlayerViewController
provides you
with standardized
playback controls
and behavior while
giving you full access
to a powerful, modern
media stack.
So please consider adopting
AVPlayerViewController
in your iOS applications.
Finally, I want to give you a
brief update on AVKit for OS X.
As you saw yesterday
in the Keynote,
OS X Yosemite received a
UI refresh, and as a result
with a brand new UI
for AVKit as well.
If you are already
using AVPlayerView,
your applications will receive
the new UI automatically.
You won't have to
change a single line
of code or update any file.
If you're not using
AVPlayerViewer yet,
this might be a good
time to adopt.
With the introduction of AVKit
for iOS we're also updating some
user interfaces and behaviors
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
for iOS we're also updating some
user interfaces and behaviors
for AVPlayerView to match
AVPlayerViewController.
If you want to learn more
about how to use AVPlayerView,
please check out last
year's "Moving to AVKit
and AVFoundation" session.
And finally, we have
a brand new class
for you this year
in AVKit for OS X.
AVCaptureView provides you
view-level capture facilities
on top of AVFoundation
and AppKit.
I'm not going into any details
in this session, though.
Please come to the Camera
Capture talk tomorrow morning
or ask us in the labs if
this is interesting to you.
And now, let me hand over to
my fellow AVFoundation engineer
to show you how you can
get the most out of AVKit
and AVFoundation in your
iOS and OS X applications.
Thank you.
[ Applause ]
>> Thank you, Stefan.
Good morning.
I'm Shalini Sahoo, an engineer
on the AVFoundation team,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
I'm Shalini Sahoo, an engineer
on the AVFoundation team,
and I'm here to talk to you
today about best practices.
Before we get into the
details of these practices,
for a show of hands, how many
of you have used
AVFoundation before?
Lots of you.
For you, I would hope this
section serves as an update
or refresher on best practices.
As technology evolves with
time, so do best practices.
For those of you who are new to
AVFoundation, I hope this serves
as a guideline on how
to approach our APIs.
Let's get started by talking
about why you would
be interested
in adopting such practices.
What's the motivation?
These practices are designed
to make your application
as efficient and
performant as possible,
making it responsive
to your end-users.
They not only help with the
application's efficiency,
but also help make
your program -
help improve your program
correctness, making it robust
and preventing any sort
of hangs or crashes.
Media applications
sometimes make use
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Media applications
sometimes make use
of expensive resources
like networks.
These best practices
are designed
so that you can make efficient
use of such resources.
And lastly, your
users will be thankful
for the improvement they
see in their battery life
as your application
uses very little power.
Here is our Modern
Media Stack we looked
at earlier in this talk.
AVFoundation sits
on top of Core Media
and its family of frameworks.
And on top is AVKit,
which provides you
with standard playback controls
with just a few lines of code.
The focus for this section today
is best practices, specifically
in the areas of inspection
and playback.
If you would like to gain more
information about AVFoundation
in general, you can look at
this talk from WWDC 2011 called
"Exploring AVFoundation".
Based on the two major
categories for today,
AVFoundation objects
are modeled separately.
For the first category,
which is inspection,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
For the first category,
which is inspection,
AVAsset is an example.
Within AVAsset, as a client
of this API, you are in charge
of loading values whenever you
need it in your application.
You can initiate I/O.
And when AVFoundation requests
- receives a request for I/O,
we go and do all the loading
work so you can get your value.
The best practice here is to
request values asynchronously.
Why asynchronous loading?
We'll answer that in
just a few slides.
For the other category,
which is playback, AVPlayer
and PlayerItem are the
controller objects.
All you have to do here
is create these items
and initiate playback.
AVFoundation and its underlying
machinery drives the playback
clock so that your users
see video frames on time.
So when you hit Play, we
drive the necessary machinery
so that your user can
experience a smooth playback.
As time progresses,
properties change.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
As time progresses,
properties change.
And if you would like to keep
our application state up-to-date
with the state of
the playback engine,
you can use NS key
value observing
to be notified of such changes.
NS key value observing
is also called NS KVO.
Here's a look at our
object diagram from earlier.
Stefan showed you in a demo
how to use these objects
and create a simple
media application.
Now let's use this as
a roadmap for talking
about best practices
class-by-class starting
with AVAsset.
As I mentioned earlier,
with AVAsset you can use
AVAsynchronousKeyValueLoading,
one of our protocol methods,
which lets you load
properties asynchronously before
accessing them.
For example, if you
are interested
in the precise duration of an
asset, you can ask AVFoundation
for the duration and we download
just about the right amount
of data to be able to tell
you what the duration is.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
of data to be able to tell
you what the duration is.
With some file formats
it's straightforward.
And there's some of that
information right overhead.
So we can download just
that piece of data.
But that's not always the case.
Sometimes when you
ask us for duration,
AVFoundation would have to
download the entire contents
of the media file, parse it,
decode it before we
can give you the value
for a precise duration.
You might ask why should
I load asynchronously
if AVFoundation has to do
all this work to load values,
or it takes time to load values?
Well, you really should.
On OS X, if you tried to
access a property synchronously
by accessing your getter
before loading the property,
you would be blocking your
main thread leading to a spin
and making your application
unresponsive to your end-user.
On OS X, however, you
can dispatch this work
to a background queue
to access the getter
and you won't see a spin.
Whereas, on iOS,
loading synchronously
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
Whereas, on iOS,
loading synchronously
from any thread would
block your application
and could lead to
a hang or crash.
This is because, as
you may already know,
on iOS we have a shared
daemon called mediaserver.d,
which services media requests
on behalf of your application.
If you tried to access a
property synchronously,
you would be tying mediaserver.d
or forcing mediaserver.d
to load a value, and
this might take time.
This leads to a timeout and
media services termination.
This not only affects
your application,
but every other application
on the system
which relies on media services.
So please don't do that.
Now that we looked
at a good reason
to use
AVAsynchronousKeyValueLoading,
there are two more
things to remember.
Firstly, only load those
properties you anticipate
to use in your application.
Every extra property
means more work.
If you intend to not use
a particular property,
do not encode the wasted work.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
do not encode the wasted work.
And the second thing is,
if you anticipate the use
of a particular property
later in your application,
you can still request all
of them together using
loadValuesAsynchronouslyForKeys
and completionHandler.
You can parse in
an array of keys
which AVFoundation
will load together
and your completionHandler is
called once you are done loading
these properties.
You no longer have to load
tracks before playback begins.
This has changed since
the last time we talked
about best practices.
In fact, this has
changed since iOS 5.
You really only need to load
those properties you would
directly use in your
application.
Now let's look
at how
AVAsynchronousKeyValueLoading
translates into code.
In this particular
example, I'm interested
in loading the property
"playable".
I pass in an array with
just playable in it
and I provide a completion
handler.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
In my completion
handler I first check
to make sure the
property is loaded.
Sometimes things go wrong,
like if you are relying
on the network resource and your
user device goes out of range,
asset loading can fail.
So this is a good place
to check for such errors.
Once you get the status, you
can see if it's already loaded.
And then you can update
your UI for the asset or,
if there's a failure, you can
report an appropriate error
to your end-user.
One more thing to remember
with completion handlers is,
if another module has
already loaded the keys you're
interested in, your
completion handler is called
back immediately.
Say, in this example,
if AVPlayer has already
loaded the playable property,
your completion handler
is called synchronously
on the same thread.
If you have code which relies
on loadValuesAsynchronously
for keys to return immediately,
that might not happen till the
completion handler has executed.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
that might not happen till the
completion handler has executed.
So that's something to keep
in mind to prevent a deadlock.
To list the best
practices for AVAsset,
load only those keys
you are interesting
in using in your application.
You can use
AVAsynchronousKeyValueLoading
to list all these keys
together in an array.
And you can provide
a completion handler,
which is called once we are
done loading these properties.
In your completion handler,
check to make sure the
properties are loaded before you
access them.
And lastly, be prepared
for asynchronous callback.
If someone else has
already loaded the property
on your behalf, your
thread is called -
your completion handler is
called back immediately.
Those were best practices
for AVAsset.
Now let's look at
AVPlayer and AVPlayerItem.
As I mentioned earlier, with
AVPlayer and AVPlayerItem,
the playback engine on the -
the underlying playback engine
drives the machinery necessary
so that your user sees
video frames on time.
So all you have to do is use
NSKeyValueObserving or KVO
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
So all you have to do is use
NSKeyValueObserving or KVO
to be notified of such changes
so that you can update
your application state.
Here's an example.
If you have a progressive
download item on an HTTP server,
as AVFoundation downloads
some data, you get a callback
with the loaded range.
And as we buffer more data,
you get yet another callback
with the updated value
for the loaded range.
Let's take an example of
where KVO might come in handy,
is playback interruption.
If your user's device receives
a phone call or a FaceTime call
when your application is
playing a particular file,
your playback is interrupted
so that your user can
answer their call.
In this case, if you're using
KVO on the player's rate,
you would see it
transition to zero.
This is a good way for you to be
notified of such interruptions
so that you do not end
up waiting endlessly
for playback to end.
And the last example here
is media services reset.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
And the last example here
is media services reset.
As I mentioned earlier,
if your application
or some other application
wasn't paying attention
to AVAsynchronousKeyValueLoading
and tried
to access a property
synchronously,
you could have forced
mediaserver.d to terminate.
This affects your
application, too.
Your media services are reset.
And when this happens,
your player item status
transitions to failed.
And this is a good place for you
to throw away all your old
objects and rebuild your player
and PlayerItem to
continue playback.
Let's look at an example in code
where KVO might come in handy.
In this example I
would like to decide
when to show audio-only
UI in my application.
Here I first create a
player item with an asset.
Then I try to access the
track's property on item asset,
particularly of type video.
And then I check to make
sure that the video tracks -
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
And then I check to make
sure that the video tracks -
or there are no video
tracks before I update my UI
to audio-only.
This is not completely correct.
Firstly, we're trying to access
the tracks property on an asset
without prior loading it.
This could block
your main thread
and make your application
unresponsive.
And the second thing here
is an assumption that lack
of video tracks means
audio-only.
You can have non-video
visual tracks
like subtitles in your movies.
Now let's look at how I
can do this using KVO.
I add myself as an observer
for the presentation
size on a player item.
And once AVFoundation is done
loading the presentation size
and your observation callback,
first check to make sure
that your presentation
size is zero by zero.
This way you know none
of the tracks have a
non-zero presentation size.
And after that, make sure you
have at least one audio track.
To do that, you can look at
the underlying AVAsset tracks
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
To do that, you can look at
the underlying AVAsset tracks
and make sure that there's
at least one of them
which has an
AVMediaCharacteristicAudible.
This is necessary because
sometimes you can have movie
files which have non - or
which have zero by zero
for the presentation size
but no audio tracks,
like timecode tracks.
The KVO recipe we just used,
we first create a player
item with an asset.
Then we register for key
value observing a property
of interest.
After that, we associate the
player item with the player.
And in your observation
callback, you can look
at the change dictionary to
know what the new value is.
This is the recipe when you
work with these objects.
But there are a few
more things to remember
when you're interacting with
these objects in general.
First, do not assume the
order in which events occur.
As soon as you associate a
player item with a player,
AVFoundation starts its
underlying machinery
so that your user can
experience a smooth playback.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
so that your user can
experience a smooth playback.
On iOS 7 we made an optimization
which changes the status
to ready-to-play
much more quickly.
So if you were to add
yourself as an observer
after you create a
player with a player item,
you could miss this
vital notification.
To fix this, you
can add yourself
as an observer before
associating a player item
with a player.
That way you won't
miss any notifications.
Or, sometimes in
your applications,
you might have a particular
scenario where you would
like to add yourself as an
observer only after an event,
say, if a user presses a button.
In such cases, you can
use NSKeyValueObserving
OptionInitial.
This flag lets you
access the initial value
as well as the new value.
The second thing to remember is
that AVFoundation serializes
the access to AVPlayer
and PlayerItem on
the main queue.
It's safe to access and register
and unregister for observers
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
It's safe to access and register
and unregister for observers
for these objects
on the main queue.
This was done to prevent
any possible race conditions
as multiple modules are
interacting with these objects.
And the main queue
was a natural choice,
as most of these
observation callbacks lead
to your application's
interaction with UIKit objects,
which also happen
on the main queue.
However, this does not mean
that we are doing our
work on our main queue.
We are not affecting the
end-user responsiveness.
All our loading and
playback-related work happens
on the background queue.
We only serialize the access to
these objects on the main queue.
If in your application you
have a particular scenario
for which you think this
requirement is a hindrance,
please come talk
to us in our labs
or file enhancement requests.
We are really interested
in listening
to your feedback on this.
And the last thing to
remember is wherever possible,
set up your player item before
associating it with the player.
As I mentioned earlier, as
soon as you create a player,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
As I mentioned earlier, as
soon as you create a player,
AVFoundation starts
driving its machinery
so that we can get
ready for playback.
For example, if you
have a streaming item,
as soon as you create a
player, we go over the network
and start downloading data
from the default time.
And after that, if you were
to issue a seek to time,
we would have to throw away
all the data we downloaded
and start reloading our
caches to begin playback.
In order to prevent
that wherever possible,
you can always configure your
player item before associating
it with the player.
Here's a few examples
of the kind
of configurations you can do.
This is definitely not the
exhaustive list, but just a few.
You can add outputs.
You can select media options
like audible and legible.
You can set forward and
reverse playback end times
or seek to time.
And after doing all
those changes,
I can associate my player
item with the player.
In summary, for AVPlayer
and PlayerItem,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
In summary, for AVPlayer
and PlayerItem,
use NSKeyValueObserving
to be notified of changes
as the playback engine
is driving the clock
so that your user can
experience a smooth playback.
Do not rely on the order
in which events occur.
If you really need
a particular value,
you can always use
NSKeyValueObserving
OptionInitial or add yourself
as observers before
you create the player.
Please serialize your access
to Player and PlayerItem
on the main queue to avoid
any possible race conditions.
And lastly, wherever possible,
set up your player item
with all configurations
before creating a player.
Those were the best practices
for AVPlayer and PlayerItem.
Most of the things we talked
about AVPlayer also
apply to AVQueuePlayer.
As you've seen earlier
in this talk,
AVQueuePlayer takes a
list of player items.
If in your application you
have an inspector window
for which you would like to
access a set of keys on each
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
for which you would like to
access a set of keys on each
of the player items, you can
use AVPlayerItem's convenient
initializer, which lets
you parse a set of keys
which AVQueuePlayer or AVPlayer
would load on your behalf.
You have an AVQueuePlayer
with a list of items.
And if you initialize each of
these items with a set of keys,
as AVQueuePlayer is getting
ready to initiate playback
for the particular item, we
load these keys in combination
with the keys we
require for playback.
And as playback progresses
and we reach the next item,
we load the second set
of keys you requested
for that particular item.
This is valid for
AVPlayer as well
if you are using
replaceCurrentItem
with playerItem.
You can use this as an alternate
to
AVAsynchronousKeyValueLoading.
Instead of using
AVAsynchronousKeyValueLoading
for each of your asset,
you can initialize each
of the player items
and we'll take care
of loading the properties.
And if you are using
this instead
of
AVAsynchronousKeyValueLoading,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
the best place to access
your properties would be
to KVO the playerItem status.
When you receive an observation
callback for the status,
and if the status
is ready to play,
your asset keys would
either be loaded or failed.
So then you can use
statusOfValueForKey
to access these properties.
And the last thing to
remember with AVQueuePlayer is
if AVFoundation encounters
an error
with a particular player item,
we'll skip that item and move
on so that your user experiences
uninterrupted playback.
However, if you would like to
be notified of such errors,
you can NSKeyValueObserve
AVQueuePlayer's current
item status.
Those were some tips
for using AVQueuePlayer.
Lastly, let's talk about
AVPlayerViewController.
With AVPlayerViewController on
iOS or AVPlayerView on OS X,
it's a good idea to
animate your view
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
it's a good idea to
animate your view
into your view hierarchy only
when there's some viable
content to present.
Only if there's some
video frames you can show.
And to do that, you can key
value observeReadyForDisplay.
On iOS, you would observe on
the player view controller,
whereas on OS X you
observe on the player view.
You add yourself as an
observer for ready for display.
And in your observation
callback, you can check
to make sure ready for display
is Yes before animating your
view or before, like
showing your view.
New in iOS 8 and OS 10.10
is contentOverlayView.
With contentOverlayView, you
can do your custom drawings
and renderings over
the player view.
The contentOverlayView
is guaranteed
to have similar dimensions
as the player view.
And if you would like to
place your drawings relative
to the video frame, you can
access the videoBounds property
on the player view controller.
Here is a screenshot of
the demo we saw earlier.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
The digital volume meter
on the bottom right was rendered
using contentOverlayView.
And lastly, for chapter
navigation, as you may have seen
in QuickTime Player,
when you seek
through chapters using the
appropriate keyboard shortcuts
or menu items, QuickTime Player
briefly flashes the chapter
number and title.
And if you would like
to get similar behavior
in your applications,
which we highly recommend,
you can use this API of
AVPlayerView, This is only
on OS X, to flash
chapter number and title.
Ideally, you would do this
after the seek has completed
so you can place this code
in your completion
handler for seek to time.
So the best practices for
PlayerViewController is
to observe ready
for display to know
when to present your
view onscreen
so that your user doesn't
have to look at black screen.
Second, then we saw how to
use content overlay view
to do custom rendering
over the player view.
In Z-order, the player
view is below and above
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
In Z-order, the player
view is below and above
that is content overlay view
and the controls sit on top.
And lastly, we saw how
to flash chapter number
and title during chapter
navigation using AVPlayerView.
That brings us to the end of
our roadmap for best practices.
Let's wrap up.
We looked at why it's important
to load values asynchronously
and how to do
that
AVAsynchronousKeyValueLoading
for AVAsset.
With AVPlayer and
PlayerItem, all you have
to do is use NSKeyValueObserving
to be notified of changes.
And we looked at some tips
for using AVQueuePlayer.
We talked about how to
observe readyForDisplay to know
when to present your view
into your view hierarchy.
And then we looked at how to
customize your player view
by custom drawings in
content overlay view
and displaying chapter
numbers during navigation.
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
and displaying chapter
numbers during navigation.
That was best practices
for AVFoundation and AVKit,
our last topic for today.
In summary, AVKit
is now available
on iOS along with OS X.
AVKit provides you with
standard playback controls
with just a few lines of code.
If you've been using
AVKit on OS X,
you get the UI refresh for free.
You wouldn't have
to change anything.
We looked at the demo,
which shows how powerful
AVFoundation can be
in combination with AVKit to
provide you with an application
with standard playback controls
and to add effects
and visualizations.
We highly recommend you to adopt
our Modern Media frameworks
and to audit your current
applications to see
if they can stand to benefit by
adopting these best practices.
For more information, you can
contact our Evangelism Team
or visit or check out
our programming guide
on our developer Website,
X-TIMESTAMP-MAP=MPEGTS:181083,LOCAL:00:00:00.000
on our developer Website,
which covers both
AVFoundation and AVKit.
You can also consult us
on our developer forums.
We have some related sessions
lined up for you for the rest
of this week starting with
one this afternoon called
"Harnessing Metadata
in Audiovisual Media".
Tomorrow morning we
show you Camera Capture
with all its new features
and AVCaptureView,
we mentioned earlier
in this talk.
And on Thursday, we show you
how you can directly access our
media encoders and decoders.
That's it.
Thank you for coming.
Enjoy the rest of your week.
[ Applause ]