WWDC2016 Session 504

Transcript

[ Music ]
>> Is this thing on?
[ Applause ]
Good afternoon.
I'm Roger Pantos.
This is what's new in
HTTP Live Streaming.
We have been talking about
HLS now for 7 to 8 years.
And yet, every time I do one
of these, marketing says.
"You've got to tell the
people what's going on!"
And so here we go.
Here's what's new.
But, first, HLS in 20 seconds.
So HLS is about playing
playlists.
What's a playlist?
It's one of these things.
It's a TXT file.
It's got tags, which are
these blue things here.
It's got tags, which are
these blue things here.
And it's got segments, which
are those white things.
Each one is 10 seconds of media.
If you want to play one,
then you go rummaging
through the tags looking for
something interesting like --
I don't know -- this one
says this is an INF segment.
So, load it first.
So you do that.
Then you load segment1.
You play it.
You load segment2.
You play it.
Maybe it's a live playlist.
So you refresh the playlist.
You load segment3.
You play that.
And that is basically
HLS in a nutshell.
So now that we're all on
the same page about that,
let's go on to what's new.
Sweet. OK.
MPEG-4 Fragment Support.
Who is this MPEG-4 guy
and why is he fragmented?
Actually, most of you
are probably familiar
with MPEG-4 files.
If you've got a movie on
your hard disk, it's probably
in MP4 format or one
of its close cousins.
And if you've ever
looked into it,
it consists of one
sample table --
usually it's at the beginning --
and then all the sample
data following that.
And we did that originally.
Then, we introduced a new
segment a little bit later
Then, we introduced a new
segment a little bit later
called fragments.
And fragments are basically a
way to take your MPEG-4 file
and divide it up into these
little islands of playable data.
A little sample table,
a little sample data.
A little sample table, a
little more sample data.
And these little islands
are all called "Fragments".
So what we're announcing
today is essentially
that what you can do now is
you'll be able to take one
of these things, these
MPEG-4 fragments,
and use it as a segment
in your HLS playlist just
like you can do with
transport stream files
or elementary audio streams.
And we're updating
the spec to sort
of give you the details of that.
There should be a link to
the session I think now.
But what it allows you to do is
essentially use fMP4 segments,
and you can do anything
you can do
with transport streams
with fMP4.
That includes iframe playlists.
That includes discontinuities.
That includes live streams.
All that stuff.
Same set of features.
And of course, we're delivering
it on all of our HLS platforms
so you can choose to encode
all your streams as fMP4
and deliver them
everywhere we are.
Why would you do that?
Well, in a word,
interoperability.
If you're in a situation
where you have
to encode your entire media
library one way to deliver it
to Apple's platforms,
and then maybe you have
to encode it again in another
way to deliver it to Android
for instance, that
kind of sucks.
But if you could have
a single library,
then your delivery
costs would go down,
your storage costs
would go down,
you would get better
cache utilization
because you're no longer
competing with yourself
on the edge, and as
sort of fringe benefit.
Also, if you've got a
single production chain,
you can use the same set
of tools and validation
you can use the same set
of tools and validation
across all of your ecosystems.
And as a fringe benefit, because
fMP4 doesn't use the same kind
of packetization that
transport streams do,
you get slightly more
efficiency at lower bit rates.
So that's sort of a gimmie.
So how does that change HLS
for you, the playlist author?
Well, as it turns
out, not very much.
If we take our playlist here
that we were looking at earlier
and we convert it to fMP4,
all we have to do is this.
We change the initialization
segment to point
to a movie box instead, and
then each segment becomes an
MPEG-4 fragment.
And that's it.
Everything else is pretty much
the same at the playlist level.
On the segment level, there's
one other thing I was going
to talk about, which
is encryption.
Now, as you know, HLS
has two different ways
of encrypting your stuff.
You can either just
encrypt the entire segment,
AES CBC the entire thing.
And that's the same whether
it's transport streams or fMP4.
And that's the same whether
it's transport streams or fMP4.
For sample encryption,
like if you want
to use FairPlay Streaming, it's
a little bit more complicated
because you have
to say what part
of the files are encrypted
and what part aren't.
And this time around --
last time, we did it
for transport streams.
We essentially made
up our own format.
This time, we decided
to use an existing one.
And so this is what MPEG
calls "Common Encryption".
It's a standard.
It came out a couple years ago.
The newest edition of the
standard has a mode called cbcs,
which is compatible
with FairPlay Streaming.
And so if you're doing
sample encryption in HLS,
you'll be using common
encryption cbcs mode for that.
And so with encryption, and
playlist, and everything else,
we pretty much have a whole
story put together for HLS
around fragmented MPEG-4.
But if we're going to get to
the happy, shiny, bouncy world
where everybody can just have
a single copy of their catalog
where everybody can just have
a single copy of their catalog
and deliver it everywhere,
we have to go beyond HLS.
And one of the things we need
to do is we need to write
down a set of rules that
says, "Here's how you're going
to author your fMP4 segments
or fragments, or what have you,
so that they play everywhere."
And we've been working
on that as well.
It started as sort
of a what-if exercise
between us and Microsoft.
And once we got to the
point where it seemed
like it was practical, it seemed
like it would work, we took it
and proposed it to MPEG.
And a lot of folks there seemed
to think it was a
really good idea.
So what it comes down to is
it's a set of constraints
for how you construct
your fMP4 segments.
So you've got to deliver your
audio and your video separately.
You've got to put an
iframe at the start
of every video segment.
You have to make sure you
rigidly align your segments
across your different bit rates.
You don't have to
do any of this stuff
if all you want to
do is target HLS.
But if you want the broadest
audience for that copy
But if you want the broadest
audience for that copy
of your media library, then
we're putting together a set
of recommendations for you.
And so I think that
that will end
up benefiting everyone
-- at least I hope so.
So now I'm going to switch
gears a little bit and talk
about something else --
everyone's favorite
topic -- metadata.
So to sort of frame this,
I'm going to start by talking
about some of the existing
things we already do
for you in HLS for metadata.
Static metadata and ID3.
Static metadata is used
for static metadata.
No surprise.
So it'll offer for
like a content title,
or something like that.
Most of it is text, and that
makes it really easy to put
into the playlist file
either directly as a tag,
or you can package it as a
JSON and refer to it by URL.
Now you only get one
title, or one author,
or one copyright per
presentation it's static.
But the benefit of that is
that it's available
whenever you want it.
It's available right away.
So let's contrast that to ID3.
Because timed metadata
is linked to the timeline
of the media presentation,
it's often used
for signaling events
in the stream.
For instance, the start of an
ad, the start of a program.
It lives inside the
media format itself,
so it's a binary
format called ID3.
And that makes it compact.
It also makes it kind
of difficult to author
because it requires
specialized tools.
It's kind of a pain in the
butt if you're debugging it.
It's hard to read ID3.
But you can have as
many as you want,
and you can put them anywhere
on the timeline you like.
And so that's powerful.
The flipside of that however
is because it's in the media,
we can't give it to you until
it's actually been played.
So where does the new guy fit?
What we're adding is a way
to author timed metadata
inside the playlist itself.
to author timed metadata
inside the playlist itself.
And so you can use it
for the same things
that you're using ID3 for today.
You can author them as text.
So that's pretty easy.
You can easily put
them into a playlist,
and you can have as
many as you want.
They can be overlapping.
They can nest.
They can do whatever.
And on the other hand, because
they're in the playlist,
we can give you the
entire set as soon
as we read the playlist
no matter
where the playhead
is in the media.
And so that makes
it pretty nice.
For doing things particularly
like navigation control.
So what does it look like?
How do we make it sort of sit?
Well, we already had the
idea of dates in HLS.
We have the PROGRAM-DATE-TIME
tag that allows you
to precisely signal
times inside a playlist.
So we built on top
of that and we said,
each bit of metadata is going to
be expressed as a range of time
each bit of metadata is going to
be expressed as a range of time
that has a set of
attributes attached to it.
And some of those are ours.
Some of those are yours.
We set up the syntax so
it'll be really clean to add
and remove these
things for live streams.
So it meshes really well
with the live workflow.
So what does it look like?
Let's take a look at
one of these guys.
We called the new tag
the DATERANGE tag.
Here's another playlist.
The first guy says
basically what time it is.
The second one is
a DATERANGE tag.
And if you look at it, what
you can see is it's composed
of attributes.
These guys here, these
blue ones, are the ones
that are defined by the spec --
the ID, the START-DATE,
the DURATION.
These guys, we reserve
the X-namespace somewhere
to HTTP for all you guys.
So you can use that to put
in whatever attribute
value pairs you want.
In this case, someone
decided to put
in an AD-ID and a beacon URL.
So that's an example.
That's what one of these guys
looks like in a playlist.
That's what one of these guys
looks like in a playlist.
When you're doing your
content authoring,
you can use it either
live or VOD.
It just has to have a
DATERANGE tag and a date.
You can obviously
put these things
in when you're creating
your playlist.
But the other nice thing is,
because the playlists are easy
to manipulate, if
you've got something
like a post-production workflow
that's doing ad insertion
or something like
that, it's really easy
to augment the playlist
and presentation
with metadata at the same time.
By popular request, we
also included rules for how
to map SCTE-35 that you might
find in your source media
into the DATERANGE tags.
So you can carry SCTE-35
losslessly in HLS.
And finally, we've added support
for the media stream validator.
And I actually wanted to take
a moment to give a shout-out
to the new set of tools we have.
We put together a talk --
we don't have time to talk about
it right now in this session --
but we put together a talk
talking about the new tools.
but we put together a talk
talking about the new tools.
And so I'd encourage
you to go watch that.
It's on demand on your apps
or whatever you guys have.
And you should check it
and out see what kinds
of new tricks the validator
in particular has learned.
So for playback, when
you want to actually act
on the metadata that's
in the stream,
you have a set of
AVFoundation APIs.
And so you have to be
in an app at this point.
And the APIs are essentially,
give me all the data
you currently got.
And then if you've
got a live playlist,
tell me when something
new shows up.
And it's centered around
a new object called the
AVPlayerItemMetadataCollector.
And so we've got a
little sample up here.
You can see it doing
your typical things.
You create your asset.
You create your PlayerItem.
And then you say, "Hey, I want
to collect some metadata."
So you create the
MetadataCollector.
You set yourself
up as a delegate
so we can tell you stuff,
and then you add it
to the PlayerItem.
and then you add it
to the PlayerItem.
And that's it.
Then you get all
the metadata items.
And so it's really easy to use.
It's really powerful.
And I think that it's going
to be the preferred way
to carry metadata
in HLS pretty soon.
So the next thing
we're going to talk
about today is another feature
that's come by popular request.
And that is the ability to
play HLS when you're offline.
[ Applause ]
And to talk about that, I'm
going to ask my colleague,
Jordan Schneider, to come
up and explain it to you.
[ Applause ]
>> Hey. So, let's see
if those slides click.
There we go.
In iOS 10, we are bringing
you the ability to download
In iOS 10, we are bringing
you the ability to download
and then play HLS content
without a network connection.
So now you can bring
offline media playback
to your users using
your existing streaming
video library.
So as part of this feature,
we are extending the
FairPlay Streaming support
that we introduced to you last
year to work with offline HLS,
and we are providing you a way
to download your HLS
content even while your app
is backgrounded.
And finally, we are
exposing the ability
to play partially-downloaded
content even while your download
might still be in progress.
So the first question - when
and should you use offline HLS?
Well, yes, if you want to
use this feature primarily
when your user might want to
play content when they expect
that they might not have
a network connection,
such as when they go
into airplane mode.
However, this feature is not
targeted toward preemptively
loading media.
We have other ways of doing
that using AVPlayerItem.
So why use HLS for
offline content rather
than just downloading
movie files?
Well, for one, your content
might already be authored
as HLS.
So adopting HLS for offline
content might be a really
convenient story for you from
an engineering perspective.
But also, in addition
to a video track,
movie files can contain
many different audio
and subtitle tracks that your
user might not really need
or want to persist offline.
So in HLS, because all these
tracks are contained separately
on the server into
different playlists,
we can have a little
bit more flexibility.
Specifically, we can
have precise control
over which media
selections are downloaded,
saving your user
time, network data,
and storage space
on their device.
So, for example, by default
we downloaded a user's default
media selection, which is likely
all they're really going to care
about when they play offline.
What we're also providing
you is a way
What we're also providing
you is a way
to configure exactly what you
and your user want to download
to have available
for offline playback.
So another cool thing about
playing HLS offline is that,
using the same asset,
we can still download
media selection options
that have not been downloaded so
long as they are still available
on your server, which is nice.
So how do you do this?
In iOS 10, we're
introducing a new class
to download AVAssets
called AVAssetDownloadTask.
This class inherits
features of URL session.
Most importantly, the
ability to download assets
in the background even while
your app isn't running using all
of its mechanisms.
As I mentioned before,
we're giving you the ability
to control which media
selections you want to download.
And because this is HLS,
we have the ability for you
to select the quality of
your asset that you want
to download for offline as well.
So the interface
of AVAssetDownloadTask
looks like this.
of AVAssetDownloadTask
looks like this.
It inherits from URLSessionTask.
I want to mention here
that it does not inherit
from URLSessionDownloadTask.
It behaves similarly, but
there is a few differences
that I'll point out as the
presentation goes along.
So to create one
of these things,
you want to set up a URLSession.
Now we have a specific subclass
of URLSession called
AVAssetDownloadURLSession
that you have to use
for one of these things.
And so you call
makeAssetDownloadTask
to create an asset
download task.
And then for your
selection of quality,
we have a minimum required
bit rate options key,
as well as a media
selection key.
Now I should note here that each
AVAssetDownloadTask corresponds
to a single set of
media selections.
So if you want to download
multiple media selections
on the same asset, then
you're going to want to set
up multiple AVAssetDownloadTasks
to do so.
I'll show you how to do
that in a few slides.
So to use one of these
things, here's what we do.
The first thing is to set
up an AVAssetDownloadTask,
configure it how you want
it, and start the download.
configure it how you want
it, and start the download.
Next, we want to
respond to any events
that might occur
during the download,
such as monitoring the
progress of the download.
Then once the download
finishes, we're going to want
to store the location
for playback
when we're actually offline.
And then you might want
to download additional media
selections for your user to use.
And finally, you're going
to want to play this thing.
So to set up one of these
downloads, the first thing
after you have your asset set up
is you're going to want to set
up a backgrounConfiguration
from a background
URLSessionConfiguration.
Then you're going to create
your AssetDownloadURLSession,
and then create your
download task.
Here, I have the download
task to download the quality
at about a media bit rate of
about two megabits per second.
But by default, we'll
download the highest-quality
video available.
Then once you have that set up,
because this is a
URLSessionTask,
you just call resume
to set your download.
So once you have your download
going, then you're going
So once you have your download
going, then you're going
to want to monitor it.
To do this, we have
a new protocol
for you called
AVAssetDownloadDelegate.
So this inherits the
same delegate methods
that URLSessionTaskDelegate has.
But I want to point out two
things that we have on here
that we are introducing to you.
One is our method that we use
to monitor the progress
of downloads.
In this method, we express
our progress of the download
and time ranges as
opposed to bytes
for better parlance
of media interfaces.
And then we have a
didFinishDownloadingTo location
delegate method which
informs you
where the download
gets downloaded to.
So this is similar
to URLSessionDownloadDelegate's
protocol method here.
But your expectation
for what you need to do
when you get this
delegate method is a little
bit differently.
So I'll point that
out in a second.
So here's an example of our
progress delegate method.
Here, I'm just converting
time ranges
Here, I'm just converting
time ranges
to a percent -- complete
downloaded.
I'm not going to go
into it in detail,
but that's how you
would do that.
Another thing that might happen
during your download is your app
might be killed.
The process might be terminated.
And then what do you do?
Your download continues.
Well, to respond to that, you
respond exactly how you respond
to URLSessionDownloadTasks.
And that's by, when your app
launches, you're going to want
to set up another background
configuration using the same
identifier you used to set up
your download in the first place
and then create a
URLSession from that.
And then from there, call
the getAllTasks method.
And then here, you can restore
your AVAssetDownloadTask
which will have the
current progress of it.
And you can use this
to update any UI
that you might have in the app.
And you can even use it to
grab the original AVAsset
that you used to set
up your download.
So once your download finishes,
the first thing you're going
to want to do is store the
location of the download asset.
So this method is called
whenever anything is deposited
at this location, including
when a partially-downloaded
content is canceled
when a partially-downloaded
content is canceled
by your user.
Now you can use this
partially-downloaded,
cancelled download to resume
the download at a future time.
Or you can even use it
to playback whatever has
been partially there.
But if you really don't
want that download there
or that content there on
your app storage anymore,
then here's where you
should probably delete it.
So, unlike
URLSessionDownloadDelegate,
we ask that you do not move
the asset from this location.
And we do this because
it's really important
that the system be able
to find these files
as the system may want
to reclaim disk space
in low disk space
conditions, and might actually
at some point go out
and delete this asset
from your app container.
So what you're going to
want to save is going
to be the relative
path of the location.
And then from there, you will
be able to restore your asset
in the future for playback.
So, now that you have a version
of your asset downloaded,
let's say you want to download
additional media selections.
A good place to do this would be
in our didCompleteWithError
delegate method.
This is the same one
that URLSessionTask has.
And the reason why it
might be good to do here is
because media selections
on the same asset are
downloaded serially.
And this is generally the
right place to do any cleanup
of the complete download.
So, to augment with an
additional media selection
option -- say, the
Spanish audio --
we would first grab
our spanishOption
that we want to download.
Then we would mutate
the AVMediaSelection
that we originally passed
in to select the
spanishOption from there.
And then once we have that
selected, we're going to want
to create a new
AVAssetDownloadTask preferably
on the same session with passing
the additional media selection
we want to download.
And then we call resume,
and then the whole
process repeats itself.
And then we could download
additional media selection
after that if we wanted to.
So let's talk about
playing this thing now.
So, say we had a --
here's an example
of how we could create
a download task.
And notice that I pass in an
asset with a networkURL to it.
When we playback offline
HLS, if we still have
that original urlAsset
available, then we should use
that AVAsset instance to play.
And we can grab this from
the urlAsset property off
the AssetDownloadTask.
If you pass in a new
asset with networkURL,
AVFoundation will not know
where to find the
downloaded content from.
So you should use
that same asset,
and then you will
have playback offline.
However, if you no
longer have any references
to that original AVAsset --
for example, a long
period of time later
when your app might not
have been running --
then what you're going to want
to do is you're going to want
to create a new AVAsset
from the download location
that we supplied to you
in the didFinishDownloadingTo
delegate method
in the didFinishDownloadingTo
delegate method
and create a PlayerItem
with that.
Now even in this case, if
you want to do an operation
on the asset -- for example,
augment with an additional
media selection download --
you really should
reuse that same asset.
You shouldn't have a bunch
of AVAssets pointing toward
the same fileURL on disk.
This helps AVFoundation
be efficient
about reusing already-downloaded
media data.
So we also have an additional
class to help you be stringent
about what you let your
users play offline,
and this is called AVAssetCache.
You grab this from the asset
cache property on an AVAsset.
And it can do two things.
One is that it can tell
you if any rendition
of this asset is
playable offline.
And the other thing it can do
is it can tell you whether a
specific media selection
option is available offline.
Now, remember, if it's
not available offline,
it's still playable so long as
you're connected to the network
and that media selection option
is still available on the server
that you downloaded it from.
So, that's how you use
AVAssetDownloadTask
and play the subsequent
downloaded asset.
And I should note
that number five here,
playing the download asset
disk, that can happen really
at any point in this process.
It can even happen before
you start the download,
or even during the download.
And AVFoundation will be
efficient about trying
to reuse what has already been
downloaded from the network.
So I want to switch gears
a little bit here and talk
about securing the
offline content.
So, last year, we
introduced FairPlay Streaming
to bring strong content
protection to the HLS ecosystem.
However, in iOS 9, FairPlay
Streaming requires a live
connection to a key server,
which doesn't really work
for offline playback.
So, this year, we've extended
FairPlay Streaming to work
without an internet connection
for the offline HLS case.
Now we do this by packaging your
keys in a manner that is safe
to store on disk and reuse
for offline playback.
But your app is still expected
to store this key itself
But your app is still expected
to store this key itself
and respond to key
requests from AVFoundation
on every single playback without
hitting the network once it has
its key cached.
So support for these keys
does require a change
in your key servers,
specifically a new TLV value
so your key servers have
to explicitly opt-in keys
to be eligible for
offline playback.
And finally, we have
a caveat for FPS Keys.
They must be declared in your
master playlist as session keys
if you want to make sure
that they are downloaded.
So, storing keys for offline
FairPlay Streaming really builds
off the request flow of
online FairPlay Streaming.
And so what you have to
change really just has to do
with how you modify
this request flow.
I'll give a quick recap
of it here to point
out what you need to do.
But for more information, you
should watch last year's talk
where we go into this
in a lot of detail.
So just like in FairPlay
Streaming for playback,
AVFoundation will
download your playlist
and trigger a key request
if it finds a key tag,
which will give your
app a chance
to save the key for
offline playback.
to save the key for
offline playback.
So the first thing your app
would do is it would call back
to AVFoundation to get
a streaming key request.
Then it would talk to your
server, which would take
that streaming key request
and give you a content
key context or a CKC.
And then in online
FairPlay Streaming,
this is where you would finish
a request flow and simply return
to AVFoundation with that CKC.
However, in offline FairPlay
Streaming, here's where you need
to give that CKC to AVFoundation
so AVFoundation can freeze-dry
it into a format that is safe
for you to store into
your app storage.
And this is something
that you can also reuse
on subsequent playbacks.
So once you have this thing
written into your app storage,
you want to return that
freeze-dried key to AVFoundation
to complete the request.
Now if your app were offline,
then all you would have
to do is read that freeze-dried
key from your app storage
and return it to AVFoundation
so you could do this
without touching the network.
So let's look at how this
actually changes your code.
So in online FairPlay Streaming,
these key requests happen
as a part of
AVAssetResourceLoader,
specifically as a
delegate method
on
AVAssetResourceLoadingRequest.
So the first thing you
would do once you get one
of these requests
is ask AVFoundation
to create a streaming
content key request or an SPC,
which sends that to your server.
And then you get a CKC
back, and then you respond
to AVFoundation with that.
Now in offline FairPlay
Streaming, we have a new method.
This is the method that you
use to freeze-dry the key.
You pass in the CKC you
get back from the server,
and it returns back that
data blob that is saved
for you to store offline.
We also have a new content type
as part of this request flow --
the persistentContentKey type --
and a new option to pass
into our streaming content key
request data for app method.
So here is how we would modify
that key request flow for a key
that we want to save offline.
The first thing that we want
to do is ask AVFoundation
to create an SPC.
The difference here is
that we need to pass
in the required persistentKey
option.
So that way, AVFoundation
and your key server knows
to request a key
that will be eligible
for that freeze-dry process.
So it sends this SPC to
your server to get the CKC.
And then once you
get the CKC back,
you call your
persistentContentKey method
to create this freeze-dried
thing that you can save to disk.
Then, you want to actually
write that thing to your disk.
And then, you want to
set your content type
to a persistentContentKey type
and then finish the downloading
request with that persistent CKC
that you saved to your disk.
So now if you are offline,
or say you already satisfied
this key request previously,
then this is what you will do.
You get your resource
loadingRequest
and you just simply read
your freeze-dried key
from your app storage.
Then you set your
persistentContentKey type,
Then you set your
persistentContentKey type,
and you respond to the
key loadingRequest all
without hitting the network,
which is obviously required
as you may potentially
be in airplane mode
and don't have any
network connectivity during
this process.
So that's how you modify
FairPlay Streaming to work
with your offline HLS content.
So now I want to talk
about best practices
for managing your assets.
So it's important to note that
these downloads do contribute
to your app's disk usage,
so you want to be a
little bit mindful here.
You should really clean up
any unneeded assets on disk.
Specifically, you should
definitely provide a way
to your users to see everything
that you have downloaded
and allow them to delete things
that they don't want any more.
Remember that downloads that
users cancel do remain on disk.
Now you can use those to
resume downloads in the future
at some point if you like.
But if that's not what you
want, then you should take care
to go ahead and delete those
assets once they are cancelled.
We're asking that you
keep downloads driven
We're asking that you
keep downloads driven
by explicit user actions.
So you shouldn't just go ahead
and download a bunch of stuff
that your user might not
actually want offline.
We've gone ahead and taken
the liberty of opting
out these downloads
from iCloud backup.
As I mentioned earlier,
you should be prepared
for the system to delete your
assets to reclaim disk space
in low disk space conditions.
Now this won't happen while
your app is running, ever.
This will only happen while
your app is terminated.
After your app is launched, you
should be a little bit defensive
about assuming whether you
actually have the assets
that you had previously
downloaded on disk.
You do not move your
assets from the location
which we provide them to.
Again, it is very important
that the system be able
to find these assets.
And finally, because these
downloaded assets may contain
media selections that have not
be downloaded, be careful not
to mutate the asset that
you have on your servers.
to mutate the asset that
you have on your servers.
If you really must change,
say, like the location
of a media playlist
and stuff like that,
and mutate your master playlist,
then you should just host
that modified asset
at a new URL rather
than changing the one
that's already there.
So, that's offline HLS.
And we're really happy
to bring this to you
and excited to see you adopt it.
So, in summary, we
have new features
for you to use this year.
We have our fragmented
MP4 support
which will bring a
common media format
across different platforms
that you may be supporting.
Remember that this is
compatible with all HLS features
and requires minimal changes
to your HLS playlist to adopt.
Next, we have our new
in-playlist metadata using the
DATERANGE tag.
and this is just really
good to use metadata
with any live content that
might have dynamically updating
metadata, like ad boundaries
and stuff like that.
And finally, we're giving
you an offline HLS playback.
So part of this we have a
powerful media downloading
engine as well that you can
configure media selections
and quality to download.
And we're also extending our
FairPlay Streaming support
to work without an
Internet connection.
So for more information,
including some sample code
on how to use
AVAssetDownloadTask
and more documentation on these
new editions, go see our page
on the development
site for this session.
I want to point out the session
that Roger mentioned earlier,
our version of authoring
and validation talk.
You can watch this video
in the WWDC app right now.
I highly recommend watching it.
So, thanks for coming.
And I hope you enjoy the
rest of the conference.
[ Applause ]