WWDC2004 Session 208

Transcript

Kind: captions
Language: en
time that you're seeing revealed for the
first time at this conference we're
going to describe what those
developments are review some of the
things that you've seen in previous
sessions and amplify on them as well
we're also going to preview some of the
more in-depth technical sessions that
follow mostly tomorrow and I'll be
directing you to what those sessions are
and what contents they'll contain so
this is the Omnibus for a quick time in
tiger what you'll learn is what
quicktime offers to you the application
developer for media services that you
can use we'll give you some examples of
applications that already use QuickTime
so you can get some ideas of how you can
use it as well talk in depth about
what's new and improved and for the
first time on any stage we'll talk in
depth about how you can take advantage
of new ways to use QuickTime in both
carbon and Coco apps starting and Tiger
where does QuickTime fit in on Mac OS 10
platform you've seen several versions of
the architecture diagram for the entire
OS this week already and you've seen
quicktime mostly among the imaging
architect among the imaging modules in
that diagram here i've blown up a
portion of that architecture diagram to
show you that quicktime sits on top of
imaging video and audio services on the
platform it's an integration layer that
gives you access to digital media
services and it may in turn makes use of
these lower level services that also
have AP is that you can use where
clients as well as you to amplify what
does quicktime offer not just playback
most of the time when you come to a
conference like this and you see
quicktime we're going to play you really
cool content this session goes beyond
that to show that the services that
quicktime offers are complete it's a
full media architecture creation of
media editing of media capture from
external devices and delivery by the web
live streaming via CD or dvd rom
quicktime offers all of that now from
time to time we get into discussions
about what we mean when we say quick
time because quicktime actually has been
around for a while it was introduced in
early nineteen ninety one originally
available for system Mac OS 607 and
em7 I was had to check with Jim because
I'm never sure about which six it was
but in fact it was 607 over the years
the term quicktime has been applied to
several different things I've got a list
of what some of them are now quicktime
is in fact an end user product quicktime
pro which is available from apple's
website you can buy from there quicktime
is also a media container format known
as the quicktime movie file format and
in fact it's achieved great success in
the industry as the basis for the mpeg-4
file format and related file formats 3g
as well container format we can put
video and audio in there and some other
things too quicktime is also a media
solution there are several products that
work together quicktime pro quicktime
broadcaster quicktime streaming server
that allow in the quicktime plugin for
the internet browsers that allow you to
produce and deliver media but what we're
talking about specifically in this
session is the technology that you can
use in your applications to put media
services to work for you there are
several sets of api's CA p is that
quicktime has offered over the years and
you see some of their names here movie
toolbox instead of high-level AP is for
playing and editing media image
compression manager allows you to
compress and decompress still images or
sequences of images and a number of
other component api's as well some of
which I'll talk about later in the
session that'll be a theme that i'll
come back to actually the themes of this
session if you must know our what do
they say we were they were
classification and reordering so every
time you hear something that relates to
classification and reordering if you
could make some sort of sign that you're
still awake that would be good for me
cuz it is five o'clock on wednesday and
this is a tough slot to fill but anyway
this session is about the API so with no
further ado let's talk about quicktime
api's at work in existing applications
today so on demo one i have one of my
favorite applications particularly when
i'm away from home male male is a male
application it does wonderful stuff it
allows you to find your email messages
file them and to read them but one of
the other things that male does is
allows you to exchange media from one
email server to another if I could have
demo one up please that would be handy
thank you so here I have what so what if
I got my email now I just launched this
app a minute ago something from home in
mail and in fact in a number of
applications you can put media digital
media in line with other content mail
uses the QuickTime api's in order to
play video and audio here I have let's
see when we got okay hi dad that's nice
thank you I'll be home soon kids don't
worry well that's just one example of an
application that allows you to embed
media together with other content if you
have a document based application and
you want to use digital media together
with other content use QuickTime it's a
perfect solution let's see another
application that uses digital media is
keynote Chino goes a few steps beyond
however what an application like male
does sure enough I can create a new
presentation in keynote and I can take
some digital media that I happen to have
handy and I can drop it in there and I
can play my digital media content inside
my presentation but this is actually not
all that remarkable this has been done
in a number of applications for a number
of years keynote takes it quite a few
steps beyond what I would actually like
to do here is create a new presentation
and this time I want to choose one of
those interesting themes the blackboard
scene and I want to take that same movie
that I put on my other slide and let's
make this a blank master and I think
what I want to do here is to make my
movie occupy the entire slide let's see
that would be 12 60 x 768 and I want
that positioned at 0 0
alright so there I now have a full
screen slide with a movie in the
background but that's not quite what I
want I really want it to be in the
background I want to have my text on top
of it the title is this in my this grade
science project and let's see it is
about the red tail hawk this red tail
hawk has been showing us in our backyard
for the last week or so and one morning
while I was drinking like coffee there
it was or just grabbed my db25 camera
and there you go live from my backyard
actually stored the red tail hawk look
what I really want to do here is have
some bullets I'm going to cover the
habitat of the red tail hawk I'm going
to cover the markings I did very well in
the fifth grade by the way issue
Chantell and I want to cover the call of
the red tail hawk and I want to give
these items well I want to give them
some some bills i'm going to have some
drop in of my texts this is all very
nice and this is all going to happen
over my video but the video you know
it's kind of bright the sun was shining
the day that i shot this video is what i
really want to do is to dial down the
opacity of that video because i really
want that to be in the background then
when i play my video it's going to show
up like this you can see in the lower
left there a preview of my presentation
then when I click my bullets are just
going to come in right over my video
which plays in the background but you
really want to see this full screen
don't you
I knew I was going to get terminal into
this presentation all right this is what
happens when it's five o'clock on a
Wednesday doesn't have to be the WWDC
all right now you got it now when i play
my presentation oh I got it wrong I'll
show it to you downstairs in the
QuickTime lab but this movie will play
full screen with the bullet points
dropping down with full resolution and
full frame rate so keynote goes further
than an application that simply embeds
media it combines digital media with
other features that it has its support
for transparency its support for builds
and transitions digital video and audio
works seamlessly with that in keynote
and it's all done by means of QuickTime
ok back to slides please so I hope I've
convinced you that it's worthwhile for
you to use QuickTime api's in your
applications to gain that sort of
feature that I demonstrated just now if
I had managed to convince you of that
only two years ago at WWDC your
application would have gained value
without you having to do anything
between then and now because the AP is
that you would have put to work in your
application then would have given you
access to an increased number of media
container formats video formats in audio
formats as new versions of QuickTime
have been released by apple in just the
QuickTime 6 series of releases we've
added support for mpeg-4 for 3g for jpeg
2000 we've added support for external
devices better support for DV support
for iidc cameras that's the industrial
and instrumentation digital camera
standard glad I got that right after the
failure of the keynote demo and we've
improved our API is also in the
QuickTime 6 series of releases to make
them more accessible to you rcap eyes
are now using modern constructs
to refer to files to refer to other
things that are common with other api's
on the platform so that's what we've
been doing while we've been doing that
what's been happening in the industry
well we kind of noticed that video in
coatings of digital video not DV digital
video but various formats of videos that
are delivered digitally are becoming
richer the quality is becoming better
without the data rate going up and new
and more complex means of coding of
video are being applied to to gain in
video coding technology well what we're
going to do to respond to that you've
already seen that we're responding to
that by adopting h.264 for quicktime
what about audio resolution audio
resolution is becoming ever greater in
production and also in delivery to your
auditors higher resolution audio is
becoming more common graphics processing
we found that we've mentioned several
times at the show that the graphics
processor is a GPU that is in every
machine that we sell and also many of
the machines that are sold by other
vendors is valuable not just in the
original domain for which those things
are developed but also for other types
of applications finally what else is
going on in the industry we're noticing
that you guys writing your applications
are using these application frameworks
and getting a lot of leverage from them
next of us to notice that what are we
going to do in response to that I'll
tell you in a moment okay
tough crowd all right so let's go
through those topics one by one what's
happening with video by way of analogy I
intend to explain to you how video
encoding has become more complex and
what we need to do input time to be able
to support more complicated encoding so
I can't get you to respond i can't make
it interactive let's play a game we're
going to play which one of these is
different okay a flip book by which I
mean a book with a each page of which
has an image either printed or drawn and
as you flip through it you get the
effective animation a real of motion
picture film actual celluloid they still
use it in some theaters each frame is on
the strip of film db25 what's back well
that's the most common format of digital
video supported by your DV camera that
you can buy down at good guys or
whatever guys you happen to have in the
area that you live in finally step by
step directions to your local library
you may think that this is the one
that's different because it doesn't
involve movies or video but wait the
step-by-step directions could be
delivered as a series of image is even
as Matt West can provide so which one
really is different well it should be
obvious to you by now that's three of
these things are sequences in which the
items are each independent of the other
items and those happen to be the first
three the flip book has pictures which
are independent of the other pictures
db25 is the digital equivalent of a flip
book each frame is independent of every
other frame the one that's different is
in fact the step-by-step directions
because every direction depends on what
came before you already have to be on
the corner of main and third before you
turn left and go two blocks okay now
we're going to play the advanced version
which one is different this time
narratives and ellipses and prolepsis
which of course are the terms for
narrative flashback and flash forward oh
thank you did you just scratch your head
or did you mean to
a schedule of automobile maintenance the
Romantic poets by which of course I am
referring to the Romantic poets in the
poetic tradition as described by Harold
luminous 1973 book the anxiety of
influence in which he posited that a
young poet can actually seek to gain
priority over his predecessor by means
of several tropes which he describes one
of which is the trope of substitution by
substituting an early pneus for a
lateness but I'm going to give it away
if I keep going and PEG tube video
another type of digital video mpeg-2 and
another type of digital video coding
finally a presentation based on keynote
well I'm sure it's all clear since I
belabor the point on the Romantic poets
that what's different here is the
schedule of automobile maintenance
because in that case you have a sequence
of things each of which is only
dependent on the things that have come
before you only go to have the hundred
thousand mile maintenance after you've
already done the 75,000 mile maintenance
right but these other things are
sequences in which items may have
dependencies on things that come before
or things have come after or at least
the Romantic poets hoped that this was
true so if none of these analogies was
effective to I will appeal to your
stomach since it's getting close to
dinnertime after you learn where all the
sharp implements in the kitchen are for
the q ladilla nari institute they
eventually reveal to you that you do not
necessarily prepare our courses for a
high-class multi-course meal in the same
order in which you serve them and why is
this true they have to take what I don't
know I never went to culinary institute
but I do know about this the same thing
is true of modern video coding video
frames are encoded and decoded in not
necessarily the same order in which they
are intended to be displayed and why is
this a good thing well it enables us to
get better quality out of comparable
frame rates let's talk about an example
suppose I have a sequence of video which
starts with one frame and ends with
another frame
and all of the intermediate frames are a
wipe effect it just wipes from left to
right from one image to the other well
if I need to encode that sequence of
frames each frame revealing more and
more of the final image then if I can
only depend on images that have come
before with each frame I have to waste
some of my bits well I have to apply
some of my bits to the portion of the
final image that has become revealed as
the white progresses but if I have a
video coding which allows me to depend
on both the first and the last frame
well then that's much easier the wipe
the intermediate frame will require much
many fewer bits in order to code the
differences between the first and the
final frame well that's basically how
we're achieving higher quality with
comparable frame rates in video the bad
news is when we shift quicktime one
point 0 in 1991 for system 60 seven and
seven point oh we did not accommodate
this model for video coding in our video
support what i like to say is that there
were no master chefs working on
quicktime one point 0 they only came
along later but at in quicktime one
point 0 as you can see we had this
limitation and this limitation was
reflected by the api so you could only
compress video that for which frames
would be completely independent or which
would only have dependencies on frames
that you had previously presented for
compression but in quick x 66 the
version that you have in your tiger
preview that sdk that i think they gave
out earlier this week we finally have
support for video frames whose decode
order and display order can be distinct
now this is a great thing because it
allows us to play modern video formats
such as h.264 but it allows us to do
more than that I will say we support I
T&B frames for those of you who are
technical but the reason that this is
wonderful and amazing is that we have
applied all of the richness of the
QuickTime media architecture to this
type of video encoding so let me show
you over here on demo 1 what I mean
let me open up some of my Hawk movies
again and I'm going to open them up with
QuickTime Player 651 both because they
want to make a point and also because
the new players thats in the tiger seed
doesn't quite run the applescript that
I'm about to run for you and I have some
h.264 encoded video clips of the hawk in
my backyard and what I can do with these
video clips since I'm I've got quicktime
player pro is i can copy these things
and combine them together into another
movie I've got different encoding rates
on these different clips but copy and
paste of these various clips works just
as you would expect it to work with
QuickTime and so now I have a single
movie that plays those three Clips
back-to-back well that's pretty good but
it doesn't quite go far enough to
demonstrate the point so let's go back
to that Apple script that I mentioned
it's called movie mincer and what does
it do movie mincer is probably the most
fearsome thing to a QuickTime engineer
right here at five o'clock on a
wednesday afternoon at WWDC because what
i'm going to do with this grip is to
select at random at random on you 22
seconds segments from a movie which is
itself pasted together from three
separate clips and i'm going to take
those two second segments at random and
put them into another movie I do not
know what will happen ladies and
gentlemen I require complete silence
it's kind of fast on a g5 I like that so
now that I have pasted together well
applescript has pasted together these
twenty two segments segments chosen at
random remember now these video frames
can have dependencies on frames that
comes after them or frames that come
before them and there's a very
complicated graph of dependencies if you
go through the sequence of video frames
in these various h.264 movies now I've
chosen at random where these clips
should
art and stop that i pasted from one
movie to another I don't know if I
started an iframe or a bee frame or I
don't know what frames i started at if i
pasted them all together how was quick
i'm going to figure out how to decode
these frames which may have dependencies
on other frames which may not have
gotten paste it into the destiny i don't
know how it works i'm just going to play
the movie every two seconds i should see
another clip from those various movies
this I think is probably the most
impressive technology demo that you're
going to see at the show that's not a
joke this is really remarkable this is
live in place editing of this very
complex video stream and a nice-looking
hawk to all right back to slice so we
don't just support playback of be frame
video we support all that other cool
stuff that I talked about earlier now I
was copying and pasting with speak what
do they call it pasteboard in Mac OS 10
but you can also insert media via
low-level ap is an application like I
movie is combining media from multiple
sources by means of the very same
technology that you saw me demonstrate
by a copy and paste this is really
pretty cool stuff now in order to
accomplish this we did have to introduce
new api's because we had the limitations
that I mentioned earlier we have new in
the image compression manager that I
mentioned earlier we have new
compression and decompression api's to
support this notion of video frames that
can have distinct display and decode
orderings also at the level of beauty of
creation and media access we needed new
AP is to carry the data about decode and
display order as well so at those levels
there are new AP is available the very
good news is and here's the other reason
why I use QuickTime Player 651 to
demonstrate that the high level AP is
for movie playback and movie editing are
unchanged even though we have support
for this new stuff so the old player the
one with the currently shipping
quicktime
features work with the new media content
and the same thing will be true in your
applications as well but it's not just
the AP is that we're changing to
accommodate this new cool video stuff
we're also expanding the QuickTime movie
file format to accommodate it as well if
you are doing reading of the QuickTime
movie file format if you're opening up
movie files and reading the bikes in
there and interpreting them you will
definitely want to get on the QuickTime
seed and i believe the email address for
the seed is not what I have up here they
both work you can send to this address
if you want to be part of the QuickTime
feeding program in addition to the
Tigers seating that you have received I
recommend that you send to this email
address now another point I want to make
about the QuickTime movie file format in
the preview of tigers that you possess
not all of the details that are changing
about the QuickTime movie file format
are final so this preview is not for use
for production purposes don't go out and
start making movies that you want to put
on the web and have them playable
indefinitely use Panther for that we
will finalize those details I'm sure by
the time Tiger ships all right what
about more information I've given you
some of the technology details but only
at a very high level of what's new and
video in quicktime where do you go to
find out more I'm recommending two
places there is a document that is
available to you in your tiger seed and
it's also available by the Apple
Developer connection it's called what's
new in QuickTime 66 developer preview
there are several sections of that
document that I've listed here bee frame
video h.264 codec and cutie sample table
API that will give you all of the nitty
gritty details about the api's that I
mentioned but if you want to see it
explained by living people actual
engineers come to section 217 friday at
three-thirty supporting new video
formats and quicktime and you can get
the information there as well alright
that's what's new in video what's
happening in the industry that's
influencing us well what's happening in
the area of graphics
I have a graphic see how I tied that in
this timeline reveals in graphical
fashion that over the past 25 years
video games have improved and their
ability to engross the consumer yeah I
think that's what I meant in any case
you can see the progression here we
couldn't actually use the images because
they're copyrighted you see from the
line drawings of asteroids to the 3d
models of modern video games you guys
must really love this stuff because
you're buying it now it's been mentioned
several times during the conference that
Moore's law drives the ability of the
processor to shrink to consume less
power and go faster well we found that
there is actually an ancillary law
together with Moore's law that has
driven this evolution of the graphics
processing unit it's called Ataris law
Ataris law states that the game that you
produce six months ago isn't good enough
it has to be better they won't buy it
and this is driven the evolution of the
graphics processing unit which is now in
one form or another part of every
hardware unit that we sell and is also
available from other hardware vendors in
their PCs the GPU is now ubiquitous
several of you application developers
have SAT back and taking a look and said
you know I'm not developing a video game
but every pixel that I draw wind up
going through that GPU on the way to the
screen and I know there's a lot of
processing power there maybe I can use
it to well that's exactly what happened
in the development of keynote keynote
made use of the power of the GPU to
distinguish itself from its competitors
in its field by having all these great
transitions and all these great effects
and great integration of graphics and
for an example my hot friend is back
again you know you can't tell if it's a
red tail hawk by its tail if it's
immature what is quick times going to be
doing about all of this graphics
processing power now these machines have
evolved with enormous organ the GPU that
was originally intended for one purpose
but as you know from evolutionary theory
these organs often wind up being applied
for other purposes as well and the
organism gains the ability to do new
things by this evolutionary process well
quick time is going to take advantage of
the GPU too and how are we going to do
it we're going to get out of its way
when quicktime was first introduced in
1991 there wasn't a lot of help among
these services available on the OS what
were the numbers again for playing video
to the screen there was some for sound
but there certainly wasn't a lot of
support for synchronizing video with
audio so quick times did it all and it
was necessary for to port to do it all
and it was very successful at it for
years when we talked to application
developers at conferences like this we
said you focus on your area of expertise
when building your app you let us handle
the digital media and you can integrate
digital media with other content that
you manage and that story was very good
but something happened along the way
we've noticed that you application
developers have actually gained
expertise in the area of digital media
and you want to do interesting things to
process digital media in your
applications and you don't want to turn
all of that over to us you want what
QuickTime offers to work in combination
with services that are available on the
platform now for really great graphics
processing really great audio processing
well we're going to make that possible
now you can have everything that's good
about quicktime video and even less so
how are we going to achieve that for
video we have to find a new abstraction
for the destination to which video will
be rendered we're calling it the visual
context it's an abstraction and we
intend there to be several types of
visual context for tiger we are making
it available concretely for opengl
textures and so this means really great
stuff you can get all of that great
processing power that's available in GPU
apply to video you've seen some
demonstrations of it at a high level
however you don't have to know anything
about OpenGL texture
to gain this advantage if you use this
by the way is a narrative prolepsis if
you use our high level modules the H
iMovie view and the QT movie view you
have the full support from them for
visual context without you having to do
a thing and that basically means if you
use those modules and you deploy your
application on a machine that has quartz
extreme then you are eligible for lots
of great stuff that the GPU can do with
video however if you have the kind of
application that's doing manipulations
at the OpenGL level we have low level
api's for you to use as well where to
get more information well the session is
215 thursday at two o'clock improved
video rendering and playback in the
what's new document as part of your seed
look at the visual context section and
information will be there for you
alright i said this was an omnibus i'm
going to talk about video and audio in
the same session i can get away with
that quicktime is a media integration
layer you don't see the core audio guys
talking about video and you won't see
the cord video guys talking about audio
but quicktime sits on top of those
layers and we can talk about both sound
of course is spatial we have had support
for mono and stereo audio i believe them
keeping Gemma wait that's good for a
long time and it's that sufficient for
being able to detect where sound is
coming from well I have two years and I
knew it was Jim it's the only person
laughing at that but we're finding that
it's actually interesting to have more
than two audio sources available to
create an enveloping experience and this
of course was originally available in a
movie theater does anybody remember the
first movie that has support for
surround sound fantasia it was called
Santa sound and if there was an array of
speakers around all done analog and it
actually all worked it was kind of cool
and you know whose idea it was Leopold
Stokowski anyway now we can do this in
the digital domain so I've named drop
Leopold
Kalsi and harold bloom and I'm only 30
minutes in it's pretty good so naturally
we want to support multiple channels and
we want to support higher resolution
audio with the same richness that I just
mentioned for video so let's step back
over to demo one and I will show you
that you can in fact combine some of its
rich audio high resolution multi-channel
audio using QuickTime the media
integration architecture let's see let
me take some video I'm going to take
some video that's near and dear to my
heart this by the way is cherry juice
this those of you have the QuickTime one
point 0 cd-rom from 1991 you may have
seen cherry juice it used one of the
original video codecs that shipped with
quicktime one point 0 known finally as
Road pizza and this video was produced
laborious ly by means of a series of
still images and I'm going to take this
very old video and I'm going to combine
it with some very new audio I happen to
have well I'm actually going to open it
up in this quicktime player first i have
some sound effects here that are
actually in surround sound like playing
for you now this demonstrates one of the
key things about support for
multi-channel audio and quicktime did
you know that quicktime 6.6
automatically mixes down to the
abilities of this device the output
device that you happen to have attached
this is five dot one audio the room is
only stereo so I'm going to copy that
segment of audio that I happen to have
that five dot one audio and i'm going to
add it to my cherry juice video and now
we have 1991 vintage video and audio
together but should we stop there i
don't think so i ran my movie mincer
script earlier today in another session
i ran my movie mincer
script on this very thing and the same
copy and paste combined integrate
complete digital media technology was
able to produce this oh man I hate it
when that happens was able to predict
what did I say it wrong was able to
produce this now in order to do that I
Ashley change some of the parameters to
my script instead of copying to second
segments like copied segments of
one-fifth of a segment and I did 40 of
them to make eight seconds of media but
the same thing works that integration
technology that power that's available
to quicktime when we do multi-channel
audio we do multi-channel audio not just
playback now that was a demonstration of
some very basic combination of audio and
video together but we thought that it
would be valuable for you to survey the
current state of production tools what
if you really wanted to do some and you
don't want Road pizza video in 2004
that's what you're it is ring you want
h.264 video and what you want to do is
to have some full screen really cool
things such as the demonstrations of
h.264 that you thought the keynote on
Monday and the question is how do you do
it well fortunately we have the answer
to that question right here ladies and
gentlemen introducing the QuickTime
stupid movie of 2004
[Music]
ha
[Music]
you
[Music]
they gave us three more tape I'm going
to project what may be perceived as a
lateness as a timeliness I believe that
video production requires a digital
media architecture as rich as quicktime
to streamline this process and now that
we're rolling out support for these
modern video formats in these modern
audio formats we hope to work with you
the tools developers and the hardware
developers to make this kind of
production accessible to people like me
thanks to those of you who prepared the
videos that we showed at the conference
and for those of you who prepared the
stupid movie that was wonderful practice
slides please what was I talking about
high resolution audio ok we had to
introduce some api's for high resolution
audio too not some of them because we
wanted to if you have multiple channels
per soundtrack or if you have sampling
rates that are higher than our previous
limitations we can now go up to 192
kilohertz and we can also go up to true
24 bit integer samples we would require
a new sound description in order to
describe those things but it's only when
you have sound with those greater
parameters that the new sound
description is necessary if you don't
you can use the old data structures just
as well we have a new sound media
handler that's capable of handling these
new formats and it's based squarely on
core audio for everything that it does
for decoding and for output we've
introduced some api's which finally
rationalized the ability to get
decompressed audio out of QuickTime
movies it's called movie audio
extraction and this is the way we're
recommending that if you need to get
audio out of movies use those api's
we're using it ourselves when we export
our movie exporter's that can rien code
movies to other formats and we also have
new support in the sequence grabber
which is quick times module for
capturing media from external devices
for high-resolution multi-channel audio
it's called the SG audio channel all of
those things will be demonstrated and
explained in section 213 which is
tomorrow at three-thirty and I think
that you really want to go to this one
because I understand that there's going
to be a live performance of multiple
percussionists a narrator and a
wandering troupe of minstrels and
they're going to be recording
multi-channel on they're going to be
performing live so please go there's no
extra charge for admission also there's
information about the new AP is in the
very same document i continue to
reference the QuickTime 66 Developer
Preview the section is audio
enhancements all right so we talked
about video we talked about graphics
processing we talked about audio we've
noticed that things have been happening
to make your lives easier in developing
applications one of them to said you
don't call wait next event anymore thank
you a long time in coming wasn't it but
we've also noticed that you are gaining
your ability to develop rapidly by using
the leverage of application frameworks
gee it's kind of great that we notice
this stuff but we do usually have our
heads down producing great video and
audio like that and may have taken a few
extra years but now we're aware that
you're using high level modules in cocoa
and also in carbon to implement your
applications and what we are doing in
Tiger is supplying modules that fit with
those frameworks to manage most of your
common quicktime needs for you carbon
developers we're introducing in Tiger
the H iMovie view a chive use is an
object-oriented view system that's for
use by carbon based applications eh I
movie view allows quicktime media to
play as a first class citizen in that
view framework I'm sorry have to go
round to the door in the front let me
show you a chai movie view in action now
there's something interesting about the
demos that I give in this session this
is sort of a session that's sandwiched
in between the sessions at the beginning
of the week that officers of the company
deliver people with titles people whose
pictures you see in trade publications
and even in Newsweek
and the sessions that come towards the
end of the week which are very highly
technical viewing by people who really
know what they're talking about if my
demos are too flashy they get moved to
the beginning of the week and if my
demos are too technical they get moved
to the end of the week so i have to do
demos that nobody else wants such as
introducing an element of randomness and
copy and paste this demo partakes
somewhat of this and that it's a very
elementary introduction to the
technology that i just mentioned the
predecessor to the HR view was the
carbon control that is used by carbon
based application itunes to display
videos in the music store so if i go to
the music store then let's see i will go
to the bob dylan page and i will go to
the main place for that artist and I
know that the music store has video
available of a performance by Bob Dylan
iTunes uses the carbon movie control the
predecessor to the H iMovie view to
display its video and it supports the
ability to display video that's
downloading from the web could you
please to get off airport if you're on
check your mail later and it can do
everything that the finder can do when
it displays a movie as well the finder
is using not the predecessor carbon
movie control but in Tiger the finder is
using a chai movie view to display
previews of QuickTime media right here
in the finder or you can see an itune to
support scrolling a video back and forth
we can clip we can revise any moment and
add cool stuff good song finder using a
chai movie view also supports all that
stuff as well I can get a large preview
it will resize I can scroll the H iMovie
view can be used in several different
contexts within the same application so
for example if I do a get info
where'd they move it now if I do a get
info on a media file the finder also can
display a preview inside of the preview
pane and that can also be a playable
video as well so in this particular case
unfortunately it's not this also uses
the H iMovie view in line with these
other user interface elements if you
have a carbon based app you're using a
view system a chive you support rich
composition movies can be composed with
other elements right back to slice not
just carbon but also cocoa oh by the way
they'll be more information about the H
iMovie view in tomorrow's session to 15
that's the improved video rendering and
playback and several original demos of a
chai movie view will be available there
but as I mentioned not only carbon but
cocoa too we are introducing in Tiger
the cocoa cutie kit and we are making
available to you Coco developers all of
the richness of the manipulations that
you can perform on media in a form that
should be very familiar to you using all
of the same constructs and all of the
same programming techniques that you
have learned to develop your cocoa
applications cocoa of course already has
basic QuickTime support in NS movie and
NS movie view with the cocoa cutie kit
we're going quite a bit further beyond
that we have classified quicktime
behaviors and introduced coco classes
based on those behaviors that make
manipulations of movies tracks and media
available to you in objective-c using
the coco framework and of course we have
a class for your GUI the QT movie view
that's very rich it goes quite a bit
beyond what you could do in NS movie
view to date now Coco not only supports
modern applications with graphical user
interfaces it also supports other types
of modules as well bundles plugins
command line tools but don't take my
word for it let's look
now when I went through Coco training
when I learned how to write an
application in cocoa several of you
might have had exactly the same
experience I had I learned with the
expenses application does this ring the
bell for anyone the expenses application
is the application that you build if
you're working through the learning Coco
book from O'Reilly or if you go to any
number of training sessions that show
you how to do basic application
development in cocoa this is a pretty
cool app this allows me to keep track of
my expenses like let's see when I cross
the golden gate bridge today that cost
me five bucks and when I parked in the
mission street garage when I guess
that's going to be about 16 bucks well
that's kind of good I can also attach
notes textual notes of see I'm on level
six to each of my expenses so that's
nice I can browse in my expenses it'll
keep a running total I can save this
document to edit it later the NS table
view displaying this set of expenses and
the NS text view which displays these
textural dotes r sub views of an NS
split view which allow me to do this
kind of cool stuff here so I can resize
the different sub views if I want when I
look at that app as a multimedia
developer I don't see an expensive zap I
see a playlist editor
so what I did was I took the expenses
application exactly as I completed it
when i completed my cocoa training which
i did almost as well as i did in 5th
grade and i made from that expenses at
using the same construct in fact the
same lines of code a playlist editor i
added a little bit to it and i changed
it around a little bit so i can add
multiple movies at a time so i'm going
to add some items to my playlist and
they show up up in the upper pain which
is once again in NS table view and i've
renamed my categories to file name
display name which may be different
that's a narrative prolepsis and display
the duration using an NS formatter to
show the number in seconds with two
decimal places I could have written a
better and it's for matter to show
minutes and seconds but it was Monday
afternoon what do you want now the view
below has changed from an NS text view
to a cutie movie view and when I select
any one of these items in my playlist
the movie that represents the playable
media for the file that that playlist
item refers to is displayed in a cutie
movie view below and the NS split view
works exactly as you would expect gee
maybe you'll even work if I play it so I
can do the resizing while I'm playing
resizing of the window resizing of the
view I also did a fancy little thing I'm
listening for the notification that
comes from QT the class QT movie that
tells me when the movie is done so that
i can listen for that notification in my
document controller and tell the NS
table view above to display the next
item in the list and the QT movie view
below to display the next movie now this
NS slider thing the senate's control on
the bottom that's actually functional
it's not just displaying the current
time of the movie I can also click on
different parts of it to navigate
through the movie and I can drag this
thing live to navigate through it as
well well how much code is here there
are places on the web that allow you to
download the version of expenses that's
similar to the one that I showed you
there's
very very little additional code there
than there is for that we will make this
unfortunately it's not available in your
preview of Tiger but we'll make the code
for this available in the coming weeks
via the Apple Developer connection
you'll see that all of the same things
that allow you to do rapid application
development of cocoa you can now do with
QT kit using QuickTime but it's not just
about applications that have a GUI it's
also about other types of executable
modules as well with QT kit you can
develop command line tools that you can
use in batch processing for media
production and because it's very
difficult to type at this time of day I
have my can command lines right here
member of our team developed a command
line tool called REE compressor that
takes several command line parameters
you tell it well here's the movie
exporter that I want to use in this case
3gpp here's the source movie that I want
you to recompress and here's the
destination movie that I want you to
recompress to so if I execute this line
in terminal I told you there's going to
be a terminal demo you saw the graphic
earlier on I didn't lie it's going to
recompress this DV video clip 2 3g and
it's now available for me I can send it
out to my 3g handset and it'll play
something like this using mpeg-4 video
and AMR audio
thank you so I wish I could tell you
more about cutie cute in this session
but this is the Omnibus quicktime
presentation back to slides please there
are more details tomorrow morning at
nine o'clock if you can be here at five
o'clock on a Wednesday you can be here
at nine o'clock on a Thursday New
Directions in using QuickTime with cocoa
go to that session there will be lots of
great stuff and I promise the demos
there will be very very rich because you
can do really great real life production
stuff with these tools that we're making
available to you and Tiger the section
of your what's new in QuickTime 66
document that covers this material is
called cutie cute framework for
QuickTime we have other documents that
are available as well a reference in a
tutorial that's that you'll have to come
to the session tomorrow to get the
pointers to okay so you've seen I've
convinced you now quicktime is a
cross-platform media architecture demoed
only the things that are available in
mac OS 10 and focused on the things that
when our new in tiger but the same media
services for playback editing creation
and so forth are also available on
windows all the windows 32 the win32
series of releases by means of quicktime
you can simplify your use of all these
various these variegated media container
formats video and audio formats external
devices for capture and so forth but
have we done enough for you our answer
was no we have we want to take it
another step further because we realize
that people don't just produce media
they classify it as well
how much did you pay to come to this
conference remember to get your money's
worth people attached classifying
information to media in the course of
production in order identify media that
are going that's going to be used in
later steps in production or to identify
media when it's delivered to the end
user and they tagged this media or
classify it with several different
formats of metadata there are as many
formats for metadata as there are
formats for media itself and maybe more
and several deal in the back the room
might be developing your own formats
even now we've noticed that for example
for mp3 files there are many more
formats for tagging then there are
players in the world what is going on
you can tag your mp3 files with id3
version 193 version 2 lyrics three
lyrics one-eyed a PE I don't know what
these things mean it's just confusing
but what we want to do is to make sense
of all these metadata formats because
hey we're quick times we are the media
integration architecture and we can do
it so what we are doing in Tiger is
introducing api's that give you access
to media metadata you can be concerned
about the specific metadata format if
you need to be or you can simply leave
the details of the format up to us and
get the information out that you're
interested in how does it work when you
open a movie you ask quicktime for a
reference to metadata attached either to
the movie to a specific track order
specific media you can tell us if you're
interested in specific format such as
itunes music store metadata or classic
quicktime user data or you can just say
give me a reference to whatever it's got
then once you have that reference you
can iterate through the items in that
metadata by means of keys that are
specific to a metadata format or a set
of keys that we've defined that we
attend to work in common among all the
metadata formats that we support so if
you want to get for example the display
name of the media you merely have to say
give me a reference to the metadata find
me the display name please
and give it to me and whether that
display name is stored in classic
quicktime user data in itunes music
store metadata or in New quicktime
metadata we've defined a new format to
overcome the limitations of classic user
data we will find it for you and we will
deliver it and I have a demonstration of
that right over here data potato do i do
what i have a demonstration of it i'm
going to back up over here on demo to
please alright so what i want to show
you here is that it's possible now for
the very first time for quicktime player
to display and this is this demo it's
not the most impressive demo that you'll
see at the show but it's probably the
nearest and dearest to my heart so I'm
going to make sure that it works
alright when it's Wednesday and the
beach ball spins where you go what do
you do yes it's the late night at the
piano bar ok here we go on demo 1 i've
opened up an mp3 file which contains an
ID 3 version two tags that includes the
name the display name of the file and
that display name is stored as utf-16
it's unicode and for the very first time
quicktime player can extract that name
from the metadata in the format id3
version 2 and display it in the title
bar of the window and what is it doing
what is quicktime player doing in order
to do that all it's doing a preview of
tomorrow's session is calling the method
cutie movie display name and underneath
the covers the cutie kit is calling upon
the quicktime metadata services that i
described to fetch that piece of
information house it doesn't worry about
the format of the metadata it just says
give me what I want and it delivers it
but we can go a little bit further than
that as well with metadata and in fact i
have and i did it in the application
that i showed you earlier my simple
playlist editor I complicated it just a
little bit by adding the following
feature this is a cocoa application so
what I did was extended the objective c
class QT movie by implementing a
category on that class with a following
method artwork what i want to do when i
instantiate a cutie movie is to fetch
the artwork out of that movie wherever
it may be and i implemented that method
artwork by using the quicktime metadata
api's that I described earlier what I
then do with that artwork is I actually
attach it to the movie as a video track
this application is highly visually
oriented what if I want to play some
music files and only preview these very
same music files for you in the finder
if I click on one of them you'll see the
finder Mueller display is the standard
quicktime movie controller because the
only media in
there is sound media but in my
application which is taking advantage of
support for quicktime metadata if i
select these audio only files which
nevertheless contain cover art in their
metadata i can do the following trick as
I described earlier I can actually
display the cover art as a video track
in parallel with the audio so if I go
through these music files from the
itunes music store i can display the
cover art as part of my movie you can do
stuff like this too we know that there
are many metadata formats that you will
be interested in as part of your
production tools or as part of delivery
if you're displaying media and what we
want to hear from you is what metadata
formats interest you what kind of stuff
do you want to read in the course of
preparing media for delivering media and
present to the user or make available in
the production process as i mentioned in
tiger we're going to be supporting
quicktime user data quicktime metadata a
new format that overcomes the
limitations and itunes music store
metadata this mp3 stuff I did by way of
demonstration we're not actually
planning to roll out support for mp3
cover art and Tiger unless you all email
us and say that you can't live without
it where to obtain more information
about put time metadata
I should have done this much earlier in
the session the documents of what's due
in QuickTime 66 does contain information
about the QuickTime metadata API we will
be posting sample code as the release
process moves along towards the eventual
delivery of Tiger and and yet you come
to the right place to find all the
information that will be covered in the
show sessions about quicktime metadata
now let's see so here I have 11 minutes
and six I don't have to talk fast yet
I'm okay don't worry but I do have more
to say and that is that we've shown you
that quick time is a digital media
architecture and integration
architecture that gives you all these
great services and we're extending it
new media formats support for new stuff
metadata not just video but you don't
have to wait for us to extend the
QuickTime media architecture because in
fact the architecture is itself modular
and extensible and the very same coding
techniques that we use to add new
support to quicktime you can use to we
haven't made this point at developer
conferences over the last couple of
years and i wanted to be sure that we
mentioned it this year several of you
are new to the platform anew to
quicktime you don't have to email Apple
and say you know if only you supported
my media container format I'd like you a
lot or if only you implemented the
following video codec you would be my
friend you can do it yourself because
quicktime defines the component AP is
that allow you to implement those things
and sample code for this stuff is
available at the Apple Developer
connection websites on the sample code
pages many different component types are
available I have an example of a movie
import component now one of my favorite
places in the file system I not to talk
faster but I can't pause any longer one
of my favorite places in the file system
is uh I think I spun up the drive I
clicked on the wrong one is right here
in flash library / QuickTime there's
cool stuff in there what's in this
particular directory my / library /
QuickTime is my modified mp3 import
component that implements all that cool
id3 stuff that I just showed you that's
how I extended quicktime for the purpose
is my demo I will also extend it
slightly further now you know that on
the in on the platform we do have
support for decoding AAC audio we rolled
it out with our mpeg-4 support in
QuickTime six however we have not to
date had support for the container
format AAC AAC or a DTS i believe it is
is a format that stores AAC audio
packets and several encoders produce a
stream of AAC audio in this format if i
were to drag this AAC file on top of
keynote however it would sadly reject it
if only i had a quicktime movie import
component that knew how to read the
audio packets from that container format
and create a QuickTime movie oh wait I
did that I implemented an import audio
file component and those of you who are
familiar with core audio know exactly
how i did this core audio in fact a
service on mac OS 10 already knows how
to parse these dot AAC components i
didn't actually have to write any hard
go to do this all i had to do was to
make use of Corrado's ability to do that
and to make a movie out of AAC file so
all i need to do here is to move that
cool thing there it is into / library /
quicktime and then I do have to quit
keynote and relaunch it so that keynote
in QuickTime can match up together again
keynote can register all of the movie
import components that are available in
the system but now that I've added that
simple component to splash library /
quicktime
if I now drag the very same AAC file
onto keynote I get a plus sign and my
dental work but wait there's more now
quicktime uses the what's known as the
core services component manager in order
to load and to invoke these extensible
modules these modules that do movie
import and video its image decompression
all the stuff we call them components
but there's nothing really fancy going
on in the runtime quicktime it deploys
the component manager uses a component
manager on both mac OS 10 and on windows
but the component manager merely uses
the native dynamic library loading
mechanism that's available on the
platform and the format of dynamically
loadable libraries that's defined by the
platform so the question is why can i
implement this these extensibility
modules for quicktime in objective-c and
the answer is of course you can now
there were some some problems involving
language runtime and stuff that I don't
understand too well that made it
dangerous to do this with previous
releases the good news is that we've
ironed all of this out and starting with
Tiger you can deploy movie import
components of the sort that I just
demonstrated or any other the types of
components that you saw listed earlier
implemented in objective-c taking
advantage of all of the great stuff in
the cocoa frameworks let me show you
what I mean because I don't just say
I'll actually launch Xcode I got five
minutes and 40 seconds left I can launch
Xcode
ok so my the component that I chose to
implement here once again it was
saturday i didn't really have a lot of
time to implement support for a media
format so I stole support that existed
somewhere else you may already be aware
that an advocate there's a class called
NS bitmap image rep and this class knows
how to write a number of still image
formats one of the still image formats
that knows how to write is gift that's a
still image format by the way that quick
times doesn't know how to write so all
that I did this is literally all that I
did we'll post the sample code chord
shortly is define a class for a graphics
exporter and I define the following
methods in it with image rep so what's
going to happen here is that an instance
of this class is going to be presented
with an image rep then the method that's
going to be invoked is export and the
information about the image is going to
be little by whatever means necessary is
going to be turned into the format that
this exporter knows how to write to and
it returns that exported image in an
instance of NS data and all of the other
work about writing the file out or
writing to wherever the thing has to go
for the purposes of that however it's
being used in QuickTime are taken care
of at a higher level and this is all the
code that I needed to write an objective
C in order to implement this exporter
this still image exporter which is in
secret sauce or actually it got moved
export gift component move that once
again to slash library / quick time and
this time when I launch quicktime player
it's going to gain the ability to write
the gif files what I'll do is open one
of my favorite movies once again and I
will export using quicktime player this
movie to a sequence of still images now
you see that we have this support for
exporting video to sequence of images or
any visual media and we have several
presets
ty knows how to do jpg and B Matt what's
that some competitors image format but
what I want to do is export to gift and
only after my graphics exporter that
supports gift was installed was this
ability added to the media architecture
what I want to do is export this movie
at two frames per second and I'm going
to put them in my pictures to record and
here I go writing frames of this movie
to gif files and sure enough if I open
one of them at random there it is a gif
file and 256 glorious colors I think
that's the first time in a while that
gift has gotten a round of applause all
right we have reached the end I have
tried as hard as I could to jump around
to make use of narrative prolapses
analysis I even try to define for you
poetic metal Ephesus but now we're done
our summary is that quicktime offers
high level integration of media services
playback capture editing cool stuff you
want to use it in your applications we
are continually extending the reach of
QuickTime to new formats and two new
services as well but you don't have to
wait for us you can extend it as well
what else are we doing I hope by means
of demonstrating to you both here in
this session and other sessions in the
show quick times use of core audio quick
times use of korva do quick times use of
core image this also marks a major
milestone for QuickTime we no longer
consider quicktime the only piece of
code on the platform that's going to
process digital media it's a great set
of services that integrates a lot of
wonderful stuff that you're going to
want to use but you with your
intelligence and your need to process
digital media applications are going to
want to use QuickTime in combination
with other services that you provide or
that are available on the platform quick
times and the core audio quick times and
core video get the video frames out and
spin
using OpenGL get the audio frames out by
means of movie audio extraction and do
what you like with core audio that is
our model for the future rich high level
services at the integration level and
powerful specific services at lower
levels and you'll see us doing more and
more of this as time goes on that is the
best narrative prolepsis evolved and i
will end there thank you very much
[Applause]