WWDC2004 Session 722

Transcript

Kind: captions
Language: en
good morning everybody I'm Glen Bullock
on the quick time team thank you for
getting up and getting in here there's
definitely a worm for these early birds
I'd like to remind you all so tomorrow
morning at nine o'clock seven to four
quick time in the professional media
workflow is a tremendous session we've
been working on it quite a bit
today's session is I designed it so that
developers and engineers could get a
sense of what's going on in Hollywood in
the motion picture industry in
post-production and get a sense of sort
of the bloom that's happening clearly
you know sort of personal computers have
done you know I've had significant
impact in print and then on their
graphics and television but over the
last you know six seven eight years
they've gone into motion picture
creation quite a bit and so we're really
lucky to have these two speakers anton
is that Technicolor and society so we've
decided whether it was video guy or the
directional director of technical
services there and Scott Simmons runs a
company down in Los Angeles as well
that's doing some incredible work so
we'll hold questions to the end and
we'll start immediately so we can get
through this material and I hope you
enjoy it
Anton my name is Anton Linacre I'm the
Technical Operations Supervisor at
technical or creative services in
Hollywood we're going to talk about
today is how Final Cut Pro and QuickTime
are revolutionising the HD workflow in
Hollywood pretty much with this seminar
we'll go over how high-definition works
what the workflow is for it how to work
with a 23 9 8 project basically in the
motion picture industry we're talking
about high-definition as 24 P or some
people call 23 9 8 so that's what we're
going to be dealing with with this
particular seminar we're going to be
talking about how to offline cut your
show and then go into an online in Final
Cut Pro and how to think differently
about high
condition what you're not going to learn
in this session is how to code anything
I'm a video guy I'm sorry
so basically a little bit about myself I
give seminars at Technicolor to
directors producers cinematographers
editors talking about how to shoot and
edit high-definition footage for
eventual film out purposes or for
finishing in high definition and in the
process of doing these seminars
inevitably I talk about the traditional
HD workflow and with the traditional HD
workflow people have been doing each D
for a while and they've been limited to
a certain extent by the technologies
that have been available high-definition
equipment is very expensive people used
to use Avid's almost exclusively and
they set up their whole workflow based
on the film model so that means having
standard definition down conversion
tapes where you have your
high-definition media you because you
don't have a d5 deck or an HD deck in
your edit suite you will have a facility
down convert it for you to a standard
definition NTSC cassette and so in that
process you take 23 9 8 media and you
have to somehow get it to 29.97 ntsc and
to do that you actually introduced
something called 3/2 which is a where
you take the fields of the video and you
repeat them in a 3-2 pattern so that you
add the 6 frames that are missing
between that 24 and 30 I'm going to use
the 24 and 30 interchangeably here with
the 23 9 n and 29 97 they're in essence
the same thing so you have this 3:2
pulldown that's added to your tapes for
the standard definition workflow and
then they basically take those standard
ish the standard definition tapes in
either work in 2997 or
remove the three two and work in a 2398
workflow and that's a couple of steps in
a workflow right there you have ETL at a
decision list we'll talk a little bit
more about that shortly but you have to
somehow get the information from your
offline cut what you've basically been
editing for four or five months till
your feature is finished to the online
room the online room is where you do
your mastering where you take that
standard definition project and make it
into a high definition finished master
that you can air or broadcast or you can
take out the film whatever your project
entail and then another thing about
standard definition down conversions is
that you have a change in aspect ratio
you have your high definition tapes are
their widescreen they're sixteen by nine
yet when you do a down conversion to a
standard definition tape you're working
in four by three normal television size
and you can either work it a couple of
ways you can do a squeezed standard
definition tape where you have if you
look at on a normal monitor people are
tall and skinny and then they get
unscrews to do a widescreen or you have
a center extraction where you only take
the center information and cut off the
sides or you can do a letterbox
extraction where you shrink everything
down and you have the black on the top
and the bottom these these particular
methods have ramifications for when you
go to your mastering which we'll talk
about shortly so one of the things in
talking and giving these seminars and
giving you know explaining the
traditional workflow for the two hundred
and fortieth time something occurred to
me is that traditional HD workflow is
hard it's unnecessarily so the entire
idea that you have that you work at a
different time code rate than what your
master's tapes are that you have to
change this aspect ratio all these
particular parts of the workflow make it
unnecessarily hard and
so I end up having people asking me why
do we have to work with these 2997
cassettes when our masters are 23 9 8
why are we dealing with 4 by 3 when it's
actually 16 by 9 that we shot and so you
then have the EDL yet a decision once
that's basically 30 year old technology
ironically it's still something we use
today and a lot of facilities depend on
it because it has been that kind of
rosetta stone that we use to talk
between editing products but an EDL is
just a text file it's a text file that
has timecode values for your source
states timecode values for your master
tape and little descriptions but it's
all text and so with Oh actually I can
go there pretty limiting because our
text file depending on the format that
you have you either have two levels of
video and four levels of audio that's an
example some have two levels of video
two levels of audio the titles and
effects transitions they're only
partially supported they you get a
little note saying that you have a
dissolve there of a certain type but you
don't know if you have an iris dissolve
if you have a fade up or something like
that that's not actually in there and in
motion effects where you take your video
you blow it up slightly or you move it
around that's not supported at all so if
you're working on modern equipment
moderate edit systems you have a lot
more control you take a Final Cut Pro
project for example you have 99 video
tracks you have 99 audio tracks if you
take a Final Cut Pro project from one
system and take it into another all of
the special effects the transitions the
titles speed changes all of those things
come across and so if you're working in
a Final Cut Pro environment and you want
to make an EDL you actually have to hold
yourself back you have to hold yourself
back and edit in a way so that the EDL
will work nicely that it will play nice
with
whatever system you end up going to for
your online so basically we were there I
kind of came to this little conclusion
that there should be a better way to do
high definition and so with the help of
the QuickTime team the final cut pro
team and hardware support from AJ and
pinnacle we've come up with a different
way to do high definition and I think
it's actually a better workflow and
better for me means simple because I
don't want it to be a complicated thing
so with this particular workflow is used
now at Warner Brothers and twentieth
Century Fox and where HD is really
prevalent in the independent community
it's we have several independent
filmmakers that are using it and
Showtime is going to be doing three
shows starting about two weeks for their
next season using this particular system
so before I go too much into what the
actual system is I'll tell you a little
bit about the basics of HD and why this
change this paradigm shift is so
important with HD pretty much everybody
knows the reason why you want to work
with HD is that you have great visual
quality you are if you have an NTSC
signal you're working with 720 by 480
six lines of resolution if you're
talking digi beta it's 480 if your DV
okay with that you can go up to a
high-definition signal where you're 1920
by 1080 so you have drastically more
pixels and lines of resolution to work
with and the picture is so much nicer
but what you also have is that NTSC is
an interlaced format and so it draws a
single frame with two fields of video
that's clocked to our electrical system
so we got a 60 Hertz electrical and so
we have 60 fields that come together to
make the 30 frames of video
2997 that's kind of an old thing but so
you have these interlaced fields that
come together and so what ends up
happening is that you have the first
field draw half of the information and
then the second field comes in like a
zipper and completes the picture for you
with High Definition if you're talking
about 24 P the piece for progressive and
you have a true whole image from top to
bottom drawn at once so it's more
filming in that regard and so that's the
whole appeal of working in 24 PhD that
quality comes at a price the price is
data rate and so depending on what kind
of format you're doing you're talking
anywhere from 90 megabytes a second to
160 megabytes a second so that's a lot
of data to get in and off your drives
and so this is not the type of thing
that you can do off of firewire it's not
the type of thing that you can do off of
a single ID Drive so it our facilities
to get a speed like this we use a neck
serve rate maxed out 14 drives rated
together in a raid 50 to get the data
rate fast enough to handle HD and not
only that you also have the fact that
the data since the data rate is so high
you use up a lot of hard drive space
very quickly and so you take a look at
this difference right here here you have
DV it's rounded up but you have DV which
is a compressed sd signal the native
format as it's recorded on tape then you
have SD 10 bit if you take a digi beta
tape and you do a 10 bit uncompressed
you're working in 27 megabytes a second
so when you do the jump to HD you see
uncompressed HD actually jumps quite a
bit you have 126 megabytes a second for
1080p and then for 1080i 2997 you're
talking 160 megabytes a second so you're
chowing through hard drive space and so
uncompressed HD is not really very
useful or practical for offline
editing because if you're a traditional
feature motion picture in Hollywood
you'll Street about 60 hours 70 hours of
footage and so at HD uncompressed HD
data rates you're talking somewhere on
the lines of 60 to 70 terabytes of
storage that you need for five six
months while your editor and your
director is whittling down that 60 hours
to an hour and a half and so that's not
a practical thing to keep track of it's
very expensive because you need to buy X
or AIDS or like large scuzzy arrays so
working an uncompressed HD is not
practical at all so because of that the
traditional workflow had this down
conversion mentality since working in
uncompressed HD isn't practical and
there was no other solution available
before down conversions was the only way
to go but you have all the hang-ups that
come along with that standard definition
workflow at Technicolor we were thinking
that we wanted to state digital we
didn't want to go down to a beta SB we
didn't want to go down to the DV cam we
wanted to give them quicktime movies
delivered on firewire and those
QuickTime movies are direct descendants
from the HD will take d5 HD HD cam and
bring it in and transcode it on the fly
to an offline format that the company
requests that offline format can be
photo JPEG which is very good quality
but a very low data rate very nice for
offline editing particularly if you have
a lot of hours of footage we can go to
DV and we'll do it at 16 by 9 DB will go
DV 50 or since about 2-3 months we can
go to dvcpro HD with the new Final Cut 8
Pro HD version and so we'll talk a
little bit more about that and show why
that's an interest
workflow so with the Final Cut Pro HD
media we are giving high quality
QuickTime movies the timecode that you
see in your Final Cut Pro System
corresponds exactly to the master tapes
and so there's no difference between
time codes that you're dealing with and
so when you go do your mastering of your
movie all the time fields line up and it
simplifies things there you don't have
to change the lists go through any kind
of conversion processes like in cinema
tools right now we have something where
you can go from a twenty nine nine seven
list to a 2398 list you don't need to do
any of that so the timecode comes across
exactly the data lay that data rate is
low and so it can be as low as one
fiftieth the original data rate that's
in particular with the photo JPEG and it
preserves a 16 by 9 aspect ratio what's
nice about that is that you see
everything that you have on the HD tape
you can see if there is a light stand in
the shot on this side whereas if you do
a four by three extraction you might
have missed that you can time things so
that they happen properly when someone
walks out of frame that you know exactly
when they leave whereas if you have a
four by three Center extraction they
would seem to leave the frame far
earlier so the first show that we did
this with was a Warner Brothers
independent movie called around the bend
which is coming out of believe in August
and it was shot on 35 millimeter for
per- we tell us a needed meaning
transferring to high-definition to HD d
v we picked D 5 because it's a 10 bit
it's a 10 bit media the then we captured
it on the fly from the D 5 and we went
to half HD res and so the resolution
that they were working at was greater
than standard definition it was 960 by
540 it was perfect 16 by 9
the average data rate was two megabytes
a second so we went from 126 megabytes
to 2 and that 2 megabytes is less than
DV by the way it's not quite half but
it's close they edited the whole show on
final cut 4.1 so it is possible to use
this workflow in for 4.1 right now we
are on for 5 or final cut HD so that's
where this was a couple months back and
to view their footage they had multiple
options when they were editing in
Albuquerque
they had a DVI projector hooked up to
their edit system and so when they
looked at their footage they actually
went through the DVI and projected onto
a 20-foot screen and so they actually
edited the movie and it looked like they
were in the theater which is very nice
they also were able to go out to their
23 inch cinema display which was quite
nice high resolution and the look is far
better than what people traditionally
have worked with on in an offline
situation traditionally offline media
looks very blocky very low res this was
far better they could see focus that
very good colors saturation so with that
particular project we've now finished it
with by the way we also gave them as
soon as cinema tools database which has
all the key code information of their
films so that we're able to do a key
code cut list but when we went to online
for their preview screenings they did HD
preview screenings I don't know if many
of you are familiar with Los Angeles
there's a fantastic theater at the Grove
in the Fairfax District of Los Angeles
they had preview screenings in HD to do
the preview screenings the editors gave
us a FireWire hard drive on that far our
hard drive we had a project file which
had each of their reels media managed so
that all of the excess
Media was clipped off and so when we go
into digitized as high-definition we're
only capturing the stuff that we need
the timecode of that media was exactly
as it was on the HD tapes so final cut
we edited this on a Final Cut HD system
with an AJ Conant cart at ten bit
uncompressed so it would take in from
the d5 s exactly the clips that it
needed it then also took in all of these
transitions filters color effects speed
changes even titles came in perfectly
and the titles were exactly spaced where
they were or placed in the proper space
on the screen so we also did the we also
have them give us a quicktime movie of
each real and that QuickTime movie had a
timecode burn of the original d5 burned
into it and we then did a
picture-in-picture while we were editing
it so you see right here we have one
screenshot from the movie where you can
see that we have exactly the right spot
visually and if anything should go wrong
which nothing did by the way but if
there was a discrepancy for example you
know sometimes we have productions where
they shoot and they have to tape ones
and they didn't name it differently and
so maybe we put in the wrong tape one
and digitized from that we would have
seen a difference in the top right hand
corner and that we would have been able
to see okay
the timecode that we captured was right
but perhaps the tape was wrong and so we
be able to go in and address the problem
so if we switch to number two I'll show
a little example of having the picture
in picture and we can have it playing so
you can see each and every edit comes in
properly this is some San Francisco
footage that the QuickTime people gave
me
and so this is a really nice way to just
go through and you know that your edit
is proper you can also you can also go
through and just step through each shot
and just go to the Edit points and you
can go back and check so for going
through and making sure that each and
every edit is correct it's a very nice
way to work previously what you would
have is you would have a chase cassette
and you would have a beta SP very low
res that you would run in parallel and
watch on two screens to see if
everything matched up this was nice
because it let us go in and we could see
on the Edit machine right away if we had
any problems so I'll just advanced a
little bit here okay we can go back to
slides so in terms of a workflow it was
very streamlined they had to hardly ever
make tapes when they needed to make a
tape it was quite easy they dumped in
this case they dumped their media too
they they bought no extra hardware by
the way they just use the dual g5 and a
cinema display they had no extra
hardware they did have an extra bread
but they had no video cards no nothing
so in order to make tapes what they did
was they took the photo JPEG sequence
dumped it into a DV sequence it resized
automatically as a letterbox image they
did a render it was a less than less
than actually more than real-time render
and with final cut for when you play out
a 23 9 8 sequence it adds in the 3 2
again as necessary to go out to tape
when your DV so they were able to make
cassettes for their sound editors for
there- cutters all that quite easily so
so in terms of this workflow how could
we make it better well one thing that's
really nice and makes it better is with
the introduction of Final Cut Pro HD we
have access now to dvcpro HD and DVCPRO
HD is really revolutionary in in terms
of how we work with HD
it's the same impact as d-v had four or
five years ago six years ago because
before DD was introduced editing video
was really hard and you would always
have kind of social quality you would
have a little capture card with RCA
inputs and you capture it in you get
your little QuickTime movie and it was
low res and all that then DV came along
you had firewire you were able to plug
that into your computer you didn't need
a video card anything like that and you
could edit in either iMovie Final Cut
Pro Final Cut Express and that changed a
lot of things you had now something that
looked pretty much like beta SP quality
which had been a professional video
product and you had easy accessibility
to it you could have it on your internal
ID drives and all that with BBC pearl HD
you have the same thing DVCPRO HD takes
it allows you to edit in true HD this is
not half resolution HD its true HD but
you don't have to hit get hit with the
uncompressed HD data rate so basically
with 24 P dvcpro HD it runs at five
point eight megabytes a second as
compared to the uncompressed 10 bit
uncompressed which runs at 126 so you
see a massive data rate difference
between the uncompressed and the dvcpro
HD the reason for that is the dvcpro HD
is the same format as the camera the
dvcpro HD camera rights to tape and the
heads on the dvcpro HD camera can't
write 126 megabytes a second so they had
to find a way to kind of shoehorn it
down to a more manageable data rate so
in the in their normal state you have
this DD 100 where it's a hundred
megabits that's what the small beef not
the big beef megabits and of that we
then only take 24 frames of it it's set
for 60 we take 24 so the the other 36
frames
get locked off so we save that battery
so if you take a look at the chart of
the differences between the HD you take
a look at the dvcpro HD at 24 P it is
only slightly more than DV is now for an
offline format to traditionalists having
a offline format that's five point eight
megabytes is still high okay that's
granted because you are going to use
still quite a bit of hard drive space
but the advantages are so great
you're now editing in complete HD res
this is not some fake HD it's real HD
you have real-time effects you have HD
playback through a cinema display or DVI
projector okay which if you go back to
the computer - I got a little bit of
black before this button so what I was
showing before was actually the dvcpro
HD oops and so with this format I'm
actually playing off of a firewire drive
a little less II bus powered Drive so it
really changes the rules for HD you
don't I didn't pull out an excerpt of
red it's just a tiny little firewire
drive and we can go back to this live
again and if you have that ability you
can go out with a DVI projector
projected on a 20-foot screen edit it
perfect HD you can lay back to the new
Panasonic 1200 X it has firewire in and
so you don't need any special cards
video cards or anything you literally
plug a firewire 400 into it and you can
lay back your tape with the same
efficiency as you do DV currently you
can use compressor to make standard F
DVDs which is very nice there pristine
in quality and the minimum system
requirement is one power book or iBook
that has a gigahertz processor and a gig
of ram that's it
I was editing HD flying back from NAB at
10,000 feet on Southwest on battery
power that's really cool
and so that's the future of HD editing
now so that's it for me
[Applause]
can everybody hear me guess so good
morning
I'm Scott Simmons on the visual effects
supervisor for livewire productions we
do feature films and large format films
as well I'm just going to pull the
audience real quick who has heard of the
term digital intermediate okay there's a
few okay this is war lines with kalahari
which is an IMAX feature that we worked
on last year it's been released this
year and doing very well give you a
little bit of a background a
giant-screen film produced by National
Geographic with acclaimed director Tim
liver sage Kindler's edge is a very well
known wildlife documentary filmmaker it
was shot on location in Botswana that's
where the Kalahari is are part of it
nine months of post-production that's
not all the DI process that's also music
and the final part of it and digital
work the DI work was done on desktop
systems these weren't put through very
expensive proprietary black boxes so
what is a digital intermediate and as a
producer asked me why isn't it called
digital advanced well it's an
intermediate step it's a middle step
before you go to
release print so that means that every
frame of the picture has been touched by
the computer that means things that were
used to be done optically or in the lab
such as dissolves or color timing and
whatnot they're all being done on the
computer now and basically what you end
up with is a new color negative and that
gets output to an answer print you know
music gets added obviously previous IMAX
pictures everybody's heard of IMAX
movies I'm sure we're conversions they
weren't done you know
from frame 1 to the end credits at a
digital intermediate their conversions
of other pictures that are finished
ahead of time so this is pretty cool
because this is the first time that you
know something this large has been put
through this kind of pipeline so let's
talk about the workflow a little bit all
this the film was scanned by a Magica a
mágico is a large format film service
bureau the each frame was 4096 by 31 12
resolution just call it 4k each frame is
a 10 bit sitting on log file for those
of you that don't know what cine on is
cine on is a format that codec developed
that basically described the gamut which
is the luminance the color range of T so
motion picture negative motion picture
film has a much deeper and higher range
than video does so this formats were
important to start with we got those in
each frame was 50 megabytes that's each
frame so you can imagine by the end I
think for a portion of the show there
was about 3 terabytes worth of data
get that data in the database the scan
so we know where they are where they
came from you know what the length of
the shots are which was really important
and now here's the interesting part so
we took those really huge files I mean
obviously you've got a throughput issue
when each frame is 50 megabytes and what
we did is convert them to 8-bit log
quick time so there is a logarithmic
conversion of the 10 bits in eons to
QuickTime file format
obviously click on file format gives us
the ability to scrub through images we
can output proxies get things review
fairly quickly as opposed to you know a
sequence of Simeon frames which you
can't really scrub through very easily
then we had to look at the frames and
clean them and degrade them while
getting saddled bit better and then we
had examined the shot to see if the
pipeline that we're putting them through
was actually going to benefit the shots
and all but for cases we were using a
QuickTime movies so imagine that we're
going from you know a file format the
QuickTime movie standard it's
traditionally used in video or even
high-def to something that's much much
bigger something that's projected on a
70 by a hundred foot screen without
artifacts and we're able to do that with
QuickTime because we've used it weather
feature films we know were you still
working with it we know how to get the
best results of it so the QuickTime
movie is basically a very good sub
sample of those sending on files it's
the best of those 10 bits now and ate it
after examining it then we had to do
color grade and effects we'll get into
that a little bit more we'd rendered a
FireWire drives and firewire drives
would get shuttled off by
and to see fi which is the film
recording lab and it's now part of
Technicolor and we get those back and
project them in dailies the dailies
would go to an IMAX there you got to see
an IMAX movie and an IMAX theater so it
looks like and before we really got into
doing takes or final shots we did some
tests so we would do cine on vs.
QuickTime test and we would compare what
we are doing to the motion picture to
the original scan and we knew right away
we were on the right track everything is
looking really great so there are a lot
of challenges from doing a giant screen
picture obviously you're doing with
throughput you're doing with file sizes
you're dealing with labs possibly that
aren't used to working with QuickTime we
have to do color grading and color
grading is more than you know the tint
knob on your television set color
grading is fixing the shots that the
shots need to be fixed and then creating
a look for those shots that makes it
look know warm or cool or has something
to do with telling the story time of day
kind of things then we added a lot of
visual effects believe it or not and
we'll get into the technology too so
challenges on this particular motion
picture is the director didn't know was
going to be an IMAX feature we thought
it might be an IMAX feature and then we
started shooting 70-millimeter which is
a traditional format for IMAX and then
realized man these cameras are heavy
these cameras are loud and I can't go
within 20 feet of the Lions with a 70
millimeter camera because it just
basically don't like it they will move
away so he ended up shooting Oh 75 80
percent of the motion picture of 35
millimeter
so there's a problem right there how do
we get the 35-millimeter to look like
the 70 millimeter because it's going to
end up in IMAX and he also shot with
different film stocks don't ask me what
they were he doesn't know and we
couldn't figure it out so we had to kind
of balance things and make things come
to a center and then we have to conform
a motion picture that was not
necessarily shot for an IMAX screen to
the IMAX backs and then because of the
filming conditions a lot of what he did
was he'd be shooting in the middle of
the day in the summer summer and
Kalahari can get up to 150 degrees I
mean half his film sitting in a cooler
basically and he just grabbed whatever
he could get and if he's running out of
film of TA would go get some more film
and he knows what it was and stick it in
there so he's not shooting in the best
of condition so there's a few cases
where we had to do some restoration work
because the film had been damaged or you
know the three layers the red-green-blue
layer of the emulsion is actually
starting to separate unfortunately that
didn't happen too much so let's get into
the challenges 35 millimeter film it's
four perforations runs up and down
through the projector and let's say
that's a thirty-five millimeter still at
70 that's a big difference I like that a
lot
let's go back that's 16 times the area
of 35 millimeter that is a 50 megabyte
frame versus a 5 megabyte frame 4 to 5
is usually what 2k is and we have to
make that look like that that's a big
challenge so how do we do something like
that first of all we have to reduce the
35 millimeter drain we have to identify
what's grain versus
sure we don't want to get rid of grain
in all of a sudden the Lions coats or
the antelopes coats just disappear you
know they don't want it to look like
solid Browns you still have to have some
texture then LSF preserve edge detail
which is the real trick because if you
can determine what the edges are then
you're you know you're preserving that
resolution of the of the image versus
noise or graining this in this example
so here's an image cut in very very
close of one of the shots you can see
the grain you can see the red green blue
and believe it or not this has been
color corrected as a first step and what
we had to do is you know determine what
the antlers are the antelope clear up
that sky because you've got a solid sky
I mean you're really going to see the
grain and this is what we ended up with
it's a much cleaner picture you can see
edge detail you can see that there
antelope there this one actually is been
color corrected and color graded
difference between color correction is
in this example if we got film that was
sort of blue because he was shoot
without a color correction filter on the
camera we have to bring it to gray and
we have to create it what's called a
great balanced look so that's important
because now we show it to the director
and the producers and say okay this is
what you shot this is what you saw when
you're on location say yes it is then
color graded as a second step and color
grading it is basically well we want it
to help you tell the story so it doesn't
look like no mutual of Omaha's Wild
Kingdom necessary we want it to look
like a feature film to everything
sweetened a little bit conforming is
another issue this is a what's called a
sacred master the sacred master
basically says this is the area you've
got to play with within your fill
so if you look at this this is a field
chart there's two things that area in
the red is credit safe so if you imagine
going to a movie like next door and you
watch the credit you're going up and
down or dissolving they're filling the
frame you can't do that with an iron act
picture because where that plus sign is
is the viewing center of viewing it's
not the middle it's the center of
viewing so the center of viewing in an
IMAX picture is about a third of the way
up so that's where your eyes are looking
and got all this extra space over your
head that just makes you feel like
you're there just sort of surrounds you
but where the audience is looking is
about through the way up the credits
obviously can't go all the way to the
top so we're looking for that sweet but
sweet spot in the center like I said
before he's not necessarily shooting for
that it's not necessarily framing all of
the Lions or whatever the action is for
that sweet spot so the center of
interest in a in a picture on an IMAX or
large format pictures not necessarily
the center the frame so what do we do
here's a couple of examples where on the
Left we have the Guinea hands and their
focus spot is actually about halfway up
and I'm in this shot this helicopter
shot obviously the center of interest is
way up at the top so what do we do we
switch to demo should we play on it
there we go
so for the guineas we have to lower the
shot in the frame and here is where the
effects come in and create a digital
extension we have to do that a lot
same thing for helicopter shot in this
case we're completely replacing the sky
everything has to be tracked this so it
becomes not just a reframing issue or a
composition issue it becomes a
full-blown effect
it's next you know I can stand here
restoration we have a lot of dirt and
scratches I think you can see a little
bit of it up there I'll go ahead and
play the movie this is a pretty good
example again this is just a little part
of it a little part of that image this
is much much bigger probably eight times
larger than the part that I'm showing
you right now like I said you have dust
and scratches you've got the emulsion
starting to separate we have to correct
all of that so we're not just this is
part of the digital intermediate process
we're just not coloring it we're just
not correcting it we're making the
frames as perfect as possible because
that's what you're going to see it's not
happening in any other place it's
happening in the computers and it's
happening to fix these kinds of things
and just for fun when we had the rat
party for the crew had a little contest
to see who could figure out how many
pieces of dirt got cleaned we had a
little gift bag of stuff like that the
software actually keeps track of that
and because sort of masochistic I wanted
to know how much pieces of dirt that we
cleaned up five hundred and eighty-three
thousand pieces of dirt and some of that
let's say a hundred thousand well done
was removed procedurally procedurally
basically means we had a computer and
image and sort of fix it not that simple
but that that's what we did the rest of
it had be done by hand over 400,000
pieces of dirt had to be done by hand
and here's the reason for our wires and
their software yeah there's some
software but it doesn't work like if the
human eye does it'll make mistakes if
you make a mistake at 4k it's pretty
obvious there's just mistakes are going
to magnified pretty largely and there's
also dust in the shot
there's also heat waves as mirages
there's distortions going on so you have
to use your eye to get rid of that stuff
and that was the major production part
for us we go back to the other color
grading okay part of color grading
I put stabilizing in there because I'm
not sure where else to put it he's
shooting on location in difficult
situations not all not every time as the
camera lock down and also you've got
heat distortion going on in the image so
the image is moving around a lot if you
flying the helicopter the cameras
strapped underneath it with bungee cords
sometimes the bungee cords aren't tight
enough and you get some jitter going on
in the frame so that has to be
stabilized because you must a shot to
look no absolutely as possible color
creakings like I said we're balancing
for gray first and then we're creating a
look okay here's a good example this is
a frame from original scan and it's a
little hard to see but it is dark it is
green it is muddy and this is what we
did to it so we gray balanced it crated
the color look for it and now you can
tell it's a lioness on her cub we
excuse me we also output to film quite a
lot and National Geographic come in and
see what we're doing they saw the before
and afters and just gasped they just
thought we can't salvage this there's no
way you could do it so what we did the
reason we did we started off with the
ten bits in eons full range of the film
negative and we selected the best part
of that to work in QuickTime we also
created color boards color panels this
is basically what you have to do when
you're working in a sequence you have to
create a color look that will work from
shot to shot you can't work on a single
shot and say it looks good and client
says yeah and you're signed off on
doesn't work like that every shot has to
cut to one another so in this case we're
trying to preserve the color of the
animals more so than the sky the sky is
pretty close but you know if you turn
around and you look at the sky behind
you it's going to be a different color
so we have to nail down is what the
texture and color of the animals looks
like then on top of that we do visual
effects shot enhancements you saw a
little bit of that before where we just
have to create extensions or whatnot and
we also did some all CG ish shots the
first shot of the movie which was a
little disconcerting to us was a
complete CG shot of a star-filled so our
fields are really hard to do I mean you
have the landscape and the stars above
and we're tilting down to the Southern
Cross stars are hard to do it's really
dark in if you're going to see
compression at all you would see it in
something like that but we worked it out
so you don't see see that we did a lot
of TG maps and we have a cosmic zoom
let's go back to demo please
here's another reason you have what
Digital Media is great this is an entire
sequence but it's one shot dissolving
into the next so all the color grading
has to work from one shot to the next
and here's our cosmic zoom again center
of interest is down towards the middle
this looks amazing on a hundred-foot
screen it feels like you're flying in
for landing
and speaking of large we had to get
satellite imagery for that part of the
sequence and we ended up at the texture
map those 60,000 pixels wide
that's the Okavango
we're orienting ourselves with the nice
sky look at that nice guy that skies all
digital we do a math dissolve into
another portion of the satellite image
and we fly past the salt hands land
right at the watering holes both go back
to the slides please
here's another special effect if you
notice the line is facing right don't
know why Lions supposed to be facing
left lion wasn't cooperating flop the
image life is going to be good but this
is right after the opening shot of the
stars so this should be done but it's
not done should be done so let's combine
it with something like that let's put
them both together let's go back to demo
that's a cool shot I like that one so
completely again this is part of color
grading and it's an effect that's also
public color grading it's all suppose a
digital intermediate process you've got
a director it says you know I want a
shot that I couldn't get you know and so
we created it can't go back the slides
please
so let's talk about technology a little
bit some surprising things we learned on
this show in particular we've done IMAX
feature work before I've even done
stereo work before so we weren't really
surprised about that QuickTime could
hold up and when we've been a slitscan
stereo graphic and if you're going to
see compression you're going to see the
API grid that you see a typical
QuickTime compression going to see it in
something like that you're going to see
it in something like the star field shot
but we're surprised by some simple
things Apple 17-inch studio displays
were subs best monitors we've ever used
blew it away don't know why you're
better than a cinemas for the displays
we're using don't know why but what we
saw in those little 17s displays we saw
on the IMAX screen so we're looking at
17 inch display they're looking at a 70
foot tall screen and they matched you
know we obviously we did a lookup table
hardware lookup table for the monitors
so that they all stayed consistent and
also kudos to ca5 for making things look
like they're supposed to and our film
outputs were pretty good
firewire drives absolutely a necessity
we were rendering to the firewire drives
and checking them and boom they go out
the door go to the lab they get recorded
they get filmed out so these large
firewire drives are just a godsend
because this show was about throughput
can do a proc machines software
after-effects leave are not combustion
there are two other companies that
worked on this a magical and Sassoon
comb divine magic that use them shake
and procedural methods and I can't
stress enough how important procedural
methods are because procedural methods
make you use the tool that you have and
come up with a new solution procedural
methods are what we use to define what
is grain versus texture what is the edge
what is an antler stuff like that we
have to work outside the tools not just
using one tool not just using one layer
and I just using want to composition
you're using several that are analyzing
doing different things to create a whole
now can software be written that does
this absolutely I'd love to see it
advances because love to see a plugin
that would do all these steps to find
the grain to find edge detail that's
that kind of thing but there's always
going to be a place where you get to a
shot and you see you know like I said
before the emulsion starting to separate
you you know those are pretty esoteric
and bizarre things and you have to use a
procedural method to correct them okay
the expense obviously we're on a desktop
expense wasn't sulfur endeth and that
makes it pretty attractive to producers
if the client wanted us to do real-time
4k color grading digital intermediate
work you're talking about with the cine
on files you're talking about terabytes
of throughput is it possible yeah
actually a couple systems have actually
been made that have been able to do that
cost for the hardware is around three or
four hundred thousand softwares on top
of that you start getting a very big
bill very quickly and put output is
critical so I have to figure out how to
move stuff around and work with
different file formats
and then as I mentioned before
understanding the medium if you work
smart you can work faster
not just dealing with the problem and
part of the digital intermediate issue
is who is owning it it's before it was
different steps
it's like labs doing this part color
timer is doing this part DP is doing
this part DP used to be using
traditional chemical methodology and
who's really owning it and that's a big
question right now because right now di
it's just a big shotgun okay we got
color if you do not do nothing but color
the film create a color look that don't
stabilize the shot don't dust but they
could but they're real-time 2k usually
playback and color correction while the
clients their clients not going to sit
there and watch you dust bust it's just
not going to happen but that's part of
it so the current technology that's out
there and million dollar suites is only
doing a half of what we had to do to
create a digital intermediate okay so
future what we need obviously large
files require a large pipeline
you know 4k is becoming a new standard
and digital effects work Spiderman was
all the effect shots we've done with 4k
plates not suitcase and right now the
architecture of file throughput is a
threaded architecture so that means if
you're using Simeon scans if you want to
stay with the full range dynamic range
of that film you're working with Simeon
scans they are sequence of frames they
are not a digital video format they are
not QuickTime I would like to see it
quicktime become threaded which is
basically why a sequence of frames is
necessary because sequence of frames can
be addressed over the network in
parallel the more power you throw at it
the more real time effects
the more throughput that you're getting
possibly the new high def standard it is
merging with QuickTime seems like it can
skip around quite a bit I would love to
see it used in production in a parallel
pipeline so you really are getting low
data rates consumption with high
throughput and get multiple effects
color grading applied to those things
that would be fantastic
standards 3d color look-up tables
what's a 3d color lookup table a color
look-up tables basically says alright
this is you know the amount of colors
you can use to display an image you know
they're software this Hardware look-up
tables a 3d color lookup table is saying
that all the values are interrelated and
that's how film works I mean getting it
to a high depth finishing station is
great but if you really want to work
with the data the way that film works
you have to work with a 3d lookup table
and it just means the red/green/blue
affect each other
if you affect the brightness of red in a
film 3d lookup table something's
happening to green and blue because
that's held film emulsion works codec is
created 3d lookup table fairly recently
but it's only available for codec stocks
so there's no 3-2 lookup table that I
know for Fuji or other files film stocks
also because it's new it's not a
standard it's being adopted for a lot of
color grading systems yet and then
you're faced with arcane chemical
technology which is how I started I
started off in an optical lab and I know
about optical printers and keeping the
right temperature of film suits and all
of that stuff and there are no direct
correlations between traditional color
correcting methods in film to digital
now some software will say we have
printer lights we can match what's going
on in the lab that's not necessarily
true because traditional color
correction if you say you're adding 10
points of red or adding some red to it
you're not cranking the red that's not
how film works but in most the software
that's available now say well the DP
wants you to add red will you add red to
it that's not correct it's not how some
works you're over driving the image you
know traditionally in the DPS I want
more red much as a neutral density I
want you to do this or that and you can
add a list of criteria for his recipe
and still be the film will hold up it'll
still look good but digitally you can
seriously overdrive the image if the the
look that's being required is pretty
severe you can overdrive the image
there's a lot of things that need to be
correlated film bias you know how do we
do that there's many issues so there are
a few new technologies those are wrap up
here that our desktop systems are not
black boxes that just do colors work
lustre by the screen it's kind of a
black box is proprietary it's a licensed
technology they don't really know do
they really own it only runs on Windows
XP lustre is sort of an all-in-one color
grading system you do your color grading
and you get the results so you twist the
dial and get a result you can't get in
backwards and see what the dials were
set to final touch is a new programs a
couple years old runs on Mac g5 s
incredible throughput they're adding
more and more tools to it
it's a lender's and floating bit
processing instead of integer which is
really important again you don't want to
over drive or clip the image
and SpeedGrade which will run on a Mac
and it's currently running on PCs speed
raids interesting because what does is
create a script for what it's color
correction just did sensibly that's
being used to send out to other
SpeedGrade systems so they can go well I
can help you render I can help you make
the color correction but I think as a
real advantage once color standards are
set
if scripts are written from these color
grading applications that can be sent to
other applications that do more effects
kind of work does a shake and that would
be fantastic so if you can that way the
effects people or the DP can decide
where does this happen do we do effects
first and then color grade or vice versa
which is really important because you're
doing effect shot on something to color
grade and made it really spooky and dark
well you know you got what you got you
know you don't necessarily have tracking
points anymore because the image is too
dark and I think that's it for me
[Applause]