WWDC2003 Session 006
Transcript
Kind: captions
Language: en
I'm Brett Halle I'm the director of
probe IDEO engineering we're going to
spend a few minutes this afternoon and
talk about how you can be extending our
the Pro applications with plugins this
is going to be a pretty busy session
we've got a lot of different plugins we
want to cover so we'll get right to it
very quickly just so you have a bit of a
context I have a number of applications
in our application suite or pro
application suite final cut for which is
a whopping G I think it's almost two
weeks old we launched a week ago last
Saturday and Shake 3 which launched last
weekend Final Cut Express which is came
out earlier this year and logic which is
an acquisition that we did last year
we're gonna cover a number of different
plug-in models that are supported by all
these applications and to kick us off
and to talk about Audio Units I'd like
to invite Roger Powell the lead engineer
for a final audio thanks very much for
coming today we're going to start off
this session by talking about Audio
Units and how they're deployed in these
three Apple Pro media applications Final
Cut Pro soundtrack and logic we're not
going to teach you how to write an audio
unit today for that you should go to the
developer site developer.apple.com slash
audio if you're new to this technology
each of these applications has
individual usage scenarios and
requirements for Audio Units
and we thought it would be beneficial to
you if we could go over those points so
that you could effectively deploy your
Audio Units in any or all of these
applications these points that we're
going to cover fall into three general
categories the first is the user
interface the parameter user interface
that's supported either custom or
generic the second is the input and
output channel configurations that your
plug-in should support in order to work
in these applications and then there are
some
choose with audio unit property
implementations let's start off with
Final Cut Pro Final Cut Pro at this
stage puts up its own filter viewer for
your user interface for your plugin so
this would is what we refer to as the
generic user interface view a custom UI
is not supported at this time we do
realize the importance of this to
vendors and it's a very high priority
for us in a future release your
parameter descriptions type and range
must be complete to support our UI
controls you should publish all of your
parameters and provide complete accurate
descriptions so that our UI controls can
be built in an effective manner that the
user can interact with your plug-in your
plugins will appear in an audio effects
folder with your manufacturer's name now
in order for this to work you do need to
follow the naming convention of
manufacturer : effect name so that we
can properly sort these and put these in
a folder with your manufacturer's name
let's take a quick look at the filter
viewer for Final Cut Pro this you can
see on the Left we have the controls
that we've generated by reading your
parameters and then on the right is the
what we call the keyframe interface and
these keyframes can be produced either
through recording gestures while you
during playback while you're using these
controls on the left or they can be
edited manually with a standard editing
procedures I'll go back for a second for
Channel configurations in Final Cut Pro
at this time because of our clip based
and track bussing architecture we
support mono in and mono out you can
think of this more or less as an insert
effect type of model rather than as a
send return type of model it's okay if
you support other configurations in your
plugin but they must support mono end
and mono out in order for them to be
loaded into Final Cut Pro at this time
we're not supporting the audio unit
preset mechanism however users can make
changes to the plug-in and then save a
version of that using our favorites
mekin
let's move ahead to another important
item for Final Cut Pro this is called
the this is an audio unit property
called tail time and it's very important
to Final Cut Pro we sort of expect that
you're going to implement this or at
least understand the issues surrounding
it tail time can be thought of as the
decay of a plug-in a good example would
be a reverb a reverb obviously you're
going to set to you know a two-second
decay and so when a signal hits it it's
going to decay out for about two seconds
this actually indicates the length of
the sample history to reach steady state
this is how we interpret this we use
this for a pre-roll computation so that
we can maintain sample accurate seeming
across edits the example of this again
is the reverb if we place the reverb on
a clip in the timeline and then we
render the first half of that clip we
then go back and start playing that clip
the first half of the clip will play
from the render file which has the
computed reverb effect and then suddenly
when we hit the crossover normally we
would start feeding source samples again
to the reverb well the reverb has to get
primed up again so this would produce an
inconsistent sample stream across that
crossover point in order to avoid this
we query the tail time property see how
many samples we have to pre-compute to
reach a steady state point at the
crossover and then when you play back
we've pre computed that and then we
start up with the samples that are
current at the crossover point so this
essentially eliminates the seeming
artifacts you we do expect this to be
implemented however there are cases
where you should actually report not
implemented one example of this would be
if you have an infinite tail time on a
reverb which would you know lasts
forever
clearly at that point we'll have to
render the entire clip in order to do
that because we can't pre-roll forever
or if the state can be indeterminate and
this is a little interesting but an
indeterminate plug-in would be something
like
there was an internal modulation from a
low frequency oscillator that was not
synchronous or if there's some other
random element in the plugin which would
cause its state to be indeterminate at a
given sample position these types of
plugins should report not implemented
and there's some gory details associated
with how you actually handle that which
I won't go into right now but we do have
experts from all of the applications up
here so you can grab one of us later and
we can give you more detail on that also
the latency property which is somewhat
similar in usage if your plugin has a
latency that is a delay between when the
input goes into it before samples come
out we need to know that as well in
order to produce sample accurate seeming
that's it for Final Cut Pro let's go
onto soundtrack once again soundtrack
and this is a theme that you'll hear
throughout all of this we we want you to
produce complete and accurate parameter
descriptions type and range this is so
that automation can be handled
accurately soundtrack actually supports
the audio unit custom UI and presets and
in order to do that you would click on
the advanced button in the effect panel
I have a screen shot of that as well on
the left you'll see the generic view for
the preset or for the plug-in and you
can see the controls listed and the
generic controls the names and the
controls listed there if you hit the
advanced button you'll get a screen like
what's on the right there
where it gets overlaid with the custom
UI that's been embedded in the plug-in
channel configurations for soundtrack is
stereo in and stereo out only again
other you can put other channel
configurations in your plug-in but they
must support stereo and stereo out in
order to be used within soundtrack
soundtrack also uses the tail time
property it uses it a little bit
different way it's only required for
audio units that would have an audible
tail again something like a reverb and
the property value must be accurate to
avoid artifacts otherwise there'll be
some truncation of output the way this
is used is if you have a reverb on a
clip and that clip is the
that track ends and you don't want the
reverb to cut off when the source signal
cuts off so soundtrack will query that
property and allow playback to continue
for the length of time required to
process all of the effects samples let's
move on to logic in logic the reference
hosts are logic platinum 5.5 dot 1 or
higher and logic platinum 6.0.1 or
higher this is where the audio unit
support was first introduced you should
put your resource allocations into the
initialization part of the plug-in
rather than the instantiation this is
because logic will instantiate one copy
of each plug-in on startup and so you
don't want to incur startup penalties
the custom UI is supported and the
generic parameter UI is also supported
parameter descriptions must be complete
and accurate we'll take a quick look at
the logic audio unit views the generic
views are on the left and the top right
and the custom UI is displayed in the
bottom-right for logic the channel
configurations that are supported are
mono two mono mono to stereo and stereo
to stereo audio channel configurations
so this is a little more flexible than
the other two each of these applications
has you know a different purpose in life
and has different support for these
things there are a couple of other
points for logic logic will use the
standard au preset for saving and
loading presets it also pays attention
to the latency and tail time properties
and then the last two points are
regarding automation your plugin should
send the carbon view event the audio
unit carbon view event parameters or
messages mouse down and up in control to
the host the reason for this is mostly
to support automation recording for
latch mode in automation and likewise
for automation support
your plugin should use the parameter
listener scheme to inform the host of
parameter changes that basically covers
the points that we wanted to make for
these three applications we know it's
probably generated some questions and
will all be available for QA at the end
of the session at this point I would
like to hand things over to Angus
Taggart
he's from Shaikh and he's going to talk
about Shaikh plug-in development Thank
You Roger thanks again for joining us
this afternoon as Roger said I'm going
to be presenting an introduction to the
Shaikh plug-in architecture our goal is
to cover some brief introduction to
shake itself try to give you a feeling
for what the application does we're also
going to cover what you get with the
shake SDK and also look at some basic
concepts that are involved with building
a shake plugin so let's start out with
an introduction to shake shake is very
rapidly established itself as an
industry-leading compositing and 2d
effects solution an invaluable tool in
many major post production houses so
what does shake do it does things such
as color correction filters grain
removal blurs all kinds of filtering it
ships with two of the industry-leading
colour keyer photons Primack here and
also the computer frame store key light
here it has tools for doing tracking
masking rotoscoping paint also retaining
controls and for those that are
interested there's a huge list of
sheikhs capabilities up on the Apple
website so how does shake work a very
brief introduction to shake is under the
hood there's an advanced very efficient
node based compositing engine the
highlights of it is that the primary
processing within shake occurs within
nodes and that nodes are connected and
data gets in and out of nodes through
plugs and somebody who's using shake
will actually create very complex
compositing trees and effects trees by
connecting nodes up I've got a simple
example where you can see there's a
foreground and a background layer being
composited over each other and you can
see the the node representation of this
operation and shake something that's
very important to plugin developers is
who uses shape who's my audience going
to be when I build a plugin shake is
used that a lot of the top post
production facilities a list includes
Wetty the new zealand-based company
that's doing the lord of the rings' work
escape based in close by here in Alameda
who's doing the matrix sinha side
dreamworks blue sky it's pretty much the
a list of post production houses and a
lot of others to its credit it has been
used in the production pipeline for the
last six academy award-winning movies is
it's becoming more commonly used as in
film schools and with apples aggressing
aggressive pricing and positioning it's
being used more and more for commercial
work and that sort of thing
who uses the Shake SDK in terms of
third-party commercial plug-in
developers pretty much you know we've
got a fairly strong industry support the
foundry uses or develops for shake with
tinder and also with their furnace
plug-in suite also Jen arts with their
sapphire plugins we'd support shake and
we've also got support from ultimate and
revision effects is warping in morphing
tools also another very big user of the
shake SDK is actually the shake customer
is the large post production houses that
have to integrate shake into their
production pipelines and also to create
custom effects within shake to get the
kind of creative effects that they need
so you're a plugin developer you're
ready to start working with shake what
you'll need to do is actually request to
shake SDK package through apple
developer relations
and when you get ahold of that package
you'll see that it not only supports the
Mac platform but it also provides
support for the arocs platform or SGI
platform as well as Linux one of the
things that we've put a lot of effort
into over the last year is really
enhancing the shake docks
we've got great tutorials really good
reference guides some white paper kind
of things to give you some technical
overviews of shake shakes node engine it
also ships with 18 example plugins for
doing everything from image processing
filters custom overlays custom widgets
there's a lot of examples to help you
get started and we've got an example
project builder development environment
the last point on this slide and this is
something that is both very powerful and
it's something that also requires a
little bit of a learning curve is that
there isn't a layer that the shake that
shake provides for its plug-in
development environment basically when
you start working with shake you're
using the same headers and frameworks
that the shake internal developers use
and this means that you get the same
access to the node engine that the shake
internal development team has and that's
a very cool thing it's a very powerful
thing it means that you can do just
about anything with shake that you want
it also means that you need to spend a
little bit of time making yourself
familiar with some of the concepts of
how the shake note engine works and so
what I'm going to do really quickly in
the final few slides is just hit on some
very basic concepts involved with shake
plug-in development as we said earlier
the building blocks are the basics of
the node engine of shakes node engine is
just that we've got nodes and plugs and
in review a node is we're processing
occurs this is where you're gonna put
your logic for image processing or
whatever you're doing it's going to be
embedded in the node in the node is a
C++ class and there's a there's a base
class defined that has a lot of
functionality that you'll derive from
and when you want to get data into your
node or data out of your node you're
gonna do that through a plug that's the
way to get that's the way the data moves
through through the shake node engine
once again it's a it's a C++ object one
concept with plugs is that they don't
exist on their own there
was owned by a node and and so that's
something that they'll within a node
you'll add a plug that provides a
mechanism to get data either in or out
of your of your node and it has a base
class as well that provides a lot of
functionality so what are the mechanics
of the shake node engine how do you
actually you know produce your your
result for first shake well there's a
concept called lazy evaluation shake
really only cares about your output
plugs what you can produce it's not
going to ask you for any information
about your input plugs it's only ever
going to when it comes time to render
and evaluate it's gonna call your node
and it's going to ask for information
about your output plugs and so at that
point once there is a request comes in
for a value out of one of your output
plugs what you'll be doing then is
pulling information in from your input
plugs whether its input parameters that
users define or it could be data
actually coming in from a node that
you're connected to so you'll be pulling
on a plug getting data out of it
processing it and placing that value in
an output plug and then shake them takes
that data and moves it down to the next
node or it might actually take if it's
image data it'll maybe even put it up in
a render view let's take a look at the
code structure that supports this
calling mechanism as I said before and a
shake plug-in is basically a node it's a
C++ class the in this case I've got it
deriving from the the NRI node base
class and there's three primary methods
that you will be working with
extensively first of all and even though
it's it's probably considered best not
the best practice in C++ but shake
plugins do a heck of a lot in their
constructors so essentially when your
constructor is called in shake you're
gonna be adding all the plugs the input
plugs that you need do any wiring that
you need inside of your node any setup
that you need is going to occur in your
constructor so you'll end up doing quite
a bit of work there another key routine
is your eval routine and this is where
shaykh calls for the value of your
output plugs so any output plugs that
you've registered the Shaykh wants a
value for if it's going to call that
eval routine with a pointer to the
output plug that it's interests
getting a value for so you'll look at
which output plug it's asking for and do
whatever you need to do to compute
compute an updated value for shake and
finally there's a virtual notify method
you can actually register with shake to
be called when one of your input plugs
is somehow modified and so shake will
call you and you might want to do
something either you know some kind of
processing or or changing the structure
of your node based upon the notification
that you get and this is the final slide
and something that's very important for
shake plug-in developers I do quite a
bit of shake SDK support and one of the
things that we find is that shape
provides a quite a rich set of base
classes that you can start from you know
we talked about the the NRI node base
class a couple slides back but as a
plugin developer you will rarely start
and derive from that base base class
directly there's a number of base
classes depending upon the operation
that you want to do whether it's a you
know a filter that takes one two or n
number of input images whether you're a
a custom widget or an overlay and that's
something that we really like to
emphasize with plug-in developers is
become acquainted with the base classes
that are provided by shape you can save
yourself a huge amount of work there's a
lot of functionality built into those
base classes so that wraps up my
presentation and at this point I'd like
to introduce Donald Louvre
[Applause]
hi how are you doing today I'll be
showing an overview of hex script and
couple of demo at the end so what is FX
script FX script is a video scripting
language used to create video effects
for Final Cut Pro it's a its design for
basic image processing its procedural
language it's parsed and interpreted at
runtime since its built-in uses the same
rendering rendering engine as Final Cut
Pro we are over 150 effects in Final Cut
Pro which are written in FX script these
are some of the features available in FX
script variable types you can declare
multi-dimensional arrays up to five
levels built-in functions standard input
controls and loops and variable loops
and branches the subroutine supports
recursion up to twelve levels
there are three types of effects in
Final Cut Pro that you can create using
FX script filters for single video
streaming transitions for two video
streams and there's a generator which is
a special type of clip such as text or
particle generators
Final Cut Pro has a built-in tool which
you can edit and preview your script
there are three windows you can edit
your script in the text entry window you
can run and see your effect in the
preview window and then you can adjust
your input controls you can edit your
parameters in your input control windows
there are several ways of saving your
script in effects builder you can
directly add your effect to your project
or you can make a favorite effect which
is saved in your preference file or you
can save as a text file you can save as
a plug-in or you can say that as a as an
encrypted plug-in once you encrypt it
encrypt your script you cannot view the
script again all the plugins all the
effects in Final Cut Pro are saved as
text files so you can view them in
effects builder or in any other text
editor this is an example of effects
script there are two sections the header
and the body in the header you define
the type and the name of the effect you
can also define the input parameters the
main body starts with the keyword code
so who is this for for Final Cut Pro
developers post effect houses and
advanced users
and now I'm ready to show you a demo
okay this is one of the filter I wrote
let's open this open here so this is a
rock if I turn off this filter I can see
an object in front of the car so I want
to remove that object so this filter
removes that object and let's open that
up in FX builder
so I'll be customizing this in just a
few minutes for this sequence right here
so this sequence has a diagonal object
and I'll be removing that object by
customizing this effect
first of all I'll be adding a input
parameter
the default value and the minimum and
the maximum value okay
and down here we rotate function
polygons the center point in the amount
of rotation and the aspect ratio let's
just run this make sure there's no
syntax error so you can see right here
there's a new angle controller here okay
so let's just apply this directly to the
clip
and that up in the filter take that just
go right here
yeah
now that a little bit okay
a little higher and I'll set the timer
and let's just adjust a little bit the
source there
let's just get rid of it overlay soften
the edges there you go
okay in my second demo I'll be showing
how to call make a call to a built-in
filter so this is a generator which
generates a random pattern let's just
open this you know effects builder
and let's just run this
it's just random pattern and I like to
make a a function call to one of the
filters in front I Pro just stop this
per second okay
okay go and the FX tab we have fine
edges so I'll be making a quick call to
find out just to this pattern so I'm
going to temporarily save the image to a
temporary buffer I'm gonna make a call
filter in the name of the filter which
is find edges and then the source which
is which I copied it to buffer and then
the destination and then frame duration
and frame rate so let's run that
so this is basically calling another
built-in filter and you don't have to
even type or copy and paste the script
of the find edges this should eliminate
such problem as variable name collisions
typos
and you can once you make this change
you can save it as a plug-in but this
time I'm gonna make it where does it
make federal fat right here if you look
in here it will say that as a favor
effect and you can use it okay
so let's close that and in my third demo
I'll be showing you I'll be showing
performance issues an FX builder let's
open this this is a particle generator
let's open this up thanks builder let's
run this
you have all these parameters you can
control gravity initial velocity speed
decay the size of the particles softness
you can change the color
you can change the radius
offset the vertical position now this is
drawing to two hundred particles if I
add in any more particles it will you
will see degradation and performance
so let's open another generator
open this up
and that's hope let's run this there's
another particle generator now this is
drawing 10 times 100 or over 1000
particles
this has same control some similar
controls as the first one speak some
decays size softness now
I've included Bezier curve to this so I
can control the path of the particles
I chose this side a little bit
yeah
so with all the additional calculation
you don't see any performance difference
between these two right here
okay that concludes my demonstration for
FX script and next I'll be I think Abby
will be up here to show you after that
plugins thank you okay thank you
alright so my name is avi Chaplinsky one
of the engineers on Final Cut Pro and
I'm here to talk about After Effects
plugin support inside Final Cut Pro so
it's a bit of an intro I'm just going to
cover some of the basics because some of
you may be aware of this and some you
may not be so what is After Effects
After Effects is Adobe's package for 2d
and 3d compositing and effects for video
and motion graphics what are After
Effects plugins after plugins are
usually third-party developed modules
that add some functionality to the host
application these are typically things
like effects
Tyler's key years time remap errs but
really there's a wide variety out there
now why does Final Cut Pro support the
After Effects plug-in well it is a
public SDK so it's available for
everyone to download and play with and
when found Cut Pro came onto the market
and in the intervening time the After
Effects plug-in has sort of developed
into a de facto standard for well cross
cross application plugins so it's
actually supported by several other
applications besides After Effects from
Adobe Premiere has support for them but
also combustion and commotions also have
support for after effect plugins so for
people developing them in a third-party
setting your audience is a lot wider
than any one specific application
standard because it's embraced by
several competing applications so in
this brief presentation I'm just going
to cover some of what's of interest to
us in Final Cut Pro so that is what
works inside Final Cut Pro any
limitations
and some Final Cut Pro specific API that
we've added in the last release of Final
Cut Pro so briefly to cover what works
well officially Falco Pro supports the
After Effects SDK version 3.1 which if
any of you are familiar with is a little
bit old at this point but you know it
covers most of the things that we're
really interested in for Final Cut Pro
those kind of basic elements are of
course frame check out and check in so
the clip that you're applied to you can
pull any of the frames out of them
perhaps make some effect on top of them
and then put them back in but you could
even just reorder the frames to do time
remapping or slow down or speed up
effect we support the basic set of After
Effects defined parameters so you'll see
layers or in Final Cut Pro parlance
these are clips or clip input wells and
that's a little icon beside it X Y
points for simple locations sliders of
values and pop-ups and there's a several
others in the SDK a particular interest
is the custom parameter type which
allows you to in the effect parameter
list define your own custom UI to either
directly edit or you can have the user
click on it to bring up your own custom
UI on top if you need to do more
elaborate things than can be defined in
the simple parameter types that you can
directly show ok so some limitations to
our implementation of the aftereffect
standard in Final Cut Pro we don't allow
plug-in defined on frame UI elements so
if you require the user to directly
interact with the frame in order to you
know place elements or to pick up you
know various kinds of things then you
can't do it directly in the Final Cut
Pro UI typically if this is the kind of
thing you need to do you'll want to do
your own UI on top and you can pull the
frame out and draw whatever you want on
top of it and let the user interact in
your own UI we support only 8-bit RGB
rendering
although Final Cut Pro 4 has a brand-new
a 32-bit float rendering engine for
After Effects sorry in Final Cut Pro the
greater than APA format defined in After
Effects is 16-bit integer which is not
something we have native support for in
Final Cut Pro another key thing to keep
in mind especially as you make more
elaborate plugins is we don't allow
dynamic parameter lists these typically
are if the list of parameters you have
wants to depend on you know the state of
a pop-up or something then we can't
allow that in font Cut Pro all the
parameters that your effect has needed
to be static and defined when the effect
is initially read in so basically at
Final Cut Pro doesn't really have an
analogous construct to changing the
number of parameters so we don't allow
it for the after effects plugins
themselves okay all right so a quick
overview of some new API calls so we
added some that these are useful if
you're trying to better integrate you're
plugging in into Final Cut Pro and the
deal with some of the differences
between After Effects and Final Cut Pro
there's also a few that are they're
particularly useful if you have your own
custom UI that you're drawing and you
want to interact with the application
and particularly if you'd like to send
frames at to the currently enabled video
out device so briefly there are four new
calls ian's Final Cut Pro for the first
two are rather simple they simply allow
you to retrieve the current version
number of the file cut pro you're
running under so there's the major
version number things like three four
and then the minor version number for
the dot releases the third one is just a
call to cause Final Cut Pro to redraw
all of its UI this is good if you've
drawn custom UI on top that might have
interfered with or drawn on top of any
of it or otherwise damaged the UI when
your plugin is completed running you'll
probably want to call this function to
just make sure the UI is in a good state
when a user comes back into Final Cut
Pro the last one is what's a particular
interest again
if you have custom UI so it allows you
to force the viewer or canvas that the
window that the user seeing for the clip
that they've applied your effect to to
move to a particular frame inside the
clip what this means is that if you've
got your own UI with maybe your own kind
of timeline and you want the user to be
able to scrub through the clip you can
call back into Final Cut Pro and have
found Cut Pro move to that that same
frame and what that really gets you is
not only showing it in the Final Cut Pro
UI but if there's a video out device
turned on and enabled Final Cut Pro will
want to do the rendering for that that
new frame which since it'll call you if
you're in the render stack that will let
you push your frame with all your
effects out to the video out device
directly without having to worry about
writing your own interface to do all
that so in order to support those
additional calls what we've done is
added a single function pointer that you
call that's in the PF util callback if
anyone's familiar but that's basically a
set of utility callback function
pointers that the host application
provides to every plugin in this case
the host is Final Cut Pro if you call
that function at the bottom to get the
private callbacks with a block of memory
then we'll fill in those function
pointers for you and then you could just
call them directly to do basically what
I outlined before okay so quickly in
summary the After Effects API is a very
useful way to add additional image
processing to Final Cut Pro this is
particularly true for those in the third
party setting because since it's a
standard API that's supported by several
other applications it's a good
opportunity to sell to a wider audience
than just one application but of course
keep in mind that we only support a
subset of the full After Effects API
this subset is what we felt was most
important and appropriate for Final Cut
Pro and it's yeah what we felt was what
you would need to best interact but good
thing to keep in mind is that we really
would welcome your fee
back if there's pieces of the SDK that
we don't support or that might have been
added since we have done our last
revision we'd be very interested to hear
from people on what they would they be
looking for to give us an idea of what
the benefits might be because if it you
know if it looks good we'd like to add
any support we can in order to make your
lives a little easier so we'd really
encourage feedback not only now but
later there'll be some email addresses
we'll post at the end to give you a
forum to provide that kind of feedback
to us okay well that is basically
everything I want to talk about so I'm
gonna welcome David Black up to talk
about some future plug-in directions so
thank you good afternoon
like to spend a little time talking
about served with a future lies with
with plugins and apples Pro video
applications plugins we've seen are very
important to us and sort of want to take
things to the next level and only
increase the opportunities for
developers going forward really what I
sort of our core Direction comes down to
trying to move to a unified plug-in
model where it makes sense for apples
professional applications and it's
important to note that this is not
something that's going to supplement
normal current operating system plug-in
models like the Audio Units
specification really what it boils down
to for us is having a common base
architecture in place across different
types of plugins and different
applications so that's only easier for
us to support them menu develop them
serve on top of that basic layer have
sort of function or tasks with KPIs
depending on what you're doing so there
might be sort of one set of API is for
an effect plug-in or one sort of API is
for a data interchange plug-in and
really again sort of key point of all
this is giving you a model where you can
know that multiple applications will
support this certainly we're looking to
commit to support these across our
applications and also provide enough
detail so that others can support it as
needed also another key point is trying
to build the support ends so that you'll
have the choice of development tools to
use for these plugins certainly a lot of
models in the past have been very
focused on C
very focused on C++ too as much next n
as possible we really want to give you
that choice because certainly different
tasks may require different tools and
developers may be at different technical
skill levels it's going to a little bit
more detail about what we're trying to
go for we're basing this model on
objective-c protocols familiar with
Objective C protocols very nice way to
sort of pass data and messages between
objects without being forced to rely on
a common base class and we found that
actually solve quite a few problems for
supplementation phase the the files on
disk just stored as standard bundles
nothing really new their bundles are
actually great because put everything
together even sort of defining multiple
plugins within one sort of to the user
at least object we will be supporting
static and or dynamic plug-in loading
and registration your plugin may have
requirements that depend on what
libraries are installed or what
application is running or it may be very
simple and just can simply document that
in the structure and services and data
from the host application will be made
available via objects and callbacks so
that you'll be able to make those calls
and it won't sort of be a simple one-way
model where image buffer comes in you do
your thing image buffer goes out but
there is some bi-directional
communication support in there but might
ask what about other plug-in models how
is this relate to everything we've been
doing and sort of new industry standards
that may be coming out really this is
meant to be a superset of a lot of
things for us it's very important to
support what makes sense but at the same
time providing a sort of native
engineering support for multiple plug-in
models at the core application level is
really really kind of a lot of work by
building sort of a superset glue layer
we hope to support more plugin api's not
fewer and also leverage those efforts
across multiple applications and to this
end it actually is designed to support
adapter or host plugins so you can sort
of have that indirection of you have a
per application you'll have a
provocation plug-in and then an adapter
layer that goes off to sum up to some
other vendors code and and we're
certainly intending to provide those
adapters for the most popular plugins
and support developers in providing more
so might ask when will this be available
we're going to begin to roll this out
later
here on top of Final Cut Pro the first
sort of functional API we're going to
implant is a data interchange model
based around the XML data format that we
discussed earlier this week in brief
we're essentially making the entire
contents of a Final Cut Pro project
available via XML to developers in a
very clean manner and this is really
giving you a programmatic interface to
define commands within the Final Cut Pro
application environment that receive
this data performed necessarily
translations and do something useful
with it we'll be releasing a public beta
this in August that's our current
intention if you watch the Final Cut Pro
website details will be up there also
you can get in contact with developer
relations and they'll make sure you're
in loop and the intention is also to
release this in the final form to
developers and users by the end of the
year early this week one of the
demonstrations at the data interchange
session was automatic duck providing AF
import and export support for final cut
pro AF being the advanced authoring
format an industry standard binary
container format intended to take data
between editing applications that
plug-in is currently sitting on top of a
very early version this plug-in
specification and it's basically great
proof of concept for the whole idea so
sort of now summarize the entire session
here really all of our applications
support plug-in models this is really
very important to us
logic soundtrack and also logic on track
sport audio units shake of course
support shake plugins and Final Cut Pro
supports audio units after effects and
effects script plugins at this time of
course we're hoping to make this list
for overtime both with your help and
with new technologies internally and
really they're important and they're not
only important apple and important to
you but they're really important to end
users really no one tool can do it all
and providing plugins allows end users
to get the tools they want sort of
unique tools that Apple may not develop
to be integrated into that space and it
really gives the end users more control
over their product in the end
and it also just helps advance
technology certainly we're not going to
think of everything and by at least
opening that door it's it's it's
available to serve to put new
technologies and
new workflows together just sort of as
they come up in the marketplace and
really we want more extensive support
for plugins in the future all the apps
intended can support more plugins we're
also trying to move toward modern
plug-in architectures as mentioned
earlier this is not sort of a
replacement for existing models but just
for open the door wider to more more
models and we also really want to be
able to share plugins across multiple
applications we certainly have this
today with Audio Units supported across
logic soundtrack and Final Cut Pro and
this just adds more value to sort of
everyone and so it's really important to
us and and we want to add sort of more
different types of api's again the data
interchange API is an example of this
certainly it's very simple to provide
import and export functionality with
this plug-in API but it's also therefore
sort of integrated tools for data
management and workflow purposes and
really it's sort of camp incisiveness
enough that really input from you is
very important to us
we've certainly done all the research we
have our own ideas but we really need to
know from you what we're missing what
we're not doing so please let us know
what sort of API is you want what are
the development tools you prefer to work
in certainly code warrior and project
builder now Xcode are very popular but
how much of a different what difference
would it make to be running from a Java
IDE or even from Apple script and are
there tools that you want to do that you
just don't know how to fit in the
current framework certainly we might be
able to suggest approaches with current
api's or use that feedback to then
generate new ApS in the future that are
just going to open up a platform even
more at this point like to invite up
Brett Hali to do the wrap-up and do QA
thanks very much this week is kind of
you know the applications introduction
to WWDC we've worked this week to have a
number of different sessions available
to you to show that we really like to
see developers get more involved in our
professional applications be it for with
plugins be it you know cards and and
various type of hardware devices be it
content creation the plugins session
here today isn't intended to be yet
another way that you can get part you
know participate in our applications and
to provide you know new products to our
collective customers and to make you
know this is a great platform if you
have questions about any of these things
we strongly encourage you one please
send us feedback we do have a feedback
address FCP feedback at apple.com
and our industry of an evangelist for
professional availment video is jeff low
and he's the person for you to get in
touch with you should you have more
questions or interest in discussing
opportunities in this space
you