WWDC2004 Session 222
Transcript
Kind: captions
Language: en
good morning everybody 9am after the
party not too bad sure we'll get some
stragglers coming in anyway I want to
make one program note if you had to
choose between this session and the GL
shading language session which
unfortunately is at the same time
because we had a last-minute scramble
after both sessions after this session
and the GL went upstairs everyone's
going to be back in the graphics and
media lab if you had questions about
that and had to pick so anyway well I'm
really excited about what we have for
you this morning we're going to be
talking about quartz composer which is a
new tool on the developer DVD pretty
exciting way to play around with all the
technology will have so without further
ado I'll bring up PR le ly of Olivier
Latour good morning everyone welcome to
the quartz composer WWDC session so the
first thing we are going to look at is
obviously what is quartz composer so
it's a brand new tool we introducing in
Tiger developer tool and exciting visual
the visual programming tool and what you
do with it is you create compositions
which basically our little programs that
somehow process and run the graphical
data quartz composer is able to actually
package a bunch of technologies into one
signal environment and you have
technology is like opengl quartz 2d or
brand new technologies we introducing in
Tiger like core video or core image
everything in real time from editing
preview to debugging and the
compositions you create are really easy
to integrate into your applications we
provide an ID palette and we the system
is completely compatible with the coco
bindings the first thing i'm going to
show you are some simple examples of
compositions can we see a demo machine
number two please
thank you so here we have a simple but
typical compositions it contains a mix
of 2d and 3d elements like in that case
we have a sweetie cube that is rotating
with some text on it and in the
background we have some simple text that
is being blurred in real time using some
cordage filters okay I'm going to let
you read to the the end of the text you
are able to this composition is actually
it's at the foundation of it is a simple
2d image that was to which were applied
a bunch of filters and at the end you
opted that completely different results
from the original image which is
animated with the time and it's
interesting to see that the quality of
what you generate with quartz composer
using general lying technologies we have
graphic technologies we have in your
braiding system it's very very good very
good and it's running very fast you
might not see it but at the bottom left
corner we got a frame right counter and
we're running at 60 frames per second
right now because we're sinking to the
display but it would actually run even
faster this composition is interesting
because we have we don't have any image
as its source it's basically generated
from scratch using core image filters
that are assembled and animated with
time and once again you can see that the
quality is very good and it's completely
per pixel computers this release of
quartz composer also allows you to do
some simple 3d animations and which
could be interactive and respond to the
mouse I'm using the mouse right now to
actually move around in a 3d world this
other composition and here we go is
actually a slide show where we have a
folder full of images and the
composition is responsible for putting
them on screen and computing the
transition between each between the
images the last composition I want to
show you is a composition that gets data
from the outside world
the the internet so this composition is
getting the RSS feed from the apple
website and showing the hot news weather
the simple text animation back to slide
please so what are we going to learn to
that basically five points we're going
to have some theory at the beginning
basically the quad computer concepts
then we're going to look at the
application itself we're going to go
through a tutorial for that you can see
a typical example of building a
composition and obviously we'll look at
how to play back compositions in your
applications at the beds of the court
composer concepts we have what we call
patches bachelors are like basic simple
processing units their role in life is
simply to execute and produce some
results this result can be sent to their
output ports which are which we are
simply going to call outputs or to some
rendering destinations then to control
dispatchers you pass parameters input
parameters actually through their input
ports of the patches or we're going to
call input ports simply input so we can
say that factual they're like little
function that produce some results
according to a bunch of parameters and
the current time the ports the input
ports and output ports of the patches
are actually typed which means the kind
of data you pass does better the we can
pass some values like simple numbers
boolean values or strings even colors
and we can also pass around complex
objects like bitmaps OpenGL textures or
core image images let's look at some
example patches the first one we're
going to look at is the Edit full patch
which stands for a low-frequency OC 80
theater the goal of this patch is
actually to produce a cinema a web on
its output which is determined by the
current time the period of the wave the
website so that would be for example a
square wave or sinusoidal wave and it's
included this other patch we have here
does not have an input parameter
it doesn't need any it's simple role is
to produce on its outputs the current
mouse position the x and y coordinates
and the last patch spritebatch does not
have any output port because it is
rendering to a destination and in that
case this patch is actually drawing a
quad at a given x and y position and
with a given color and texture on it so
how do we put everything together well
let's say we have kind of work space and
we put all the patches on it patches we
want to use these patches are going to
have some inputs and outputs we can
build some connections between the
inputs and outputs so that for example
the patch in the middle is retrieving
the data its input data from the other
patches on the left so we can say that
the answer the patch on the further
right is actually pulling the data from
the patches on the left so what we bill
here is a kind of simple data flow model
and this is at the core of the court
composer concept we're going to put
these patches into actually a kind of
macro patch and when it becomes
interesting is that for example the
patch we have at the middle of the
macrophage the big one could actually be
itself another micro patch made of
several patches and some of these
patches may themselves be other macro
patches and so on so what you end up
with is a lyrical patch tree now if we
call the top patch sub macro patch at
the top of the tree the root patch then
the answer your patch tree and the
entire your data flow is it describes is
what we call the court computer
composition now i'm going to show you
the application itself because that's in
a theory for now
so you will find the application into
developer folder applications graphic
tools squad composer so it's a kind of
standard Coco looking application where
you can work on multiple compositions at
the same time and in that case we have
one document for composition so let's
look in detail at this document window
what we have here is the workspace where
actually we put the patches and we
interconnect them on the right we have
the list of patches we can actually use
there are pre-sorted by categories so
that it's easier to find them for
example some of them are going to be
controlling the objects by sending data
for it for the position for example some
other patches are going to be used to
import data like images from 50 from a
video camera then we have several
numeric patches that are used to perform
abuse clematis some mathematical
operations and then we also have
rendering patches because we want to
render something eventually at the
bottom of the list you will find several
categories that start with a dot and
these are all the quarry image filters
which are natively supported by quartz
composer so you can sign here all the
compositing filters distortion effects
generators and so on ok now we're going
to build a simple composition the first
thing we want to have is actually render
some single screens so I'm going to go
to the renderer and pick up for example
the sprite it has more parameters than
the abstract version you've seen on the
slide but it's not very difficult to
understand if you can wait what they're
that we have basically a bunch of input
parameters to define the position
orientation of the sprite and then some
other parameters to define the width the
height and also the kind of texture you
want to have on it we can run the
composition because as I said at the
beginning this entire system is
completely real time now we have our
sprite I would like for example to
control the position of the sprite with
the mouse so for that i'm going to use
the mouse controller and drag and drop
it there now I would like to have the x
and y position of the sprite to be
determined by the mouse coordinates so
for that I actually drug connection from
the x and y outputs of the mouse to the
x and y inputs of the sprite and now i
got my sprite which is kind of following
the mouth at the same time weird is that
we have a huge trial the reason is that
we are not clearing the rendering area
so each time the sprite is drawn over
the previous frame and obviously we want
to add something that looks a lot nicer
for that i'm going to go back to the
renderer category and pick up the clear
patch which obviously clear the screen
but you will notice that i don't see my
sprite anymore the reason is that since
we're performing drawing operations wait
to define which ordered operations are
performed and in that chest you will
notice on the top right corner of these
kind of patches that they have a number
indicating in which order they are
executed we have number one here number
two there and obviously we want to do
the inverse which is rendering first I
mean clearing first the rendering area
and then drawing out sprite for that you
display a contextual menu on the sprite
you are interested in and you can change
it's a rendering layer which basically
defined the order and now it's working
fine the next step I'm going to use the
third patch and we introduced earlier
which is the Edit foot patch so the
electro is outputting a web according to
the time and I'm going to drive the
let's say the width of the sprite with
that wave so now we get we can see
something that is animated I might want
to also drive the height of the sprite
with that web but do some other versions
on it the way you would do it for
example is use a matte patch we go and
I'm going to take the value from the
output of the LFO turn it through the
mat patch and get the fur the next
result and connect it to the height of
the sprite now we've seen so far how you
connect patches and have values being
transmitted dynamically through pact
between patches but
of course you might not want always do
that and have some value is defined
statically so how do you do that well
you can simply double-click on an input
which is going to bring up an editor and
then you can set the value on the input
so in that case that's a color input so
we get a color whale and I can then pick
up some green color for example then
another way to edit the parameters is to
show the inspector there and go to the
input parameter Spain where you can see
all the parameters at once you will
notice that parameters that are defined
by because they are connected to some
outputs are not editable obviously so
what I what I might want to do there is
simply I don't know define take the
force P despite not to go below 0 point
5 for a twist for example for tight I
mean I'm to go above 0 point 5 and now
we got something like this let's go back
to the to the slide please
okay we still have some some theory to
look at can we go back to the slides
please thanks so within earlier that we
have what we call a year or she go patch
tree now we are going to have a rough
look of regarding how exactly it is
being evaluated by the suspense so the
top of the tree we have our we have an
acrobat which is basically like the root
patch and then during evaluation this
patch is being traversed and each time
and we're going to the system is going
to execute the sub patches that are
content into this micro patch and each
time one of these sub patches is a macro
patch itself it's going to be traversed
and all its sub patches are being
executive and so on and so on for the
trees drivers upside down now what
exactly happens when we are inside a
micro patch and we need to execute the
set patches the first thing to know is
that not all patches are kind of born
equal we have consumer patches these are
the essential the most important patches
there are the one which renders
something for right now they're all
rendering something to the destination
area which is going to be like a screen
or preview rendering window each time
you render a frame of the compositions
they're going to be executed there are
cyclical you give in a defined order as
we've seen earlier and you can look at
the order by looking at the other number
added on the top right corner and they
actually are the one which pull the data
from the other patches what kind of
other patches do we have well we have
the professor patches which ask which
are kind of slave patches they run on
the mound and they simply execute to
when it when the output when their
inputs I've changed and they need their
outputs to be updated the system only
execute them kind of in a lazy mode the
third type of factors we have our
providers and this one's their role is
to get data from outside sources into
the system in that example it would be a
mouse patch and they also run on demand
which means only when the
outputs are needed okay so now how is
that simple example going to actually
execute as we said earlier the consumer
patches are the most important patches
and they are the one driving everything
so they get executed first and first we
have the clear path to execute then we
have the sprite patches can execute but
this one is interesting because some of
its inputs are actually defined because
by the by the fact they are connected to
other outputs to some outputs of other
patches so before running the sprite
patch the system must make sure that all
the inputs that are connected have
up-to-date values so first it's going to
execute the mouse patch so that it's
updating its outputs that get copies to
the input of the sprite patch and then
we have the elephant patch that is
executive and eventually the mat patch
now all the values on the sprites are up
to date and we can execute the sprite
match let's go back to the DML please so
I'm going to show you a little more
about the evaluation system and how you
can look at it inside the application so
here we have a composition studies far
more advanced than the one we looked at
earlier let's look at it for a second
full screen so it's a kind of DVD scene
if you want that was built using court
composer and we have live video playing
being masks in real time from real-time
coordinate filters also and some flying
hearts and everything and everything is
goofy completely editable for example
which I want to change the title are
just double click on a string input and
I say WWDC 2004 ok now it's very good as
we said earlier we have some it's it's a
patch free gender composition is a patch
tree the way you can look at it is by
displaying the bruiser and here you can
see we have the root patch and I can
look at the background patch the flying
hard patch title and menu some of them
are going to contain other sub patches
so on back to the root patch let's look
at the background patch which is a macro
that was created to simplify the so that
you don't end up with thousands of
patches at the root level you can create
macros and put them in order to clean up
your workspace we're going to see that
later in inner tutorial so let's look at
the exact evaluation of the system in
that case we would have some movie
coming in which is our started DV
progressive 24 frames per second it's
going through this patch here the movie
positioning which actually it's not the
real name but I renamed it in order
simply to put the movie at the correct
position on screen then it goes through
a series of filters to display to
correct the gamma to address the collar
controls and also to to mix the two
masks out the bottom movie we're not
interested in using an image that was
imported from from the hard drive and we
end up with the part of the movie that
we're interested in that its position on
the correct place and mask out then we
have our original background image that
is then some more color corrected and on
top of that we add a kind of white mask
that is going to be used to generate a
hello hello hello and signed a ring and
eventually when we combine everything we
end up with our final background image
now you can see that this is a very part
the tooltip system is very powerful
because it allows you to see exactly
what's happening inside the data flow
but we can even do better if you press
the debug button here what's going to
happen is that it's going to call our
eyes the patches depending on the
weather are currently being executed we
have three colors green means the patch
is currently activated and is being is
running red means a patch is not even
being activated or running or anything
at all for example if I just drag and
drop the patch here nobody's using it so
it just read it's useless
then orange patches are the one that are
being activated but they're not running
why is that because the data here is
never changing so the system is
obviously small enough to only reacts
acute the part of the data so that are
changing which is the case here because
the in the movie images that arrive at
the at the entry of the patch is
changing every frame so this enter 85
here needs to be reevaluated now let's
look at what happens if I change the
color here for example pick up some
green stuff as you can see this part of
the pipeline just blink indicating that
because some parameters are change
somewhere it's revaluing all the
necessary parts back to slices
so up to the tutorial this simple
tutorial is going to take us through
building a composition from scratch and
what we are going to build is a
composition session under a simple
real-time glow effect and we're gonna
learn through that how to build such a
gothic using crow image filters and also
how to render a simple animated cube
that we're going to fit through that
grill effect because we're going to be
wandering in to create a texture out of
it and we're going to obtain the result
that you see at the bottom of the screen
with the original cube and then the cube
with the glow effect on top of it so how
would we do a simple glow effect it's
only a two step process we have the
original image and then on top of it we
put a broad version we add a blood
version so we hope that this we obtain
this nice glowing effect so let's get
started so I'm going to launch quartz
composer start up with a brand new
composition first thing I'm going to do
is import an image to work with so I go
to the generators and I use bitmap with
file there an inspector if I go to the
settings pane I can click on the import
file button and pick up go and pick up
an image file in that case this one then
the next thing I want to do is actually
render this image on screen you can get
started here we go for that I'm going to
use the image render which give us
extensive control over rendering a
simple image on screen and I'm simply
going to connect the bitmap output to
the image input of the image render the
connection is red meaning there is some
conversion going on because the bitmap
we spider generating kind of a bitmap
object and the image renderer except the
core image image but the system is smart
enough to do conversions between the
objects or the various types you may be
transmitting whenever it's possible for
the right just to indicate that a
conversion is occurring is happening
otherwise you would get like green
connection we're not displaying the
entire image because the image render
give a specific control over which part
of the image we want to display the
reason is that
because it's rendering core images Corey
majors may be infinite and you might
often end up with infinite images when
you apply some filters so in that case
you already need to specify I want to
render this area of the image so we
could specify manually here and input
parameters that our original image is
1024 x 768 so that will do for now then
to achieve a blur effect we said we
needed to use a blur and to add that on
top of the original rendering so for
that I'm going to use the Gaussian blur
which is into the coordinates hitters
and the addition compositing batch all i
have to do is collect the bitmap to the
input of the Gaussian blur connect the
result of the Gaussian blur to one of
the input of the addition patch ok let's
that's going to be better there we go
and the result is what we want to
display ok and once i'm done with my
blur i want to add it to the original
image now we got our nice glow aside but
it's if we look at it full screen what I
notice is that it's very bright and one
way to fix that is simply to insert a
color adjustment and by playing on
the gamma before we apply the blur so
I'm going to insert a gun I just before
the blur and then I can play with the ok
the value of the power on the gamma
correction to actually kind of fix the
brightness of the of the glow effect now
it's much better so that will be our
first part of this tutorial second part
is going to be to create some animated
backgrounds because it's obviously more
interesting to have our burlas like
being applied to something that is
animating that something that's
completely static so first thing we want
to do is we're going to just create a
brand new compositions as usual clear
the screen and then when their acute on
top of it
the cube as several parameters and you
can control position orientations and
with the dimensions of every every axis
and so on first thing we're going to do
is simply change the dimensions to make
it a little smaller say all 75 we do the
trick and then we want to have this cube
simply animate with time one way to do
that is would be to use the let's see it
turns interpolation patch which simply
interpolates between two values on a
given duration and I'm going to control
with that the X and Y rotation okay i
want the interpolation to start at zero
in that case we're going to be in
degrees finish at 360 degrees over at
say ten seconds and I want the
interpolation to loop now we got our
knife rotating cube next step is to
usually add a texture on the faces of
the cube so I'm going to go back to
generators and use bitmap which file and
import an image so i should have one for
this we go nice break picture and to
display this kind of brand new image on
the faces of the cube I simply have to
connect the bitmap to the various faces
of the cube
there we go and the last one so it's
getting better but one thing is still
missing it's the the fact our tube is
not even lighted so it's not very
realistic so let's look at the
environment patches we have here and one
of them is the lighting so we're going
to drag and drop it there we don't see
any difference yet the reason is that
you need to specify how the lighting is
going to influence the battery I mean
which patches on the workspace here are
going to be influenced by the lighting
you might not want all the patches to be
specified to be intersected so the way
we do it is because it is by using the
fact that the lighting patch is a macro
patch itself and it's only going to
affect the patches that are inside this
micro patch one way to navigate through
the patch for you seen earlier is to use
the bruiser bear another way is simply
to double click on a title bar in a
patch and then you can go inside it and
if you want to go up one level you just
click on the Edit parent better so what
we're going to do is just cut this go
into the lighting patch and touch it and
I mean past it here we go so now we got
our cube which is lighter than rotating
and everything that's going to be part 2
all right now how do we put the these
two compositions together
what we need to do is produce some kind
of image we're gonna with the rotating
cube that we're going to feed through
the glow effect and it's convenient
because we have a patch specifically for
that which is called which is a
generator patch and which generated
texture with what is inside itself let's
look at the so I just added it to the
workspace and let's look at the settings
so for the OpenGL folks you're going to
notice that you have access to all the
underlying open G of settings in that
yet it's a picture so I can specify the
target the meat mapping the filtering
mode everything so it's important to
notice that this tool even if it's a
daunting to abstract all the various
technologies it encompasses it's not
hiding everything so you can still have
access to the low level settings most of
the time so in that case I'm going to
use a rectangle fixture and i also want
to i will turn it to specify what is the
size of the texture so we can either
hard-coded hard coded or you can just
put 0 0 which in that case sells the
system well generate the texture that is
whatever the size of the rendering
destination so if you're wondering i
don't know 320 x 240 which is the case
for this preview area this picture is
going to be 320 x 240 and so on then
what i want to do is that will click
here to go inside the texture patch
could be what i just created and pass it
there so now on to this texture i'm
going to have my rotating cube and the
next step is obviously to fit that
texture in through the sis to the glow
effect instead of the original image ok
save the razor last 43 remember we set
the dimensions manually here which is
kind of annoying because if you change
some images at the beginning or anywhere
in the tight junction operations you
might want to have these values to be
defined automatically one way to do that
is to use a tool we have around here
which is the
for example texture dimensions as the
name implies you simply pass in a
texture and you can get the width and
height in pixels of that texture so now
it's set automatically and we have our
nice glowing effect there is a plan on
the cube and by playing with the the
power of the gamma you can see the
effect of the glow if I go fullscreen
you're going to notice it's not running
extremely smooth the reason is that
remember our original texture on which
we applied the Gaussian blur is the same
size of the screen so in that case
that's going to be 12 90 x 960 or
something and it's a lot of data to be
processed by the by the GPU because
computing a brewer is a very expensive
operation so there are ways to optimize
that I'm not going to go into details
for this example and I'm simply going to
show you the end result of that
optimization this is the version you're
actually going to find any examples
provided on the tiger DVD example
compositions so this one is the final at
riddles and it's very smooth and we got
a nice flow effect basically the
difference with what we just build is
after the texture after the step where
we generate the texture with the cube we
basically down sample the texture to one
that is always 256 256 and we feed that
downsampled version of the texture
through the through the the gaussian
blur and the glow effect and because
it's only 256 x 256 it's going to be
very fast to be computed back to the
slide please
so now it's time to look at how we are
going to play the composition there are
three possibilities you can use the QC
view which is customized and if you will
provide with court composer and you can
use it directly in interface builder you
can do kind of advanced playback still
in interface builder using the QC pad
controller to use the bindings and then
the final way of playing back a
composition is kind of the hard way
where you have more control over the
composition and that's going to be using
the low-level QC 10 class and you have
to use that programmatically okay back
to the emulation please so let's launch
interface builder the first time you
launch it you're going to have to add
the quad composite palette which is not
loaded by default so for that you show
the the preferences of interface builder
you go to the palette area and you
simply click on ADD and you will find a
quad composer palette into developers a
developer extras palettes and then court
composer ok now we up and running so
let's create an empty application and
I'm simply going to drag and drop an
instance of the quad composer view onto
my window and then display the interface
builder inspector in the attributes pan
you're going to find a load button which
allows us to obviously specify the
compositions we're going to be playing
back so I'm going to take the optimized
version of the glow effect which is
created and now I can just go directly
to text interface and it's up and
running so so far we haven't even type
of seeing a line of code to do all of
this what might work it's worth noticing
it ok that was for very simple playback
now we're going to do something that is
a little more advanced and use the cocoa
bindings
so let's go back for a minute to the
required composer application and I'm
going to hide interface builder there we
go and let's open our optimized low so
that's this one so we said at the very
beginning that the system was completely
compliant with the coco bindings that
means key value coding and key value
observing and to use bindings you need
to use loosely keys to specify the
objects when you look at the tooltips
you will notice that you may not
actually see clearly on the screen but
the third item is the P because each
patch in the system in a batch tree as a
unique key to identify it and then each
input port or output port also has a key
and you can see these kids by simply
displaying the tooltips so in that case
the the key for the patch is rendering
texture underscore one and the key for
its output is output texture so we have
a way to identify ports and and patches
in a tree what we are going to do now is
in order to use that composition with
bindings we're going to transform it
into some parameterize composition get
excited for example we want to we might
want to have to parameterize the
background color or the intensity of the
glow effect how would we do that well
let's go inside the rendering texture
and the color we interested to change is
defined by the clear patch so what we
are going to do is bring that input to
its parent patch so for that I display
the contextual menu on the patch and I
have an option publish inputs and I can
select the input some interested in
which in that case is clear color and
specify a name for the years for the new
input simply color now if I go back to
the parent patch you will notice it now
has a color input which is actually with
corresponds to the the clear color input
of the clear patch that is inside this
macrophage
so that is called publishing inputs
outputs you know an input is published
to its upper level by because it's going
to be like a full field dot instead of
an MP dot let's publish but remember we
said at the beginning we have a patch
tree and at the top of the patch tree we
have a macro patch like any other micro
patch which means we can also publish
inputs and outputs to this micro patch
so let's do it we're going to publish
the new color input we just created to
its parent patch which is the root patch
and we're going to keep the same name
and now I have an input that is at a
very top level and it's kind of an input
of the composition and it's the color
input so you can use the the display pen
here to actually see all the top level
inputs of the compositions which can be
considered like the parameters of the
composition itself let's publish another
input like the intensity of the of the
glow effect let's save the result as our
part 4 ok now I'm going to go back to IB
and this time we're going to do playback
using bindings
to use bindings we need a controller and
we provide one which is the QC patch
controller and which you instantiate
simply by dragging by drug and dropping
it onto the document window remember the
way binding works is that you have a
model in that case that's going to be
the composition then you have several
views that are displaying that are
interacting with that model and
interaction is made through the
controller that is that is standing in
the middle and doing the interment yet
on the controller I can obviously know
the composition and I'm going to load
the one I just created then I'm going to
go back to my QC view just unload the
composition that it on it I news now
bindings so I display the bindings area
of the attributes inspector and then we
have a property that is patch which
obviously determines the patch that is
displayed by the view so I'm going to
bind that to the patch controller i just
created and i want to retrieve the root
patch of the composition and displayed
on the view and the way you do that is
simply by using the touch controller key
which is going to return the root batch
object so remember the tub patch had to
compare the top level of the composition
so now I can test the interface and it's
the same residence for nothing has
changed except we going through bindings
which give us a lot of flexibility
because what I can do now is for example
pick up a slider and use that to control
our power input we just created on the
composition remember it's this one okay
so the way we do that is simply by
binding the value on the slider to the
patch control to to the patch controller
then which we've the root patch object
and then we need to specify the key pass
for that to access that bad color input
if we look at the court composer we can
see that the color input if you look at
the tools if you can see you can look at
the key and you can look also at the key
for the published version of the input
and it's that case it is co somatically
computed by the name so because I use
the name color the key
conveniently set to color it's always
going to be the case unless you have all
right you already have a color input and
then the system is going to ask to pick
up another name for you so we know that
that input is identified with color so
now that gives me the input and what I'm
interested on the input is the value
itself or am sorry it's not actually
colloids its power in that case so its
power and what I'm interested in is the
valuable power input so I add that value
okay now i can run the 9260 attributes
of the slider so that it goes from 0 to
1 only and that it is in continuous mode
okay that's there so now I got live
interaction without typing a single line
of code still of the composition we
created and anime and controlled use i
have in my window let's go a step
further and drive the the color now so
same principle we go to the bindings
area you bind the value to the pad
controller we go you retrieve the patch
object itself and then we say color
switch with the coloring put that value
to reach with the value on that input
okay we can do better than that if we go
back to a composition I can even take go
let's get it running I might want to
change the images the image i use as a
texture on the cube so the way to do
that would be to go back to the cube and
we would like to have the same texture
be put on all the faces on the q and
here we have like six connections going
on if you want to simplify your life and
this happens quite often when you want
to add the same value be said on several
inputs at the same time and when you
change one of these values you don't
want to change for example of six of
them each time for that we provide a
little tool which is the little utility
patch as you'd say which is the input
splitter if you display inspector you
can select the the type of data it's
transmitting so the input filter is so
in that case we want to transmit a
texture and the input splitter is simply
meaning the value that arise on its
input to its output so I'm going to
connect the six meses to that single
output and now i can i can set that
value from one single and foodborne
instead of sending it for six obviously
I can do this and I get the same reason
as before and I'm going to publish that
input and name it texture go back to the
parent patch get rid of that one into
useless now go back to the parent patch
it's right there the texture input we
just created publish it again to the top
level to the upper level I mean and
publishing publishing it one last time
to the very roots I publish the output
instead we go texture and keep the same
end now our composition has three input
parameters the background color the
power of the the glow effect and a
texture on the cube I'm going to set
that and stop it go back to IB we load
the composition on the controller for
the composition is actually stored
inside the controller and set with the
knit side so you need to reload it when
you change it now has my brand new
composition still working the same ok
and in that case I'm going to draw a net
image view and bind it as usual I go to
the bindings panel and display the value
pick up the pad controller which with
the patch and type this texture to get
the texture port text your input port
and that value to a tree the value
itself one last thing I need to do is
make sure the here we go NS image view
is actually editable so now it's up and
running and what I can do is just drop
any kind of image file there you know
and
[Applause]
you see our starting image will affect
unum back to the side please alright so
that was playing back a composition
without even typing a single line of
code but what if you really want to type
some code well you got to use the QC
renderer class unfortunately it's still
very simple to do so only three steps
you need to have an NS OpenGL context
around because this entire system quartz
composer is basically running on top of
OpenGL at the primary backbone and then
we need to have a QC renderer instance
and we simply run a friends using runner
at times the run the right time method
let's look at some sample code so how
does it what does it look like well I'm
going to assume I have an NS open G of
you standing around which is called my
NS sngl view and I retrieve its NS
OpenGL context from it then you create
an instance of the Qt renderer using
that context and using the file the path
to a fine compared to a composition side
somewhere on your hard drive now if I
want to render ten seconds of that
compositions with like 25 frames each
second for one second you can do it in a
very ugly manner in that simple for loop
or you have to do is call renderer you
pass the appropriate time and then you
need to flush the buffers on the OpenGL
context to display what was just drawn
on screen more in a minute about why we
need to do that and eventually when
you're done you just release the
renderer and all the cleanup is going to
be done automatically now why exactly is
the renderer not flushing the OpenGL
buffers itself so that just after a
frame is rendering it will be seen on
screen well the goal is that this allows
a lot of flexibility where you can
interleave the composition rendering
with your own OpenGL code and if that
says I can do some opal Drouin OpenGL
drawing before then we are another
composition and do more OpenGL drawing
afterwards to create kind of underlay
overlay so it's very easy to integrate
composed the quad composer system into
your application if you into your
already existing OpenGL application if
you want to add add some examples from
flying logos all around the screen of
stuff like that you can design them
easily in other compositions and then
import them into your application we've
seen that one way to communicate with
the composition is by doing everything
using bindings in interface builder now
you can also do that programatically and
the equivalent calls would be set value
for input key or value for output key to
which with data from an output of the
composition and the way you would be
doing that is simply you pass an object
corresponding to the value and you
specify the key corresponding to the
input port of the composition or the
output port you may have noticed earlier
that the runner at time takes an
optional dictionary of arguments and
what we are passing here are what we may
pass here actually because this is
completely optional is the current NS
event that is being processed by the
application and you can also pass the
mouse location in normalized coordinates
you find more details into the header
corresponding to this QC renderer class
you may wonder why you have to actually
pass this and it's even stuff and the
damask location once again is optional
but the real reason is that you might
want to have the QC one or system
running in to come online tool or some
application where you don't have any UI
so it's not even it's not even there I
mean this data is not available so the
system cannot retrieve it another reason
might be that you're running that into
your application but you don't want the
QC renderer to actually steal even from
your system so that's the downside of
using a low-level API like this is that
if you playing back a composition that
expect user events and in that case only
the mouse patch I think is actually
using them to obviously detect mouse up
and mouse down you will need to pass
this events manually so what kind of
value should you pass when you use set
value for input key well to each type of
pork there is a corresponding
nsobject you can pass if it's boolean
index or number court simply
personalized number yoga sleep ass and
its core object for co report and so on
what you need to pass for now for the
texture bitmap or image port is you can
pass NS images or in the case of the
image port you can directly past the CI
image you obtained from some other place
so what if I want to control for
grammatically the power of my glow
effect well the way you would do it is
by simply adding a called runner or set
value and the power input is a number
input so as we said earlier we're going
to use a simple NS number for that so we
back up the next number with the value
interested in and we pass that to the
renderer for the correct input key the
system is kind of small enough so you
might actually you don't even need to
pass with NS number anything that will
any objective c object that is going to
respond to float value or double value
or int int value this kind of stuff is
going to work so back to the demo please
i'm going to be going to show you very
simple application of the QC 10 class
and that is going to be the simple
playback application you've seen at the
very beginning of this presentation so
the simple application is basically
playing full screen and compositions
which creating a full swing and expand
GL context create a QC renderer on it
load the composition and play
continuously and it's like two pages of
code and any portion of it is actually
setting up the OpenGL contact capturing
the screen detecting the fact that the
user is drawn dropping a file on the
application icon and so on the quad
composer part of like couple of lines of
code another example is this screensaver
that is now provided with Tiger you
might have seen during an veterans
keynote I think
so what exactly is inside a screensaver
well let's have a look at it so I can
show the original there which is
somewhere into the system folder and if
you look at the package content go to
resources you will find a swell
composition trial and it's right there
so all that animation was done would it
change the corner or something all that
kind of complex looking screen server
was done without typing a line of code
except for the simple playback part into
the screensaver bundle back to the
slides please ok
can we see the slide please
okay thanks and the last thing I would
like to show you is kind of open some
doors or what exactly you can do with
quartz composer and Willian says on the
point that it's really an open system
and you can bring data from many many
places you've seen earlier that it was
there was there were some interactive
compositions retrieving data from the
mouse for example or from an RSS feed
but we can also use MIDI controlling
devices video cameras and even more the
composition we built the compositions
would build kettle to respond to their
execution environment somehow so that
should be the capabilities of the OpenGL
renderer or the dimensions of the
rendering area so demo machine please
okay for the last demo I'm going to
still modify our composition now-famous
glow effect composition and add more
interactivity to it okay so let's see
let's get a picture first that's not
there anymore okay
ok
so for example here I have an immediate
device controller which is simply as a
bunch of sliders and knobs and on its on
it on each of this knob and each of the
slider there is a corresponding midi
controller ID and when you touch them
it's simply signing the current value of
that control to the midi system and we
have here a midi controller patch i can
drag and drop and by looking at the
inspector specifically at the settings
panel you can configure it completely
and i want to listen to my MIDI input
here you can select several sources you
can filter the MIDI channels you want to
listen to and obviously select the
controllers you want to observe in that
case I want to observe controller 32
actually so I'm going to select 32 in
the list and now we have a new output
that was generated with the 32
controller and I'm going to delete the
the control number one it's outputting a
value that is normalized between two on
one cause it's more convenient that the
01 27 values that usually get from media
equipment so i have to do is connect it
there and now i can use my slider here
and as you can see I control the blow
effect simply with
sighs all right let's do a little more
so instead of using a static image or
something that is generated from some
other compositions let's get some video
in so I just use the texture with video
touch we go and I'm going to feed that
on the let's say well the cube the faces
of the cube so as to do is connect this
connect the the texture output to the
texture input of the which was
eventually going to end up on lie on my
cube itself you can see there are
several settings you can set for video
captures and I need to turn on the video
camera I guess let's try this okay I
might have to restart the composition so
it detects the video input okay so now i
have okay- video input there and control
voice via subscribe
now we don't it be cool if this
composition was still working when there
is no video input but what happens if I
actually turn off the video camera and
so the image is still is frozen and when
I restored or caters to some image
remaining try again ok so if no camera
is connected this patch is not going to
output anything so we don't mean it
wouldn't it be nice if we were able to
display some text saying connect a video
camera or something but only if there is
no video camera connected for the way we
detect if there is a video camera
connected or not is by the fact some
object is defined on it out on the
output for that we could use a tool that
is the multiplexer the multiplexer is
simply a number of inputs and you select
by using the source index input you
select which one of these inputs it can
be forwarded to the output so in that
case i'm going to set the multiplexer to
manipulate texture objects reduce the
number of inputs to only to connect the
output of the next lecture to the input
of my of my rendering texture thing
macro connect for example the output of
the texture here of my video texture to
the source number one and on source
number zero let's just put an image with
some text on it so there is a specific
generator for that which create a bitmap
with some text so I'm going to connect
it there ok and I can change the text
like no video ok I have no video on my
cube how would I so it's easy you know
you just return on back the video camera
there it's running ok and I restart now
by changing the input I'm going to get
either the video input all the texts now
we would like you see that to be done
automatically there is one little hack
you can use to do that it's simply using
let's fish it is conditional or logic
here we go
because you can actually connect an
object like a texture or something like
that to a boolean input and basically if
there is an object the boolean input is
going to be set to true and if there is
no objects can be set to false so here
are the simple patch which is doing a
logic comparison between two billions
and I'm going to use that so I'm going
to set it to or comparisons set the
second input too true so it always pass
and connect the texture object to the
input there so what we have now is on
that'd put i'm going to have true if
there is a texture coming or false is
there is no texture and i can take that
boolean and connect it to an index input
and two is simply go electron set to one
and false is going to transfer to zero
which is exactly what I need so now I
got a system that is automatically
defined and if i have a video camera
connected i'm going to get the video
camera image if I don't have a video
camera connected I'm going to get the no
video picture we go I kind of saturated
thing oh it is long
oh that's right that's right I do their
true that is incorrect and go oh oh
that's right yeah ain't that tension
there there we go that's better so that
was kind of an interactive composition
and we can do the last thing I'm going
to show you is how to have that
composition respond to the OpenGL render
capacity capabilities because all that
part here is basically using core image
and it's using in it hardware so it's
only going to work if you have a video
card that is supporting the proper
expansion set so what is the composition
is going to run on it whether it's not
going to crush or anything but all this
part here is going to do strictly
nothing so you might want to display
something else instead so let's go back
to the tools and I have a convenient
opengl info patch which is returning me
for example the renderer vandor or an
renderer name of the door of the car
under the version and can check also for
the existence of various OpenGL
extensions the way you use it is that
you display the settings pane and you
can add some fancy extensions names here
for example I don't know my fancy
extension okay add it there and now you
get a boolean it's a boolean output so
if the extension is supported you get
true if it's not supported you get false
so busey GL ARB vertex program is
supported on this computer fragment
program is also supported but my fancy
extension well is not supported for some
reason okay now what I'm going to do is
do a simple comparison on the renderer
version so that would be numeric and
then conditional so the renderer version
is a number and in that case I think
it's OpenGL one point let's see 1.4 and
what I want to test is do we have an
OpenGL renderer that supports OpenGL 1.4
or later so I just do is greater or
equal than 1.4 and you may have notice
that all the consumer patches
automatically have an unable input added
by the system which is a simple boolean
and you can turn on turn off the
consumer patches from that input so that
enable input is at the top of the patch
now what I'm going to do is connect my
result to this unable input so as we can
see doesn't change anything because if
you see this video card support OpenGL
1.4 now what is for some reason I wanted
to test for the support of OpenGL 1.5
well I would simply set one point oops I
don't even put the one place for them
here we go I would simply put one point
five instead of 1.4 and greater or equal
and okay and I must have done something
wrong here oh that's right that's pretty
bad here we go so now you can see that
simply changing the the version you want
a drag for it's going to turn off
dynamically part of the data flow so
it's important to I'm going to conclude
on that that compositions you create are
absolutely not something that is close
you can really interact with then you
can feed data in you can retrieve data
from then they can be dynamic
dynamically responding to the
environment and it's a great word system
where you can experiment with the brand
new technologies we are adding in tiger
without typing a line of code and you
can enter and you can add them easily to
your applications back to the slides
please
so who you might want to contact about
this there is a threaded round with our
and graphics and imaging evangelists or
possibly myself for more information you
may want to look at at to documentation
we have visual computing with what
composer which is a simple introduction
followed by a tutorial which is pretty
much the tutorial we just did right
there and the other documentation is
image processing with the cordage
framework and it actually contains a
description an extensive description of
all the features provided back your
image which are natively supported by
the system by the courts computer system
you will also find the sample code for
the quad composer player ID mode today
which is going to be on the ADC website
and you will also find some demo
compositions located in to developer
examples quad composer and you may
obviously look at the screen savers