WWDC2001 Session 210

Transcript

Kind: captions
Language: en
well last year I got up here I was in
across the street and showed you what
I'd been working on for the last few
months I know is 10 and it all was kind
of basically working but there were a
lot of underlying things that we were
kind of unsure about in the kernel
scheduling and all that and I'm really
happy now because not only did we finish
everything that we were showing last
year but it's really performing well and
towards the end of the session some
demos that show you exactly how well a
MIDI person is performing for us I know
it's time so before I get there I'm just
going to give you some introduction to
the system services for many some of our
design thoughts and challenges that we
added in getting many running I'm going
to go through the basic concepts of the
API and some of the objects you'll use
in that API along with a lot of examples
as I go through the API and at the end
I'll have some demos so in the large
picture of things
mitties a fairly low-level service it's
not in the kernel like i/o kit is MIDI
through its drivers talks to i/o kit to
do midi i/o but as we saw in the
previous session there are some higher
level services like the audio toolbox
which gives you ways to send MIDI
sequences added MIDI sequences and sends
them through to the MIDI hardware layer
a QuickTime is another example of a
higher-level service that on OS 9 is
layered on top of MIDI and we've done
some work to get it layered on top of
MIDI antenna and I'm not sure if it
actually works there's there's a
component in the current build a voice
down and I'm not sure if it actually
works but it will at some point
so some examples of the kinds of midi
hardware that we expect to be seeing
drivers for well people are mostly using
USB MIDI interfaces these days we're
also seeing some USB synthesizers like
this rolling device I have here it's
connected directly to my power book with
USB we're still seeing PCI cards that do
MIDI and we're starting to see with
devices from Yamaha and others some MIDI
firewire devices for the purposes of the
MIDI API we don't really concern
ourselves with things that connect to
MIDI interfaces like traditional MIDI
devices drum machines keyboards samplers
things that have traditional MIDI cables
on them so when I talk about a MIDI
device in the API I'm talking about one
of these things that connects directly
to the computer and it's controlled by a
driver on the computer so when I talk
about an external MIDI device that's a
traditional MIDI device with a five pin
connector on it so here are the goals
that went into our design for the MIDI
services on Mac OS 9 we started to see
these trends towards hardware and
software developers kind of doing their
own thing to make their own products
work together in a kind of closed system
way and while that's ok I mean that's
there's some innovation that goes on
when people do that it's also a bit
antithetical to the spirit of MIDI in
the first place which was that you could
take a bunch of gear from different
manufacturers plug it in together and it
would all work so what we want to do on
10 is try to support this this kind of
interoperability that we've had in the
past with support for things like
timestamps MIDI interfaces
and to get people to actually use our
services instead of trying to hack down
to the lowest levels of the hardware or
the OS like we've had to as many
developers in the past our goals are to
really get performance where it needs to
be with with good highly accurate timing
both on recording of input and
performing outgoing MIDI our goals are
to have really low latency with through
times of under millisecond and low
jitter in the hundreds of microseconds
range and I think we're pretty close to
those numbers as I'll show you at the
end of the session we want to present to
the user a single system-wide state we
don't necessarily want to dictate the
user interface because MIDI users might
be using the fairly simple program they
don't want a user interface designed for
helping him name his 55 synthesizers but
then again there are people with 55
synthesizers and they want to name them
all and so we've taken the middle path
by providing a central system database
but not imposing a user interface on the
way sorry about them skipping ahead
rather quickly here a single system wide
state so that all your MIDI programs
will see the same devices with the same
properties and developers can add their
own properties to the devices to make
the system extensible
so towards those goals we have a driver
model so that you know the hardware
manufacturers can write drivers the MIDI
server which we'll get to in a moment
loads those drivers and then all the
applications can share their access to
that hardware everybody consent to that
same destination at once everybody can
receive from us from a source at once
and like I just mentioned we we have a
central device database and we provide
for time-stamped input and scheduled
MIDI output and we also have some
features for inter-process
communications so if you have for
instance a software synthesizer you
could create it as a virtual mini
destination and have it show up to other
applications is something you could send
MIDI to some other uses of that might be
MIDI effects and so I'll get to that
towards the end of the talk so those are
the major features
here's a picture that gives you an
overview of how things are actually
implemented the horizontal gray lines
are address based boundaries which are
kind of challenges for the
implementation that's where we have to
move data between address spaces and
really efficient manner and I'm happy to
say I think we're doing that really well
right now at the lower level we've got
the kernel with IO ket above that we've
got many drivers which are typically
ìokay user clients and they're loaded
and managed by a MIDI server process
which gets loaded automatically by the
corn MIDI framework which are
applications linked with in their own
address spaces on the right I've got
QuickTime here linking with core MIDI
just as an example of how QuickTime is
just another application in this in this
model so yeah the core midi framework is
the the purple boxes there and that
manages communication with the midi
server using mock messaging to very
efficiently move your data back and
forth between your application and the
server
so those many frameworks such as Carmody
there's actually two of them they're
implemented as mokou libraries there's
the core MIDI framework which is what
applications link with and as I just
mentioned that framework allows your
clients it implements the API to the
MIDI server using mock messaging to the
server process
there's also a second framework called
core MIDI server which is for the
benefit of driver writers this framework
actually contains the entire
implementation of the MIDI server the
MIDI server itself is a main function
that jumps into this framework and the
framework loads drivers and then the
drivers can link to the framework to
make calls backs into it so that way
drivers have access to almost the entire
API I'm not going to go deeply into the
process of how to create a driver it's a
little more esoteric there aren't as
many of you who are gonna be interested
in writing drivers but it's helpful for
application riders to know what the MIDI
drivers are doing because they kind of
show that they kind of set up all the
information that you end up seeing in
your application and you'll have to
figure out about installing them and
that kind of stuff as you start
developing apples going to provide and
does provide in ten point out a MIDI
driver for the USB MIDI class standard
that's in the build now unfortunately no
devices that I know of are shipping they
use it but someone's got to do something
first so we're providing that driver for
other pieces of MIDI hardware we expect
manufacturers to be creating drivers I
know that several of you hardware
manufacturers are doing that
and I would ask you application
developers who are eager for drivers to
get in touch with the people who make
hardware and ask them to please give you
a beta driver or something so you can
develop and you know then then you'll
buy your heart buy their hardware so any
any case drivers get installed into the
system library extensions folder they're
managed by the MIDI server as I
mentioned they use the CF plug-in
mechanism which is a little daunting at
first but we've got some example code
that makes it fairly easy to get your
first driver up and running and for most
drivers no kernel extensions are
necessary a USB MIDI driver for instance
works entirely as a USB user client so
there's no no kernel extension needed
there the basic functions of the MIDI
driver there aren't very many all it
really has to do is look for hardware
and once it's found it send and receive
many messages typically using i/o kit
when it finds hardware it creates these
objects called the MIDI device the MIDI
MIDI entity and MIDI endpoint objects
and it sets their properties and those
are objects that you'll see in your
application so let's look at what those
look like from the bottom up there's a
MIDI endpoint and it's simply a MIDI
source or destination it's a single 16
channel stream so you don't have to
worry about channel 72 on something you
know you basically speak the standard
MIDI protocol over that stream
the next layer up is the MIDI entity
which groups endpoints together this is
useful when applications want to get an
idea of which endpoints go with each
other you know from the point of view of
for instance a patch librarian program
it's nice to be able to send messages to
a device and get messages back in a
totally flexible world you could have
the users say yeah the out goes out port
one but I'm getting the in back in on
port eight
to provide useful defaults for people
it's it's nice to be able to group the
endpoints like that so that an
application can make reasonable default
assumptions about how to talk
bi-directionally to a device so an
entity and this is a terminator Ambala
borrowed from the USB midi class spec an
entity is really just one sub component
of a device so some examples of that
would be for instance an 8 in 8 out
multi port interface like e magics
United could be seen as having eight
entities each with one source end point
and one destination end point another
example would be a hypothetical device
that had a pair of MIDI ports in it and
it had a general MIDI synth in it and
conceptually those are two distinct
entities and your your software might
want to present them as such so the next
level in the hierarchy above the entity
is the device which is something that
you would represent by an icon if you
were going to try to draw a graphical
view of what's there
devices are created and controlled by
drivers and they contain the end of the
objects
so now that we've seen those basic
objects that the driver populates the
system with we can start to look at how
you begin to sign into the system and
build the objects through which you
communicate with the MIDI sources and
destinations in the system so with OMS
and mini manager actually you had to say
OMS sign-in or MIDI sign-in before
anything else would work that's not
totally true here you can actually
interrogate the system before making
these calls but pretty early on in your
program because you won't be able to do
any i/o until you've done this you'll
want to call a MIDI client to create
passing it a name and a function pointer
in this case it's called my notify proc
and this function will be called back to
tell you about when things change in the
system the last argument to MIDI client
create is the client ref which you'll
store somewhere in your program and
using other calls
once you've sorry I'm skipping ahead of
myself a little more about the notify
proc that you passed to MIDI client to
create it gets called back in the same
thread which called MIDI client create
which should ideally be your programs
main thread we may have some more
fine-grained notifications in the future
but right now there's only one which
says something changed and that may be a
device arrived a device disappeared some
endpoints on a device disappeared or
appeared or the name of something
changed that's what this message here is
kay MIDI message setup change something
about the system is changed so if you're
cashing in your programs variables a
picture of what's in the MIDI system
this is your message to say ok it's time
for you to resynchronize your your
variables with what's in the MIDI system
once you've created your MIDI client
then you create MIDI port objects and
these are objects through which your
client actually sends and receives
Smitty not to be confused with MIDI
hardware ports like you would see on a
MIDI interface these are more like if
you remember MIDI manager the the little
virtual triangles on the outs on the ins
and outs of your program another analogy
is mock ports they're your those are
your programs communication receptacles
you can think of them as to create an
output port before I mentioned that one
thing to know about output ports is you
only need one to be able to send to all
of the MIDI destinations in the system
one port can send to seven different
destinations you're one of the arguments
when you signed is which destination
you're sending to the only time you need
to create multiple ports is if you have
a kind of component oriented program you
know maybe you've got five different
separately coded parts of your program
and they're all acting very
independently of each other you you and
in that case perhaps have five output
ports each sending in a separate thread
even and what would happen is that the
MIDI server would then merge the output
of those five ports so in any cases
where you're sending in multiple threads
especially where system exclusive
messages are involved you need to be
using separate ports as a really common
example you might have a MIDI through
process in your program that's taking
everything that's coming in and sending
it right back out and elsewhere in your
program you might have a MIDI sequence
player and that's sending its own stream
with MIDI out those streams need to be
merged in case their MIDI system
exclusive messages
so by sending them through separate
ports the server will perform that
merging for you
similarly MIDI input ports may receive
input from all the sources in the system
a port is basically a binding well it's
an object that contains a connection
point for input and it gets bound to
something called a MIDI read proc which
is a function that will get called when
input arrives at that port so the
arguments to MIDI create and or I'm
sorry MIDI input port create our name a
function pointer called my read proc in
this case a null ref
user client data pointer and then you
get back the MIDI port ref which is your
input port and you would say that away
for use in other calls
okay before we look at how to actually
perform midi i/o let's look at the data
structures in which MIDI messages are
sent and received we have this thing
called a MIDI packet list which provides
a list of time-stamped packets to or
from one endpoint and you'll use this
both in receiving MIDI and sending MIDI
it's a variable length structure which
in turn contains multiple very length
variable length structures so the first
argument is simply the number of packets
and then it's followed by MIDI packet
structures which are variable lengths
and there can be any number of those the
MIDI packet structure contains one or
more simultaneous MIDI events events to
be played at the same time it is pretty
close to being a standard MIDI stream
with a few exceptions we don't allow
running status because that just makes
it hard for anybody in the chain to
parse through it efficiently you know
it's it's okay to use running status
when you're sending to the hardware and
you're trying to get the most for your
bandwidth but there's no reason when
somebody might be walking through your
packets looking for note offs to use
running status it just makes things
harder so in the MIDI packet we don't
want to see running Status Messages the
other limitation is although a MIDI
packet contained can contain multiple
MIDI messages like you could put you
know 17 note offs into one packet when
system exclusive messages are involved
that packet needs to contain either one
complete system exclusive message or one
part of it and nothing else you can spit
split system exclusive messages across
multiple packets if they're large for
instance
but it's it's for similar reasons
involving making it easy for anyone who
has to parse that packet to look at it
there's a limitation that we don't want
you to mix system exclusive messages
with other many messages within a packet
okay about the variable length nature of
these packets the first half of the
slide shows an incorrect example of how
to read through a mini packet list and
what's incorrect about it is that it's
treating that packet member of the
structure as a fixed length object it's
just you know it's treating the packet
list as an array of packets but that
doesn't work because the packets are
variable length so the right way to do
it is to first get a pointer to the
first packet in the packet list walk
through each of the packets in the list
and then you use a very efficient helper
macro called MIDI packet next to get to
the next packet in the list and because
these variable length structures are a
little annoying to deal with we've also
provided some convenience convenience
functions for when you're building them
up there's many packet lists in it and
maybe packet list ad the way they work
in this example we're creating a 1k
buffer on the stack and casting it to a
mini packet list
so basically we're saying here's the
mini packet list that can be up to 1k in
size then with mini packet lists in it
we're setting it up so that it contains
no packets we're getting back a pointer
to the first empty packet in the list
then when we call MIDI packet list add
to add a node on event that note on
event with its timestamp gets appended
to the packet list
and you know if we have an array or if
we have a list of you know 50 notes that
we want to add or other events 50 events
that we want to add to this packet list
we can successfully attempt to call MIDI
packet list ad until it returns null in
cur packet and that's our clue that the
packet has become full and it's time to
send it so that's a useful way to build
up a packet dynamically with correct
syntax and this is just a summary of the
last two slides that convenience
functions okay now that we've looked at
the actual format of our MIDI data and
we know about the MIDI endpoints and
sources and destinations that we see in
the system we can look at the functions
for getting information about those
sources and destinations and actually
communicating with them this example
here shows the two functions for
iterating through all of the MIDI
destinations in the system there's MIDI
get number of destinations and MIDI get
destination which just takes an index as
as its argument a zero-based index and
in this example we're calling MIDI send
which is the basic I want to send MIDI
function obviously it's MIDI son's first
argument is the output port that you
created that at the beginning of your
program the second argument is a
destination and the third argument is a
MIDI packet list so in this example
we're sending some arbitrary MIDI packet
list to all the destinations in the
system and here's a pretty parallel
example of how to find all the MIDI
sources in the system and establish
input connections to them the calls to
iterate through the sources are MIDI get
number of sources and MIDI get source
and inputs a little different from
output in that we can always send to any
destination there's no big deal about
that but when we want to get input from
a source we have to tell the system I
want to listen to that source because
otherwise we might have a situation
where three or four MIDI programs are
running and there are five MIDI
controllers connected and someone's
banging on all those controllers and yet
each program only wants to be listening
to one of them and it's best for system
overhead if clients are only delivered
messages from the sources that they're
actually interested in listening to so
we require before a client gets any
input that it will explicitly ask for
input from from that source so that's
what that call MIDI port connect source
does there's a parallel called MIDI port
disconnect source the last argument to
MIDI port connects source is a reference
constant which will come back to your
read proc and if you'll recall the read
proc was an argument when you set up
your input port so at the bottom there's
the prototype for the MIDI read proc
which is your call back function to
receive many it gets called back in a
very high priority thread that Creek
gets created on your behalf by the core
material synchronization issues with the
data that you're accessing in that
thread
okay so we've looked at how to walk
through the sources and destinations in
the system and send and receive MIDI to
them we can also look at the
higher-level structures in the system
which are the devices and entities that
the drivers created using MIDI get
number of devices MIDI get device MIDI
get number of entities and MIDI get
entity that's all really pretty
straightforward so why would you want to
do that there's there are times when you
want to walk through the endpoints the
actual MIDI sources and destinations and
those will exclude the endpoints of any
device which is temporarily absent from
the system which is good it will include
virtual endpoints created by other
applications which is good and that's
what you want to do when you're trying
to figure out what sources and
destinations you can talk to now there
are other times when you might want to
draw the user a picture of what's out
there you know I see this device I see
that device it's got these entities in
it and so forth and that's when you
would walk through the devices and
entities in the system you will see the
devices which might be temporarily
absent you won't see any virtual
endpoints because they're not really
associated with any devices at all and
so again that's useful if you want to
present some sort of configuration view
of the system
so speaking of devices entities and
endpoints they all have these properties
the core audio framework has properties
on devices as we saw the audio units
have properties this is a concept that
we've used rather pervasively as a way
to extensively add information about the
objects in the system typically drivers
will set attributes or properties on
their objects when they create them and
typically applications will just read
these attributes but the system is
extensible in that applications can add
their own custom properties to devices
if they want to do that an important
feature of the property system is that
properties are inherited down the
hierarchy from devices to entities to
endpoints so in this example you can see
that the device and the entity and the
endpoint all have different names
you know the device's name is X 9 9 9
the entity's name is port 1 and the
endpoints name is port 1 in but you can
see that the manufacturer and model name
are defined by the device and neither
the entity nor the endpoint override
those properties and so you could ask
the endpoint what is your manufacturer
and it would say well I don't know
but I know that my devices manufacturer
is X Corp so I guess that's mine and
similarly the endpoint is inheriting the
sis X ID of 17 from the entity
so some of the common properties that we
define are obviously the name of the
entity or object devices have
manufacturer and model names and sysex
IDs as I just mentioned in the previous
example some other slightly obscure but
actually kind of important properties
that I'll get into a little later are
the maximum transmission speed to a
device and this is important when you're
sending sysex because MIDI is a one-way
protocol in a lot of cases a lot of
cases their devices that you send you
know 100k of samples to and you just
expect it to catch them all and you're
not gonna have any way of knowing from
the sending and whether it actually got
it or not and before we had high-speed
transport media other than the MIDI
cable this wasn't really an issue
because you know the MIDI cable could
only go at its speed but now that we
have things like USB devices in between
our computer and our MIDI cables it's
important that the computer not send
more than 30 1250 bytes per second the
speed of many to an old MIDI device so
that's a property of a device that we
can interrogate and I'll show you some
examples of using that a bit later
another property that you may see on
some devices is a request from it's
driver that you schedule it's it's
events a little bit ahead of time for it
and that's something else I'll get into
in a moment
so continuing just on properties in
general here's an example of how to get
a string property of a device we use
MIDI object gets string property and
that works for any of the objects in the
hierarchy devices endpoints or entities
we're passing the kvety property name
constant to say which property we want
we're getting back a a corefoundation
string CF name in this example
converting it to a c string and using
printf to put it on the console one
important thing that's illustrated here
is that a number of the middie calls i
think it's all in the property world
will return corefoundation objects and
when you get a core foundation object
back from a MIDI API it's your
responsibility to release it because
you're being given a new reference to it
but fortunately things are a little
easier with numerical properties because
we just simply return a signed integer
for things like this property here which
is the advanced scheduling time in
microseconds for a device and this is
what I referred to a moment ago about
how some drivers wish when possible for
you to schedule their output a little
bit into the future and I'm going to I
will talk about that in a moment but
first we have to understand how many
deals with how the MIDI system expresses
time we have a type called a MIDI time
stamp which is simply an unsigned 64-bit
integer it's equivalent to host time I'm
sorry up time except that uptime returns
a structure containing two 32-bit values
that you have to do some shenanigans
with to get into a 64-bit integer so
we've chosen to use 64-bit integers
because you end up doing a lot of math
with these numbers and it's no fun
converting them to structures and back
so we've got our own versions of these
calls that used to be in driver services
to get the current host time and to
convert it back and forth into
nanoseconds and again the host time is
what in the old days we called up time
and I guess you can still call it up
time but we call it the host time that's
the basic time stamp that we use
everywhere in the middie services
so when we schedule our MIDI output we
can either say send it right now by
passing a timestamp of zero or you can
say I want to schedule this at some time
in the future using a MIDI time stamp
and what that will do is in the server
process it will add the event or events
to a schedule and this scheduled runs in
a mock real-time priority thread which
means it wakes up really darn close to
when it's supposed to and will propagate
your outgoing MIDI message to the driver
to be sent one thing to be aware of is
that you shouldn't schedule further in
advance then you're willing to really
commit to because at the moment there
isn't a way to unschedule anything so if
the user clicks top and you've scheduled
two minutes of MIDI to be played into
the future it's going to play unless you
shut the whole system down this is
intended just to give you a tiny bit of
breathing room I would say 100
milliseconds is the outer bound of how
far ahead you'd want a schedule I'm
aware of developers scheduling at
smaller intervals and you know that
anything over a couple of milliseconds
will take a little bit of strain off the
system and is helpful it's not essential
to do but this is this is all in the
interests of getting really highly
precise timing out of MIDI hardware that
supports scheduling in advance and such
devices that do have that feature of
being able to accept scheduled output in
advance will put that property for a
minimum advance scheduling time on their
devices so you as an application writer
can check that property and say oh okay
this guy wants his MIDI you know fifteen
hundred microseconds in advance or
whatever his number is and that's your
hint that you can make that piece of
hardware perform better by giving it its
data that much further in
pants similarly our incoming messages
get time-stamped with the same host
o'clock time audio get current Hardware
time if you want to schedule your own
timing tasks you can use the multi
processing services in carbon and I
touched on this a couple slides ago it's
best to schedule your output a few
milliseconds in advance and combine your
multi multiple MIDI events that happen
fairly close together in time with a
single call to MIDI sent you don't have
to do this you're still going to be able
to get pretty good performance without
doing this but when you do do this you
are reducing the system load and there's
yet more CPU time available for other
things like you know intense DSP
operations we are getting really good
latencies as I'm going to show you later
on in moving the data around from place
to place but it is more efficient when
you can bunch up your messages just the
tiniest bit and these are some of the
figures were starting to see in some of
our tests just in the software stack the
MIDI through time is usually well under
one millisecond and our scheduler wakeup
jitter is in the realm of 100
microseconds so if you say I want the
scheduler to wake up at such-and-such a
time these are tests I've run on my
titanium power book here but that's
that's around the time I'm seeing right
now before I actually show you some
demos that illustrate some of our timing
I'd like to touch on a couple of other
things here we have some inter process
communication features so that your app
can create virtual sources and
destinations which other apps including
your own will see just as if they were
regular sources and destinations
here's an example of how to create a
virtual source you need to have that
client ref that you created at the
beginning of the program my client you
give your source a name you get back an
endpoint reference to it and when you
want to emanate data from your virtual
source you make a call called MIDI
received which might seem like a strange
name at first but if you realize okay
I'm mimicking what happens in a driver
when it receives data from a real source
you're saying okay I'm pretending I'm
receiving data but I'm a virtual source
so that's why it's called mini received
it's the same function a driver calls
when it gets data from a real source so
you just pass that the virtual source
endpoint and the packet list of data you
want to send and any clients who are
listening to that virtual source will
receive that data virtual destinations
are the same but backwards you create a
virtual destination asking it your
client give it a name and you pass it a
read proc which will get called when
other clients send data to your virtual
destination we saw earlier in the talk
how a read proc looks and how it gets
called
and the other slightly obscure but kind
of important thing to go over here is
what happens when you need to send large
system exclusive messages as is common
in patch librarian and sample transfer
applications you basically need to slow
down how fast you're sending the data
from the computer and there's two ways
to do that one is to check that property
on the device Kay MIDI property Maxis X
speed and do your own math to break up
the message into chunks so that over
time you say ok every second I'm not
going to send more than 31-thousand or
I'm sorry 30 125 bytes that's one way to
do it another way to do it is to call
MIDI send sis X which runs its own
little thread and does that for you
here's a brief example of how to do that
this function is an example of how to
call MIDI census X you fill out a MIDI
census X request structure with your
destination a pointer to your system
exclusive message its length and a
pointer to a completion function that
will get called when the last bit of
that message has been sent then you call
many census X passing it your request
and it will go off and asynchronously go
send that data as with all asynchronous
functions like this and those of you
have been programming McIntosh for a
long time all know about the problems of
parameter Innes calls you want to keep
around that's the sex send request until
it's completed you know and this was a
bad example in that it's a local a local
variable and it's I'm only vindicated by
the fact that I'm actually pulling at
the end of the function to see if the
request is complete before I'm allowing
that request to fall off the stack more
typically you might put the send request
in a global variable or somewhere else
it's going to persist
the law beyond the function in which you
you call MIDI census X as you saw when I
was polling at the end of the function
on the complete member of that structure
you can look at that to tell when the
functions complete you can also look at
a number of bytes to send because there
you know you if you initially said I
want to send a thousand bytes as those
bytes actually get sent that number in
the structure will decrement so you can
watch the progress if you want to put up
a progress bar but going back to the
complete flag you can set that to true
and the system will say ok I'm not going
to send any more of this you can abort
the request and the core midi framework
will implements this by running a medium
priority thread within your app it's a
little higher priority than your user
interface but it's not a mock real-time
thread by any means ok let's go over to
the demo machine
I have two or three three things I'd
like to show you here
and this is a program that will play
audio
and if I just say
and I can have it play just directly to
the audio how using the hardware timing
characteristics and play that that sound
file as it really should be but I can
also play this audio file synchronized
to MIDI timecode in this example program
since I don't have a MIDI timecode
source piece of hardware that was easy
to carry here I mean this is the lighter
the less the least gear I've ever taken
to a gig and so we have in this window
here a virtual source which is a MIDI
time MIDI time code generator and as we
see here in the playback controller we
have two choices of sync source we have
the SK 500 which won't send us any MIDI
timecode it won't let us think but we
also have this virtual destination sync
source so this is stopped I can start
the file player and it's not going to
start playing because I haven't started
the MIDI timecode yeah
this is unplugged I'm gonna plug it in
right now okay very speed up
slow down the rain
[Music]
slow down 50%
yeah that's that's my first step and
[Applause]
some of you may have seen this program
on Monday morning and avi tavini ins
kena I've been adding features to it and
fun so we've got several components here
at the top we just have a simple MIDI
through generator here we have a MIDI
file player it can send through to the
Mac os10 music sense which is the DLS
the downloadable sample sense that Chris
mentioned in his talk so let's just open
a MIDI file and send it
to the internal sense
what's interesting about this is that
it's this MIDI file player was designed
for playing to external hardware so it's
waking up and saying played us now and
the software synth is responding that
quickly we've got it programmed to be
processing in 64 sample frame chunks
which is every 1 and 1/2 milliseconds
another little thing I'd like to show
you is that this is this is my new
feature in the program I wrote a little
MIDI arpeggiator that will play if I
play chord here
[Music]
the kind of impressive thing about this
to me is that if I set it up with some
drum sounds to get a sense of how how
precisely the Mac is spinning out this
sound
[Music]
[Applause]
okay one other thing I want to say in
this program so here I'd like to trigger
some sounds being locally played here
let me make sure I have local control on
[Music]
the computer or not can you tell me for
sure that both the computer and the
keyboard are are getting level
right now this is a computer alone
I'm gonna add in the keyboard
and I found a sound that's pretty
similar on both of them and is really
percussive this morning and I'd like to
go back to the slides and show you what
I found
oh this is the other test I did okay
well they're both impressive this one on
this test I'm playing a sound on the
piano keyboard the lower graph there is
the note just being triggered from the
keyboard the upper graph that note is
traveling over USB from the keyboard to
the computer into ioq it up to the MIDI
server process up to a MIDI through
application back down to the MIDI server
back down through i/o kit in the kernel
back over USB to the keyboard and we're
getting one to two milliseconds of delay
between those two notes
and this is the one I meant to show you
first
here I was triggering actually this is a
slightly different test I lied this is a
different test I did yesterday but here
I'm triggering both a square wave being
synthesized through the audio Hal and
the Roland keyboard playing a rimshot
and I'm taking excruciating steps to
make sure that they're being triggered
at the exact same time so from that time
we're only hearing one millisecond of
difference between when the sound comes
out of the Macintosh and when the sound
comes out of the synthesizer the
synthesizer being triggered by Middies
getting at first and you know it's
optimized for this kind of thing but
it's still only in the realm of under
two milliseconds between the time we're
telling the computer play this bit of
audio and the time it comes out the
speaker I think that's pretty impressive
I think it's a testimony to the guys in
the kernel on the i/o kit team it's just
an amazing system and I'm really proud
of what what they've done it's made it
all possible for us
so to sum up here the midi services are
available in system 10.0 point x there
is some existing documentation in the
framework header files I believe one of
them is currently header docked
the application one the driver one is a
little sketchier but all that's about to
change at least on the application side
we're going to have some really
extensive documentation those of you who
are working on hardware there's an
example driver and you can get in touch
with developer relations and us and we
can help you with with problems if you
have questions about driver
documentation and there's some examples
in developer examples core audio MIDI
and as bill has been mentioning we are
getting an STS DK out soon and we're
hoping to improve our documentation
there's there should be some more out
really soon now ok thank you very much
[Applause]
if we can have the slides machine up
they'll be good I have a just a brief
walk through some of the Java code that
does a similar thing to what
Doug's demo did just to sort of see the
MIDI side of what I showed last session
and then we'll do some Q&A it just be a
couple of minutes and we're on earth so
this was the where we left off in the
last session of setting up their graph
using an idea graph object we created a
couple of nodes in the graph one for the
software synthesizer and one for the
output you know we told the graph to use
a sound phone file that was a 12 string
sound phone from emia and what I want to
do is just give you some idea of how the
code looks in setting up a real
application but using the Java API it's
quite a number of things I'm making a
couple of cheating points here but
you'll see the you know explain them as
I do it so the first thing I'm going to
do is ask the MIDI setup to get me the
number of sources that are in the system
and then I just print them out and with
just the the single keyboard here it'll
just give me one input no one source if
there's no sources then I'm actually
just going to throw a job of exception
to say well I don't have any input
devices so what are you doing there I'm
going to have a look on these input
sources and I'm going to do the get the
MIDI endpoint I'm just going to get the
first one I'm not going to worry about
sort of either writing through looking
for particular devices and names of
things like that I'm just going to get
the first source at the end point
and I'm going to get the string property
which just gives me back this string as
a CIF string which is a core audio
foundation or something I'm not sure
what's the history anyway it's it's some
kind of string you can convert that into
Java string object which I do down here
how I should have been see if I stream
it I don't think we wanted to use FA
right and then I'm gonna do a MIDI Cline
I'm gonna use a CF string to just call
it a MIDI setup client and then I've got
this Java object here that's called a CI
J raid prop this is a bit easier I have
to look at another monitor so let's have
a look at what this looks like and we've
got a very long disclaimer about how
we'll make no guarantees about any of
this code anyway so this just implements
a Java interface which is the MIDI rate
product and as you see it looks very
similar the execute method that's
defined in this implement in this
interface looks very similar to what
your the function call will look like in
C so you've got your midi input port
you've got your endpoint and you've got
a mini packet list and then in Java we
can just get the number of packets many
packets that are in this packet list I
can get a packet at a particular index
in that list and then I decided to sort
of abstract this a little bit further in
the Java API by actually having another
object called music data or MIDI data
and so my process MIDI event in my other
class which will go back to actually
will just take the mini data itself and
I can pass in timing information stuff
if I want to but I'm not using it in
this particular program so if I go back
to where I was so I'm going to create an
input port and this is going to be my
callback function and I'm calling it
read prop and then I'm going to connect
the source so I want this input to
listen to this endpoint being a good
citizen like Doug told me to do
then my processed MIDI event is really
quite simple I get the input port and
what I want to do here is to get the
MIDI data offset the the structure that
we use here is actually used in a number
of different
places within the choreo Java framework
it's used to deal with some of the music
events that are related to the music
sequence object that Chris discussed in
his previous session and you can have
different types of events in there and
you can have the MIDI data actually
offset in different places in this
structure so you should always find out
where that data is offset where the
first part of MIDI data is and get its
startle length and then I'm just sort of
well that's some debug stuff that isn't
enabled here and then there's a whole
bunch of code here to just go through
and do the right thing if it was a sysex
message it may be split out over a
number of different MIDI packets so I'm
going to make sure that if I am doing
sysex I'm going to actually parse all of
that data and get through it and then
I'm going to look to see if there's a
note on or a note off command button
basically just send that to the synth if
it's not a note on or nod off command
then I'm going to do some parsing based
on the MIDI spec of what kind of whether
it's going to have a two or three by
data segment to it and then just send
that and then all I do is send that MIDI
event to the music synth so it's it's
fairly simple cards you just pull out
the MIDI data out of that packet and
there could be more than one many
messages in their package so there's a
little bit of work you have to do to
just pause it and then I just send that
MIDI data if you look at the interface
of the program it lets you do sort of
some alternate stuff on channels it lets
you do some stuff with transposing the
data and and all that kind of stuff so
this example is a little bit revised
from lots of our burn your developer
section of your CD and we'll put this up
on the website as part of the SDK next
week to help you along with the Java
stuff and it's actually pretty similar
to the C stuff anyway we can just go
back to slides very quickly
[Music]
doo-doo-doo-doo so it's just the same
thing as I went through last session
there's Javadoc available for this as
well and it's really architectural
rather than language sort of specific
documentation that we're generating that
will be available on the website and the
java api presents the same functionality
as the seokgae resources we've got a
mailing list list our Apple comm and
there's also develop a website
developer.apple.com slash audio we're
still in the process of getting that
website up so if you look at it today
over the weekend it may not be the same
as what it will be next week so you
might want to check next week as well
and we'll be getting stuff out there's
some related session information and as
with the end of Friday DVD people you
should look at it
freeze pause that frame if you're doing
any hardware development firewire USB if
you're doing any sort of PCI development
that's got to do with audio you can
contact Craig Keithley he's the
developer relations person for that if
you're interested in getting access for
seating you can contact us at audio at
apple.com and I'd like to thank you all
very much for coming especially late
Friday afternoon
you