Transcript
>> This morning we're going to talk to you about
the many opportunities that iPhone OS provides
for connecting applications to accessories.
We're also going to specifically point out
the new opportunities that you have in iOS 4.
There are three ways that accessories
can connect to iPhone OS devices.
They can use the 30-pin dock connector,
Bluetooth, and Wi-Fi.
Your application sits on the iPhone OS, and has
access to frameworks and services offered by that OS
such as the External Accessory framework and Audio Services.
Accessories then connect through the hardware interfaces,
the 30-pin with includes UART and USB, Bluetooth, and Wi-Fi.
Over the 30-pin, the iPod Accessory Protocol
or iAP is used to communicate with the OS.
iAP can also be used over Bluetooth.
And then of course, over Bluetooth, we
also offer standard Bluetooth profiles.
Because Wi-Fi, excuse me, uses
TCP/IP and standard network services,
we won't be covering Wi-Fi in detail in this session.
But there is a session on Bonjour that you
can attend if you're interested in that.
I'm going to start by talking about the 30-pin and those
hardware interfaces, and then talk about iAP in some detail.
Brian is then going to come up and
talk to you more about Bluetooth
on iPhone OS devices and the profiles that we support.
And then Paul is going to wrap up at the end and talk
in some detail about the External Accessory framework.
I know that we have a lot of application developers in the
audience, and I know we also have some accessory developers.
There are some of you overachievers that do both.
We're going to have the information for all of you today.
We'll try to be specific about who
we're talking to at which point.
But again, this is information for both
accessory developers and application developers.
So, first, the dock connector.
The dock connector has been standard on iPod since 2003.
We sold a lot, a lot of iPods with dock connectors.
It's also on every iPhone, iPod touch, and iPad.
The available interfaces on that 30-pin are UART and USB
as I explained for communication, and actually, on USB,
the iPod or the iPhone OS device
can be a USB host or a USB device.
We also offer, of course, audio and
video hardware interfaces and power.
Accessories can charge the iPhone OS device.
So, they can charge it and give it power to run off
of or they can be charged by the iPhone OS device.
So, an accessory that doesn't have its own power source.
For example, an FM transmitter or some other small
dongle like maybe a blood glucose monitor, they can get--
those accessories can get power from the iPhone OS device.
For those of you who might be new to accessory development
and who are interested in accessory development,
we do have several reference designs available from some
of the MFi partners that can help to get you started.
Now, I'm going to talk a little bit
about the iPod Accessory Protocol or iAP.
iAP was put in place to allow accessories to
communicate with and control the iPhone OS device.
I'm going to talk about the-- about iAP at
a high level and about some of the features
that I think might be interesting for this audience.
But the full details, all hundreds and hundreds of pages
of stack of it, are available through the MFi program.
And if you are interested in learning about joining
the MFi program, we have a lab session this afternoon,
and there'll be a lot of MFi representatives
there who can help you.
So first, I'm going to talk about standard features that
you can use to communicate or that accessories can use
to communicate with iPhone OS devices that
then make data available to your apps.
These are-- these include: Audio, Video,
Location information, and then some new things
for iOS 4 including Multimedia remote
control, Keyboards, and Accessibility.
Then I'll talk about the possibility of sending
custom protocols over iAP so that an app
and an accessory can communicate
arbitrary data between themselves.
And we'll talk more about that at the end.
So first, audio.
Audio is routed to and from your application, to and from
your accessory via the most appropriate path by Core Audio.
There are several hardware interfaces available for audio.
They're listed here.
You can see that there are analog and
digital forms for both input and output.
I thought you might be interested in seeing
which devices support which audio interfaces.
They're listed here.
You can see that line out and the headset and USB
audio out are supported on all of our platforms.
And the Bluetooth audio was added with the 2nd
generation iPod touch and with the iPhone 3G.
Audio input is in a bit of a transition right now.
So, some of our devices supported analog line in.
So all of the iPod touches do and
iPhone through the iPhone 3GS.
iPhone 4, however, does not support an
analog line in and neither does iPad.
Those two devices support USB audio
in and digital audio input path.
And we were able to get that feature back also
into 3rd generation iPod touch and the 3GS iPhone.
We've had some exciting developments in the
video space with iPhone OS 3.2 and iOS 4.
We have the possibility for applications
to send video to an external display.
I think this creates a whole lot of opportunities for you.
The applications to find out about attached
displays including the bounding rectangle,
whether they have been attached and
when they're detached via UI screen.
And really thinking about it, I think there are a lot of
things that applications might want to do when attached
to a large monitor such as the television, perhaps,
to other monitors and cars and airplanes, you name it.
So, here are the hardware interfaces
that are supported on iPhone OS devices.
You can see that analog is supported on all the
devices including composite, S Video and component,
and that we also are able to send video out
through our VGA Adapter on iPhone 4 and on iPad.
Now, I'm going to talk a little bit about location.
Accessories can communicate with iPhone OS
devices and provide them location information.
So, you can imagine a car or a GPS puck like
a Bluetooth puck or a GPS hand-held device
that might have nice big GPS antenna,
can provide very valuable,
very accurate GPS information and
heading to an iPhone OS device.
This includes an iPod touch.
So, what that means for you as an app developer is that
your app shouldn't rely on the fact that you're running
on an iPod touch and not use location services.
These accessories are getting more popular.
We're very excited to see what
developers are getting into the market.
And your device very might-- very may well have an attached
accessory that allows it to have location information.
New for iPhone OS 4 is the ability for your app
to receive multimedia remote control commands
such as play and pause, next and previous.
These commands previously were reserved for only the
iPod application, but now, if you have a multimedia app
that registers for these commands,
you can also receive these commands
from standard accessories that are already in the field.
Of course, new accessories as well.
So, again, speaker docks, cars, and a very popular
accessory, the headphone remote and mic system
that so many people own already, sends these next and
previous commands and they can go straight to your app.
So, you can imagine listening to music in your app
and having a user just be able to use their headset
to press pause and having the music pause in your app.
I think this is a great addition.
And I'm excited to see a lot of apps develop
or, sorry, support this API in the near future.
Next are keyboards.
With iPhone OS 4, we expect to see a lot
of new keyboards, both 30-pin keyboards
and standard Bluetooth keyboards in the market.
For application developers, this
doesn't mean much at the surface.
You're still going to get keyboard events
the same way that you've gotten them before.
Now, you'll get UIKeyInput and UITextInput events.
So, it doesn't mean that you'll get a new
kind of event, but what it does mean is
that accessory developers might be coming up with
some great new form factors for keyboards or keyboards
that are especially suited for particular
applications or particular use cases.
I look forward to you guys using your imaginations
on this one and seeing what you can come up with.
I'm excited to talk about accessibility.
We just had some discussion of accessibility
in this session right before this one.
Accessories can now leverage VoiceOver
technology to communicate with the phone
and completely control the iPhone OS device.
They can send commands such as moving to a particular X,Y
coordinates or moving to the next or previous UI element.
And they can send those commands remotely,
and VoiceOver will perform those actions,
and of course, use VoiceOver to provide feedback.
This is exciting because you can imagine an accessory like
a joystick or a trackpad that sends these commands remote
from the device, and then the device responds appropriately.
Especially coupled with the video output
capability of apps, I think that there's a lot here.
This is revolutionary for our users that can't see
or don't have the dexterity to manipulate the device.
But I think that there are a lot of opportunities outside
of the traditional accessibility space with this feature.
Now, what can you as an app developer do?
You can make sure that your app is accessible.
So, this means using the UIAccessibility API.
For most of you, all that means is you need to fill-in a
few missing labels for UI elements that you might have.
This should be a fairly easy task.
And then whatever accessibility accessories come to
market, your app will automatically work with them.
Now, we're going to move on to custom protocols over iAP.
In the iPhone OS 4, we introduced the opportunity for apps
and accessories to communicate directly with each other,
again, to send arbitrary data between the app and accessory.
New to iPhone OS 4 is the possibility for
multiple apps to talk to the same accessory
or multiple accessories to talk to the same apps.
We've got this request a lot of times,
and we're really excited about it.
We think that there will be a lot of new innovation in this
space as you guys mix it up in the app and accessory worlds.
There's also the possibility under a very limited
circumstances for an accessory to launch an application.
You can imagine a specific, you know, blood pressure cuff
or something that gets plugged into an iPhone OS device.
And the reason it's getting plugged in is
because the user wants to use it immediately.
So, it's appropriate for an app to be launched.
And I think that this is very help-- can be very helpful
for the user and eliminates some steps especially
for those who have a lot of apps on their device.
Last feature is the capability for an accessory to
declare that it doesn't need an application to function.
So, we don't pester users by asking them if they would
want to go to the app store too many times, if in fact,
your accessory works well without an application.
Now, Paul is going to talk in detail
about these custom protocols over iAP
and what we call the External Accessory framework
in iPhone OS that allows these protocols to work.
And he'll talk about them in a lot more
detail in the third section of this session.
So, in summary, we've added a lot of features for you
to iOS 4 to allow accessories and apps to communicate.
We're excited in particular about multimedia
remote control for multimedia applications.
We're excited about seeing new form
factors of keyboards, 30-pin keyboards,
and Bluetooth keyboards that your app can then use.
We're definitely looking forward to seeing what happens
in this accessibility space which
I think is wide open right now.
And we have several new features for you in what has proven
to be a very popular area of using
custom protocols over iAP.
So with that, I'm going-- well,
first, I guess, mention reference--
some sessions that might be of interest to you guys.
If you were interested in some of the specific
technologies that I talked about, first for Wi-Fi,
next for accessibility, and then the third one, if
you're interested in learning more about Core Audio.
I'll now hand over to Brian Tucker who's
going to talk about Bluetooth on iPhone OS.
[ Applause ]
>> Good job.
So, I'm Brian Tucker, and I'm the Senior Engineering
Manager for Bluetooth Technologies on all iOS platforms.
That includes iPod, iPhone, and of course, the iPad.
So, what are we going to get out of this session?
Well, fundamentally, two groups hopefully
will benefit from these 20 minutes or so.
First is accessory manufacturers.
Obviously, Bluetooth is really not very interesting
if we don't communicate with the particular accessory.
And then application developers.
So, we have some technologies in the iOS
platform that give you the ability to communicate
with your accessory directly or with another iPhone.
So, for example, GameKit.
We're going to talk about some things that you can do to
improve, hopefully, your game and the network throughput
that you're getting in your game in these 20 minutes.
So, kind of briefly talked about this, but we're going to
talk a little bit about the OS implementation of Bluetooth,
and then I have some tips and tricks relating to
a few areas where hopefully it will benefit you
as the accessory manufacturer or
as an application developer.
And ultimately, I think the big point
here is that we as a community have
to come together both the accessory
manufacturer, application developer,
and within our engineering group to
create great Bluetooth platforms.
We're only so good as the product that we're interacting
with or the customers interacting with at a particular time.
So, we need to work together.
So, what are we going to be covering today?
So first off, I'm going to cover what's
new in the iPhone OS 4, specifically,
I'll go through a few of the key
things that we're excited about.
Then I'm going to give you kind of the state
of the union of what we have in iOS 4 for now.
And then we'll go into my tips and tricks.
And I have three things that we'll be covering
specifically with general Bluetooth, with GameKit,
and with some coexistence discussions,
and we'll get into that more.
So, what's new with iOS?
So, Emily mentioned keyboards.
And of course, we've added HID keyboards with iPad.
But now with iOS, pretty much with any iOS
application, we support Bluetooth keyboards.
And this is using HID, the HID profile.
And pretty much anything you can do with the HID
keyboard you can do with an iOS piece of hardware.
So, last year, we introduced A2DP or
Advanced Audio Distribution Profile.
And this was the ability for you to stream
audio to an accessory, a car or a headset.
Now, when you do A2DP, you have to do
a codec called SBC or Subband Coded.
And it's OK.
But to get a really decent audio quality, it takes about
330 kbps data rate in order to achieve an audio quality
that we feel is acceptable for the customer.
So, we thought, well, how do we implement a better
implementation that gives a lower throughput usage,
better coexistence model, better power consumption
footprint, and we're using AAC for this.
So, with iOS 4.0, we're now providing
an endpoint that does AAC encoding.
And as you can see here, it's a 44.1 encoding, 128 kbps,
and it is VBR, so we can update data rate dynamically,
and we will update that data rate
dynamically depending upon coexistence.
We've also added voice commands over Bluetooth.
Now, we kind of had this in the past with
the Bluetooth voice activation command.
But with iOS 4.0, we're now providing a way for you to press
the Home button on your device and start the voice command,
and if you're connected to a device that doesn't necessarily
support voice commands, we'll go ahead and enable that,
and we do some trickery on the phone
to enable the voice commanding car kits
that wouldn't necessarily be able to do this.
We've added in-band ringtones.
So, that awesome industrial death metal
ringtone that you have on your phone,
you can now hear in your car probably
when your mom's in the car.
Braille keyboards.
So, what's interesting about Braille keyboards
is they don't use HID, they're not a HID device.
Typically, they're based on the Serial Port Profile or SPP.
And we've implemented a Braille implementation on a
lot of the real popular, there's quite a few of them,
but a lot of the popular Braille keyboards in the market.
And we've added support for multiple handsfree and A2DP
devices to be connected simultaneously to the device.
A kind of a cool factor here is that you can have
a headset in your ear, you can get into the car,
the headset won't disconnect, the car will connect, and
now you can route automatically the audio between the two.
If you want to have a private conversation, you can
have it in the car, or you can have it in your ear
and you can have it in the car, and you're
still going to have to pick up the phone.
For example, in California I hear that's illegal.
So, this will allow, you know, I guess
the routing, you have to touch the phone,
but you don't bring it up to your ear, so.
iPhone volume control.
So, this is kind of a minor thing, but this
is huge for some accessory manufacturers
that do not have an industrial design
that allows for localized volume control.
So that what this gives you is a slider, just
like you would see on any other accessory
on the phone itself as well as the physical buttons.
So, it allows you to control the volume of the
localized output game being sent to the accessory.
So, if you have like a little desktop speaker, you
can now control the volume of that on the phone.
Or headset, some headsets are so small that
just physically cannot have a volume control,
you can now can do that on the phone.
So, where are we at?
So, here's the Bluetooth profiles that we have today.
So, Hands-Free Profile, the Phone Book Access
Profile, which is how we sync phone books
to your device, your car or your Bluetooth device.
The Advanced Audio Distribution Profile, HID or the
Human Interface Device, Personal Area Networking,
which is what we use for GameKit as well as
for tethering, and then the Device ID Profile.
I want to go into a little bit
more about DID here in a minute.
And then we have some custom protocols.
Emily touched on iAP.
And what's cool about iAP is it's essentially, it's the
entire iPod Accessory Protocol implementation, wireless.
So, all of the advantages you get with iAP in terms
of browsing the content of the phone or album art
or any of the other events so custom protocols, for example.
We see some implementations where people are adding value
to their traditional hands-free device by implementing iAP
and doing some really interesting things
with the phone or with applications
on the phone to interact with their device.
It's pretty cool.
And up to this point, if we adjusted a basic interface,
we wouldn't be able to provide you with the ability
to do things like bring up apps
and all those kind of things.
So, that's iAP over Bluetooth.
Alright. So, now, I'd like to go to three areas
where hopefully we can improve Bluetooth together.
So, we're going to cover some general
overall Bluetooth improvements.
We're going to go through GameKit and
some of the things that you can do
in GameKit hopefully to improve your network throughput.
And then we're going to talk about Wi-Fi Coexistence.
And if you don't know what that
is, you will in about 10 minutes.
So, let's go over overall Bluetooth improvements.
So, first-off, right now we're at Bluetooth 2.1 plus EDR.
We see a lot of devices that are being-- no,
this is specifically for accessory manufacturers.
But we see a lot of devices that are
still being implemented at a 1.1 level
or at a 2.0 level, but not implementing extended data rate.
And so for us, to interact with those, because we're
having to coexist with Wi-Fi, and we're having to coexist
with multiple profiles connected simultaneously
either to your device or to another device,
if you're not implementing EDR, it doesn't
leave a whole of time for the radio
to do anything else other than communicate with your device.
Implement EDR.
Support Secure Simple Pairing.
So, with SSP, the user no longer has to enter a pin code.
This is a huge thing, right?
All they have to do is find your device, click on that
device to get a little confirmation that says do you want
to pair with that device, they say
yes, and now they're authenticated.
It's that simple.
They don't have to go find the manual
and figure out what pin code to enter.
It's also a lot more secure.
It keeps man-in-the-middle attacks
from occurring, for example.
And so for car kits especially, a lot of times the car
kit implementation is that it'll generate a pin code,
and then the user has to go to the
phone and type in the 6-digit pin code.
But with Secure Simple Pairing,
you don't have to do that anymore.
Big, big thing.
Extended Inquiry Response.
So, with 2.1 we get EIR, we get
extended responses from accessories.
Now, normally, you just provide us with your Mac address.
But if you put your Friendly Name in the EIR, we
no longer have to go back and keep talking to you
about what's your name, what are you
capable off, and those kinds of things.
This is a big win for the phone because it
doesn't have to keep talking to your device
and to the next device and to the next device.
Implement Friendly Name in EIR.
Device ID Profile.
So, this is a profile that some
people do, some people don't do.
I know that this is kind of a religious
debate in the Bluetooth SIG.
But we implement DID, which means that when you
connect to the phone, in the SDP inquiry response,
we can tell you what you're connected to.
We'll tell you the name of the device, we'll tell
you the version, we'll tell you the hardware version,
so that you can add value to your products
that's based on what you're connected to.
Well, inversely, we like to do the same with your products.
So, if you implement DID in your products, we can say, oh,
we're connected to that new Mercedes
2012 325 or whatever, that's a BMW.
But any car like that.
And we can now add value with you, and we're more
than willing to work with you, because for example,
maybe you're doing iAP in the background as well.
And if we know what device we're connected to,
we can again together add value to that product.
Implement DID, it's a good thing.
Role switch.
So, with Bluetooth, there's a master-slave concept.
And the master is responsible for a lot of things.
It's responsible for power.
It's responsible for frequency selection.
It's responsible for many things in relation to the
piconet that's created between these two devices.
We still run into devices that do not allow role switch.
And this is not a good thing when the phone is
trying to connect to, again, multiple profiles
and multiple devices as well as coexist with Wi-Fi.
So, embrace slave mode, it's a great thing, trust me.
Now, if you don't-- if you can't do it, in other words,
you're connected to another phone or
whatever, that's fine, we understand.
But 90 percent of the time, or even more than that,
you don't have to be master, let
the phone be master, it'll be great.
Support sniff.
So, sniff intervals, for those who don't know,
is a way for us to kind of deterministically talk
to your device in a very controlled method.
And a lot of times we use sniff
for power saving, for example.
Keyboards work in sniff mode.
And they do some things.
Some keyboards do some things where they implement
a variable sniff interval where, for example,
they may use a relatively short sniff interval
when you're operating with the device.
It's about 12 millisecond duty cycle.
But then when you're asleep or when the phone screen
goes to sleep, the phone's going to actually ask you,
"Hey by the way, can you go to a longer sniff interval?
Let's see, 500 milliseconds or even
longer than that in some cases?"
That's a huge win for two reasons:
Power, 'cause we don't want to have
to turn the antenna on very often, and coexistence.
So, the less we're using the Bluetooth side of the
radio or antenna, the more time we can give to Wi-Fi.
I think you're seeing a theme here, right, with coexistence?
And then a little thing but we think is a big win for some
of our customers is if somebody reaches for the power button
on your keyboard, send us a disconnect
before you physically turn off.
There's a supervision time-out that will kick in after
awhile, but if you can turn off right away rather
than waiting for a supervision time-out, it's a big
win because you don't have to sit around waiting
to see if your device is going to talk to us.
Again, power savings and coexistence improvements.
So, profiles, just two quick things: A2DP.
I briefly touched on a couple of things here.
But if you can't do AAC, try to
do a higher data rate of SBC.
We see a lot of devices that are still using data rates
in the 200-- 220 kbps data rate, that doesn't sound good.
And a lot of the negative implications
of A2DP or Bluetooth audio has been
because people have implemented low data
rates in their SBC encoding or decoding.
So, if you say to us, "I only support 180 kbps," which
is like SBC value of like somewhere in the 20s, well, so,
the user, they're going to hear just
massive sound compression artifacts, right?
So, shoot for 330.
We find that somewhere in the bitpool value of 53, you're
going to produce an audio quality that really sounds great.
However, if you can implement AAC, we
really, really want you to do this.
This is a huge win for us again because
we saved throughput, it's only a 128 kbps.
We saved power because we're-- we
have a highly, highly optimizing coder
that reduces the system usage on the AAC encode period.
And then it also saves in coexistence.
Again that coexistence word.
Oh, and then finally, AVDTP 1.3.
So, when we're sending audio to your device,
we have no idea how long it takes your device
to render that frame of audio, right?
So, from the time that audio comes in to your
Bluetooth receiver to the point at which it's decoded,
to the point at which it goes to the DDA, to
the point at which it goes to the transducer
or speaker, we don't know how long that takes.
So, we have to actually measure multiple devices,
and we determine realistically and try to figure
out how long we think your device takes to render audio.
And the reason we have to do that is because in most cases,
people are playing a game or people are watching video
or watching a podcast or whatever they're doing.
And if we don't keep those in sync, what happens?
Lip-sync problems, right?
They're talking and it's-- the bad
Japanese movie or a bad monster movie.
So, what we have to do is we have to
improve latency by doing all these tricks.
Well, actually, in Bluetooth, there's a way to
get-- for you to give this information back to us.
We really like to see you implement that.
And with HFP, I know I spoke about voice commands as
a feature that we support, but we'd really like you
to implement the real way of doing it which is BVRA
or Bluetooth Voice Recognition Activation Command.
Even if your car has the ability of doing
its own voice recognition, totally cool,
but if the user wants to use the
phone's voice recognition activation,
we like to be able to send the BVRA plus colon 1
command to you to say we want to do voice commands.
And everybody wins, because you now understand
the state by which the phone is going into.
And then eSCO, which is again going back to 2.1 in EDR.
OK. So that's general Bluetooth
or overall Bluetooth improvements.
So, let's get into GameKit.
So, first off, GameKit is it's an interesting
implementation that we did last year.
And a lot of people started by doing this kind
of client-server client-server both are running
browsing and advertising simultaneously.
By the way, this is really-- this talk is really
focused towards the application developer that's working
on specific applications or uses of GameKit.
So, when you implement a game, and
both of them come up, it's advertising,
and they also are coming up in the browse, what happens?
Well, contention happens, right?
Because one device is advertising, another
device is advertising, one device is browsing,
and the other device is browsing, and
there are times when they both browse,
and when you're browsing, you're
not advertising, and guess what?
It takes a long time to find each other.
So, if you can, it's not always appropriate, but
if you can, try to use a client-server model.
In other words, once side says, "Oh,
you want to play a game of checkers?"
"Sure." So, you host a game.
And the other side says let's browse for game of checkers.
And then you find that game of
checkers and you connect with.
It finds it much, much, much faster.
And that's traditionally how Bluetooth is
supposed to work, a client-server model.
So, in relation to that, only browse
when you want to find something, right?
So, if you find that game of checkers and you start
playing the game of checkers, there's really no use for you
to sit there and look for another
game of checkers unless you want
to play five games of checkers, which I guess is OK.
But 90 percent of the time, you're not, right?
Or a large percent of the time
you're not going to want to do that.
And inverse to that, only advertise
when you need to advertise.
So that game of checkers, if you only support one
simultaneous act of game, as soon as the game starts,
stop advertising with Bonjour or with GameKit.
It's a big win.
Because what's happening is when you're
advertising or when you're browsing,
in the background, the radios are pinging out, right?
They are either in advertise mode, which
means other devices can discover you,
which means it has to respond to those inquiries, right.
So, only advertise and browse when
you need to advertise and browse.
So, to understand maximization or how to
maximize the network or Bluetooth network,
I wanted to kind of go in to a brief explanation of
how Bluetooth packetizes its data across the network.
You'll never ever, ever have to
deal with this as a game developer.
But understanding the packet model hopefully
will help you maximize the throughput
or latency reduction of latency of your games.
And the key thing to remember here is that a
wireless network is not equal to a wired network.
A wireless network, we have this weird thing called
time space, and we can only transmit a certain amount
of data in a free airspace domain at a time.
So, it's not just this dedicated wire that were sitting
there plus the data rates tend to be a little bit slower.
So, let's look at the packet format.
So, on the left-hand side, we have Time to Transmit
and on the bottom we have the amount of data
that you can send at any particular time.
I'm going to throw the packets up there.
So, there's a lot of gobbledygook titles up there.
The green are Basic Data Rate packets,
the blue are Extended Data Rate packets.
And you can already see the advantage of using EDR
over BDR or Basic Data Rate or classic Bluetooth.
So, we're not going to use BDR, so I'm going
to get rid of those packets right away.
So, on the far right-hand side or
left-hand side to you, is a 3-DH5 packet,
and that can contain about 1,000 bytes a data, it's a little
bit more, and it can send in at about 3.125 milliseconds.
On the left-hand side, we have a 2-DH1 and a 3-DH1, we'll
focus on the 3-DH1, and that can send roughly, I think,
it's about 80 bytes every 625 microseconds.
So, we're going to get rid of the other ones, and
we're going to just focus on these particular extremes.
And why is this important?
I'll get to it in a second.
So, let's look at a particular packet or
a 3-DH1 packet and the anatomy of this.
So, this is how a typical packet looks
as it's flowing over the network.
We have the baseband which is about 9 bytes
of data, L2CAP takes about 4 bytes of data,
BNEP which is our networking layer or Bluetooth's way of
sending IP traffic data over the layer, it's about 9 bytes.
In this case, IPv4 or the IPv6 can be used as
well which will even be more than 13 bytes.
We're using UDP, so this is a UDP header that
we're using for this particular transmit,
and then the CRC, which is about 2 bytes.
So, the overhead of this packet is 45 bytes of information.
Well, why is this important?
Well, a 3-DH1 packet is only 83 bytes, so which
means we're only able to send 38 bytes of information
on any one radio cycle, every 625 milliseconds--
microseconds we're sending, basically,
38 bytes of the payload that you're sending to us.
So, there's some key things that we can learn from this.
If you're sending high bandwidth data, in other words,
if you're sending a file of if you're sending a contact
to another phone using the GameKit model, try to keep
your MTU values or your payload values at about 980 bytes
of data, and try to keep your duty
cycles at about 5 milliseconds.
It's really about 3.125.
But if you keep it 5, then we can deal with
contention and retries that are going to happen.
So, what happens if you send it faster
than that, we end up with fragmentation.
And if you send more MTUs that are larger than this,
let's say you send an MTU of about 1100 bytes, well,
what's going to end up happening is we're going
to be able to send 1021 bytes and 100 bytes.
So, the efficiency of the network becomes extremely low
because we're having to send a large packet, a small packer
or a large packet or a small packet and so on.
And inverse to that, if you need to send a lot of packet
data really fast and you're not sending a whole lot of data,
it's about 30 bytes of data, and the duty
cycle of this is about 2.5 milliseconds.
Again, it's really 625 microseconds.
But in relation to the large packet data, you can see
that if you keep your packet small and your duty cycles
within a cadence that's appropriate to Bluetooth,
you're really, really improving your efficiency
which means your overall latency is going
to drop or the round trip is going to drop.
So, a couple other things.
Be a good wireless citizen.
What does that mean?
Only use the bandwidth you need.
Don't slam the Bluetooth network if you don't
need to be slamming the Bluetooth network.
Again, this is for coexistence.
Only transmit when necessary.
We kind of briefly talked about this.
Performance of the Bluetooth and Wi-Fi are affected.
And what I mean by this is if somebody's playing
your game, you're directly affecting the performance
of everything else that's going on on the radio.
You have to keep that in mind.
And then avoid multicast, not everybody
needs to get the same packet of information
if you're dealing with a multi-device game.
So, that's working with GameKit.
So, finally, I want to briefly talk about Wi-Fi Coexistence.
So, for those of you that don't know, Bluetooth
and Wi-Fi exist within the same ISM band,
and this is more for associate
manufacturers and application developers.
We exist between the 2.4 and 2.5 gigahertz band.
Wi-Fi consist of about 22 megahertz-- or consist of
specifically thirteen 22 megahertz channels that start
at 2.4 just above 2.4 and move all the way up to
channel 11, in some parts of the world, channel 13.
Bluetooth uses seventy-nine 1 megahertz
channels that do not overlap.
One thing to keep in mind, Wi-Fi does overlap.
So, one-- channel 1 and channel 2 overlap.
So, typically, when you are selecting Wi-Fi channels, you
see 1 and then the next non-overlapped Wi-Fi channel at 6,
and the next overlap Wi-Fi channel is 11.
So in the case of yesterday in the
keynote, you guys were using every channel.
It was crazy.
So, let's look at what Bluetooth looks like maybe visually.
So, this is 79 channels across the entire spectrum.
And one of the things Bluetooth does
that tries to avoid interruption
or interference is it uses adaptive frequency hopping.
So, it doesn't transmit on all of
these frequency simultaneously,
it's algorithmically choosing the
appropriate frequency to transmit on.
So, it would look something like
this, where it's hopping around
and picking frequencies depending
upon what is open or available to it.
So, now, let's overlay 1, 6, and11 on top of this.
So, now you can see when Wi-Fi is using a bunch of frequency
or spectrum, Bluetooth doesn't have a whole lot to live in.
So, if we're hopping in this, you can
see we're really, really under duress.
Our Bluetooth is really trying to
make it work for whoever is using it
at that particular time, whether
it be Hands-Free or GameKit.
So, briefly I wanted to mention how we transmit.
So, Bluetooth and Wi-Fi use the same antenna in our devices.
In all of our iOS devices, this is the case.
We use the same exact antenna.
So, if they both want to transmit and Bluetooth
has priority what happens is Wi-Fi can't transmit,
and Bluetooth gets the antenna.
Well, inverse to that, if Wi-Fi wants
to transmit and it has higher priority,
then Bluetooth can transmit and it has the antenna.
So, we have technology that we've
developed, and we have worked with our--
with industry standard to do antenna arbitration so that
we smartly select the appropriate side of the radio,
if you will, Wi-Fi or Bluetooth, and we try to fit
in Bluetooth transmits in line with Wi-Fi transmits
so that Wi-Fi doesn't step on Bluetooth
and Bluetooth doesn't step on Wi-Fi.
But there are some things that you can do to
help us make this easier for us in our products.
Guess what?
Implement 2.1.
This is a big win for us.
If you can transmit three times as
much data in the same amount of time,
then you don't have to be transmitting as often, right?
Or if you're transmitting 30 bytes of data every 625
microseconds rather than 10 bytes of data, again,
it's a big win for coexistence, because
we don't have to do it as often, right?
Support sniff.
Sniff is deterministic.
If we have a deterministic interval
and transmit, we know how to arbitrate
or how to integrate Wi-Fi and BT
from a coexistence perspective.
So, sniff is a big win for us.
Optimize packet usage.
Only send, and this goes back to my previous
section, only send what you need to send, right?
Support lower bandwidth codecs.
We talked about AAC.
It's a huge win.
The difference between 330 kbps and a 128
kbps or depending upon what we're doing,
it could even be less than that is
just a huge win for coexistence.
So that's Wi-fi Coexistence.
And ultimately, that's the three areas that hopefully will
help you improve your products both application development
as well as Bluetooth accessories to
hopefully create a great user experience
between the iPhone OS platforms and
your software and your accessories.
So, with that, I'm going to hand it over to
Paul who's going to go into External Frameworks.
Thank you.
[ Applause ]
>> Good morning, everybody.
My name is Paul Holden, working
in the iPhone Application Team.
And I'm really excited to be here to talk to
you guys about the External Accessory framework.
And particularly, I actually-- I saw some faces
that were here last year when we gave this talk.
So, I'd like to briefly go over the architecture of--
I'll call it EA from now on instead of External Accessory,
the EA architecture, and quickly go over the API.
OK. And go over some of the-- things that people have sort
of sent bugs in it about and talk to us about in the lab
so that we can try and give you guys some hints.
Next, we'd like to go over what's new in iOS 4
with respect to the External Accessory framework.
So, multitasking, and something that is very, very
popular with you guys, a lot of requests that we got were
for different types of App Store interactions.
So, we'll talk about those as well.
So, going over the architecture, you've got a
physical accessory, and you've got your application.
OK. So, here you're got physical
accessory, you've got an application,
and your application will link
against the EA Framework, right?
So that's the blue box that you see here.
Now, when you connect your physical
accessory to the iPhone, you'll get--
your accessory will enumerate the
protocols that it supports.
Now, this is what Emily was talking about, about the
custom protocols where we don't really get in the way,
we just give you guys channels to send input
and output to and from your accessory, right?
And what happens is all of these your-- for example here
we've got three, you name them, you're in control of them,
and your application is free to
use any of them that it wants.
So, once that happens, once your accessory
enumerates, and with the protocols that it supports.
In your application, you'll get a notification.
And part of that notification will be an object
that will represent the physical accessory, OK.
So, if you've got three accessories that are currently
connected, you'll get three of these kinds of blue boxes
that will show up in your application, and they'll each
list all the protocols that are supported by your accessory.
Next, when you want to actually send data, we've
got the gray boxes around here, the protocol.
So, for example, here we've got the two, you'll
open what we call EASessions with these protocols,
and then you'll be able to send input and
output to and from your accessory, OK.
So that's the architecture really quickly.
So next, let's talk about the API.
So, very basic small API here.
We can fit it all in just a few lines.
So, we've got three classes: EAAccessoryManager,
EAAccessory, which also has a delegate, and EASession.
So, let's start with EAAccessoryManager.
So, when you guys look at this header file, you'll
see two notifications that pretty much every single
of your applications that'll use additional
accessory framework will have to use,
so the DidDisconnect and the DidConnectNotification.
So when you receive these, you'll get them whenever
a physical accessory connects to your device
and enumerates the protocols that it supports, OK.
And anyone who's familiar with Foundation notifications
will see that there's a user info dictionary,
and in the user info dictionary will be
that object that represents your accessory,
the EAAccessory object which we'll get to the next slide.
So, it is a singleton, so you use the
sharedAccessoryManager to get to it.
And next, like you've seen in a lot of the other frameworks,
we asked you to register and unregister for notifications.
That helps us do a little bit of authorization.
And then next, when you connect-- when you actually startup
your application, we won't send you a DidConnectNotification
for every single accessory that's already connected.
What you'll get instead is you'll come
up and you'll register for notifications
and you'll call the connectedAccessories protocol, and what
that'll do, well, that'll give you the list of accessories
that are already connected to your iPhone.
So next, the object that pretty much everyone use here.
When you get the DidConnectNotification or when
you call the property connectedAccessories,
in there you'll see these EAAccessory objects.
So, if this is the one-- if this is
the accessory that you care about so,
you'll check it to see if it supports the protocolString.
So here, we've got three important protocols, the
connectionID, the protocolString and the delegate.
So, if it is an accessory that you care about,
you'll go ahead and you'll retain and you'll use it.
And then you'll probably use these three properties.
So now, there's a lot of them in
there such as name, manufacturer,
serial number, model number, all sorts of stuff.
But generally, you'll use connectionID for you to know
that this actually is a unique ID given
to that accessory for that connection.
So, let's say you have a certain accessory
that connects, disconnects and reconnects,
although the accessory will look the
same, the connectionID will be different.
And then the protocolStrings is just an array
of all those custom protocols that it supports.
And generally, we ask people to
name them in reverse DNS notations.
So you'll get them here, it's just a list of strings.
And finally, the delegate.
So, what's the delegate used for?
Two slides ago, we talked about
the DidDisconnectNotification.
OK, what's great about the DidDisconnectNotification
is that you get it anytime an accessory disconnects.
The bad thing about that notification is that
you get it every time an accessory disconnects.
So, if you only care about one particular one that you've
retained, then you can go ahead and set the delegate
on that EA object, on that EAAccessory object and you'll
get this called only when that accessory disconnects.
So, hope you guys have all that.
Next, where you'll spend most of
your time is with the EASession, OK.
So, EASessions are what we'll use when
we actually want to talk to a protocol--
when we talk over a protocol for a given accessory.
So, the application developer,
you guys, will create it, right?
So, use initWithAccessory forProtocol.
So, you create one session forProtocol for one accessory.
Next, how are you going to talk to your accessory?
Well you'll use these two properties.
You'll use the inputStream and the outputStream.
So, these are NSInputStream and NSOutputStream
classes which are both subclasses of NSStream.
We'll talk about those in a little bit.
But basically, these are the main
objects that you'll use to communicate.
Finally, you have one EASession
per EA Protocol per EAAccessory.
That's kind of a mouthful, but even with
multitasking, this is still the case.
So for example, if you the application developer, hold
the session to a certain protocol for a certain accessory,
even if you go in the background, even if you
suspend, you will still own that protocol, right?
So, no other application can come in and just
start talking to it until you release it.
So, we talked about communicating
using NSInputStream and NSOutputStream.
So, I'm bringing this up again because this was definitely
one of the most difficult points that when people had a lot
of question or people had issues with their--
writing their applications and trying to use the External
Accessory framework, a lot of them came to using this class.
So, it's a subclass of NSStream.
And I would say at least 80 percent of the
questions were somehow in this document.
So, the "Introduction to Stream Programming
Guide for Cocoa," I encourage anyone who plans
to use this framework to go read that, very, very useful.
It describes it in a lot of detail, gives a lot of examples.
And if I were to describe it very, very quickly,
what I would do is I would tell people that this is--
it's delegate-based, and basically, whether
you have the inputStream or the outputStream,
both are subclasses of NSStream, and that
NSStream basically just uses one subclass
to communicate, and that's theStream handleEvent.
And the types of events that you'll get are stream
open, stream close, stream had an error, stream ended,
stream has data available, and stream has space
available for you to put data in it, right.
So you can't just assume that when you open
a stream that there's enough space for you
to put, you know, 25 megabytes in there, right.
You send data, and it'll come back and
it'll tell you how much room is in it,
how much you can-- how much data you can put in it.
And then when the data is consumed,
you'll get a notification
and then you're free to put more data in there, right.
So definitely, check that out.
It'll be really, really useful.
So, what's new?
We talked about the API and how
they've kind of stay the same.
But generally, multitasking has had a few-- made
a few changes with respect to your accessory.
So, one is there are no EA events in the background.
So, how do we accomplish this?
When your application gets the
UIApplicationDidEnterBackgroundNotification,
we read that too.
So, the External Accessory framework will read
that and what it'll do is it sends a disconnect
for every single accessory that's
currently connected to your system.
So you'd get this when you go into the background.
Then when you come in to the foreground,
you'll get a DidConnectNotification
for everything that's connected
when you come into the foreground.
Now, why that's important is because this will
allow your programs to work seamlessly with iOS 4.0.
So, even though-- let's say someone disconnects
and reconnects an accessory a hundred times
and when you come back up or possibly if-- when you
come back up out of suspension or out of the background,
that accessory is no longer there, you won't even notice it.
It'll look just as if that accessory disconnected, OK.
So, this will allow people to seamlessly
have their application work
between 3 and 4 with one exception of course.
So, if you don't release your session,
other clients aren't free to use it.
So, it's very, very important that if you don't intend on
keeping a session around, that when you're done with it,
you close it or when the accessory
disconnects, that you close it.
Now, in 3.0, this wasn't-- for the iPhone
OS 3.0, this wasn't as important, right.
Because when your-- when the use press the
Home button, your application went down
and we did all the clean up everything, right.
So, a lot of people-- we're looking
at most applications today.
Most people did this.
So, most of them will just get all this behavior for free.
But not-- it's-- but if you didn't, it's very
important that you go ahead and clean up.
So this-- so this is taken right
out of that programming guide,
the Stream Programming Guide for
Cocoa that I brought earlier.
So, for every session, you want to close them, remove
them from the runLoop and set the delegate the nil.
And then finally, the last line
here, just release your session.
So finally this, App Store Interactions, very,
very excited to be able to bring this to you guys.
This was definitely one of the most
requested features that we had here.
So, if I were to describe how we did this before, when
you connected an accessory to your iPhone, if there was--
if for every protocol that you supported, there was not a
single application that handled anyone of those protocols.
You got the alert that you see here.
So, application unsold, you want to go to the App Store
and then sell it, and also in the Settings application,
you got to find app for accessory, alright.
So this worked for 90 percent of the people out there,
but we've got a lot of request for us to expand this
and give them different kinds of options.
So, before I go into those, I'd like to just talk
about how you get an application to be associated
with your-- with the protocols that it supports.
So, we have a key UISupportedExternalAccessoryProtocols.
It goes in your Info.plist.
You list all the protocols that your accessory supports,
and then from there, if there is a iPhone OS 3.0,
if there is a protocol that's supported by
a single application, the alert disappears.
But you still see the find application for accessory here.
So, App Store Interactions.
This is the mode, we've got pictures
later, so don't read this.
But basically, we've got a bunch of new
interactions that'll help you work with that alert
and help you work with the App Store a lot easier.
What's important to mention is that we now add properties
to these protocols and this happens on the iAP layer.
OK. It doesn't happen in the people who are writing
applications essentially have no control over this, right.
When you're writing your accessory, when you are
enumerating your protocols over iAP, you'll go ahead
and you'll add the-- you'll add a few
properties that'll give us hints about you want
to do with respect to App Store Interactions.
So, all of these happen over iAP.
So, the first one is the one that we had in 3.0.
So, Show Alert if no EA Protocol is supported.
So let's say, your accessory supports three protocols.
Not a single one of them is supported by any
app on the phone, then you'll get the alert.
Next, we have a property-- oh, also to mention where--
I go back here, if you don't add any properties,
you'll get the default behavior from the 3.0.
So nothing-- you'll see your applications
going forward, they'll work.
If you don't add anything, they'll
work just as they did in 3.0.
So, going forward, you can also add a property that says,
I want you to show the Alert if
EA Protocol x is not supported.
So, let's say you have an accessory that has three
protocols and two of them aren't very important,
they might also be supported by adaptation, but you have one
that's really, really, really important and you always want
to show this alert if that particular one doesn't have
an accessory that has that protocol in it's Info.plist.
So you can add this property to it, the Show Alert if
EA Protocol x is not supported and that'll always come
up until that particular protocol is supported.
Next, No Alert.
So you can just tell us for this
particular protocol, I don't want an alert.
That's fine, we won't show it.
Next, if you want to be in total stealth
mode, you can just say No Action.
And if you add that property to the protocol over iAP, we
will not show anything that has to do with the App Store.
So that concludes the External
Accessory framework portion of it.
If you have any more questions,
definitely talk to the Evangelist.
If they don't know the answer or who to get you
the answer from, there probably is no answer.
And then check out Developer Program
Documentation and the Apple Forums.
Next, we also have a-- associated with
this session, we have a test application,
and I definitely encourage everyone
to go through and see it.
It's very, very general.
It basically lets you form connections and
send data to any accessory that's out there.
So, everyone can take advantage of the
application that's bundled with this presentation.
So, definitely go look at it.
In particular, there's an EAD so the program
is called EA Demo, so the prefix EAD.
EAD session controller .h and .m.
Those are like plagiarized full proof.
We want everyone to go through, read them and
possibly just copy them as-is into your application.
They kind of abstract a little bit the NSStream stuff
so that you guys can go through and use that, alright?