AudioGeneratorController
A controller that manages the playback of a real-time audio stream.
Declaration
@MainActor class AudioGeneratorControllerOverview
To receive an audio generator controller, call an entity’s prepareAudio(configuration:_:) or playAudio(configuration:_:) method.
The following examples show how you can use the controller:
/// myHandler.mm
AVAudioSourceNodeRenderBlock myHandler = ^OSStatus(BOOL *isSilence,
const AudioTimeStamp *timestamp,
AVAudioFrameCount frameCount,
AudioBufferList *outputData) {
double phase = FREQUENCY * timestamp->mSampleTime * (1.f / SAMPLE_RATE);
for (int idx = 0; idx < frameCount; idx++) {
((Float32 *)outputData->mBuffers[0].mData)[idx] = sin(phase * 2.f * M_PI) * 0.5;
phase += (FREQUENCY / SAMPLE_RATE);
}
return 0;
};// Create a configuration.
var config = AudioGeneratorConfiguration(kAudioChannelLayoutTag_Mono)
// Prepare a closure that you define in myHandler.mm.
var controller = myEntity.prepareAudio(configuration: config, myHandler)
controller.gain = -3.0
controller.play()During playback, the audio appears to come from the entity that you use to create the controller. As a person moves around the MR scene, RealityKit modulates the characteristics of the audio to account for their location.
Call stop() to halt the audio, and play() to restart the stream.