Contents

AVAudioPlayerNode

An object for scheduling the playback of buffers or segments of audio files.

Declaration

class AVAudioPlayerNode

Overview

This audio node supports scheduling the playback of AVAudioPCMBuffer instances, or segments of audio files that you open through AVAudioFile. You can schedule buffers and segments to play at specific points in time or to play immediately following preceding segments.

Generally, you want to configure the node’s output format with the same number of channels as in the files and buffers. Otherwise, the node drops or adds channels as necessary. It’s usually preferable to use an AVAudioMixerNode for this configuration.

Similarly, when playing file segments, the node makes sample rate conversions, if necessary. It’s preferable to configure the node’s output sample rate to match that of the files, and to use a mixer to perform the rate conversion.

When playing buffers, there’s an implicit assumption that the buffers are at the same sample rate as the node’s output format.

The stop() method unschedules all previously scheduled buffers and file segments, and returns the player timeline to sample time 0.

Player Timeline

The usual AVAudioNode sample times, which lastRenderTime observes, have an arbitrary zero point. The AVAudioPlayerNode class superimposes a second player timeline on top of this to reflect when the player starts and intervals when it pauses. The methods nodeTime(forPlayerTime:) and playerTime(forNodeTime:) convert between the two.

Scheduling Playback Time

The scheduleBuffer(_:at:options:completionHandler:), scheduleFile(_:at:completionHandler:), and scheduleSegment(_:startingFrame:frameCount:at:completionHandler:) methods take an AVAudioTime when parameter, and you interpret it as follows:

  • If the when parameter is nil:

  • If there are previous commands, the new one plays immediately following the last one.

  • Otherwise, if the node is in a playing state, the event plays in the very near future.

  • Otherwise, the command plays at sample time 0.

  • If the when parameter is a sample time, the parameter interprets it as such.

  • If the when parameter is a host time, the system ignores it unless the sample time is invalid when the engine is rendering to an audio device.

The scheduling methods fail if:

  • A buffer’s channel count doesn’t match that of the node’s output format.

  • The system can’t access a file.

  • An AVAudioTime doesn’t specify a valid sample time or a host time.

  • A segment’s start frame or frame count is a negative value.

Handling Buffer or File Completion

The buffer of file completion handlers are a means to schedule more data if available on the player node. For more information on the different completion callback types, see AVAudioPlayerNodeCompletionCallbackType.

Rendering Offline

When you use a player node with the engine operating in manual rendering mode, you use the buffer or file completion handlers — lastRenderTime, latency, and outputPresentationLatency — to track how much data the player rendered and how much remains to render.

Topics

Creating a Player Node

Scheduling Playback

Converting Node and Player Times

Controlling Playback

See Also

Playback