Displaying video from connected devices
Show video from devices connected with the Developer Strap in your visionOS app.
Overview
Apple’s audiovisual frameworks allow your visionOS app to access video from USB video class (UVC) devices connected with the Developer Strap for Apple Vision Pro. You can use this functionality to display real-time video in your app. For example, a medical professional can view the output from an endoscopic camera during a procedure. This article outlines the requirements to access UVC devices in visionOS. The sample code project shows a picker for each device connected to Apple Vision Pro, and displays the selected device’s video feed.
Add usage descriptions for camera access
To help protect people’s privacy, visionOS limits app access to cameras and other sensors in Apple Vision Pro. You need to add an NSCameraUsageDescription to your app’s information property list file to provide a usage description that explains how your app uses the data those sensors provide. People see this description when your app prompts for access to camera data.
Create the device picker
Use an AVCaptureDevice.DiscoverySession to obtain an array of connected devices.
// ConnectionManager
private let discoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.external],
mediaType: .video,
position: .unspecified)
private func updateDeviceList() {
// Transform the `AVCaptureDevice` instances.
let devices = discoverySession
.devices
.map { Device(id: $0.uniqueID, name: $0.localizedName) }
...
}Next, observe wasConnectedNotification and wasDisconnectedNotification to update the array when a device connects or disconnects.
// ConnectionManager
private func observeDeviceConnectionStates() {
Task {
// Await notification of the system connecting a new device.
for await _ in NotificationCenter.default.notifications(named: AVCaptureDevice.wasConnectedNotification) {
updateDeviceList()
}
}
Task {
// Await notification of the system disconnecting a device.
for await _ in NotificationCenter.default.notifications(named: AVCaptureDevice.wasDisconnectedNotification) {
updateDeviceList()
}
}
}Render a picker with an option for each device.
// ContentView
Picker("Device Picker", selection: $previewManager.selectedDevice) {
Text("Select Device").tag(nil as Device?)
ForEach(devices) {
Text($0.name).tag($0)
}
}Display the selected device’s video feed
Configure an AVCaptureSession to capture AVCaptureDeviceInput from the selected device and output it to an AVCaptureVideoDataOutput.
// CaptureManager
private let captureSession = AVCaptureSession()
private let videoDataOutput = AVCaptureVideoDataOutput()
...
private func setUpSession() {
// Bracket the following configuration in a begin/commit configuration pair.
captureSession.beginConfiguration()
defer { captureSession.commitConfiguration() }
// Drop frames that don't render in a timely manner.
videoDataOutput.alwaysDiscardsLateVideoFrames = true
videoDataOutput.setSampleBufferDelegate(self, queue: sessionQueue)
if captureSession.canAddOutput(videoDataOutput) {
captureSession.addOutput(videoDataOutput)
} else {
assertionFailure("Unable to add video data output to the capture session.")
}
}
/// Stops capture from the previously selected device and, if provided, begins capture from the provided device.
/// - Parameter device: The device to capture video from, or nil to stop capture altogether.
func select(device: Device?) {
// Bracket the following configuration in a begin/commit configuration pair.
captureSession.beginConfiguration()
defer { captureSession.commitConfiguration() }
// Remove previous input, if it exists.
for input in captureSession.inputs {
captureSession.removeInput(input)
}
// Prepare the renderer to receive content from a new device.
videoRenderer.flush(removingDisplayedImage: true)
// Return early if the passed device is nil.
guard let captureDevice = device?.captureDevice else { return }
do {
let authorizationStatus = AVCaptureDevice.authorizationStatus(for: .video)
/// In the context of this sample, this check generally passes because `ContentView`
/// displays a message and terminates when the system denies access to the camera.
precondition(authorizationStatus == .authorized,
"Camera authorization is required to set up a device capture session.")
let input = try AVCaptureDeviceInput(device: captureDevice)
// Add the new input, if possible.
if captureSession.canAddInput(input) {
captureSession.addInput(input)
} else {
assertionFailure("Unable to add the input to the capture session.")
}
} catch {
fatalError("Unable to create input for the device. \(error)")
}
}Call startRunning() on the capture session to start the flow of data from the capture session’s inputs to its outputs.
// CaptureManager
/// Begin the flow of data from the capture session's inputs to its outputs.
func start() {
captureSession.startRunning()
}AVCaptureSession delivers a steady stream of updates to the AVCaptureVideoDataOutputSampleBufferDelegate assigned to the AVCaptureVideoDataOutput. Each update includes a CMSampleBuffer that contains the latest video frame from the device. Render the CMSampleBuffer to an AVSampleBufferDisplayLayer using the layer’s AVSampleBufferVideoRenderer.
// CaptureManager
/// The video renderer from the `AVSampleBufferDisplayLayer`
/// this app uses to display video.
nonisolated private let videoRenderer: AVSampleBufferVideoRenderer
...
extension CaptureManager: AVCaptureVideoDataOutputSampleBufferDelegate {
nonisolated func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
// If the renderer is ready for more data, queue the sample buffer for presentation.
if videoRenderer.isReadyForMoreMediaData {
videoRenderer.enqueue(sampleBuffer)
}
}
}Add the AVSampleBufferDisplayLayer to a UIView and use a UIViewRepresentable to display the UIView in a SwiftUI view.
struct DevicePreview: UIViewRepresentable {
/*
In this sample, `preview` is an instance of `AVSampleBufferDisplayLayer`.
`AVCaptureVideoDataOutputSampleBufferDelegate.captureOutput`
uses the layer's `sampleBufferRenderer` to enqueue the provided
`CMSampleBuffer` for rendering.
*/
private let preview: CALayer
init(preview: CALayer) {
self.preview = preview
}
func makeUIView(context: Context) -> SampleBufferPreview {
SampleBufferPreview(preview: preview)
}
func updateUIView(_ previewView: SampleBufferPreview, context: Context) {
// Updates the state of the specified view with new information from SwiftUI.
}
class SampleBufferPreview: UIView {
let preview: CALayer
init(preview: CALayer) {
self.preview = preview
super.init(frame: .zero)
layer.addSublayer(preview)
}
required init?(coder: NSCoder) {
fatalError("init(coder:) hasn't been implemented")
}
override func layoutSubviews() {
preview.frame = bounds
}
}
}Display a prompt when denying access to the camera
If the person denies camera access, the sample app prompts them to grant access in the Settings app. For more information about providing camera access in your app, see Requesting authorization to capture and save media.
See Also
Video playback
Destination VideoPlaying immersive media with RealityKitRendering stereoscopic video with RealityKitCreating a multiview video playback experience in visionOSConfiguring your app for media playbackAdopting the system player interface in visionOSControlling the transport behavior of a playerMonitoring playback progress in your appTrimming and exporting media in visionOS