Contents

ARKit in visionOS

Create immersive augmented reality experiences.

Overview

ARKit in visionOS offers a new set of sensing capabilities that you adopt individually in your app, using data providers to deliver updates asynchronously. The available capabilities include:

  • Plane detection. Detect surfaces in a person’s surroundings and use them to anchor content.

  • World tracking. Determine the position and orientation of Apple Vision Pro relative to its surroundings, and add world anchors to place content.

  • Hand tracking. Use a person’s hand and finger positions as input for custom gestures and interactivity.

  • Scene reconstruction. Build a mesh of a person’s physical surroundings and incorporate it into your immersive spaces to support interactions.

  • Image tracking. Look for known images in a person’s surroundings and use them as anchor points for custom content.

  • Object tracking. Use 3D reference objects to find and track real-world objects in a person’s environment.

  • Barcode detection. Detect and scan QR codes and barcodes in a variety of formats in a person’s surroundings.

  • Room tracking. Use room anchors to identify specific rooms and implement per-room experiences.

  • Light estimation. Understand the lighting characteristics of a room to help improve the appearance of shiny or semi-reflective materials in your virtual content.

  • Camera frames. Access camera frames from a device in several formats.

  • Accessory tracking. Work with the real-time position and orientation of accessories that a person is using.

[Image]

Topics

Setup

Barcode detection

Camera sampling

Rendering

Camera region

Plane detection

World tracking

Hand tracking

Scene reconstruction

Image tracking

Geometry

Lighting estimation

Object tracking

Accessory tracking

Room tracking

Shared coordinate spaces

See Also

visionOS