Implementing scene understanding and reconstruction in your RealityKit app
Detect real-world objects and surfaces to create precise AR interactions.
Overview
RealityKit can detect planes in the real-world environment on any device, allowing virtual objects to interact with real-world surfaces. On devices with a LiDAR sensor, RealityKit can create a detailed reconstruction of the surrounding environment for more precise interactions between virtual content and the real world. With scene understanding enabled, RealityKit not only reconstructs the environment, it also recognizes many real-world object types like tables, walls, and floors.
Configure scene understanding with RealityView
To enable scene understanding in a RealityView, configure a SpatialTrackingSession.
let session = SpatialTrackingSession()
let config = SpatialTrackingSession.Configuration(
tracking: [],
sceneUnderstanding: [
.occlusion,
.physics,
.collision,
.shadow
])
await session.run(config)In iOS and macOS, all scene-understanding capabilities are available, including occlusion, physics, collision, and shadow. In visionOS, you can only enable physics and collision.
Configure scene understanding with ARView
For existing iOS and macOS apps that use ARView, enable these features by inserting options into sceneUnderstanding.
arView.environment.sceneUnderstanding.options.insert(.occlusion)
arView.environment.sceneUnderstanding.options.insert(.physics)
arView.environment.sceneUnderstanding.options.insert(.collision)
arView.environment.sceneUnderstanding.options.insert(.receivesLighting)Use scene-understanding meshes
After enabling scene-understanding options, RealityKit automatically generates entities representing real-world geometry with a SceneUnderstandingComponent.
Retrieve these entities using an EntityQuery. The following code example renders a custom debug material with scene-understanding meshes:
var debugMaterial = UnlitMaterial(color: .green)
debugMaterial.triangleFillMode = .lines
let sceneUnderstandingQuery = EntityQuery(where: .has(SceneUnderstandingComponent.self) && .has(ModelComponent.self))
let queryResult = scene.performQuery(sceneUnderstandingQuery)
for entity in queryResult {
entity.components[ModelComponent.self]?.materials = [debugMaterial]
}With the physics and collision capabilities enabled, scene-understanding meshes participate in physics simulations and collision events.
The following code example identifies scene-understanding meshes in a collision event:
let _ = content.subscribe(to: CollisionEvents.Began.self) { event in
if event.entityA.components.has(SceneUnderstandingComponent.self) {
// The entityA is a scene-understanding mesh.
}
}Add virtual scene-understanding meshes in visionOS
You can add SceneUnderstandingComponent to your custom entities to make them behave as virtual scene-understanding meshes. A virtual scene-understanding mesh participates in system rendering features, such as shadows and depth mitigation, just like real-world geometry.
Custom virtual scene-understanding meshes only work in progressive or full immersive space. They don’t work in mixed space, or in a window or volume in the Shared Space.
Use scene reconstruction in visionOS
To enable scene reconstruction for a visionOS app, use a SceneReconstructionProvider.
let arSession = ARKitSession()
let sceneReconstruction = SceneReconstructionProvider(modes: [])
Task {
do {
try await arSession.run([sceneReconstruction])
} catch {
// Handle the error.
}
}See Also
Scene reconstructions and analysis
Creating a game with scene understandingVisualizing and interacting with a reconstructed scenesceneReconstructionsupportsSceneReconstruction(_:)SceneUnderstandingComponentARView.Environment.SceneUnderstandingARView.Environment.SceneUnderstanding.OptionsHasSceneUnderstandingSceneReconstructionProviderARSession