Contents

ARFaceTrackingConfiguration

A configuration that tracks facial movement and expressions using the front camera.

Declaration

class ARFaceTrackingConfiguration

Mentioned in

Overview

A face-tracking configuration detects faces within 3 meters of the device’s front camera. When ARKit detects a face, it creates an ARFaceAnchor object that provides information about a person’s facial position, orientation, topology, and expressions.

Face tracking supports devices with Apple Neural Engine in iOS 14 and iPadOS 14 and requires a device with a TrueDepth camera on iOS 13 and iPadOS 13 and earlier. To determine whether the device supports face tracking, call isSupported on ARFaceTrackingConfiguration before attempting to use this configuration.

When you enable the isLightEstimationEnabled setting, a face-tracking configuration estimates directional and environmental lighting (an ARDirectionalLightEstimate object) by referring to the detected face as a light probe.

Topics

Creating a Configuration

Enabling World Tracking

Tracking Multiple Faces

See Also

Body and Face Tracking