Capturing still and Live Photos
Configure and capture single or multiple still images, Live Photos, and other forms of photography.
Overview
Video captured on the iPhone 8, iPhone 8 Plus, and iPhone X running iOS 11 or later uses the HEVC codec by default. If your app shares the captured video using a system share sheet, the video will be automatically converted to a format compatible with the destination device.
AVFoundation supports many ways to capture photos. You can simply capture still HEIF or JPEG images, capture in RAW format for custom processing, snap several images in one shot, create Live Photos with motion and sound, and much more. In iOS, all photography workflows use the AVCapturePhotoOutput class.
Prepare for photo capture
First, set up an AVCaptureSession containing a supported camera device as one of its inputs and an AVCapturePhotoOutput as one of its outputs. (For details, see Choosing a capture device and Setting up a capture session.) Each camera device supports a wide range of resolution and frame rate settings. To easily get the best photo quality for the user’s device, you can use the photo session preset instead of directly choosing individual settings.
Some capture options affect the internal configuration of the media capture pipeline. Because changing those options causes the pipeline to reconfigure itself, which takes time, enable them before offering the user the ability to shoot photos with those settings. Otherwise, the configuration delay could prevent the user from capturing a photo at the right moment.
For example, to configure the capture pipleline to support Live Photos, enable that property on the photo output, as shown below. After you’ve enabled Live Photo capture, you can choose for each individual shot whether to use still or Live Photo capture for each shot (see Capturing and saving Live Photos).
self.captureSession.beginConfiguration()
let photoOutput = AVCapturePhotoOutput()
photoOutput.isHighResolutionCaptureEnabled = true
photoOutput.isLivePhotoCaptureEnabled = photoOutput.isLivePhotoCaptureSupported
guard self.captureSession.canAddOutput(photoOutput) else { return }
self.captureSession.sessionPreset = .photo
self.captureSession.addOutput(photoOutput)
self.previewView.session = captureSession
self.captureSession.commitConfiguration()
self.captureSession.startRunning()Choose settings
To capture a photo, first create an AVCapturePhotoSettings object describing the settings you want to use for that shot and the data format for the resulting still photo. For example:
On supported devices, you can use the HEIF/HEVC format for improved image quality at smaller file sizes: use init(format:) and choose hevc for the video codec. On devices without HEVC support, use the default initializer init() to fall back to JPEG format.
To shoot in RAW format, use init(rawPixelFormatType:) with one of the availableRawPhotoPixelFormatTypes supported by the photo output.
After creating a photo settings object, you can choose other settings for the photo. For example, the code below creates a settings object for HEIF/HEVC shooting, with automatic flash and image stabilization.
let photoSettings: AVCapturePhotoSettings
if self.photoOutput.availablePhotoCodecTypes.contains(.hevc) {
photoSettings = AVCapturePhotoSettings(format:
[AVVideoCodecKey: AVVideoCodecType.hevc])
} else {
photoSettings = AVCapturePhotoSettings()
}
photoSettings.flashMode = .auto
photoSettings.isAutoStillImageStabilizationEnabled =
self.photoOutput.isStillImageStabilizationSupported
Other possible photo settings include Live Photos, depth data capture, and multi-image (bracketed) capture, as well as options for embedding preview or thumbnail images in output image files. For more information, see Next Steps and More Capture Options below.
Capture the photo
Pass your photo settings object to the capturePhoto(with:delegate:) method to trigger photo capture with the settings you’ve chosen.
Handle capture results
The delegate you pass to the capturePhoto(with:delegate:) method is an object to track the progress of and handle results from that photo capture. Capturing a photo is an asynchronous process with multiple steps that unfold over time. Because your app can trigger additional captures while earlier captures are still processing, your delegate implementation should be able to handle multiple captures at once. An easy way to handle concurrent captures is to define a class adopting the AVCapturePhotoCaptureDelegate protocol and create a separate instance of that class for each capture:
class PhotoCaptureProcessor: NSObject, AVCapturePhotoCaptureDelegate {
// ...
}
let captureProcessor = PhotoCaptureProcessor()
self.photoOutput.capturePhoto(with: photoSettings, delegate: captureProcessor)When your captured image data is ready for use, the photo output calls your delegate’s photoOutput(_:didFinishProcessingPhoto:error:) method. You can use the resulting AVCapturePhoto object there to display, process or save the image.
Topics
Next steps
More capture options
See Also
Photo capture
Capturing consistent color imagesCapturing photos in RAW and Apple ProRAW formatsSupporting Continuity Camera in Your Mac AppAVCapturePhotoAVCaptureDeferredPhotoProxyAVCapturePhotoOutputAVCapturePhotoCaptureDelegateAVCapturePhotoOutputReadinessCoordinatorAVCapturePhotoOutputReadinessCoordinatorDelegateAVCaptureStillImageOutput