1

I'm using an iPhone 7+ with ios 11 installed, and I'm trying to adapt some code that captures regular images to also capture depth.

When I call capturePhotoOutput?.isDepthDataDeliverySupported it returns false. I was under the impression I would be able to use my iPhone 7+ to capture depth.

Am I missing a permission from info.plist? Or have I made a more fundamental error?


//
//  RecorderViewController.swift


import UIKit
import AVFoundation


class RecorderViewController: UIViewController {

    @IBOutlet weak var previewView: UIView!

    @IBAction func onTapTakePhoto(_ sender: Any) {
        // Make sure capturePhotoOutput is valid
        guard let capturePhotoOutput = self.capturePhotoOutput else { return }
        // Get an instance of AVCapturePhotoSettings class
        let photoSettings = AVCapturePhotoSettings()
        // Set photo settings for our need
        photoSettings.isAutoStillImageStabilizationEnabled = true
        photoSettings.isHighResolutionPhotoEnabled = true
        photoSettings.flashMode = .auto
        // Call capturePhoto method by passing our photo settings and a
        // delegate implementing AVCapturePhotoCaptureDelegate
        capturePhotoOutput.capturePhoto(with: photoSettings, delegate: self)

    }

    var captureSession: AVCaptureSession?
    var videoPreviewLayer: AVCaptureVideoPreviewLayer?

    var capturePhotoOutput: AVCapturePhotoOutput?


    override func viewDidLoad() {
        super.viewDidLoad()

        //let captureDevice = AVCaptureDevice.default(for: AVMediaType.video)
        let captureDevice = AVCaptureDevice.default(AVCaptureDevice.DeviceType.builtInDualCamera, for: .video, position: .back)
        do {
            let input = try AVCaptureDeviceInput(device: captureDevice!)

            captureSession = AVCaptureSession()
            captureSession?.addInput(input)

            videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession!)
            videoPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
            videoPreviewLayer?.frame = view.layer.bounds
            previewView.layer.addSublayer(videoPreviewLayer!)

            capturePhotoOutput = AVCapturePhotoOutput()
            capturePhotoOutput?.isHighResolutionCaptureEnabled = true

            if (capturePhotoOutput?.isDepthDataDeliverySupported)!
            {
                capturePhotoOutput?.isDepthDataDeliveryEnabled = true
            }
            else{
                print ("DEPTH NOT SUPPORTED!")
            }



            // Set the output on the capture session
            captureSession?.addOutput(capturePhotoOutput!)

            captureSession?.startRunning()

        } catch {
            print(error)
        }




    }

    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
        // Dispose of any resources that can be recreated.
    }

}

extension RecorderViewController : AVCapturePhotoCaptureDelegate {
    func photoOutput(_ captureOutput: AVCapturePhotoOutput,
                     didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?,
                     previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
                 resolvedSettings: AVCaptureResolvedPhotoSettings,
                 bracketSettings: AVCaptureBracketedStillImageSettings?,
                 error: Error?) {
        // get captured image
        // Make sure we get some photo sample buffer
        guard error == nil,
            let photoSampleBuffer = photoSampleBuffer else {
                print("Error capturing photo: \(String(describing: error))")
                return
        }
        // Convert photo same buffer to a jpeg image data by using // AVCapturePhotoOutput
        guard let imageData =
            AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoSampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer) else {
                return
        }
        // Initialise a UIImage with our image data
        let capturedImage = UIImage.init(data: imageData , scale: 1.0)
        if let image = capturedImage {
            // Save our captured image to photos album
            UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
        }
}
}

James
  • 3,421
  • 2
  • 30
  • 67
  • What happens when you set to isDepthDataDeliveryEnabled = true before calling the support parameter? https://developer.apple.com/documentation/avfoundation/avcapturephotooutput/2866565-isdepthdatadeliveryenabled And mabye this will help https://stackoverflow.com/questions/44506934/how-to-capture-depth-data-from-camera-in-ios-11-and-swift-4 and here is a WWDC tutorial https://developer.apple.com/videos/play/wwdc2017/507/ – kuzdu Mar 14 '18 at 12:38
  • I cannot set that property, it is a get-only parameter. Thanks for the links, but I still cannot resolve this – James Mar 14 '18 at 16:09
  • Hmm? Is isDepthDataDeliveryEnabled only read? You already set him. But maybe too late? What about this: https://developer.apple.com/documentation/avfoundation/avcapturedevice.devicetype/2933376-builtintruedepthcamera – kuzdu Mar 14 '18 at 16:23
  • I’m underway and cant edit my comment above by app. Whatever have a look here https://stackoverflow.com/a/46455058 too – kuzdu Mar 14 '18 at 16:27

0 Answers0