8

I want to detect ball and have AR model interact with it. I used opencv for ball detection and send center of ball which I can use in hitTest to get coordinates in sceneView. I have been converting CVPixelBuffer to UIImage using following function:

static func convertToUIImage(buffer: CVPixelBuffer) -> UIImage?{
    let ciImage = CIImage(cvPixelBuffer: buffer)
    let temporaryContext = CIContext(options: nil)
    if let temporaryImage = temporaryContext.createCGImage(ciImage, from: CGRect(x: 0, y: 0, width: CVPixelBufferGetWidth(buffer), height: CVPixelBufferGetHeight(buffer)))
    {
        let capturedImage = UIImage(cgImage: temporaryImage)
        return capturedImage
    }
    return nil
}

This gave me rotated image:

enter image description here

Then i found about changing orientation using:

let capturedImage = UIImage(cgImage: temporaryImage, scale: 1.0, orientation: .right)

While it gave correct orientation while device is in portrait, rotating device to landscape again gave rotated image.

Now I am thinking about handling it using viewWillTransition. But before that i want to know:

  1. If there is other way around to convert image with correct orientation?
  2. Why does this happen?
Andrea Mugnaini
  • 8,880
  • 3
  • 34
  • 49
Alok Subedi
  • 1,531
  • 10
  • 25

1 Answers1

14

1. Is there another way to convert the image with the correct orientation?

You may try to use snapshot() of ARSCNView (inherited from SCNView), which:

Draws the contents of the view and returns them as a new image object

so if you have an object like:

@IBOutlet var arkitSceneView:ARSCNView!

you only need to do so:

let imageFromArkitScene:UIImage? = arkitSceneView.snapshot()

2. Why does this happen?

It's because the CVPixelBuffer comes from ARFrame, which is :

captured (continuously) from the device camera, by the running AR session.

Well, since the camera orientation does not change with the rotation of the device (they are separate), to be able to adjust the orientation of your frame to the current view, you should re-orient the image captured from your camera applying the affine transform extracted with displayTransform(for:viewportSize:):

Returns an affine transform for converting between normalized image coordinates and a coordinate space appropriate for rendering the camera image onscreen.

you may find good documentation here, usage example:

let orient = UIApplication.shared.statusBarOrientation
let viewportSize = yourSceneView.bounds.size
let transform = frame.displayTransform(for: orient, viewportSize: viewportSize).inverted()
var finalImage = CIImage(cvPixelBuffer: pixelBuffer).transformed(by: transform)
Andrea Mugnaini
  • 8,880
  • 3
  • 34
  • 49
  • Thanks for your answer. Snapshot was never an option because i have to locate ball every frame. 'displayTransform(for:viewportSize:):' might be what i needed. – Alok Subedi Jan 26 '18 at 08:45
  • 1
    You are welcome, hope you may find a good compromise. Besides if you have to process at every frame, might be better using pixel buffer+displayTransform, looks like a more performant solution. – Andrea Mugnaini Jan 26 '18 at 08:47
  • 1
    yup, that is what i wanted. Thanks – Alok Subedi Jan 26 '18 at 09:04
  • I found that when the orientation is corrected, the images are still stretched. For example: https://stackoverflow.com/questions/58501761/live-camera-is-getting-stretched-while-rendering-using-cifilter-swift-4 – Joe Mar 12 '20 at 02:52