1

I need to create a application in which we want to measure distance between device and my user face. I think its possible by ARKit but don't know how to do it. Is there any kind of sample or example?

halfer
  • 18,701
  • 13
  • 79
  • 158
Monish Bansal
  • 489
  • 2
  • 15

3 Answers3

3

If you are running an ARFaceTrackingConfiguration (only for devices with a front-facing TrueDepth camera), there are at least two ways to achieve this (I think the second one is the better).

First method

You can use the depthData of the IR camera :

yourARSceneView.session.currentFrame?.capturedDepthData?.depthDataMap

This will return a CVPixelBuffer of size 640x360 containing depth data for each pixel (basically the distance between the IR camera and the real objects in the world). You can access CVPixelBuffer data through available extensions like this one. The depth data are expressed in meters. Once you have the depth data, you will have to choose or detect which ones are part of the user's face. You also have to be careful that "the depth-sensing camera provides data at a different frame rate than the color camera, so this property’s value can also be nil if no depth data was captured at the same time as the current color image". For more informations : AVDepthData

Second method (recommended)

Another way to get the distance between the device and the user's face is to convert position of the detected user's face into camera's coordinate system. To do this, you will have to use the convertPosition method from SceneKit in order to switch coordinate space, from face coordinate space to camera coordinate space.

let positionInCameraSpace = theFaceNode.convertPosition(pointInFaceCoordinateSpace, to: yourARSceneView.pointOfView)

theFaceNode is the SCNNode created by ARKit representing the user's face. The pointOfView property of your ARSCNView returns the node from which the scene is viewed, basically the camera. pointInFaceCoordinateSpace could be any vertices of the face mesh or just the position of theFaceNode (which is the origin of the face coordinate system). Here, positionInCameraSpace is a SCNVector3, representing the position of the point you gave, in camera coordinate space. Then you can get the distance between the point and the camera using the x,y and z value of this SCNVector3 (expressed in meters).

I guess the second method is better as it looks more precise and you can choose precisely which point of the face you want to measure. You can also use transforms as Rom4in said (I guess the convertPosition method uses transforms). Hope it will help and I'm also curious to know if there are easier ways to achieve this.

0

Both the camera and the face have transforms, you can then calculate the distance between them.

Rom4in
  • 270
  • 3
  • 7
0

I did it using renderer(didUpdate) method in the ARKit library and then you can store it all in a variable or write to a file. Here I am storing all the data in a variable like this,

 func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor)
    {
         var data = "\(Date().timeIntervalSince1970 * 1000),\  (node.position.x),\(node.position.y),\(node.position.z)\n"
     }
Mozz
  • 118
  • 5