I am using ARKit 2 under iOS 12 (16A5288q), building with Xcode 10 beta 6, running on an iPhone X, and lookAtPoint is always zeroes.
I access the face data (in Swift) with:
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    guard let faceAnchor = anchor as? ARFaceAnchor else { return }
    FaceAnchorsProcessedCount = FaceAnchorsProcessedCount + 1
    let rightEyeTransform: simd_float4x4 = faceAnchor.rightEyeTransform
    let leftEyeTransform:  simd_float4x4 = faceAnchor.leftEyeTransform
    let lookAtPoint:       simd_float3   = faceAnchor.lookAtPoint
}
And I get data like:
rightEyeTransform    simd_float4x4    \n[ [9.999874e-01, 0.000000e+00, 5.010252e-03, -3.208227e-02],\n  [2.375229e-04, 9.988756e-01, -4.740678e-02, 2.703529e-02],\n  [-5.004618e-03, 4.740737e-02, 9.988630e-01, 2.525132e-02],\n  [0.000000e+00, 0.000000e+00, 0.000000e+00, 1.000000e+00] ]\n    
leftEyeTransform     simd_float4x4    \n[ [9.978353e-01, 0.000000e+00, -6.576237e-02, 3.208223e-02],\n  [-3.110934e-03, 9.988804e-01, -4.720329e-02, 2.703534e-02],\n  [6.568874e-02, 4.730569e-02, 9.967182e-01, 2.525137e-02],\n  [0.000000e+00, 0.000000e+00, 0.000000e+00, 1.000000e+00] ]\n    
lookAtPoint          simd_float3      (0.000000e+00, 0.000000e+00, 0.000000e+00)    
What am I doing wrong? Or is this a known bug?
UPDATED 4 Oct 2018
I did a simple test of lookAtPoint today. I moved my face close to the handset, and then farther away, and close again; repeatedly.
The minimum z for lookAtPoint was 38.59 inches, and the max was 39.17 inches (converted from meters).
The actual distances, measured with a measuring tape, were ~4.5 inches and ~33 inches.
Apple's declaration that lookAtPoint will "[...] estimate what point, relative to the face, the user's eyes are focused upon." does not seem to be correct.