浏览代码

Update description for WorldCameraWithUserFacingFaceTracking sample

/3.1
Tim Mowrer 5 年前
当前提交
e19314ee
共有 1 个文件被更改,包括 2 次插入2 次删除
  1. 4
      README.md

4
README.md


This sample uses the front-facing (i.e., selfie) camera and requires an iOS device with a TrueDepth camera.
### RearCameraWithFrontCameraFaceMesh
### WorldCameraWithUserFacingFaceTracking
iOS 13 adds support for face tracking while the rear camera is active. This sample does not show much other than number of currently tracked faces. To enable this mode in ARFoundation, you must enable both an `ARFaceManager` and at least one other manager which requires the rear camera. This sample enables both the `ARFaceManager` and `ARPlaneManager` to achieve this.
iOS 13 adds support for face tracking while the world-facing (i.e., rear) camera is active. This means the user-facing (i.e., front) camera is used for face tracking, but the pass through video uses the world-facing camera. To enable this mode in ARFoundation, you must enable an `ARFaceManager`, set the `ARSession` tracking mode to "Position and Rotation", and set the `ARCameraManager`'s facing direction to "World". Tap the screen to toggle between the user-facing and world-facing cameras.
This feature requires a device with a TrueDepth camera and an A12 bionic chip running iOS 13.

正在加载...
取消
保存