浏览代码

Merge remote-tracking branch 'origin/master' into all-feature-points

/3.1
Tim Mowrer 5 年前
当前提交
5738212e
共有 2 个文件被更改,包括 16 次插入12 次删除
  1. 8
      Packages/manifest.json
  2. 20
      README.md

8
Packages/manifest.json


"com.unity.textmeshpro": "2.0.1",
"com.unity.timeline": "1.1.0",
"com.unity.ugui": "1.0.0",
"com.unity.xr.arcore": "3.0.0-preview.3",
"com.unity.xr.arfoundation": "3.0.0-preview.3",
"com.unity.xr.arkit": "3.0.0-preview.3",
"com.unity.xr.arkit-face-tracking": "3.0.0-preview.3",
"com.unity.xr.arcore": "3.0.0-preview.4",
"com.unity.xr.arfoundation": "3.0.0-preview.4",
"com.unity.xr.arkit": "3.0.0-preview.4",
"com.unity.xr.arkit-face-tracking": "3.0.0-preview.4",
"com.unity.xr.legacyinputhelpers": "2.0.6",
"com.unity.modules.ai": "1.0.0",
"com.unity.modules.androidjni": "1.0.0",

20
README.md


This set of samples relies on five Unity packages:
* ARSubsystems
* ARCore XR Plugin
* ARKit XR Plugin
* ARKit Face Tracking
* ARFoundation
* ARSubsystems ([documentation](https://docs.unity3d.com/Packages/com.unity.xr.arsubsystems@3.0/manual/index.html))
* ARCore XR Plugin ([documentation](https://docs.unity3d.com/Packages/com.unity.xr.arcore@3.0/manual/index.html))
* ARKit XR Plugin ([documentation](https://docs.unity3d.com/Packages/com.unity.xr.arkit@3.0/manual/index.html))
* ARKit Face Tracking ([documentation](https://docs.unity3d.com/Packages/com.unity.xr.arkit-face-tracking@3.0/manual/index.html))
* ARFoundation ([documentation](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@3.0/manual/index.html))
## Why is ARKit Face Tracking a separate package?
For privacy reasons, use of ARKit's face tracking feature requires additional validation in order to publish your app on the App Store. If your application binary contains certain face tracking related symbols, your app may fail validation. For this reason, we provide this feature as a separate package which must be explicitly included.
## ARKit 3 Support
The ARKit XR Plugin and ARKit Face Tacking packages support both ARKit 2 and ARKit 3 simultaneously. We supply separate libraries and select the appropriate one based on the version of Xcode selected in your Build Settings. This should eliminate the confusion over which package version is compatible with which Xcode version.

## PlaneOcclusion
This sample demonstrates basic plane detection, but uses an occlusion shader for the plane's material. This makes the plane appear invisible, but virtual objects behind the plane are culled. This provides an additional level of realism when, for example, when placing objects on a table.
This sample demonstrates basic plane detection, but uses an occlusion shader for the plane's material. This makes the plane appear invisible, but virtual objects behind the plane are culled. This provides an additional level of realism when, for example, placing objects on a table.
Move the device around until a plane is detected (its edges are still drawn) and then tap on the plane to place/move content.

At runtime, ARFoundation will generate an `ARTrackedImage` for each detected reference image. This sample uses the [`TrackedImageInfoManager.cs`](https://github.com/Unity-Technologies/arfoundation-samples/blob/master/Assets/Scenes/ImageTracking/TrackedImageInfoManager.cs) script to overlay the original image on top of the detected image, along with some meta data.
Run the sample on an ARCore or ARKit-capable device and point your device at one of the images in [`Assets/Scenes/ImageTracking/Images`](https://github.com/Unity-Technologies/arfoundation-samples/tree/master/Assets/Scenes/ImageTracking/Images). They can be displayed on a computere monitor; they do not need to be printed out.
Run the sample on an ARCore or ARKit-capable device and point your device at one of the images in [`Assets/Scenes/ImageTracking/Images`](https://github.com/Unity-Technologies/arfoundation-samples/tree/master/Assets/Scenes/ImageTracking/Images). They can be displayed on a computer monitor; they do not need to be printed out.
## ObjectTracking

These samples demonstrate eye and fixation point tracking. Eye tracking produces a pose (position and rotation) for each eye in the detected face, and the "fixation point" is the point the face is looking at (i.e., fixated upon). `EyeLasers` uses the eye pose to draw laser beams emitted from the detected face.
This sample uses the front-facing (i.e., selfie) camera.
This sample uses the front-facing (i.e., selfie) camera and requires an iOS device with a TrueDepth camera.
### RearCameraWithFrontCameraFaceMesh

正在加载...
取消
保存