浏览代码

Update README.md

/main
GitHub 4 年前
当前提交
076b2b8c
共有 1 个文件被更改,包括 19 次插入0 次删除
  1. 19
      README.md

19
README.md


# Mesh Placement
![img](https://user-images.githubusercontent.com/2120584/87866691-77e47080-c939-11ea-9fe9-25a68ddd8a4b.JPG)
An example scene for using [ARKit meshing](https://docs.unity3d.com/Packages/com.unity.xr.arkit@4.0/manual/arkit-meshing.html) feature with the available surface [classifications](https://developer.apple.com/documentation/arkit/armeshclassification) to place unique objects on surfaces. This demo adds some additional functionality for use cases helpful outside of this demo such as a placement reticle and the [DOTween tweening library](http://dotween.demigiant.com/).
## Mesh Classificatons

To update the label at the top of the demo we use a physics raycast to raycast against the megamesh generated by the ARMeshManager to get the correct triangle index and parse the current classification for a more readable string label.
> to generate a mesh collider for physics raycast our megamesh must contain a mesh collider component on it
## Mesh Placement
The [Mesh Placement Manager](https://github.com/Unity-Technologies/arfoundation-demos/blob/master/Assets/Meshing/Scripts/ClassificationPlacementManager.cs) script handles showing the UI for each unique surface and spawning the objects at the placement reticle position. In the Update method I am checking against [specific classifications](https://github.com/Unity-Technologies/arfoundation-demos/blob/master/Assets/Meshing/Scripts/ClassificationPlacementManager.cs#L101-L104), in this case Table, Floor and Wall to enable or disable specific UI buttons. The UI buttons are configured in the scene to pass an index and instantiate the assigned prefab in the object list for each surface.
There's also some additional logic for placing floor and table objects to rotate them towards the user (Camera transform).
A way to place content on surfaces based on the center screen position of the users device. This reticle shows a visual that can snap to mesh (generated ARKit mesh) or planes. It uses an AR raycast to find the surfaces and snaps to AR Raycast Hit pose position and rotation.
There is also additional logic to scale up the reticle's local scale based on the distance away from the user (AR Camera transform).
For determining between snapping to a mesh and a plane we use a Raycast Mask.
Mesh:
```m_RaycastMask = TrackableType.PlaneEstimated;```
Plane:
```m_RaycastMask = TrackableType.PlaneWithinPolygon;```
To visualize and understand the different classified surfaces we are using a modified version of the [MeshFracking](https://github.com/Unity-Technologies/arfoundation-samples/blob/latest-preview/Assets/Scenes/Meshing/Scripts/MeshClassificationFracking.cs) script available in AR Foundaiton Samples. We've added an additional helper method to modify the alpha color of the generated meshes [ToggleVisability().](https://github.com/Unity-Technologies/arfoundation-demos/blob/master/Assets/Meshing/Scripts/MeshClassificationFracking.cs#L350-L366) This is all driven by a Toggle UI button in the scene and changes the shared material color on each material on the generated prefabs. By default they are configured to be completely transparent.
## DOTween is available on the Unity Asset store [here](https://assetstore.unity.com/packages/tools/animation/dotween-hotween-v2-27676)
For this demo it is used to scaling up the placed objects as they appear.
正在加载...
取消
保存