浏览代码

Update README to describe the AllPointCloudPoints sample

/3.1
Tim Mowrer 5 年前
当前提交
4b3fe7cf
共有 2 个文件被更改,包括 12 次插入0 次删除
  1. 4
      README.md
  2. 8
      Assets/Scenes/PointCloud.meta

4
README.md


This sample demonstrates "people occlusion", which can produce stencil and depth textures for detected persons. This sample is very primitive and simply displays the raw texture on the screen. We are currently working on a better sample.
This sample requires a device with an A12 bionic chip running iOS 13.
## AllPointCloudPoints
This sample shows all feature points over time, not just the current frame's feature points as the "AR Default Point Cloud" prefab does. It does this by using a slightly modified version of the `ARPointCloudParticleVisualzier` component that stores all the feature points in a Dictionary. Since each feature point has a unique identifier, it can look up the stored point and update its position in the dictionary if it already exists. This can be a useful starting point for custom solutions that require the entire map of point cloud points, e.g., for custom mesh reconstruction techniques.

8
Assets/Scenes/PointCloud.meta


fileFormatVersion: 2
guid: 1e5e2e06f121c434085c1c035420174d
folderAsset: yes
DefaultImporter:
externalObjects: {}
userData:
assetBundleName:
assetBundleVariant:
正在加载...
取消
保存