浏览代码

Updated tutorials to include the visualizer

/0.9.0.preview.1_staging
Aryan Mann 3 年前
当前提交
535443d5
共有 7 个文件被更改,包括 1867 次插入102 次删除
  1. 3
      README.md
  2. 43
      com.unity.perception/Documentation~/Tutorial/Phase1.md
  3. 78
      com.unity.perception/Documentation~/Tutorial/Phase3.md
  4. 119
      com.unity.perception/Documentation~/Tutorial/DatasetInsights.md
  5. 725
      com.unity.perception/Documentation~/Tutorial/Images/did_visualizer_open.png
  6. 1001
      com.unity.perception/Documentation~/Tutorial/Images/visualizer_sample_synthdet.png

3
README.md


**[FAQ](com.unity.perception/Documentation~/FAQ/FAQ.md)**
Check out our FAQ for a list of common questions, tips, tricks, and some sample code.
**[Verifying Datasets with Dataset Insights](com.unity.perception/Documentation~/DatasetInsights.md)**
Introduction to downloading Unity's Python based Dataset Insights package and using it to verify your dataset's statistics.
## Documentation
In-depth documentation on individual components of the package.

43
com.unity.perception/Documentation~/Tutorial/Phase1.md


* [Step 5: Set Up Background Randomizers](#step-5)
* [Step 6: Set Up Foreground Randomizers](#step-6)
* [Step 7: Inspect Generated Synthetic Data](#step-7)
* [Step 8: Verify Data Using Dataset Insights](#step-8)
* [Step 8: Explore and Visualize Your Dataset](#step-8)
> :information_source: If you face any problems while following this tutorial, please create a post on the **[Unity Computer Vision forum](https://forum.unity.com/forums/computer-vision.626/)** or the **[GitHub issues](https://github.com/Unity-Technologies/com.unity.perception/issues)** page and include as much detail as possible.

* **:green_circle: Action**: Review the JSON meta-data and the images captured for the first annotated frame, and verify that the objects within them match.
### <a name="step-8">Step 8: Verify Data Using Dataset Insights</a>
### <a name="step-8">Step 8: Explore and Visualize Your Dataset</a>
Reviewing the individual files in the dataset can be error-prone and take a long time. Instead, you can use our Dataset Visualizer to efficiently visualize the captured frames along with Labeler outputs overlaid on them, as well as inspect the JSON data attached to each frame.
To verify and analyze a variety of metrics for the generated data, such as number of foreground objects in each frame and degree of representation for each foreground object (label), we will now use Unity's Dataset Insights framework. This will involve running a Jupyter notebook which is conveniently packaged within a Docker file that you can download from Unity.
* **:green_circle: Action**: Open _**Window**_ -> _**Dataset Visualizer**_ -> _**Open**_.
* **:green_circle: Action**: Download and install [Docker Desktop](https://www.docker.com/products/docker-desktop)
* **:green_circle: Action**: Open a command line interface (Command Prompt on Windows, Terminal on Mac OS, etc.) and type the following command to run the Dataset Insights Docker image:
`docker run -p 8888:8888 -v <path to synthetic data>:/data -t unitytechnologies/datasetinsights:latest`, where the path to data is what we looked at earlier. You can copy the path using the _**Copy Path**_ button in the `Perception Camera` UI.
This will first install the visualizer and then run it. The visualizer is a Python based tool that runs in the browser. Once the tool is installed, a browser window will open to the address `http://localhost:8501/`.
> :information_source: If you get an error about the format of the command, try the command again **with quotation marks** around the folder mapping argument, i.e. `"<path to synthetic data>:/data"`.
This will download a Docker image from Unity. If you get an error regarding the path to your dataset, make sure you have not included the enclosing `<` and `>` in the path and that the spaces are properly escaped.
* **:green_circle: Action**: The image is now running on your computer. Open a web browser and navigate to `http://localhost:8888` to open the Jupyter notebook:
In certain circumstances, the tool may run but the browser window may not open. To alleviate this, once the tool is running you will get a prompt in Unity Editor, asking whether the browser window opened successfully. If it did not, you can force it to open by clicking **Manually Open**.
<img src="Images/jupyter1.png" width="800"/>
<img src="Images/did_visualizer_open.png" width = "300"/>
* **:green_circle: Action**: To make sure your data is properly mounted, navigate to the `data` folder. If you see the dataset's folders there, we are good to go.
* **:green_circle: Action**: Navigate to the `datasetinsights/notebooks` folder and open `Perception_Statistics.ipynb`.
* **:green_circle: Action**: Once in the notebook, remove the `/<GUID>` part of the `data_root = /data/<GUID>` path. Since the dataset root is already mapped to `/data`, you can use this path directly.
> :information_source: The visualizer is a standalone tool that may be updated outside of the normal release cycles of the Perception package. You can use _**Window**_ -> _**Dataset Visualizer**_ -> _**Check For Updates**_ to check for and install updates.
Once the browser window is open, the tool will automatically try to open the latest dataset generated using the Unity project from which you opened the visualizer. You can also open other datasets using the ***Open Dataset*** button on the left side of the screen.
The left sidebar also contains switches for enabling the visualization of each Labeler in the dataset. The image below shows a sample dataset opened in the visualizer, with the 2D bounding boxes overlay enabled.
<img src="Images/jupyter2.png" width="800"/>
<img src="Images/visualizer_sample_synthdet.png" width = "800"/>
This notebook contains a variety of functions for generating plots, tables, and bounding box images that help you analyze your generated dataset. Certain parts of this notebook are currently not of use to us, such as the code meant for downloading data generated through Unity Simulation (coming later in this tutorial).
Each of the code blocks in this notebook can be executed by clicking on them to select them, and then clicking the _**Run**_ button at the top of the notebook. When you run a code block, an **asterisk (\*)** will be shown next to it on the left side, until the code finishes executing.
Below, you can see a sample plot generated by the Dataset Insights notebook, depicting the number of times each of the 10 foreground objects appeared in the dataset. As shown in the histogram, there is a high level of uniformity between the labels, which is a desirable outcome.
* **:green_circle: Action**: Click _**Expand Frame** for the first image.
In expanded mode, the image is enlarged and the JSON data attached to it are displayed, as seen in the screenshot below.
<img src="Images/object_count_plot.png" width="600"/>
<img src="Images/vis_expanded.png" width = "800"/>
To further analyze your dataset and verify statistics such as number of objects in each frame and more, you can use Unity's Dataset Insights, which is a Python package created for processing Perception datasets. [This guide](DatasetInsights.md) can get you started.
* **:green_circle: Action**: Follow the instructions laid out in the notebook and run each code block to view its outputs.
This concludes Phase 1 of the Perception Tutorial. In the next phase, you will dive a little bit into randomization code and learn how to build your own custom Randomizer.
This concludes Phase 1 of the Perception Tutorial. In the next phase, we will dive a little bit into randomization code and learn how to build custom Randomizers.
**[Continue to Phase 2: Custom Randomizations](Phase2.md)**

78
com.unity.perception/Documentation~/Tutorial/Phase3.md


* [Step 1: Setup Unity Account, Unity Simulation, and Cloud Project](#step-1)
* [Step 2: Run Project on Unity Simulation](#step-2)
* [Step 3: Keep Track of Your Runs Using the Unity Simulation Command-Line Interface](#step-3)
* [Step 4: Analyze the Dataset using Dataset Insights](#step-4)
### <a name="step-1">Step 1: Setup Unity Account, Unity Simulation, and Cloud Project</a>

* **:green_circle: Action**: Open the manifest file to check it. Make sure there are links to various types of output and check a few of the links to see if they work.
### <a name="step-4">Step 4: Analyze the Dataset using Dataset Insights</a>
In order to download the actual data from your run, we will now use Dataset Insights again. This time though, we will utilize some of the lines that were commented in our previous use with locally generated data.
* **:green_circle: Action**: Open the Dataset Insights Jupyter notebook again, using the command below:
`docker run -p 8888:8888 -v <download path>/data:/data -t unitytechnologies/datasetinsights:latest`
> :information_source: If you get an error about the format of the command, try the command again **with quotation marks** around the folder mapping argument, i.e. `"<download path>/data:/data"`.
In the above command, replace `<download path>` with the location on your computer in which you wish to download your data.
Once the Docker image is running, the rest of the workflow is quite similar to what we did in Phase 1, with certain differences caused by the need to download the data from Unity Simulation.
* **:green_circle: Action**: Open a web browser and navigate to `http://localhost:8888` to open the Jupyter notebook.
* **:green_circle: Action**: Navigate to the `datasetinsights/notebooks` folder and open `Perception_Statistics.ipynb`.
* **:green_circle: Action**: In the `data_root = /data/<GUID>` line, the `<GUID>` part will be the location inside your `<download path>` where the data will be downloaded. Therefore, you can just remove it so as to have data downloaded directly to the path you previously specified:
<p align="center">
<img src="Images/di_usim_1.png" width="900"/>
</p>
The next few lines of code pertain to setting up your notebook for downloading data from Unity Simulation.
* **:green_circle: Action**: In the block of code titled "Unity Simulation [Optional]", uncomment the lines that assign values to variables, and insert the correct values, based on information from your Unity Simulation run.
We have previously learned how to obtain the `run_execution_id` and `project_id`. You can remove the value already present for `annotation_definition_id` and leave it blank. What's left is the `access_token`.
* **:green_circle: Action**: Return to your command-line interface and run the `usim inspect auth` command.
MacOS:
`USimCLI/mac/usim inspect auth`
If you receive errors regarding authentication, your token might have timed out. Repeat the login step (`usim login auth`) to login again and fix this issue.
A sample output from `usim inspect auth` will look like below:
```
Protect your credentials. They may be used to impersonate your requests.
access token: Bearer 0CfQbhJ6gjYIHjC6BaP5gkYn1x5xtAp7ZA9I003fTNT1sFp
expires in: 2:00:05.236227
expired: False
refresh token: FW4c3YRD4IXi6qQHv3Y9W-rwg59K7k0Te9myKe7Zo6M003f.k4Dqo0tuoBdf-ncm003fX2RAHQ
updated: 2020-10-02 14:50:11.412979
```
The `access_token` you need for your Dataset Insights notebook is the access token shown by the above command, minus the `'Bearer '` part. So, in this case, we should input `0CfQbhJ6gjYIHjC6BaP5gkYn1x5xtAp7ZA9I003fTNT1sFp` in the notebook.
* **:green_circle: Action**: Copy the access token excluding the `'Bearer '` part to the corresponding field in the Dataset Insights notebook.
Once you have entered all the information, the block of code should look like the screenshot below (the actual values you input will be different):
<p align="center">
<img src="Images/di_usim_2.png" width="800"/>
</p>
* **:green_circle: Action**: Continue to the next code block and run it to download all the metadata files from the generated dataset. This includes JSON files and logs but does not include images (which will be downloaded later).
You will see a progress bar while the data downloads:
<p align="center">
<img src="Images/di_usim_3.png" width="800"/>
</p>
The next couple of code blocks (under "Load dataset metadata") analyze the downloaded metadata and display a table containing annotation-definition-ids for the various metrics defined in the dataset.
* **:green_circle: Action**: Once you reach the code block titled "Built-in Statistics", make sure the value assigned to the field `rendered_object_info_definition_id` matches the id displayed for this metric in the table output by the code block immediately before it. The screenshot below demonstrates this (note that your ids might differ from the ones here):
<p align="center">
<img src="Images/di_usim_4.png" width="800"/>
</p>
Follow the rest of the steps inside the notebook to generate a variety of plots and stats. Keep in mind that this notebook is provided just as an example, and you can modify and extend it according to your own needs using the tools provided by the [Dataset Insights framework](https://datasetinsights.readthedocs.io/en/latest/).
Once the run execution is complete, you can use Unity's [Datasets Insights](https://github.com/Unity-Technologies/datasetinsights) framework to download your dataset and analyze it. This is covered in the Unity Simulation section of the [Dataset Insights](DatasetInsights.md) guide.
The next step in this workflow, which is out of the scope of this tutorial, is to train an object-detection model using our synthetic dataset. It is important to note that the 1000 large dataset we generated here is probably not sufficiently large for training most models. We chose this number here so that the Unity Simulation run would finish quickly, allowing us to move on to learning how to analyze the statistics of the dataset. In order to generate data for training, we recommend a minimum dataset size of around 50,000 captures with a large degree of randomization.

119
com.unity.perception/Documentation~/Tutorial/DatasetInsights.md


# Verifying and Analyzing Perception Datasets with Dataset Insights
Unity's [Datasets Insights](https://github.com/Unity-Technologies/datasetinsights) is a Python package that provides a variety of tools for downloading, processing, and analyzing datasets generated using the Perception package. In addition to a Python library, the package comes with a sample Jupyter notebook that helps you load datasets and verify some of their most commonly needed statistics.
In this guide, we will go through the steps involved in opening Perception datasets and verifying them using the provided Jupyter notebook. This includes both datasets that are generated locally and those generated with Unity Simulation. To learn how to generate datasets locally follow [Phase 1](Tutorial/../Phase1.md) of the Perception Tutorial. For the Unity Simulation workflow, follow [Phase 3](Tutorial/../Phase3.md) of the tutorial.
* **:green_circle: Action**: Download and install [Docker Desktop](https://www.docker.com/products/docker-desktop)
## Locally generated datasets
* **:green_circle: Action**: Open a command line interface (Command Prompt on Windows, Terminal on Mac OS, etc.) and type the following command to run the Dataset Insights Docker image:
`docker run -p 8888:8888 -v <path to synthetic data>:/data -t unitytechnologies/datasetinsights:latest`, where the path to data is what we looked at earlier. You can copy the path using the _**Copy Path**_ button in the `Perception Camera` UI.
> :information_source: If you get an error about the format of the command, try the command again **with quotation marks** around the folder mapping argument, i.e. `"<path to synthetic data>:/data"`.
This will download a Docker image from Unity. If you get an error regarding the path to your dataset, make sure you have not included the enclosing `<` and `>` in the path and that the spaces are properly escaped.
* **:green_circle: Action**: The image is now running on your computer. Open a web browser and navigate to `http://localhost:8888` to open the Jupyter notebook:
<p align="center">
<img src="Images/jupyter1.png" width="800"/>
</p>
* **:green_circle: Action**: To make sure your data is properly mounted, navigate to the `data` folder. If you see the dataset's folders there, we are good to go.
* **:green_circle: Action**: Navigate to the `datasetinsights/notebooks` folder and open `Perception_Statistics.ipynb`.
* **:green_circle: Action**: Once in the notebook, remove the `/<GUID>` part of the `data_root = /data/<GUID>` path. Since the dataset root is already mapped to `/data`, you can use this path directly.
<p align="center">
<img src="Images/jupyter2.png" width="800"/>
</p>
This notebook contains a variety of functions for generating plots, tables, and bounding box images that help you analyze your generated dataset. Certain parts of this notebook are currently not of use to us, such as the code meant for downloading data generated through Unity Simulation (coming later in this tutorial).
Each of the code blocks in this notebook can be executed by clicking on them to select them, and then clicking the _**Run**_ button at the top of the notebook. When you run a code block, an **asterisk (\*)** will be shown next to it on the left side, until the code finishes executing.
Below, you can see a sample plot generated by the Dataset Insights notebook, depicting the number of times each of the 10 foreground objects appeared in the dataset. As shown in the histogram, there is a high level of uniformity between the labels, which is a desirable outcome.
<p align="center">
<img src="Images/object_count_plot.png" width="600"/>
</p>
* **:green_circle: Action**: Follow the instructions laid out in the notebook and run each code block to view its outputs.
## Datasets generated with Unity Simulation
For these datasets we recommend using a slightly different command to open the notebook, as we do not need to mount a specific dataset folder. Instead, we mount a folder which will hold our downloaded datasets.
* **:green_circle: Action**: Open the notebook using the command below:
`docker run -p 8888:8888 -v <download path>/data:/data -t unitytechnologies/datasetinsights:latest`
In the above command, replace `<download path>` with the location on your computer in which you wish to download your data.
Once the Docker image is running, the rest of the workflow is quite similar to what we did for locally generated data. The only difference is that we need to uncomment certain lines of code from the notebook to download the dataset.
* **:green_circle: Action**: Open a web browser and navigate to `http://localhost:8888` to open the notebook.
* **:green_circle: Action**: Navigate to the `datasetinsights/notebooks` folder and open `Perception_Statistics.ipynb`.
* **:green_circle: Action**: In the `data_root = /data/<GUID>` line, the `<GUID>` part will be the location inside your `<download path>` where the data will be downloaded. Therefore, you can just remove it so as to have data downloaded directly to the path you previously specified:
<p align="center">
<img src="Images/di_usim_1.png" width="900"/>
</p>
* **:green_circle: Action**: In the block of code titled "Unity Simulation [Optional]", uncomment the lines that assign values to variables, and insert the correct values, based on information from your Unity Simulation run.
We have previously learned how to obtain the `run_execution_id` and `project_id`. You can remove the value already present for `annotation_definition_id` and leave it blank. What's left is the `access_token`.
* **:green_circle: Action**: Return to your command-line interface and run the `usim inspect auth` command.
MacOS:
`USimCLI/mac/usim inspect auth`
If you receive errors regarding authentication, your token might have timed out. Repeat the login step (`usim login auth`) to login again and fix this issue.
A sample output from `usim inspect auth` will look like below:
```
Protect your credentials. They may be used to impersonate your requests.
access token: Bearer 0CfQbhJ6gjYIHjC6BaP5gkYn1x5xtAp7ZA9I003fTNT1sFp
expires in: 2:00:05.236227
expired: False
refresh token: FW4c3YRD4IXi6qQHv3Y9W-rwg59K7k0Te9myKe7Zo6M003f.k4Dqo0tuoBdf-ncm003fX2RAHQ
updated: 2020-10-02 14:50:11.412979
```
The `access_token` you need for your Dataset Insights notebook is the access token shown by the above command, minus the `'Bearer '` part. So, in this case, we should copy `0CfQbhJ6gjYIHjC6BaP5gkYn1x5xtAp7ZA9I003fTNT1sFp` into the notebook.
* **:green_circle: Action**: Copy the access token excluding the `'Bearer '` part to the corresponding field in the notebook.
Once you have entered all the information, the block of code should look like the screenshot below (the actual values you input will be different):
<p align="center">
<img src="Images/di_usim_2.png" width="800"/>
</p>
* **:green_circle: Action**: Continue to the next code block and run it to download all the metadata files from the generated dataset. This includes JSON files and logs but does not include images (which will be downloaded later).
You will see a progress bar while the data downloads:
<p align="center">
<img src="Images/di_usim_3.png" width="800"/>
</p>
The next couple of code blocks (under "Load dataset metadata") analyze the downloaded metadata and display a table containing annotation-definition-ids for the various metrics defined in the dataset.
* **:green_circle: Action**: Once you reach the code block titled "Built-in Statistics", make sure the value assigned to the field `rendered_object_info_definition_id` matches the id displayed for this metric in the table output by the code block immediately before it. The screenshot below demonstrates this (note that your ids might differ from the ones here):
<p align="center">
<img src="Images/di_usim_4.png" width="800"/>
</p>
Follow the rest of the steps inside the notebook to generate a variety of plots and stats.
Keep in mind that this notebook is provided just as an example, and you can modify and extend it according to your own needs using the tools provided by the [Dataset Insights framework](https://datasetinsights.readthedocs.io/en/latest/).

725
com.unity.perception/Documentation~/Tutorial/Images/did_visualizer_open.png

之前 之后
宽度: 656  |  高度: 754  |  大小: 186 KiB

1001
com.unity.perception/Documentation~/Tutorial/Images/visualizer_sample_synthdet.png
文件差异内容过多而无法显示
查看文件

部分文件因为文件数量过多而无法显示

正在加载...
取消
保存