The output dataset includes a variety of information about different aspects of the active sensors in the Scene (currently only one), as well as the ground-truth generated by all active Labelers. [This page](https://github.com/Unity-Technologies/com.unity.perception/blob/master/com.unity.perception/Documentation%7E/Schema/Synthetic_Dataset_Schema.md) provides a comprehensive explanation on the schema of this dataset. We strongly recommend having a look at the page once you have completed this tutorial.
> :information_source: Are the RGB images blank? This may be a bug. When using URP in OSX, having MSAA enabled on the camera may cause the output RGB images to be blank. As a workaround, you can disable MSAA and use FXAA instead, until the issue is fixed.
> :information_source: Are the RGB images blank? This may be a bug. When using URP in OSX, having MSAA enabled on the camera may cause the output RGB images to be blank. As a workaround, you can disable MSAA and use FXAA instead, until the issue is fixed. To do this, select `Main Camera`, and in the ***Inspector*** view of the Camera component, in the ***Output*** section, set MSAA to `Off`. If you would like to use FXAA, in the ***Rendering*** section, set the Anti-aliasing option to `Fast Approximate Anti-aliasing (FXAA)`.
* **:green_circle: Action**: To get a quick feel of how the data is stored, open the folder whose name starts with `Dataset`, then open the file named `captures_000.json`. This file contains the output from `BoundingBox2DLabeler`. The `captures` array contains the position and rotation of the sensor (camera), the position and rotation of the ego (sensor group, currently only one), and the annotations made by `BoundingBox2DLabeler` for all visible objects defined in its label configuration. For each visible object, the annotations include:
* `label_id`: The numerical id assigned to this object's label in the Labeler's label configuration