浏览代码

Merge branch 'master' into tutorial_sample_project

/main
Mohsen Kamalzadeh 4 年前
当前提交
f55ee988
共有 30 个文件被更改,包括 253 次插入107 次删除
  1. 5
      com.unity.perception/CHANGELOG.md
  2. 10
      com.unity.perception/Documentation~/DatasetCapture.md
  3. 55
      com.unity.perception/Documentation~/PerceptionCamera.md
  4. 2
      com.unity.perception/Documentation~/Randomization/Index.md
  5. 18
      com.unity.perception/Documentation~/TableOfContents.md
  6. 30
      com.unity.perception/Documentation~/index.md
  7. 13
      com.unity.perception/Runtime/GroundTruth/Labelers/ObjectCountLabeler.cs
  8. 8
      com.unity.perception/Runtime/GroundTruth/Labelers/RenderedObjectInfoLabeler.cs
  9. 10
      com.unity.perception/Runtime/GroundTruth/Labelers/Visualization/HUDPanel.cs
  10. 2
      com.unity.perception/Runtime/Randomization/Parameters/CategoricalParameter.cs
  11. 2
      com.unity.perception/Runtime/Randomization/Parameters/Parameter.cs
  12. 2
      com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/BooleanParameter.cs
  13. 2
      com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/ColorParameters/ColorHsvaParameter.cs
  14. 2
      com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/ColorParameters/ColorRgbParameter.cs
  15. 2
      com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/FloatParameter.cs
  16. 2
      com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/IntegerParameter.cs
  17. 2
      com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/Vector2Parameter.cs
  18. 2
      com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/Vector3Parameter.cs
  19. 2
      com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/Vector4Parameter.cs
  20. 11
      com.unity.perception/Runtime/Randomization/Randomizers/Randomizer.cs
  21. 1
      com.unity.perception/Runtime/Randomization/Randomizers/RandomizerExamples/Tags/ColorRandomizerTag.cs
  22. 8
      com.unity.perception/Runtime/Randomization/Randomizers/RandomizerExamples/Randomizers/RotationRandomizer.cs
  23. 2
      com.unity.perception/Runtime/Randomization/Randomizers/RandomizerExamples/Randomizers/ColorRandomizer.cs
  24. 3
      com.unity.perception/Runtime/Randomization/Randomizers/RandomizerExamples/Randomizers/BackgroundObjectPlacementRandomizer.cs
  25. 9
      com.unity.perception/Runtime/Randomization/Scenarios/ScenarioBase.cs
  26. 28
      com.unity.perception/Documentation~/GroundTruthLabeling.md
  27. 102
      com.unity.perception/Documentation~/images/SemanticSegmentationLabelConfig.png
  28. 25
      com.unity.perception/Documentation~/GroundTruth-Labeling.md
  29. 0
      /com.unity.perception/Runtime/Randomization/Randomizers/RandomizerExamples
  30. 0
      /com.unity.perception/Runtime/Randomization/Randomizers/RandomizerExamples.meta

5
com.unity.perception/CHANGELOG.md


### Fixed
Fixed visualization issue where object count and pixel count labelers were shown stale values
Fixed visualization issue where HUD entry labels could be too long and take up the entire panel
## [0.4.0-preview.1] - 2020-08-07
### Added

### Removed
### Fixed
Fixed 2d bounding boxes being reported for objects that do not match the label config.
Fixed a categorical parameter UI error in which deleting an individual option would successfully remove the option from the UI but only serialize the option to null during serialization instead of removing it

Changed minimum Unity Editor version to 2019.4
### Fixed
Fixed compilation warnings with latest com.unity.simulation.core package.
Fixed errors in example script when exiting play mode

10
com.unity.perception/Documentation~/DatasetCapture.md


## Sensor scheduling
While sensors are registered, `DatasetCapture` ensures that frame timing is deterministic and run at the appropriate simulation times to let each sensor run at its own rate.
Using [Time.CaptureDeltaTime](https://docs.unity3d.com/ScriptReference/Time-captureDeltaTime.html), it also decouples wall clock time from simulation time, allowing the simulation to run as fast as possible.
While sensors are registered, `DatasetCapture` ensures that frame timing is deterministic and run at the appropriate simulation times to let each sensor run at its own rate.
Using [Time.CaptureDeltaTime](https://docs.unity3d.com/ScriptReference/Time-captureDeltaTime.html), it also decouples wall clock time from simulation time, allowing the simulation to run as fast as possible.
Custom sensors can be registered using `DatasetCapture.RegisterSensor()`. The `period` passed in at registration time determines how often in simulation time frames should be scheduled for the sensor to run. The sensor implementation would then check `ShouldCaptureThisFrame` on the returned `SensorHandle` each frame to determine whether it is time for the sensor to perform a capture. `SensorHandle.ReportCapture` should then be called in each of these frames to report the state of the sensor to populate the dataset.
You can register custom sensors using `DatasetCapture.RegisterSensor()`. The `period` you pass in at registration time determines how often (in simulation time) frames should be scheduled for the sensor to run. The sensor implementation then checks `ShouldCaptureThisFrame` on the returned `SensorHandle` each frame to determine whether it is time for the sensor to perform a capture. `SensorHandle.ReportCapture` should then be called in each of these frames to report the state of the sensor to populate the dataset.
In addition to the common annotations and metrics produced by [PerceptionCamera](PerceptionCamera.md), scripts can produce their own via `DatasetCapture`. Annotation and metric definitions must first be registered using `DatasetCapture.RegisterAnnotationDefinition()` or `DatasetCapture.RegisterMetricDefinition()`. These return `AnnotationDefinition` and `MetricDefinition` instances which can then be used to report values during runtime.
In addition to the common annotations and metrics produced by [PerceptionCamera](PerceptionCamera.md), scripts can produce their own via `DatasetCapture`. You must first register annotation and metric definitions using `DatasetCapture.RegisterAnnotationDefinition()` or `DatasetCapture.RegisterMetricDefinition()`. These return `AnnotationDefinition` and `MetricDefinition` instances which you can then use to report values during runtime.
Annotations and metrics are always associated with the frame they are reported in. They may also be associated with a specific sensor by using the `Report*` methods on `SensorHandle`.

55
com.unity.perception/Documentation~/PerceptionCamera.md


# The Perception Camera component
The Perception Camera component ensures the attached [Camera](https://docs.unity3d.com/Manual/class-Camera.html) runs at deterministic rates and captures RGB and other Camera-related ground truth to the [JSON dataset](Schema/Synthetic_Dataset_Schema.md) using [DatasetCapture](DatasetCapture.md). It supports HDRP and URP.
The Perception Camera component ensures that the [Camera](https://docs.unity3d.com/Manual/class-Camera.html) runs at deterministic rates. It also ensures that the Camera uses [DatasetCapture](DatasetCapture.md) to capture RGB and other Camera-related ground truth in the [JSON dataset](Schema/Synthetic_Dataset_Schema.md). You can use the Perception Camera component on the High Definition Render Pipeline (HDRP) or the Universal Render Pipeline (URP).
<img src="images/PerceptionCamera.png" align="middle"/>
![Perception Camera component](images/PerceptionCameraFinished.png)
<br/>_Perception Camera component_
| Description | A description of the camera to be registered in the JSON dataset. |
| Period | The amount of simulation time in seconds between frames for this camera. For more on sensor scheduling, see [DatasetCapture](DatasetCapture.md). |
| Start Time | The simulation time at which to run the first frame. This time will offset the period, useful for allowing multiple cameras to run at the right times relative to each other. |
| Capture Rgb Images | When this is checked, RGB images will be captured as PNG files in the dataset each frame. |
| Camera Labelers | A list of labelers which generate data derived from this camera. |
| Description | A description of the Camera to be registered in the JSON dataset. |
| Period | The amount of simulation time in seconds between frames for this Camera. For more information on sensor scheduling, see [DatasetCapture](DatasetCapture.md). |
| Start Time | The simulation time at which to run the first frame. This time offsets the period, which allows multiple Cameras to run at the correct times relative to each other. |
| Capture Rgb Images | When you enable this property, Unity captures RGB images as PNG files in the dataset each frame. |
| Camera Labelers | A list of labelers that generate data derived from this Camera. |
## Camera Labelers
Camera Labelers capture data related to the camera into the JSON dataset. This data can be used for model training or for dataset statistics. A number of Camera Labelers are provided with Perception, and additional labelers can be defined by deriving from the CameraLabeler class.
## Camera labelers
Camera labelers capture data related to the Camera in the JSON dataset. You can use this data to train models and for dataset statistics. The Perception package provides several Camera labelers, and you can derive from the CameraLabeler class to define more labelers.
### Semantic Segmentation Labeler
<img src="images/semantic_segmentation.png" align="middle"/>
### SemanticSegmentationLabeler
![Example semantic segmentation image from a modified SynthDet project](images/semantic_segmentation.png)
<br/>_Example semantic segmentation image from a modified SynthDet project_
_Example semantic segmentation image from a modified [SynthDet](https://github.com/Unity-Technologies/SynthDet) project_
The SemanticSegmentationLabeler generates a 2D RGB image with the attached Camera. Unity draws objects in the color you associate with the label in the SemanticSegmentationLabelingConfiguration. If Unity can't find a label for an object, it draws it in black.
The Semantic Segmentation Labeler generates a 2D RGB image using the attached camera where objects are drawn with the color associated with their label in the provided SemanticSegmentationLabelConfig. If no label is resolved for an object, it is drawn black.
### Bounding Box 2D Labeler
<img src="images/bounding_boxes.png" align="middle"/>
_example bounding box visualization from [SynthDet](https://github.com/Unity-Technologies/SynthDet) generated by the `SynthDet_Statistics` jupyter notebook_
### BoundingBox2DLabeler
![Example bounding box visualization from SynthDet generated by the `SynthDet_Statistics` Jupyter notebook](images/bounding_boxes.png)
<br/>_Example bounding box visualization from SynthDet generated by the `SynthDet_Statistics` Jupyter notebook_
The Bounding Box 2D Labeler produces 2D bounding boxes for each visible object with a label resolved by the given ID Label Config. Bounding boxes are calculated using the rendered image, so only occluded or out-of-frame portions of the objects are not included.
The BoundingBox2DLabeler produces 2D bounding boxes for each visible object with a label you define in the IdLabelConfig. Unity calculates bounding boxes using the rendered image, so it only excludes occluded or out-of-frame portions of the objects.
### Object Count Labeler
### ObjectCountLabeler
```
{
"label_id": 25,

```
_Example object count for a single label_
The Object Count Labeler records object counts for each label in the provided ID Label Config. Only objects with at least one visible pixel in the camera frame will be recorded.
The ObjectCountLabeler records object counts for each label you define in the IdLabelConfig. Unity only records objects that have at least one visible pixel in the Camera frame.
### Rendered Object Info Labeler
### RenderedObjectInfoLabeler
```
{
"label_id": 24,

```
_Example rendered object info for a single object_
The Rendered Object Info Labeler records a list of all objects visible in the camera image, including its instance id, resolved label id and visible pixels. Objects not resolved to a label in the given ID Label Config are not recorded.
The RenderedObjectInfoLabeler records a list of all objects visible in the Camera image, including its instance ID, resolved label ID and visible pixels. If Unity cannot resolve objects to a label in the IdLabelConfig, it does not record these objects.
Ground truth is not compatible with all rendering features, especially ones that modify the visibility or shape of objects in the frame.
Ground truth is not compatible with all rendering features, especially those that modify the visibility or shape of objects in the frame.
* Vertex and geometry shaders are not run
* Transparency is not considered. All geometry is considered opaque
* Besides built-in Lens Distortion in URP and HDRP, post-processing effects are not run
* Unity does not run Vertex and geometry shaders
* Unity does not consider transparency and considers all geometry opaque
* Unity does not run post-processing effects, except built-in lens distortion in URP and HDRP
If you encounter additional incompatibilities, please open an [issue](https://github.com/Unity-Technologies/com.unity.perception/issues)
If you discover more incompatibilities, please open an issue in the [Perception GitHub repository](https://github.com/Unity-Technologies/com.unity.perception/issues).

2
com.unity.perception/Documentation~/Randomization/Index.md


# Overview
*NOTICE: The perception randomization toolset is currently marked as experimental and will experience a number of updates in the near future.*
*NOTE: The Perception package's randomization toolset is currently marked as experimental and is subject to change.*
The randomization toolset simplifies randomizing aspects of generating synthetic data. It facilitates exposing parameters for randomization, offers samplers to pick random values from parameters, and provides scenarios to coordinate a full randomization process. Each of these also allows for custom implementations to fit particular randomization needs.

18
com.unity.perception/Documentation~/TableOfContents.md


* [Installation Instructions](SetupSteps.md)
* [Getting Started](GettingStarted.md)
* [Labeling](GroundTruth-Labeling.md)
* [Perception Camera](PerceptionCamera.md)
* [Dataset Capture](DatasetCapture.md)
* [Randomization](Randomization/Index.md)
* [Unity Perception Package](index.md)
* [Installation instructions](SetupSteps.md)
* [Getting started](GettingStarted.md)
* [Labeling](GroundTruthLabeling.md)
* [Perception Camera](PerceptionCamera.md)
* [Dataset capture](DatasetCapture.md)
* [Randomization](Randomization/index.md)
* [Parameters](Randomization/Parameters.md)
* [Samplers](Randomization/Samplers.md)
* [Scenarios](Randomization/Scenarios.md)
* [Tutorial](Randomization/Tutorial.md)

30
com.unity.perception/Documentation~/index.md


<img src="images/banner2.PNG" align="middle"/>
# Unity Perception package (com.unity.perception)
The Perception package provides a toolkit for generating large-scale datasets for perception-based machine learning training and validation. It is focused on capturing ground truth for camera-based use cases for now and will ultimately expand to other forms of sensors and machine learning tasks.
> The Perception package is in active development. Its features and API are subject to significant change as development progresses.
The Perception package provides a toolkit for generating large-scale datasets for perception-based machine learning, training and validation. It is focused on capturing ground truth for Camera-based use cases. In the future, the Perception package will include other types of sensors and machine learning tasks.
## Preview package
This package is available as a preview, so it is not ready for production use. The features and documentation in this package might change before it is verified for release.
## Example projects using Perception

[SynthDet](https://github.com/Unity-Technologies/SynthDet) is an end-to-end solution for training a 2d object detection model using synthetic data.
[SynthDet](https://github.com/Unity-Technologies/SynthDet) is an end-to-end solution for training a 2D object detection model using synthetic data.
### Unity Simulation Smart Camera Example
### Unity Simulation Smart Camera example
The [Unity Simulation Smart Camera Example](https://github.com/Unity-Technologies/Unity-Simulation-Smart-Camera-Outdoor) illustrates how Perception could be used in a smart city or autonomous vehicle simulation. Datasets can be generated locally or at scale in [Unity Simulation](https://unity.com/products/unity-simulation).
The [Unity Simulation Smart Camera Example](https://github.com/Unity-Technologies/Unity-Simulation-Smart-Camera-Outdoor) illustrates how Perception could be used in a smart city or autonomous vehicle simulation. You can generate datasets locally or at scale in [Unity Simulation](https://unity.com/products/unity-simulation).
|Feature|Description
|Feature|Description|
|[Labeling](GroundTruth-Labeling.md)|Component which marks a GameObject and its descendants with a set of labels|
|[LabelConfig](GroundTruth-Labeling.md#LabelConfig)|Asset which defines a taxonomy of labels for ground truth generation|
|[Perception Camera](PerceptionCamera.md)|Captures RGB images and ground truth from a [Camera](https://docs.unity3d.com/Manual/class-Camera.html)|
|[DatasetCapture](DatasetCapture.md)|Ensures sensors are triggered at proper rates and accepts data for the JSON dataset|
|[Randomization (Experimental)](Randomization/Index.md)|Integrate domain randomization principles into your simulation|
|[Labeling](GroundTruth-Labeling.md)|A component that marks a GameObject and its descendants with a set of labels|
|[LabelConfig](GroundTruth-Labeling.md#LabelConfig)|An asset that defines a taxonomy of labels for ground truth generation|
|[Perception Camera](PerceptionCamera.md)|Captures RGB images and ground truth from a [Camera](https://docs.unity3d.com/Manual/class-Camera.html).|
|[DatasetCapture](DatasetCapture.md)|Ensures sensors are triggered at proper rates and accepts data for the JSON dataset.|
|[Randomization (Experimental)](Randomization/Index.md)|The Randomization tool set lets you integrate domain randomization principles into your simulation.|
## Known Issues
## Known issues
* The Linux Editor 2019.4.7f1 and 2019.4.8f1 have been found to hang when importing HDRP-based perception projects. For Linux Editor support, use 2019.4.6f1 or 2020.1
* The Linux Editor 2019.4.7f1 and 2019.4.8f1 might hang when importing HDRP-based Perception projects. For Linux Editor support, use 2019.4.6f1 or 2020.1

13
com.unity.perception/Runtime/GroundTruth/Labelers/ObjectCountLabeler.cs


if (m_ClassCountValues == null || m_ClassCountValues.Length != entries.Count)
m_ClassCountValues = new ClassCountValue[entries.Count];
bool visualize = visualizationEnabled;
var visualize = visualizationEnabled;
if (visualize)
{
// Clear out all of the old entries...
hudPanel.RemoveEntries(this);
}
for (var i = 0; i < entries.Count; i++)
{
m_ClassCountValues[i] = new ClassCountValue()

count = counts[i]
};
if (visualize)
// Only display entries with a count greater than 0
if (visualize && counts[i] > 0)
}
}

8
com.unity.perception/Runtime/GroundTruth/Labelers/RenderedObjectInfoLabeler.cs


if (m_VisiblePixelsValues == null || m_VisiblePixelsValues.Length != renderedObjectInfos.Length)
m_VisiblePixelsValues = new RenderedObjectInfoValue[renderedObjectInfos.Length];
bool visualize = visualizationEnabled;
var visualize = visualizationEnabled;
if (visualize)
{
// Clear out all of the old entries...
hudPanel.RemoveEntries(this);
}
for (var i = 0; i < renderedObjectInfos.Length; i++)
{
var objectInfo = renderedObjectInfos[i];

10
com.unity.perception/Runtime/GroundTruth/Labelers/Visualization/HUDPanel.cs


using System.Collections;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using UnityEngine;
using UnityEngine.UI;

const int k_YPadding = 10;
const int k_BoxWidth = 200;
const int k_YLineSpacing = 4;
const int k_MaxKeyLength = 20;
/// <summary>
/// The number of labelers currently displaying real-time information on the visualization HUD

GUI.skin.label.font = Resources.Load<Font>("Inter-Light");
GUI.skin.label.padding = new RectOffset(0, 0, 1, 1);
GUI.skin.label.margin = new RectOffset(0, 0, 1, 1);
GUI.skin.label.wordWrap = false;
GUI.skin.label.clipping = TextClipping.Clip;
GUI.skin.box.padding = new RectOffset(5, 5, 5, 5);
GUI.skin.toggle.margin = new RectOffset(0, 0, 0, 0);
GUI.skin.horizontalSlider.margin = new RectOffset(0, 0, 0, 0);

{
GUILayout.BeginHorizontal();
GUILayout.Space(5);
GUILayout.Label(entry.Key);
var k = new StringBuilder(entry.Key.Substring(0, Math.Min(entry.Key.Length, k_MaxKeyLength)));
if (k.Length != entry.Key.Length)
k.Append("...");
GUILayout.Label(k.ToString());
GUILayout.FlexibleSpace();
GUILayout.Label(entry.Value);
GUILayout.EndHorizontal();

2
com.unity.perception/Runtime/Randomization/Parameters/CategoricalParameter.cs


/// <summary>
/// Returns an IEnumerable that iterates over each sampler field in this parameter
/// </summary>
public override IEnumerable<ISampler> samplers
internal override IEnumerable<ISampler> samplers
{
get { yield return m_Sampler; }
}

2
com.unity.perception/Runtime/Randomization/Parameters/Parameter.cs


/// <summary>
/// Returns an IEnumerable that iterates over each sampler field in this parameter
/// </summary>
public abstract IEnumerable<ISampler> samplers { get; }
internal abstract IEnumerable<ISampler> samplers { get; }
/// <summary>
/// Constructs a new parameter

2
com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/BooleanParameter.cs


/// <summary>
/// Returns an IEnumerable that iterates over each sampler field in this parameter
/// </summary>
public override IEnumerable<ISampler> samplers
internal override IEnumerable<ISampler> samplers
{
get { yield return value; }
}

2
com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/ColorParameters/ColorHsvaParameter.cs


/// <summary>
/// Returns an IEnumerable that iterates over each sampler field in this parameter
/// </summary>
public override IEnumerable<ISampler> samplers
internal override IEnumerable<ISampler> samplers
{
get
{

2
com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/ColorParameters/ColorRgbParameter.cs


/// <summary>
/// Returns an IEnumerable that iterates over each sampler field in this parameter
/// </summary>
public override IEnumerable<ISampler> samplers
internal override IEnumerable<ISampler> samplers
{
get
{

2
com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/FloatParameter.cs


/// <summary>
/// Returns an IEnumerable that iterates over each sampler field in this parameter
/// </summary>
public override IEnumerable<ISampler> samplers
internal override IEnumerable<ISampler> samplers
{
get { yield return value; }
}

2
com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/IntegerParameter.cs


/// <summary>
/// Returns an IEnumerable that iterates over each sampler field in this parameter
/// </summary>
public override IEnumerable<ISampler> samplers
internal override IEnumerable<ISampler> samplers
{
get { yield return value; }
}

2
com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/Vector2Parameter.cs


/// <summary>
/// Returns an IEnumerable that iterates over each sampler field in this parameter
/// </summary>
public override IEnumerable<ISampler> samplers
internal override IEnumerable<ISampler> samplers
{
get
{

2
com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/Vector3Parameter.cs


/// <summary>
/// Returns an IEnumerable that iterates over each sampler field in this parameter
/// </summary>
public override IEnumerable<ISampler> samplers
internal override IEnumerable<ISampler> samplers
{
get
{

2
com.unity.perception/Runtime/Randomization/Parameters/ParameterTypes/NumericParameters/Vector4Parameter.cs


/// <summary>
/// Returns an IEnumerable that iterates over each sampler field in this parameter
/// </summary>
public override IEnumerable<ISampler> samplers
internal override IEnumerable<ISampler> samplers
{
get
{

11
com.unity.perception/Runtime/Randomization/Randomizers/Randomizer.cs


using System;
using System.Collections.Generic;
using System.ComponentModel;
using UnityEngine.Experimental.Perception.Randomization.Parameters;
using UnityEngine.Experimental.Perception.Randomization.Scenarios;

{
bool m_PreviouslyEnabled;
// ReSharper disable once InconsistentNaming
internal ScenarioBase m_Scenario;
ScenarioBase m_Scenario;
internal RandomizerTagManager m_TagManager;
RandomizerTagManager m_TagManager;
[HideInInspector, SerializeField] internal bool collapsed;

/// OnUpdate is executed every frame for enabled Randomizers
/// </summary>
protected virtual void OnUpdate() { }
internal void Initialize(ScenarioBase parentScenario, RandomizerTagManager parentTagManager)
{
m_Scenario = parentScenario;
m_TagManager = parentTagManager;
}
internal virtual void Create()
{

1
com.unity.perception/Runtime/Randomization/Randomizers/RandomizerExamples/Tags/ColorRandomizerTag.cs


/// <summary>
/// Used in conjunction with a ColorRandomizer to vary the material color of GameObjects
/// </summary>
[RequireComponent(typeof(Renderer))]
[AddComponentMenu("Perception/RandomizerTags/Color Randomizer Tag")]
public class ColorRandomizerTag : RandomizerTag { }
}

8
com.unity.perception/Runtime/Randomization/Randomizers/RandomizerExamples/Randomizers/RotationRandomizer.cs


using System;
using UnityEngine.Experimental.Perception.Randomization.Parameters;
using UnityEngine.Experimental.Perception.Randomization.Randomizers.SampleRandomizers.Tags;
using UnityEngine.Experimental.Perception.Randomization.Samplers;
namespace UnityEngine.Experimental.Perception.Randomization.Randomizers.SampleRandomizers
{

/// <summary>
/// Defines the range of random rotations that can be assigned to tagged objects
/// </summary>
public Vector3Parameter rotation = new Vector3Parameter();
public Vector3Parameter rotation = new Vector3Parameter
{
x = new UniformSampler(0, 360),
y = new UniformSampler(0, 360),
z = new UniformSampler(0, 360)
};
/// <summary>
/// Randomizes the rotation of tagged objects at the start of each scenario iteration

2
com.unity.perception/Runtime/Randomization/Randomizers/RandomizerExamples/Randomizers/ColorRandomizer.cs


var taggedObjects = tagManager.Query<ColorRandomizerTag>();
foreach (var taggedObject in taggedObjects)
{
var renderer = taggedObject.GetComponent<MeshRenderer>();
var renderer = taggedObject.GetComponent<Renderer>();
renderer.material.SetColor(k_BaseColor, colorParameter.Sample());
}
}

3
com.unity.perception/Runtime/Randomization/Randomizers/RandomizerExamples/Randomizers/BackgroundObjectPlacementRandomizer.cs


/// </summary>
protected override void OnIterationStart()
{
m_SpawnedObjects ??= new List<GameObject>();
if (m_SpawnedObjects == null)
m_SpawnedObjects = new List<GameObject>();
for (var i = 0; i < layerCount; i++)
{

9
com.unity.perception/Runtime/Randomization/Scenarios/ScenarioBase.cs


activeScenario = this;
OnAwake();
foreach (var randomizer in m_Randomizers)
{
randomizer.m_Scenario = this;
randomizer.m_TagManager = tagManager;
}
randomizer.Initialize(this, tagManager);
ValidateParameters();
}
void OnEnable()

$"Two Randomizers of the same type ({randomizerType.Name}) cannot both be active simultaneously");
var newRandomizer = (Randomizer)Activator.CreateInstance(randomizerType);
m_Randomizers.Add(newRandomizer);
newRandomizer.m_Scenario = this;
newRandomizer.m_TagManager = tagManager;
newRandomizer.Initialize(this, tagManager);
newRandomizer.Create();
return newRandomizer;
}

28
com.unity.perception/Documentation~/GroundTruthLabeling.md


# Labeling
Many labelers require mapping the objects in the view to the values recorded in the dataset. As an example, Semantic Segmentation needs to determine the color to draw each object in the segmentation image.
This mapping is accomplished for a GameObject by:
* Finding the nearest Labeling component attached to the object or its parents.
* Finding the first label in the Labeling component that is present anywhere in the Labeler's Label Config.
Unity uses the resolved Label Entry from the Label Config to produce the final output.
## Labeling component
The Labeling component associates a list of string-based labels with a GameObject and its descendants. A Labeling component on a descendant overrides its parent's labels.
## Label Config
Many labelers require a Label Config asset. This asset specifies a list of all labels to be captured in the dataset along with extra information used by the various labelers.
## Best practices
Generally algorithm testing and training requires a single label on an asset for proper identification such as "chair", "table" or "door". To maximize asset reuse, however, it is useful to give each object multiple labels in a hierarchy.
For example, you could label an asset representing a box of Rice Krispies as `food\cereal\kellogs\ricekrispies`
* "food": type
* "cereal": subtype
* "kellogs": main descriptor
* "ricekrispies": sub descriptor
If the goal of the algorithm is to identify all objects in a Scene that are "food", that label is available and can be used. Conversely if the goal is to identify only Rice Krispies cereal within a Scene that label is also available. Depending on the goal of the algorithm, you can use any mix of labels in the hierarchy.

102
com.unity.perception/Documentation~/images/SemanticSegmentationLabelConfig.png

之前 之后
宽度: 454  |  高度: 245  |  大小: 30 KiB

25
com.unity.perception/Documentation~/GroundTruth-Labeling.md


# Labeling
Many labelers require mapping the objects in the view to the values recorded in the dataset. As an example, Semantic Segmentation needs to determine the color to draw each object in the segmentation image.
This mapping is accomplished for a GameObject by:
* Finding the nearest Labeling component attached to the object or its ancestors.
* Find the first label in the Labeling which is present anywhere in the Labeler's Label Config.
* The resolved Label Entry from the Label Config is used to produce the final output.
## Labeling component
The `Labeling` component associates a list of string-based labels with a GameObject and its descendants. A `Labeling` component on a descendant overrides its parent's labels.
## Label Config
Many labelers require require a `Label Config` asset. This asset specifies a list of all labels to be captured in the dataset along with extra information used by the various labelers.
## Best practices
Generally algorithm testing and training requires a single label on an asset for proper identification such as “chair”, “table”, or “door". To maximize asset reuse, however, it is useful to give each object multiple labels in a hierarchy.
For example, an asset representing a box of Rice Krispies cereal could be labeled as `food\cereal\kellogs\ricekrispies`
* “food” - type
* “cereal” - subtype
* “kellogs” - main descriptor
* “ricekrispies” - sub descriptor
If the goal of the algorithm is to identify all objects in a scene that is “food” that label is available and can be used. Conversely if the goal is to identify only Rice Krispies cereal within a scene that label is also available. Depending on the goal of the algorithm any mix of labels in the hierarchy can be used.

/com.unity.perception/Runtime/Randomization/Randomizers/SampleRandomizers → /com.unity.perception/Runtime/Randomization/Randomizers/RandomizerExamples

/com.unity.perception/Runtime/Randomization/Randomizers/SampleRandomizers.meta → /com.unity.perception/Runtime/Randomization/Randomizers/RandomizerExamples.meta

正在加载...
取消
保存