浏览代码

Update docs to reflect new package installation workflow. (#3362)

- Fix old material name references.
- Update outdated code comments.
/asymm-envs
GitHub 5 年前
当前提交
bde6cfaf
共有 7 个文件被更改,包括 546 次插入16 次删除
  1. 4
      .gitignore
  2. 30
      docs/Installation.md
  3. 6
      docs/Learning-Environment-Create-New.md
  4. 2
      docs/Migrating.md
  5. 2
      ml-agents/tests/yamato/yamato_utils.py
  6. 224
      docs/images/unity_package_json.png
  7. 294
      docs/images/unity_package_manager_window.png

4
.gitignore


/envs
# Environemnt logfile
*UnitySDK.log
*Project.log
/UnitySDK/.vs/
/Project/.vs/
# Autogenerated VS/MD/Consulo solution and project files
/com.unity.ml-agentsExportedObj/

30
docs/Installation.md


to help you get started.
### Package Installation
ML-Agents C# SDK is transitioning to a Unity Package. While we are working on getting into the
official packages list, you can add the `com.unity.ml-agents` package to your project by
navigating to the menu `Window` -> `Package Manager`. In the package manager window click
on the `+` button.
If you intend to copy the `com.unity.ml-agents` folder in to your project, ensure that
you have the [Barracuda preview package](https://docs.unity3d.com/Packages/com.unity.barracuda@0.3/manual/index.html) installed.
<p align="center">
<img src="images/unity_package_manager_window.png"
alt="Linux Build Support"
width="500" border="10" />
</p>
To install the Barracuda package in later versions of Unity, navigate to the Package
Manager window by navigating to the menu `Window` -> `Package Manager`. Click on the
`Advanced` dropdown menu to the left of the search bar and make sure "Show Preview Packages"
is checked. Search for or select the `Barracuda` package and install the latest version.
**NOTE:** In Unity 2018.4 it's on the bottom right of the packages list, and in Unity 2019.3 it's on the top left of the packages list.
Select `Add package from disk...` and navigate into the
`com.unity.ml-agents` folder and select the `package.json` folder.
<img src="images/barracuda-package.png"
alt="Barracuda Package Manager"
width="710" border="10"
height="569" />
<img src="images/unity_package_json.png"
alt="Linux Build Support"
width="500" border="10" />
If you are going to follow the examples from our documentation, you can open the `Project`
folder in Unity and start tinkering immediately.
The `ml-agents` subdirectory contains a Python package which provides deep reinforcement
learning trainers to use with Unity environments.

6
docs/Learning-Environment-Create-New.md


3. Select the Floor Plane to view its properties in the Inspector window.
4. Set Transform to Position = (0, 0, 0), Rotation = (0, 0, 0), Scale = (1, 1, 1).
5. On the Plane's Mesh Renderer, expand the Materials property and change the
default-material to *LightGridFloorSquare* (or any suitable material of your choice).
default-material to *GridMatFloor* (or any suitable material of your choice).
(To set a new material, click the small circle icon next to the current material
name. This opens the **Object Picker** dialog so that you can choose a

3. Select the Target Cube to view its properties in the Inspector window.
4. Set Transform to Position = (3, 0.5, 3), Rotation = (0, 0, 0), Scale = (1, 1, 1).
5. On the Cube's Mesh Renderer, expand the Materials property and change the
default-material to *Block*.
default-material to *AgentBlue*.
![The Target Cube in the Inspector window](images/mlagents-NewTutBlock.png)

3. Select the RollerAgent Sphere to view its properties in the Inspector window.
4. Set Transform to Position = (0, 0.5, 0), Rotation = (0, 0, 0), Scale = (1, 1, 1).
5. On the Sphere's Mesh Renderer, expand the Materials property and change the
default-material to *CheckerSquare*.
default-material to *Checkers_Ball*.
6. Click **Add Component**.
7. Add the Physics/Rigidbody component to the Sphere.

2
docs/Migrating.md


## Migrating from 0.13 to latest
### Important changes
* The `UnitySDK` folder has been split into a Unity Package (`com.unity.ml-agents`) and an examples project (`Project`). Please follow the [Intallation Guide](Installation.md) to get up and running with this new repo structure.
* Several changes were made to how agents are reset and marked as done:
* Calling `Done()` on the Agent will now reset it immediately and call the `AgentReset` virtual method. (This is to simplify the previous logic in which the Agent had to wait for the next `EnvironmentStep` to reset)
* The "Reset on Done" setting in AgentParameters was removed; this is now effectively always true. `AgentOnDone` virtual method on the Agent has been removed.

* RayPerceptionSensor was inconsistent in how it handle scale on the Agent's transform. It now scales the ray length and sphere size for casting as the transform's scale changes.
### Steps to Migrate
* Follow the instructions on how to install the `com.unity.ml-agents` package into your project in the [Installation Guide](Installation.md).
* If your Agent implemented `AgentOnDone` and did not have the checkbox `Reset On Done` checked in the inspector, you must call the code that was in `AgentOnDone` manually.
* If you give your Agent a reward or penalty at the end of an episode (e.g. for reaching a goal or falling off of a platform), make sure you call `AddReward()` or `SetReward()` *before* calling `Done()`. Previously, the order didn't matter.
* If you were not using `On Demand Decision` for your Agent, you **must** add a `DecisionRequester` component to your Agent GameObject and set its `Decision Period` field to the old `Decision Period` of the Agent.

2
ml-agents/tests/yamato/yamato_utils.py


def run_standalone_build(base_path: str, verbose: bool = False) -> int:
"""
Run BuildStandalonePlayerOSX test to produce a player at UnitySDK/testPlayer
Run BuildStandalonePlayerOSX test to produce a player at Project/testPlayer
:param base_path:
:return:
"""

224
docs/images/unity_package_json.png

之前 之后
宽度: 601  |  高度: 449  |  大小: 72 KiB

294
docs/images/unity_package_manager_window.png

之前 之后
宽度: 700  |  高度: 465  |  大小: 99 KiB
正在加载...
取消
保存