Unity 机器学习代理工具包 (ML-Agents) 是一个开源项目,它使游戏和模拟能够作为训练智能代理的环境。
您最多选择25个主题 主题必须以中文或者字母或数字开头,可以包含连字符 (-),并且长度不得超过35个字符
 
 
 
 
 
eshvk 9345614c [cleanup] Use debug mode for some log messages 7 年前
..
curricula 0.2 Update 7 年前
tests Python Testing & Image Inference Improvements (#353) 7 年前
unityagents [cleanup] Use debug mode for some log messages 7 年前
unitytrainers [containerization] CPU based containerization to support all environments that don't use observations 7 年前
Basics.ipynb formating and added documentation 7 年前
README.md Make clear meaning of <env_name> 7 年前
learn.py [containerization] CPU based containerization to support all environments that don't use observations 7 年前
requirements.txt Multi Brain Training and Recurrent state encoder (#166) 7 年前
setup.py 0.2 Update 7 年前
trainer_config.yaml [containerization] CPU based containerization to support all environments that don't use observations 7 年前

README.md

alt text

Unity ML - Agents (Python API)

Python Setup

Requirements

  • Jupyter
  • docopt
  • Matplotlib
  • numpy
  • Pillow
  • Python (2 or 3)
  • Tensorflow (1.0+)

Installing Dependencies

To install dependencies, run:

pip install .

or

pip3 install .

If your Python environment doesn't include pip, see these instructions on installing it.

Provided Jupyter Notebooks

  • Basic - Demonstrates usage of UnityEnvironment class for launching and interfacing with Unity Environments.
  • PPO - Used for training agents. Contains an implementation of Proximal Policy Optimization Reinforcement Learning algorithm.

Running each notebook

To launch jupyter, run:

jupyter notebook

Then navigate to localhost:8888 to access each training notebook.

To monitor training progress, run the following from the root directory of this repo:

tensorboard --logdir=summaries

Then navigate to localhost:6006 to monitor progress with Tensorboard.

Training PPO directly

To train using PPO without the notebook, run: python3 ppo.py <env_name> --train

Where <env_name> corresponds to the name of the built Unity environment.

For a list of additional hyperparameters, run: python3 ppo.py --help

Using Python API

See this documentation for a detailed description of the functions and uses of the Python API.

Training on AWS

See this related blog post for a description of how to run Unity Environments on AWS EC2 instances with the GPU.