GitHub
2d68b835
Check that worker port is available in RpcCommunicator (#1415)
* Check that worker port is available in RpcCommunicator Previously the RpcCommunicator did not check the port or create the RPC server until `initialize()` was called. Since "initialize" requires the environment to be available, this means we might create a new environment which connects to an existing RPC server running in another process. This causes both training runs to fail. As a remedy to this issue, this commit moves the server creation into the RpcCommunicator constructor and adds an explicit socket binding check to the requested port. * Fixes suggested by Codacy * Update rpc_communicator.py * Addressing feedback: formatting & consistency |
6 年前 | |
---|---|---|
.. | ||
mlagents | Check that worker port is available in RpcCommunicator (#1415) | 6 年前 |
tests | Check that worker port is available in RpcCommunicator (#1415) | 6 年前 |
README.md | Fixing tables in documentation and other markdown errors. (#1199) | 6 年前 |
requirements.txt | New proto files | 7 年前 |
setup.py | Fixing learn.py, trainer_controller.py, and Docker (#1164) | 7 年前 |
README.md
Unity ML-Agents Python Interface and Trainers
The mlagents
Python package is part of the
ML-Agents Toolkit.
mlagents
provides a Python API that allows direct interaction with the Unity
game engine as well as a collection of trainers and algorithms to train agents
in Unity environments.
The mlagents
Python package contains two sub packages:
-
mlagents.envs
: A low level API which allows you to interact directly with a Unity Environment. See here for more information on using this package. -
mlagents.trainers
: A set of Reinforcement Learning algorithms designed to be used with Unity environments. Access them using the:mlagents-learn
access point. See here for more information on using this package.
Installation
Install the mlagents
package with:
pip install mlagents
Usage & More Information
For more detailed documentation, check out the ML-Agents Toolkit documentation.