Unity 机器学习代理工具包 (ML-Agents) 是一个开源项目,它使游戏和模拟能够作为训练智能代理的环境。
您最多选择25个主题 主题必须以中文或者字母或数字开头,可以包含连字符 (-),并且长度不得超过35个字符
 
 
 
 
 

1.3 KiB

Memory-enhanced agents using Recurrent Neural Networks

How to use

When configuring the trainer parameters in the config/trainer_config.yaml file, add the following parameters to the Behavior you want to use.

use_recurrent: true
sequence_length: 64
memory_size: 256
  • use_recurrent is a flag that notifies the trainer that you want to use a Recurrent Neural Network.
  • sequence_length defines how long the sequences of experiences must be while training. In order to use a LSTM, training requires a sequence of experiences instead of single experiences.
  • memory_size corresponds to the size of the memory the agent must keep. Note that if this number is too small, the agent will not be able to remember a lot of things. If this number is too large, the neural network will take longer to train.

Limitations

  • LSTM does not work well with continuous vector action space. Please use discrete vector action space for better results.
  • Since the memories must be sent back and forth between Python and Unity, using too large memory_size will slow down training.
  • Adding a recurrent layer increases the complexity of the neural network, it is recommended to decrease num_layers when using recurrent.
  • It is required that memory_size be divisible by 4.