浏览代码

[skip ci] modify learning rate in horovod optimizer

/distributed-training
Anupam Bhatnagar 5 年前
当前提交
5d180caf
共有 3 个文件被更改,包括 1 次插入3 次删除
  1. 1
      config/trainer_config.yaml
  2. 3
      ml-agents/mlagents/trainers/optimizer/tf_optimizer.py

1
config/trainer_config.yaml


time_horizon: 1000
lambd: 0.99
beta: 0.001
max_steps: 1.0e5
3DBallHard:
normalize: true

3
ml-agents/mlagents/trainers/optimizer/tf_optimizer.py


if hvd is not None:
adam_optimizer = tf.train.AdamOptimizer(
learning_rate=learning_rate * hvd.size(), name=name
)
learning_rate=learning_rate, name=name)
horovod_optimizer = hvd.DistributedOptimizer(adam_optimizer)
else:
adam_optimizer = tf.train.AdamOptimizer(

正在加载...
取消
保存