浏览代码

Fix PPO optimizer creation

/develop/nopreviousactions
Ervin Teng 5 年前
当前提交
30e4424c
共有 1 个文件被更改,包括 1 次插入1 次删除
  1. 2
      ml-agents/mlagents/trainers/ppo/optimizer.py

2
ml-agents/mlagents/trainers/ppo/optimizer.py


)
def create_ppo_optimizer(self):
self.tf_optimizer = self.create_tf_optimizer(self.learning_rate)
self.tf_optimizer = self.create_optimizer_op(self.learning_rate)
self.grads = self.tf_optimizer.compute_gradients(self.loss)
self.update_batch = self.tf_optimizer.minimize(self.loss)

正在加载...
取消
保存