ppo-LunarLander / results.json
adrian47's picture
Initial Commit for a PPO Lunarlander
23db29c
raw
history blame contribute delete
163 Bytes
{"mean_reward": 260.5837960643443, "std_reward": 25.83407916632778, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-03-04T10:18:20.656199"}