ppo-LunarLander-v2 / config.json

Commit History

Upload my updated PPO model to the hub
b9c2e41

UmberH commited on

Upload my updated PPO model to the hub
ebb8309

UmberH commited on

Upload my first PPO model to the hub
5daf5a4

UmberH commited on