PPO-LunarLander-v2 / results.json
Umarik's picture
Upload PPO LunarLander-v2 model to the Hugging Face for the first time!
01bfd26 verified
raw
history blame contribute delete
164 Bytes
{"mean_reward": 239.79507610000002, "std_reward": 45.17348183379063, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2025-01-17T12:53:45.384805"}