ppo-LunarLander-v2 / results.json
rdesarz's picture
Try PPO algorithm
b8cb891
raw
history blame contribute delete
163 Bytes
{"mean_reward": 271.1695162594388, "std_reward": 17.18993322050464, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2023-02-18T13:47:42.803894"}