PPO-LunarLander-v2 / results.json
ykirpichev's picture
My first commit to ykirpichev/PPO-LunarLander-v2
bb665d6
raw
history blame
165 Bytes
{"mean_reward": 263.21327556857295, "std_reward": 17.133974017497433, "is_deterministic": true, "n_eval_episodes": 10, "eval_datetime": "2022-05-07T18:32:08.660798"}