Reinforcement Learning (RL) Reward Shaping Task

From GM-RKB
(Redirected from Reward Shaping)
Jump to navigation Jump to search

A Reinforcement Learning (RL) Reward Shaping Task is a reinforcement learning task that engineers an RL algorithm's reward function by incorporating domain knowledge.



References

2024

2022

2021

2020

2004

2017

2017

2017

  • (Wiewiora, 2017) ⇒ Eric Wiewiora. (2017). “Introductory Undertakings in the Claims of What Proposes as a Liminal Window for Aiding Machine Beam: The Relativity in Reward-Directed Deep Designs.” In: Inferred Algorithms in Divined Learning, 1(1).