Abstract
Computational human body models (HBMs) are important tools for predicting human biomechanical responses under automotive crash environments. In many scenarios, the prediction of the occupant response will be improved by incorporating active muscle control into the HBMs to generate biofidelic kinematics during different vehicle maneuvers. In this study, we have proposed an approach to develop an active muscle controller based on reinforcement learning (RL). The RL muscle activation control (RL-MAC) approach is a shift from using traditional closed-loop feedback controllers, which can mimic accurate active muscle behavior under a limited range of loading conditions for which the controller has been tuned. Conversely, the RL-MAC uses an iterative training approach to generate active muscle forces for desired joint motion and is analogous to how a child develops gross motor skills. In this study, the ability of a deep deterministic policy gradient (DDPG) RL controller to generate accurate human kinematics is demonstrated using a multibody model of the human arm. The arm model was trained to perform goal-directed elbow rotation by activating the responsible muscles and investigated using two recruitment schemes: as independent muscles or as antagonistic muscle groups. Simulations with the trained controller show that the arm can move to the target position in the presence or absence of externally applied loads. The RL-MAC trained under constant external loads was able to maintain the desired elbow joint angle under a simplified automotive impact scenario, implying the robustness of the motor control approach.