Lower limb exoskeletons can help transform how humans move within their communities. Human movement, however, is highly variable – as we go about our day, we may walk, run, jump, lift heavy objects, and more. Current state-of-the-art exoskeleton controllers are typically limited by the number of tasks that they are programmed to provide assistance for and are often restricted to laboratory or research settings. Developing exoskeletons that can accommodate the wide range of human movement is paramount to transitioning these devices into the real world.
We have developed a novel deep learning-based exoskeleton controller capable of assisting users, regardless of task or activity. Using different sensors embedded onto the device, the controller continuously estimates the user’s joint moments and seamlessly modulates assistance to the user via a clothing-integrated hip-knee exoskeleton. The task-agnostic controller was trained on a variety of different activities, ranging from highly repeatable tasks such as walking and running to more sporadic and undefined movements such as tug of war, perturbation recovery, and lateral cutting maneuvers. Our approach not only accurately estimated the torque demands at each joint but also reduced user energy expenditure across 10 different human activities compared to our baseline, no-assistance condition. These findings provide a novel path to generalizing exoskeleton assistance across users, activities, and environments, as well as a critical link to the adoption of exoskeleton technology in the real world.
Watch as Dr. Keaton Scherpereel takes us through a selection of the different tasks, with live predictions and assistance from the exoskeleton! On each graph, the real biological joint moments are shown in red and the estimated biological joint moments via the deep learning estimator or shown in green.
Open Source Dataset
Training this deep learning model required a dataset of 25 users and 28 activities. We have provided our dataset for you to explore and develop your own models with. Please see our webpage describing the dataset structure, and get the data directly from the Georgia Tech Library Repository.