User-friendly Robot Empowers Individuals with Mobility Disability
Shalutha Rajapakshe, Emmanuel Senft and Jean-Marc Odobez have developed a software that makes the use of assistive robots seamless, with the goal of creating a natural and intuitive experience.
The software allows the robot and the human to work together to complete a task. It is designed to intuitively integrate human corrections to refine the robot's motions. The solution includes a joystick, similar to those found in electric wheelchairs, to control a robotic arm that can handle a range of activities, from simple tasks like loading a washing machine to more creative ones like painting.
A key innovation of the software is its flexible design, which uses a learning from demonstration method called "canal surfaces". This approach only requires two demonstrations of movements to represent the expected robot behavior and its variations. Then, the robot moves along the learned path automatically. When using the joystick, the software adjusts the user’s inputs based on the robot's location, making it easier and more natural to control without needing to understand the robot's technical details.
An initial proof of concept at Idiap involved two groups: 20 able-bodied individuals and 3 wheelchair users, aged 52 to 61. The testing demonstrated the solution’s superior performance compared to basic control methods and highlighted its effectiveness in meeting the needs of people with disabilities. Participants with movement impairments reported a keen interest in the method and found that multiple aspects of their daily lives could be improved by such a system.
This approach significantly reduces user effort, improves task performance, and offers a more intuitive control experience, setting it apart from existing systems.
This study highlights the transformative potential of assistive robotics in enhancing the quality of life for all. The next step for the team is to explore the feasibility of deploying this system in real-world home environments, bringing this life-changing technology closer to those who need it most.
The study is being presented at 20th IEEE Conference on Human-Robot Interaction (HRI).
Reference:
Rajapakshe, S., Odobez, J.-M., & Senft, E. (2025). Giving sense to inputs: Toward an accessible control framework for shared autonomy. ACM/IEEE International Conference on Human-Robot Interaction (HRI).