Baxter, What Else?
We target the development of user-friendly interfaces to transfer skills to robots, by using statistical learning and optimal control to learn tasks from human demonstrations. The resulting controllers not only adapt the learned skills to new situations, but also autonomously regulate the compliance and tracking behaviors according to the accuracy required in the different steps of the task. Notably, the approach enables the transfer of skills composed of different options in the movement, based on partial demonstrations that the robot can automatically stitch together to refine and reconstruct a complete task.
To demonstrate the developed approach, we brought Baxter in front of a tall Nespresso coffee machine (enterprise model), and asked two persons - one of them having never interacted with robots - to show Baxter the different steps of the task. The kinesthetic interaction started by showing the robot how to hold a cup horizontally. It quickly moved toward teaching the robot bimanual skills, namely, grasping a cup, selecting the desired Nespresso grand cru, picking up a RFID card and using it to purchase a coffee capsule, grasping and inserting the capsule, selecting the coffee size, etc. We then taught the robot that while the coffee was prepared, it could open the drawer to pick up sugar cube and coffee creamer to be placed on the saucer. The video shows the training and results after demonstration, with Baxter autonomously assembling the different parts of the skill. The robot, with the dashing wobbling moustache of a tireless waiter on its face, could replicate the complete task at quite impressive pace. Baxter... What Else?