Workshop IROS 2015

Workshop on Physical Human-Robot Collaboration: Safety, Control, Learning and Applications
Friday, October 2nd, 2015, 08:30-17:30, Saal E

Time Talk
08:30-08:40 Introduction by the organizers
SAFETY
08:40 - 09:20 Alin Albu-Schäffer, DLR Institute of Robotics and Mechatronics, Germany
09:20 - 10:00 Jae-Bok Song, Korea University, Korea
10:00 - 10:30 Coffee Break
CONTROL & LEARNING
10:30 - 11:10 Sandra Hirche, TUM, Germany
11:10 - 11:50 Sami Haddadin, Leibniz Universität Hannover, Germany
11:50 - 12:30 Michael Mistry, University of Birmingham, UK
12:30 - 14:00 Lunch
APPLICATIONS
14:00 - 14:40 Heni Ben Amor, Arizona State University, USA
14:40 - 15:20 Matthias Bjoern, ABB Corporate Research, Germany
15:20 - 16:00 Coffee Break
16:00 - 16:40 Kazuhiro Kosuge, Tohoku University, Japan
16:40 - Inf Panel discussion

Alin Albu-Schäffer - Designing, Controlling and Programming Robots for Direct Interaction with Humans
Bio:
Alin Albu-Schäffer graduated in electrical engineering at the Technical University of Timisoara, in 1993 and got the PhD in automatic control from the Technical University of Munich in 2002. Since 2012 he is the head of the Institute of Robotics and Mechatronics at the German Aerospace Center, which he joint in 1995 as a PhD candidate.
Moreover, he is a professor at the Technical University of Munich, holding the Chair for “Sensorbased Robotic Systems and Intelligent Assistance Systems” at the Computer Science Department. His personal research interests include robot design, modeling and control, flexible joint and variable compliance robots for manipulation and locomotion, physical human-robot interaction, bio-inspired robot design. He received several awards, including the IEEE King-Sun Fu Best Paper Award of the Transactions on Robotics in 2012 and 2014, several ICRA and IROS Best paper Awards as well as the DLR Science Award.

Jae-Bok Song - Collision Detection for Physical Human-Robot Collaboration
Abstract:
Various solutions to collision detection have been proposed in order to deal with safety issues, as physical human-robot collaborations have drawn much attention in recent years. These solutions are usually model-based, which requires accurate models of a robot and the objects which the robot grasps. However, the dynamic model of an object that interacts with a robot is uncertain or unknown in most cases where the robot performs a task with various objects or tools. This can be a problem when physical human-robot interaction is being implemented. Furthermore, existing methods for collision detection include the usage of skin sensors or joint torque sensors, and cannot be applied to robot arms without these sensors (e.g., industrial manipulators). This talk will present our research on a novel algorithm which can separate intended interaction from collision, and a sensorless collision detection approach.
Collision detection for human-robot interaction: When robots interact with the external environment including a human or a grasped object for the task such as a direct teaching of a pick-and-place operation, robots should react to an unexpected collision, but not to an intended interaction force because this force is inevitable for a task. To cope with this problem, we propose a collision detection method for a robot that interacts with unknown environments or various objects. For this purpose, a collision detection index, which is decoupled from the inevitable external force generated by the object being handled by the robot, is developed by projecting the joint torque space onto its certain subspace. The proposed index is verified through various simulations and experiments, and the corresponding results show that regardless of the object that is being handled, it is possible to detect collisions.
Sensorless collision detection: Conventional collision detection methods usually require additional sensors such as skin sensors, joint torque sensors, and acceleration sensors, which are impractical to implement due to their high cost. To address this problem, we propose a collision detection method using only an encoder without any extra sensors. In the proposed scheme, the external torque due to collision is estimated using a generalized momentum-based observer and a friction torque model in the harmonic drive developed for a robot that is conducting position control. This allows that collision can be reliably detected without any extra sensors for any type of robot manipulator.
Bio:
Jae-Bok Song is Professor at the School of Mechanical Engineering, Korea University since 1993. He served as the President of Korea Robotics Society (2014) and as Editor-in-Chief of the International Journal of Control, Automation & Systems (2011-2013). He holds a PhD a PhD in Mechanical Engineering from MIT (1992). His research interests include Robot Control (Robotic Assembly, Variable Stiffness Actuators) and the Development of Robot Arms (Collaborative robots, Counterbalance Robots, Safe Robots).

Sandra Hirche - Uncertainty-dependent Control in pHRI
Abstract:
Physical human-robot interaction is relevant for many societally important application domains such as machine-based physical rehabilitation, mobility and manipulation aids for elderly, and collaborative human-machine production systems. Intuitive and goal-oriented interaction is one of the key challenges of current research. From psychological studies it is well-known, that anticipation and behavior prediction of the interaction partner are key to joint action. However, any prediction based on a human behavior model will be uncertain, due to sparsely available training data and inherent human variability. This uncertainty in prediction is quite crucial to be considered when designing control mechanisms for pHRI. In addition, because of the physical contact between the human and the machine not only information, but also energy is exchanged posing fundamental challenges for real-time human-adaptive and safe decision making/control. In this talk we will introduce a class of control algorithms suitable for assistive control in pHRI explicitly incorporating prediction uncertainties, their combination with statistical human motion models and their evaluation in physical human-machine interaction.

Sami Haddadin - Unified Force/Imepdance Control with Collision Handling
Abstract:
Enabling robots for direct physical interaction and cooperation with humans has been one of robotics research primary goals over decades. In this talk I will outline how our recent work on unified force/impedance control and systematic collision handling, including learning contact classification, contribute to this ambitious aims.
First, I will survey a novel hybrid Cartesian force/impedance controller that is equipped with energy tanks to preserve passivity. Our approach overcomes the problems of (hybrid) force control, impedance control, and set-point based indirect force control. It allows accurate force tracking, full compliant impedance behavior, and safe contact resemblance simultaneously by introducing a controller shaping function that robustly handles unexpected contact loss and avoids chattering behavior that switching based approaches suffer from. Furthermore, we propose a constructive way of initiating the energy tanks via the concept of task energy.
Secondly, I will show preliminary results in detecting and interpreting contacts in physical Human-Robot Interaction. In order to discriminate between intended and unintended contact types, we derive a set of linear and non-linear features based on physical contact model insights and from observing real impact data that may even rely on proprioceptive sensation only. We implement a classification system with a standard non-linear Support Vector Machine and show empirically both in simulations and on a real robot the high accuracy in off- as well as on-line settings of the system. I will argue that these successful results are based on our feature design derived from first principles.
Bio:
Sami Haddadin is full Professor and Director at Institute of Automatic Control (IRT) of Leibniz Universität Hannover. He holds a Dipl.-Ing. degree in EE and a M.Sc. in CS from TUM, as well as an Honours degree in Technology Management from TUM and LMU. He obtained his PhD with summa cum laude from RWTH Aachen in 2011. Until 2013, he was head of the DLR research group ``Human-Centered Robotics'' and the program ``Terrestrial Assistant Robotics''. His main research interests are pHRI, non-linear control, real-time motion, task and reflex planning, robot learning, optimal control, VIA, brain controlled assistive robots, and safety in robotics. He was in program/organisation committees of several international robotics conferences and Guest Editor of IJRR. Currently, he is Associate Editor of IEEE Transactions on Robotics. He published more than 100 peer reviewed papers in international journals, books, and conferences. Among other things, he received the 2015 IEEE/RAS Early Career Award, the 2015 RSS Early Career Spotlight, 8 best paper/video awards, the euRobotics Technology Transfer Award 2011, the 2012 George Giralt Award, and the 2015 Alfried Krupp Award for Young Professors. He was strongly involved in the development and technology transfer of the DLR Lightweight robot to KUKA.

Michael Mistry - Exploiting contact for whole-body physical human-robot collaboration
Abstract:
In my research, I work towards humans and robots collaborating together in natural, unrestricted settings, making use of their whole bodies in contact. For example, a team of humans and humanoid robots may coordinate to lift and move heavy furniture together. Of course such whole body interaction has safety concerns, particularly for postural control and balance. In this talk I will outline some of our approaches towards natural whole-body physical human-robot collaboration. I will discuss how robots and humans may exploit external contacts, including compliant contacts, to help achieve goals. I will highlight our work on understanding which postural and contact configurations are best for a particular task. I will also explore how a robot may enhance interaction with a human partner in contact, by emulating a human’s natural postural sway.
Bio:
Michael Mistry is a Senior Lecturer in Robotics at the School of Computer Science, University of Birmingham, where he is also a member of the Intelligent Robotics Lab and the Centre for Computational Neuroscience and Cognitive Robotics. Michael is broadly interested in human motion and humanoid robotics. His research focuses on issues relevant to dexterous movement in both humans and humanoid robots, including redundancy resolution and inverse kinematics, operational space control and manipulation, stochastic optimal control, and internal model learning and control, particularly in contact. Michael is currently an investigator in the EU funded projects codyco.eu and cogimon.eu, as well as an EPSRC funded project on nuclear robotics.

Heni Ben Amor - Learning Human-Robot Collaboration from Human-Human Demonstrations
Abstract:
Human-friendly collaborative robotics requires robots with manipulation abilities and safe compliant control, as well as algorithms for human-robot interaction during skill acquisition. Programming robots for such interaction scenarios is notoriously hard, as it is difficult to foresee all possible actions and responses of the human partner. In this talk, I will present motor skill learning methods, which allow anthropomorphic robots to engage in joint physical activities with humans. To this end, human-human demonstrations of collaborative tasks are first recorded and are subsequently used to learn compact models of the interaction dynamics. I will introduce "Interaction Primitives" -a novel framework for action anticipation, recognition, and generation- and show how it can be used to solve complex collaborative assembly tasks. Interaction primitives allow robots to anticipate human actions and produce appropriate responses in a proactive manner. Finally, I will also present novel HRI methods that allow robots to project their next actions into the environment in order to clearly communicate their intentions to a human interaction partner.
Bio:
Heni Ben Amor is an Assistant Professor for Robotics at Arizona State University (USA). Prior to that, he was a Research Scientist at the Institute for Robotics and Intelligent Machines at GeorgiaTech in Atlanta. Heni studied Computer Science at the University of Koblenz-Landau (GER) and earned a Ph.D in robotics from the Technical University Freiberg and the University of Osaka in 2010 where he worked with Hiroshi Ishiguro and Minoru Asada. Before moving to the US, Heni was a postdoctoral scholar at the Technical University Darmstadt working with Jan Peters. Heni's research topics focus on artificial intelligence, machine learning, human-robot interaction, robot vision, and automatic motor skill acquisition. He received the highly competitive Daimler-and-Benz Fellowship as well as several best paper awards at major robotics and AI conferences.

Matthias Bjoern - Risk Assessment for Human-Robot Collaborative Applications
Abstract:
Collaborative automation, bringing humans and robots into closer and more flexible interaction, is an area into which high hopes are placed by manufacturing companies. In the eternal quest to unlock additional productivity potential, the area of small-lot, high-variant manufacturing has been a challenging context for the introduction of automation. With the advent of new safety functionality in industrial robot systems, the community is now eagerly mapping out how collaborative robots can be the tool of choice for partial automation in such environments.
The successful deployment of collaborative industrial robots in mixed human-robot manufacturing environments hinges on a number of ingredients. Firstly, the application for which a partial automation concept is targeted, must promise economic viability through productivity gains. These improvements can range from improved workplace ergonomics for the human worker, over better and more reproducible quality for selected manufacturing steps, to increases in throughput of overall production. Secondly, the intended application must be reviewed for its safeguarding requirements in the event that a collaborative robot is introduced into the production scenario. This step is also referred to as the “application risk assessment” and identifies the requirements of the application on safeguarding and on the safety functions that the control system must support. Finally, given the economical and safety boundary conditions on the overall application, one must select the appropriate components for realizing the application. The most important component is, of course, the robot with its tool box of safety functions suitable for collaborative applications.
This contribution focuses on the step of the application risk assessment, outlining the principal procedure and casting light on the specific issues that the integrator and end-user must treat in their efforts to bring forth the safe and efficient collaborative application.
Bio:
After studying physics at the California Institute of Technology (Caltech) and obtaining his PhD in experimental particle physics at Yale University in 1990, Bjoern Matthias spent just over 3 years as a post-doc at the University of Heidelberg, doing research on spectroscopy of exotic atoms and on rare decays. In 1994, he joined ABB Corporate Research in Germany, working in the area of computational design and simulation of power electrical systems and of electrostatic paint application.
Since 2001, Bjoern has been working on safety for industrial robots and human-robot collaboration and since 2003 he is the ABB Senior Principal Scientist for Robotic Automation. His present research is focused on collaborative small-parts assembly, on safety-related design of collaborative robots, and on the ergonomics of the collaborative workplace. Bjoern is active in national and international (ISO) committees for robot safety standardization. He has also served on the Board of Directors of the European robotics interest group euRobotics aisbl, which works with the European Commission on establishing R&D&I priorities for Europe, and continues to contribute to the topical roadmapping work.

Kazuhiro Kosuge - A Co-worker Robot “PaDY" for Automobile Assembly Line
Bio:
Kazuhiro Kosuge is a Professor in the Department of Bioengineering and Robotics. He received the B.S., M.S., and Ph.D. degrees in control engineering from Tokyo Institute of For more than 25 years, he has been doing research on various robotics problems. He received the JSME Awards for the best papers from the Japan Society of Mechanical Engineers in 2002 and 2005, the RSJ Award for the best papers from the Robotics Society of Japan in 2005, the Best Paper Award of IROS’97, etc. He is an IEEE Fellow, a JSME Fellow, a SICE Fellow and a RSJ Fellow. He served as President of IEEE Robotics and Automation Society for 2010-2011. He is serving as a Division X Representative/Director, IEEE for 2015-2016.