Elsevier

Robotics and Autonomous Systems

Volume 94, August 2017, Pages 102-119
Robotics and Autonomous Systems

Multimodal sensor-based whole-body control for human–robot collaboration in industrial settings

https://doi.org/10.1016/j.robot.2017.04.007Get rights and content

Abstract

This paper describes the development of a dual-arm robotic system for industrial human–robot collaboration. The robot demonstrator described here possesses multiple sensor modalities for the monitoring of the shared human–robot workspace and is equipped with the ability for real-time collision-free dual-arm manipulation. A whole-body control framework is used as a key control element which generates a coherent output signal for the robot’s joints given the multiple controller inputs, tasks’ priorities, physical constraints, and current situation. Furthermore, sets of controller-constraints combinations of the whole-body controller constitute the basic building blocks that describe actions of a high-level action plan to be sequentially executed. In addition, the robotic system can be controlled in an intuitive manner via human gestures. These individual robotic capabilities are combined into an industrial demonstrator which is validated in a gearbox assembly station of a Volkswagen factory.

Section snippets

Motivation

Industrial manufacturing is undergoing major changes and transformations. New technologies are introduced by digitization; consumers are demanding that the manufactured products are produced in an increasing number of variants while the workforce’s average age is shifted by the demographic change. All of these factors are important drivers for introducing workplaces where human and robot work together.

In recent years, a new generation of industrial robots is being deployed to allow the

Related works

The area of human–robot collaboration has experienced a significant increase of interest in the past years, first from the research community and, more recently, from the industrial community as well. The reason lies in key enabling technologies appearing in the market, probably most importantly a new generation of lightweight robots which incorporate different concepts (control software or mechatronic design) to allow the interaction with humans while ensuring a certain degree of safety.

From

System description

The developed robotic system is based on two KUKA iiwa lightweight robots [3] equipped with 3-finger grippers from Robotiq [23] (see Fig. 1). Moreover, three RGB-D cameras (ASUS Xtion Pro Live) monitor the common human–robot shared workspace to ensure real-time collision-free robot movements. For monitoring the surroundings of the system, two SICK LMS100 laser scanners are used, which are mounted on opposite corners of the table and jointly perceive a 360° view of the area around the robot.

The

Rock framework

The software for this work was developed using the Robot Construction Kit (Rock) [24]. Rock is a framework to develop software for robotic systems based on the component model of the Orocos Real Time Toolkit [26]. In addition to the software component model, further tools for robot software development are included. In this subsection, the relevant parts of the framework will be introduced.

Communication between hardware components

The KUKA iiwa system is an industrial robot with a native control interface that runs solely on the KUKA Sunrise cabinet. In order to connect the robot to our Rock framework, a UDP based Client/Server application (see Fig. 6) was implemented. This Client is a Rock component that is able to send control commands to the arm with a frequency of 5 ms, change the dynamic model of the gripper, adapt the impedance of the robot and handle error states. By connecting to our Rock framework, both arms

Environment perception

The demonstrator robot is equipped with a heterogeneous set of sensors: three RGB-D cameras and two 2D laser scanners. Through a series of processing steps, point clouds corresponding to potential collision objects (such as persons, tools or carts) are extracted from the RGB-D data and tracked. These object point clouds are directly passed on to the dynamic collision avoidance module (Section 7.2), which (if necessary) rapidly executes an avoidance motion using our whole-body control framework.

Safety aspects

According to the ISO 10218 standard and the new ISO/TS 15066 Technical Specification, there a currently four modes allowed for human–robot collaboration:

  • 1.

    Safety-rated monitored stop, in which there is a separation between human and robot and no robot motion is allowed when the operator is in the collaborative workspace,

  • 2.

    Hand guiding, in which the robot motion is only happening through direct input of the operator (by, for instance, holding a device on the end-effector of the

Gesture recognition

A key aspect of future intelligent assistance systems for industrial applications is the intuitive interaction with humans. There are multiple approaches for the communication between humans and robots, but in industrial settings only a few are suitable in order to allow a safe and easy operation. Humans usually communicate using voice, which is difficult to implement for a robot in noisy environments like in automotive assembly lines. Also the operation via input devices like keyboards or

Practical application

In order to underline the practical relevance as well as the future potential of this demonstrator, a representative handling and assembly scenario from a gearbox manufacturing plant is chosen. The current workplace is depicted in Fig. 15. Here, the gear shaft and a coupling have to be joined manually with a tight tolerance, which is a tiring job since the parts are heavy, the surfaces easily scratched, and the gear wheels are shock sensitive. Therefore, the chosen workplace is a good example

Conclusions

Human–robot interaction requires a holistic approach that coherently integrates different subcomponents into a more complex system. Currently, great advances are seen in different areas of robotics: from hardware and mechatronic design, from the perception, control and autonomy algorithms, and up to the planning, learning and reasoning capabilities. However, one of the remaining challenges is currently the integration of such capabilities into a single system which possesses greater abilities

Outlook

The test results of the presented technological demonstrator are promising and at the same time outline the next steps for the industrialization of the system and its components: (1) The safety measures, especially the collision avoidance system, which marks an important improvement of collaborative robots, need to be certified for industrial use. In this case, work on two areas is required: on the one side, safety-rated versions of sensors like the RGB-D cameras are needed; on the other side,

José de Gea Fernández received his M.Sc. in Electronics Engineering (2002) from the Technical University of Catalunya (UPC), Spain and his Ph.D. in Robotics (2011) from the University of Bremen, Germany. Between 2003 and 2009 he was a Researcher at the Robotics Group of the University of Bremen. Since 2009 he is working at the Robotics Innovation Center of DFKI (German Research Center for Artificial Intelligence) in Bremen. There, from 2011 to 2013 he acted as Deputy Head of the Department for

References (42)

  • Motoman HC10 collaborative robot from Yaskawa, 2016....
  • AURA collaborative robot from COMAU, 2016. http://www.comau.com/EN/media/news/2016/06/comau-at-automatica-2016....
  • SafetyEye from PILZ, 2014. http://brochures.pilz.nl/bro_pdf/SafetyEYE_2014.pdf. (Online; accessed 16 August...
  • VogelChristian et al.

    A projection-based sensor system for safe physical human–robot collaboration

  • KirchnerAndrea Elsa et al.

    Intuitive interaction with robots – technical approaches and challenges

  • den BerghMichael Van et al.

    Real-time 3D hand gesture interaction with a robot for understanding directions from humans

  • Wei Wang, Drazen Brscic, Zhiwei He, Sandra Hirche, Kolja Kühnlenz, Real-time human body motion estimation based on...
  • BenbasatAri Y. et al.

    An inertial measurement framework for gesture recognition and applications

  • European Project SAPHARI, 2012. http://www.saphari.eu/. (Online; accessed 16 August...
  • European Project VALERI, 2015. http://www.valeri-project.eu/. (Online; accessed 16 August...
  • Mads Hvilshoej, Simon Boegh, Ole Madsen, Morten Kristiansen, The mobile robot little helper: Concepts, ideas and...
  • Cited by (0)

    José de Gea Fernández received his M.Sc. in Electronics Engineering (2002) from the Technical University of Catalunya (UPC), Spain and his Ph.D. in Robotics (2011) from the University of Bremen, Germany. Between 2003 and 2009 he was a Researcher at the Robotics Group of the University of Bremen. Since 2009 he is working at the Robotics Innovation Center of DFKI (German Research Center for Artificial Intelligence) in Bremen. There, from 2011 to 2013 he acted as Deputy Head of the Department for “Mobility and Manipulation”. Currently, he is Senior Researcher and Head of the Team “Robot Control”. He has co-authored over 45 scientific publications and has been involved in different German national (BMBF, DFG, BMWi, DLR) and European projects (EU, ESA) in several areas within his research in robotic manipulation. He led the DFKI team in the German project SemProm which specified the software/hardware characteristics and designed the control strategies for the robot AILA. He also led the DFKI contributions in the EU Project Robofoot and the project BesMan (Behaviors for Mobile Manipulation), funded by BMWi (German Federal Ministry of Economics and Technology) and DLR (German Space Agency). His research area is on mobile manipulation and safe human–robot collaboration.

    Dennis Mronga is working as a researcher at German Research Center for Artificial Intelligence in Bremen, Germany. He received his diploma degree in Electrical Engineering from University of Bremen in 2009. His research interests focus on whole body control of complex robotic systems and applied machine learning.

    Martin Günther is a researcher at DFKI, the German Research Center for Artificial Intelligence. He earned his Diploma in Computer Science from Technical University Dresden, Germany, in 2008. From 2009 until 2015 he worked as a research associate at the Knowledge-Based Systems group at Osnabrück University, Germany. His research interests include 3D perception, semantic mapping, context-aware object tracking and anchoring, active perception and goal-directed action in autonomous robot control.

    Tobias Knobloch studies Mechatronics in the university of applied science in Aschaffenburg. His focus is in industrial and automation technology and microelectronic systems. He graduated his practical semester in the Robotics Innovation Center in German Research Center of Artificial Intelligence (DFKI) in 2016. Since 2006 he is a participant in the worldwide robotic competition Robocup.

    Malte Wirkus received is Diploma in Computer Science at the University of Bremen in 2010. He joined the Robotics Innovation Center (RIC) of the German Research Center for Artificial Intelligence (DFKI GmbH) in 2010.

    In different research and industry projects, he gained experiences in the fields of robotic mobile manipulation, multi-agent architectures and human-robot collaboration. With his current scientific research interest in control architectures and frameworks for robotic applications, he works as researcher and project leader at DFKI-RIC.

    Martin Schröer studied at the Carl-von-Ossietzky University of Oldenburg, Germany, graduated with a diploma degree in computer sciences (2007) and an additional diploma degree in psychology (2011). Employee at the DFKI RIC, Bremen, since 07/2010, occupied with human-machine interaction. Noteworthy activities are the development of the DFKI modular electric car “EO” and the interaction of man an machine using gestures in various projects (Moonwalk, Volkswagen iMRK, AIRBUS autonomous transport).

    Mathias Trampler studied Systems Engineering at the University of Bremen. He is currently working as Junior Researcher at the German Research Center for Artificial Intelligence.

    Dr. Stefan Stiene holds a Ph.D. in computer science from the University of Osnabrück. He received his Master of science and computer science from the same institution in 2006. After receiving his Ph.D. in 2009 he worked for ROSEN Technology and Research Center GmbH developing intelligent inspection systems for gas and oil pipelines. In 2011 he joined the DFKI Robotics Research Group as a senior researcher and team leader of the planning and perception team at the DFKI-RIC and since 2016 he is the leader of the DFKI competence center Smart Agriculture Technologies (CC-SaAT).

    Elsa A. Kirchner born in 1976, studied from 1994 to 1999 Biology at the University of Bremen, Germany. In 1999 she received her Diploma (Dipl. Inf.). Her diploma thesis was the result of a cooperation between the Brain Research Institute I: Behavioral Physiology and Developmental Neurobiology of the University of Bremen and the department of Epileptology at the University Hospital of Bonn, Germany. From 1997 to 2000 she was fellow of the Studienstiftung des Deutschen Volkes. During her studies she received several other scholarships. With the help of the Stiftung Familie Klee award she was able to work as a guest researcher at the Department of Brain and Cognitive Sciences, MIT in Boston, USA, from 1999 to 2000. Since 2005, she is staff scientist of the Robotics Lab at the University of Bremen, Germany, leading the Brain & Behavioral Labs. Since 2008, she leads the team Interaction and since 2016 the extended team Sustained Interaction and Learning, at the Robotics Innovation Center of the German Research Center for Artificial Intelligence (DFKI GmbH) in Bremen, Germany. In 2014 she graduated (Dr. rer. nat) in Computer Science at the University of Bremen. Her scientific interests focus on human-machine interaction, cognitive architectures, neuropsychology, and electrophysiological methods. She supervised several Diploma, B.Sc. and M.Sc. students and served as reviewer of international journals and conferences. She also is author of more than 40 publications in peer-reviewed international journals and of two book chapters.

    Vinzenz Bargsten received the degree of Dipl.-Ing. in Systems Engineering and Engineering Cybernetics from the Otto-von-Guericke University Magdeburg, Germany in 2012. Since then, he is working as a researcher at the University of Bremen, Workgoup Robotics. His main areas of research interest are modeling, identification and control of robotic systems.

    Timo Bänziger, after finishing his studies in mechanical engineering at the ETH Zürich and the UC Berkeley in 2014, worked as software developer in a research team for solar energy. In 2015 he became Ph.D. candidate in the field of human-robot collaboration at the Smart Production Lab of the Group IT of Volkswagen.

    Thomas Krüger studied aerospace engineering at the University of Braunschweig where he received his diploma in 2006. Afterwards he worked in the field of unmanned aircraft systems at the institute of Aerospace systems at the University of Braunschweig where he received his Ph.D. on adaptive control systems using neural networks in 2012. From 2012 to 2015 he was program manager for special mission aircraft at Aerodata AG and since 2015 he is working on new intelligent robotic systems for manufacturing applications at Volkswagen AG. His main scientific interests are robotics, neural networks and adaptive control.

    Frank Kirchner studied computer science and neurobiology at the University Bonn, Germany, where he received the Dr. rer. nat. degree in computer science in 1999. Since 1994, he has been a Senior Scientist at the Gesellschaft für Mathematik und Datenverarbeitung (GMD) in Sankt Augustin, Germany, and also a Senior Scientist at the Department for Electrical Engineering from 1998 at Northeastern University in Boston, MA, USA. In 1999, he first was appointed Tenure Track Assistant Professor at the Northeastern University, and since 2002 as a Full Professor at the University of Bremen, Germany. Since December 2005, he has also been the CEO of the Robotics Innovation Center, speaker of the Bremen location of the German Centre for Artificial Intelligence (DFKI) and since 2013 Scientific Director of the Brazilian Institute of Robotics (BIR).

    Frank Kirchner is a leading expert in research on biologically inspired locomotion and behaviour of highly redundant, multifunctional robotic systems. For a number of Ph.D. students he is the principal supervisor and he regularly serves as a reviewer for a series of international scientific journals and conferences. Frank Kirchner also is author of more than 200 publications in the field of Robotics and AI.

    View full text