Skip to main content

2020 | Buch

Wearable Technology for Robotic Manipulation and Learning

verfasst von: Dr. Bin Fang, Prof. Fuchun Sun, Prof. Huaping Liu, Dr. Chunfang Liu, Dr. Di Guo

Verlag: Springer Singapore

insite
SUCHEN

Über dieses Buch

Over the next few decades, millions of people, with varying backgrounds and levels of technical expertise, will have to effectively interact with robotic technologies on a daily basis. This means it will have to be possible to modify robot behavior without explicitly writing code, but instead via a small number of wearable devices or visual demonstrations. At the same time, robots will need to infer and predict humans’ intentions and internal objectives on the basis of past interactions in order to provide assistance before it is explicitly requested; this is the basis of imitation learning for robotics.

This book introduces readers to robotic imitation learning based on human demonstration with wearable devices. It presents an advanced calibration method for wearable sensors and fusion approaches under the Kalman filter framework, as well as a novel wearable device for capturing gestures and other motions. Furthermore it describes the wearable-device-based and vision-based imitation learning method for robotic manipulation, making it a valuable reference guide for graduate students with a basic knowledge of machine learning, and for researchers interested in wearable computing and robotic learning.

Inhaltsverzeichnis

Frontmatter

Background

Frontmatter
Chapter 1. Introduction
Abstract
Wearable sensing devices are smart electronic devices that can be worn on the body as implants or accessories. With the rapid development of manufacturing and sensor technologies, they have aroused the worldwide interest and are widely applied in many practical applications. In this chapter, we provide an overview of the research of wearable technologies and their future research trends. The contents of wearable devices, sensor technologies, wearable computing algorithms, and their applications are presented in sequence.
Bin Fang, Fuchun Sun, Huaping Liu, Chunfang Liu, Di Guo

Wearable Technology

Frontmatter
Chapter 2. Wearable Sensors
Abstract
Wearable sensors are the basis of the wearable devices. In this chapter, two most popular wearable sensors including the inertial sensor and tactile sensor are introduced. The principle of the inertial sensor is described and optimal calibration methods are employed to improve the performance. Then, two tactile sensors with different sensing mechanisms are introduced. The calibrated experimental results are shown in the end.
Bin Fang, Fuchun Sun, Huaping Liu, Chunfang Liu, Di Guo
Chapter 3. Wearable Design and Computing
Abstract
Wearable design and computing are up to the performance of the wearable device. In this chapter, we introduce the wearable device that comprises eighteen low-cost inertial and magnetic measurement units (IMMUs). It makes up the drawbacks of traditional data glove which only captures incomplete gesture information. The IMMUs are designed compact and small enough to wear on the upper-arm, forearm, palm, and fingers. The orientation algorithms including Quaternion Extended Kalman Filter (QEKF) and two-step optimal filter are presented. We integrate the kinematic models of the arm, hand, and fingers into the whole system to capture the motion gesture. A position algorithm is also deduced to compute the positions of fingertips. Experimental results demonstrate that the proposed wearable device can accurately capture the gesture.
Bin Fang, Fuchun Sun, Huaping Liu, Chunfang Liu, Di Guo
Chapter 4. Applications of Developed Wearable Devices
Abstract
This chapter describes the applications of developed wearable devices. Gesture recognition provides an intelligent, natural, and convenient way for human–robot interaction. The gesture datasets are built by wearable device with the inertial sensors. The ELM and CNN approaches are applied to gesture recognition. Furthermore, the wearable device with the capacitive tactile sensor is introduced to apply to tactile interaction. Finally, the wearable device based on the piezo-resistive tactile sensor is developed for the perception of grasping.
Bin Fang, Fuchun Sun, Huaping Liu, Chunfang Liu, Di Guo

Manipulation Learning from Demonstration

Frontmatter
Chapter 5. Learning from Wearable-Based Teleoperation Demonstration
Abstract
In order to effectively transfer human’s manipulation skills to a robot, in this chapter, we investigate skill-learning functions via our proposed wearable device. The robotic teleoperation system which is implemented by directly controlling the speed of the motors is developed. Then the rotation-invariant dynamical-movement-primitive method is presented for learning interaction skills. Imitation learning experiments are designed and implemented. The experimental results verify the effectiveness of the proposed method.
Bin Fang, Fuchun Sun, Huaping Liu, Chunfang Liu, Di Guo
Chapter 6. Learning from Visual-Based Teleoperation Demonstration
Abstract
This chapter proposes the deep neural networks to enhance visual teleoperation considering human–robot posture consistence. Firstly, the teacher–student network (TeachNet) is introduced, which is a novel neural network architecture for intuitive and markerless vision-based teleoperation of dexterous robotic hands. It is combined with a consistency loss function, which handles the differences in appearance and anatomy between human and robotic hands. Then the multi-stage structure of visual teleoperation network is designed for robotic arm. Finally, imitation experiments are carried out on the robots to demonstrate that the proposed visual-based posture-consistent teleoperation is effective and reliable.
Bin Fang, Fuchun Sun, Huaping Liu, Chunfang Liu, Di Guo
Chapter 7. Learning from Wearable-Based Indirect Demonstration
Abstract
Comparing with the direct way for learning humans’ experience such as teleoperation, this chapter mainly focuses on indirect manipulation demonstration using wearable device. It is expected that robots could learn humans’ manipulation experience from the collected human manipulation data. In this chapter, we firstly introduce three kinds of approaches about collecting humans’ manipulation data using wearable device; then, an effective method is illustrated for building the experience model of human manipulation.
Bin Fang, Fuchun Sun, Huaping Liu, Chunfang Liu, Di Guo

Conclusions

Frontmatter
Chapter 8. Conclusions
Abstract
This book comprehensively introduces developed wearable devices and robotic manipulation learning from their demonstrations. Under the developed unified framework of wearable technologies, the following wearable demonstration and manipulation learning problems are systematically addressed.
Bin Fang, Fuchun Sun, Huaping Liu, Chunfang Liu, Di Guo
Metadaten
Titel
Wearable Technology for Robotic Manipulation and Learning
verfasst von
Dr. Bin Fang
Prof. Fuchun Sun
Prof. Huaping Liu
Dr. Chunfang Liu
Dr. Di Guo
Copyright-Jahr
2020
Verlag
Springer Singapore
Electronic ISBN
978-981-15-5124-6
Print ISBN
978-981-15-5123-9
DOI
https://doi.org/10.1007/978-981-15-5124-6

Premium Partner