Skip to main content
Top
Published in: Advances in Manufacturing 3/2017

Open Access 18-08-2017

Haptic and visual augmented reality interface for programming welding robots

Authors: D. Ni, A. W. W. Yew, S. K. Ong, A. Y. C. Nee

Published in: Advances in Manufacturing | Issue 3/2017

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

It is a challenging task for operators to program a remote robot for welding manipulation depending only on the visual information from the remote site. This paper proposes an intuitive user interface for programming welding robots remotely using augmented reality (AR) with haptic feedback. The proposed system uses a depth camera to reconstruct the surfaces of workpieces. A haptic input device is used to allow users to define welding paths along these surfaces. An AR user interface is developed to allow users to visualize and adjust the orientation of the welding torch. Compared with the traditional robotic welding path programming methods which rely on prior CAD models or contact between the robot end-effector and the workpiece, this proposed approach allows for fast and intuitive remote robotic welding path programming without prior knowledge of CAD models of the workpieces. The experimental results show that the proposed approach is a user-friendly interface and can assist users in obtaining an accurate welding path.

1 Introduction

Teleoperation systems are required to operate remote robotic tasks in hazardous or uninhabitable environments, such as nuclear facilities, underwater environment and outer space. Teleoperation systems are typically based on bilateral control, whereby motion from an operator is directly transmitted to a remote robot and forces experienced by the remote robot are transmitted to the operator. For telerobotic applications, such as assembly or pick-and-place, the simple bilateral control can be acceptable as the task can be repeated. However, a welding operation is irreversible once it has been executed. Defining welding tasks is challenging due to the stringent requirements in defining robotic task parameters, such as welding torch position, orientation and speed. Many robotic welding tasks are programmed on-site, where the pose of the welding torch can be verified and adjusted by the operator. Robotic welding can also be programmed offline with CAD models of the workpieces. However, in the programming of a remote robot for welding, the operator cannot verify the welding torch poses and would need to rely on video and other information transmitted from the remote site to define the welding paths. Therefore, the research in this paper addresses two challenges, namely:
(i)
Intuitive definition of welding paths and poses remotely, and
 
(ii)
Remote welding in unstructured environments where knowledge of the robot workspace is not available.
 
This paper presents a user-friendly and intuitive robot programming interface for remote robotic welding tasks. Challenge #1 is addressed through the development of an augmented reality (AR) interface that is combined with haptic feedback. The haptic feedback allows users to perceive the surfaces of the workpieces at remote location so as to guide the users in defining welding paths along a workpiece surface. The AR user interface overlays a virtual robot on the real robot in the camera view of the remote workspace, thus allowing a user to visualize and adjust the end-effector pose, as well as validate the reachability of the user-defined welding paths. Challenge #2 is addressed using a point cloud data set that is acquired using a depth sensor to reconstruct implicit surfaces that represent the surfaces of workpieces. The welding task is first planned and simulated using a virtual robot via the AR interface before the task is executed using the real robot. The welding paths generated by the system are intended to be further processed by welding seam trackers at the actual robot workspace to accurately locate the welding seams on the workpieces. Seam tracking is achieved using optical sensors, such as laser scanners, to locate the differences in the height of a workpiece surface so as to define an edge for welding. However, a path along the welding seam must still be defined in order that the sensors can scan and locate the seam. Therefore, the proposed system is designed to define paths at a distance offset from the surface of a workpiece to allow the trackers to scan the welding seam and prevent the welding torch from colliding with the workpiece.
Overall, the main contributions of this paper are as follows:
(i)
To propose a novel prototype of haptic and visual augmented reality interface for welding robot programming.
 
(ii)
To implement a welding path definition method based on the point cloud data of unknown workpieces.
 
The remaining of the paper is organised as follows. A discussion on the related work in remote welding robot programming is presented in Sect. 2. Section 3 presents a description of the AR user interface, followed by the development of the haptic input method for welding task definition in Sect. 4. Lastly, an evaluation of the prototype system is presented in Sect. 5.
In the area of manufacturing, the visual, haptic, audio feedbacks are often adopted to enhance information perception of users. Visual feedback is widely used to realize an immersive and accurate man-machine interface for the users. In Ref. [1], a product design method using the virtual reality (VR) technology for complex human-product interactions is proposed. Mavrikios et al. [2] investigated the use of VR-based methods to support human-integrated simulation for manual welding processes. The interface enables the user to set up, execute and validate the results of a welding process [2]. AR is an enhancement of VR. By augmenting the physical environment with virtual objects, AR further enhances information perception and situational awareness, giving the users a live view of the manufacturing environment. AR has been applied in many areas in manufacturing, including assembly and human-robot interaction [35].
Haptic feedback is usually applied in robot teleoperation systems because of the lack of situational awareness of the remote environment where the robot operates. It is usually employed for human telepresence reconstruction, i.e., imbuing remote users with the haptic sensation of the physical properties of remote objects, such as the texture, roughness, etc. [68]. Another application of haptic feedback is the provision of augmented information that can be used to guide users in performing specific tasks. In Rosenberg’s work [9], virtual fixtures are overlaid on a workspace as guiding forces that are transmitted to a remote operator via haptic feedback. This approach has since been adopted in robotic surgery [10], micro-scale teleoperation systems [11], maintenance [12] and assembly [13]. Haptic feedback, as an additional perception channel to visual feedback, can provide users with a better spatial understanding of the surfaces and edges of a workpiece. Wang et al. [14] proposed a haptic arc welding training method based on VR and haptic guidance which provides force feedback to a welder to show the proper force/position relation within pre-defined trajectories for attaining hand-mind-eye coordination skills in a virtual environment. However, this system is only suitable for a pre-defined welding environment where the welding path needs to be preset.
In robotic welding, the objective is to define a path for the end-effector so that it follows the topology of a surface at a constant distance from the surface, known as the tip-to-work distance, and with a specific end-effector orientation. Nichol and Manic [15] developed a haptic interface for teleoperating a remote arc welding robot, using force sensors installed on the end-effector to transmit forces encountered at the end-effector to the users of the haptic interface. The limitation of this system is that it relies on contact between the end-effector and the workpiece for users to follow the surface with haptic feedback. Reddy and Reddy [16] developed a visual and haptic interface for teleoperating a remote welding robot, where a camera is used to give users a view of the remote environment and an ultrasonic sensor is used to determine the distance between the workpiece and the end-effector. In this system, haptic feedback is provided to users via vibration motors that are embedded in a glove. Thus, a user has to maintain his hand steadily at a fixed offset from the welding surface. For highly curved surfaces, this approach will not result in smooth paths. For remote robot welding, it is a challenge to program the motion and end-effector orientation of the robot to carry out precise tasks such as welding. It is important to have an accurate representation of the geometry of the workpiece as it is not possible for the user to have physical access to it. Thus, in contrast with the reported works presented in this section, this research proposes a system where the haptic perception channel is combined with AR. With AR instead of VR, the user is provided with a better situational awareness of the remote environment, while the addition of the haptic channel provides the user with a good spatial understanding of the workpiece so that welding paths may be defined more accurately. In addition, this research accounts for dynamic remote environments by utilizing 3D point cloud data obtained from a depth camera at the remote environment in order to generate haptic guidance along the surface of the workpiece without prior knowledge of the workpiece.
In this research, an AR interface has been developed where a virtual robot simulates the motion of the real robot. The virtual robot is overlaid on the real robot in the view of the real robot workspace. Therefore, users are able to validate the motion of the virtual robot to ensure that the welding task can be executed safely and properly before transferring the task to a real robot. Meanwhile, a haptic interface based on the PHANToM device is used to control the virtual robot and generate haptic feedback that is applied as a guiding fixture for users to control a virtual welding robot to define welding paths at a fixed distance offset from a workpiece surface. The haptic force is calculated based on an implicit surface simulation method. Assisted with the proposed system, users can realize welding robot programming remotely for an unknown environment.

3 System overview

The proposed system consists of the users, the AR interface, visual sensors including a PC camera and a Kinect sensor. The AR interface is composed of two main modules including the visual augmented reality interface and the haptic interface. The visual augmented reality interface provides an AR display for simulating the motion of the real robot which consists of a virtual robot and the remote video real time captured using a PC camera. Meanwhile, users can control the virtual robot using the haptic interface. When the virtual robot is near the workpiece, a relative distance is estimated based on the position relationship between the virtual robot end-effector and the point cloud from the Kinect sensor. Thus, the force feedback can help users maintain a constant distance from a workpiece surface.
The detailed dataflow is shown in Fig. 1. When the user controls the virtual robot with the visual and haptic indications, if the position is a suitable welding candidate point, the virtual robot end-effector position and orientation are recorded on a text document by pressing the button on the PHANToM device. Then the recorded welding candidate point will be fitted as a curve for programming the real welding robot.

4 Visual augmented reality interface

An AR interface has been developed for the simulation of the motion of a welding robot during a welding process. The virtual robot is controlled using a PHANToM device. The pose of the end-effector of the virtual robot is set according to the pose of the end-effector of the PHANToM device (see Fig. 2a). In the AR scene which is a view of the remote workspace, the virtual welding robot is overlaid on the real robot (see Fig. 2b). This allows users to set the angle of the welding torch with respect to the workpiece visually, so as to ensure that the welding path is reachable and collision-free.
The movement of the PHANToM device will be reflected by the movement of the virtual robot. An inverse kinematics solver [17] is used to compute a valid joint configuration of the virtual robot based on the pose of the PHANToM end-effector, so as to simulate the movement of the virtual robot. The joint configuration is used to set the joint angles of the virtual robot model, so that the configuration of the virtual robot would be the same as the configuration of the real robot as the real robot reaches a point along a welding path. The joint angles of the virtual robot are updated in real time as the user moves the PHANToM device.

5 Haptic interface

The PHANToM device is used to move the virtual robot along a surface of a workpiece without the need for a prior CAD model of the workpiece. The position of the end-effector of the PHANToM device is mapped to the robot workspace as a haptic interaction point (HIP). Thus, the virtual robot end-effector trails the HIP as the HIP is moved by the PHANToM. A point cloud of the robot workspace is acquired using a Kinect sensor, and the surfaces of objects near the HIP are reconstructed as implicit surfaces from the local point cloud data [18]. When the HIP is near a workpiece, the virtual robot end-effector is prevented from penetrating an isosurface, which is a virtual surface created above the actual surface of the workpiece. Haptic force feedback is applied via the PHANToM to stop the HIP when the virtual robot end-effector touches the isosurface. This mechanism is used to implicitly guide a user to maintain a constant distance from a workpiece surface as the user defines a welding path on the workpiece by limiting the position of the virtual robot end-effector to a constant distance from the workpiece surface.

5.1 Implicit surface method

The method for creating implicit surfaces from point clouds is based on Ref. [18]. An implicit surface near the HIP is estimated based on the weighted contributions of the points in the range of a R-radius sphere at the position of the HIP. R is a pre-set radius that defines a sphere around the position of the HIP. Equation (1) is the weighting function applied in the proposed system, where d(p) represents the distance between the virtual robot end-effector and the local points \( p \) that are obtained from the Kinect sensor in a R-radius sphere region.
$$ W\left(p\right) = \left\{{\begin{array}{ll} \left({1 -\frac{d\left(p\right)}{R}}\right)^{6} \left({35\left({\frac{d\left(p\right)}{R}}\right)^{2} + 18\left({\frac{d\left(p\right)}{R}}\right) + 3}\right), & \quad d\left(p \right) \le R, \\ \qquad\qquad\qquad\qquad 0, \quad & \quad d\left(p\right) > R. \\ \end{array}}\right.$$
(1)
With the weighted contribution of each point, the center and normal vector of the implicit surface can be calculated. The center of the point cloud points \( p \) in the range of the R-radius sphere is defined as c, and the normal vector is defined as n, in Eqs. (2) and (3) respectively. Thus, the local implicit surface can be calculated using Eqs. (2) and (3).
$$ {\varvec{c}} = \frac{{\mathop \sum \nolimits_{i} W(p_{i} ) p_{i} }}{{\mathop \sum \nolimits_{i} W\left( {p_{i} } \right)}}, $$
(2)
$$ {\varvec{n}} = \frac{{\mathop \sum \nolimits_{i} W(p_{i}) (e - p_{i} )}}{{\mathop \sum \nolimits_{i} W(p_{i} )(e - p_{i} )}} ,$$
(3)
where p i is one of the points p, and e is the position of the virtual robot end-effector.
With the estimated center and normal vector, the local implicit surface is described as Eq. (4), where P is a point on the surface.
$$ S = {\varvec{n}}^{\text{T}} \left( {\varvec{P}} - {\varvec{c}} \right). $$
(4)

5.2 Haptic force feedback for surface offset

As shown in Fig. 3, the black point indicates the position of the HIP and the other points are part of the point cloud acquired by the Kinect sensor in the range of the R-radius sphere. The position of the HIP is controlled by the user through the PHANToM device. When the HIP is near a point cloud, shown as purple points, the local implicit surface of the point cloud near the HIP is estimated using the implicit surface method (blue plane in Fig. 3), and the relative isosurface is estimated (green plane in Fig. 3). The virtual robot end-effector, shown as a yellow point in Fig. 3, is constrained to be above the isosurface.
To guide users along a workpiece at an offset distance D from a workpiece surface, an isosurface is defined D away from the local implicit surface. This is achieved by setting the value of \( S \) in Eq. (4) to the value of D. The haptic force feedback that prevents the virtual robot end-effector from penetrating the isosurface is applied using the spring-damper model as
$$ {\varvec{f}} = k \left( {{\varvec{e}} - {\varvec{h}}} \right) + b {\varvec{v}}. $$
(5)
In Eq. (5), k is the spring parameter; b is the damping parameter; h is the position of the PHANToM end-effector; \( \varvec{e} \) is the position of the virtual robot end-effector; \({\varvec{v}}\) is the velocity of the HIP, and f is the force vector to be applied as haptic feedback.
Figure 4 illustrates a simulated relationship among the haptic force feedback (red line), the distance between the HIP and the workpiece surface (blue line), and the distance between the virtual robot end-effector and the workpiece surface (green line). In Fig. 4, the values of D and R are 0.02 m. When there are no points in the R-radius sphere, no haptic force feedback is applied, which means that there are no suitable points for defining a welding path. When the HIP is brought closer to the workpiece surface such that the distance is less than D, an isosurface is estimated based on Eq. (4). The virtual robot end-effector is kept on the isosurface and a haptic feedback is calculated based on Eq. (5) to indicate that the desired offset from the surface of the workpiece has been reached.

6 Validation of the haptic and visual augmented reality interfaces

To validate the proposed approach, a prototype of the system has been developed (see Fig. 5). The robot workspace is captured using a PC camera to compose the AR scene, while a Kinect sensor is used to capture the 3D point cloud data of the workpiece. A coordinate registration must be obtained between the PC camera, Kinect sensor, and real robot in order to overlay the virtual robot on the real robot and represent the point cloud data of workpieces with respect to the coordinate system of the real robot. This is achieved by placing a fiducial marker in the physical workspace of the real robot. The coordinate transformation between the real robot and the fiducial marker can be obtained by moving the real robot end-effector to the corners of the marker so as to obtain the coordinates of the corners of the marker with respect to the real robot; thus, the transformation matrix between the robot and the marker can be obtained. The PC camera and Kinect sensor can be registered to the fiducial marker using a marker tracking algorithm [19] to obtain their respective transformation matrices from the fiducial marker in real time.
The system prototype was tested by ten users with no background in welding or robot programming. Before the test, the users were allowed to be familiarized with the visual AR and haptic interfaces and the haptic sensations through several trials. During the test, the users were asked to carry out two robotic welding programming tasks, namely, one straight path and one curved path, as shown in Fig. 5. The users would need to record the welding candidate points.
For each task, the users were asked to use the PHANToM device to define the candidate welding points. In order to evaluate the proposed haptic and visual interaction interface, two experiments were carried out. Experiment A was designed to define the welding path with the assistance of haptic force feedback, while experiment B was designed without the haptic force feedback. In experiment A, the distance D was set to 10 mm. In the experiments, users click a button on the PHANToM device to choose the candidate welding points as they move the virtual robot. During the tests, the candidate welding points defined by the users are recorded. The use of the system is shown in Fig. 6 .
The candidate welding points are plotted in 3D using Matlab and overlaid on the point cloud data. The candidate points in experiment B are drawn, as shown in Fig. 7, and it can be observed that tasks are fulfilled pretty well.
For detailed analysis of the accuracy of path definition in the two experiments, Fig. 8 shows the average deviation of the paths defined by the users from the actual welding path. The actual welding paths were obtained accurately by moving the real robot end-effector to points on the path and recording the point coordinates from the robot controller. It can be observed that welding paths can be defined more accurately with haptic feedback. With haptic feedback, the error in the user-defined welding paths from the actual paths in each coordinate axis is within ±15 mm.
A subjective evaluation of the interface was conducted through collecting user feedback after the tests. Most users reported that the system is easy and user-friendly and felt confident in their ability to define welding paths using the system. This is because the AR interface allows the users to visualize the robot in the remote environment, and to visually validate that the welding task is reachable and can be executed safely without collisions. Without haptic feedback, users reported a loss of confidence in having to locate the workpiece surface, and record the welding path points accurately. A few users suggested developing visual aids to assist in gauging the angle between the welding torch and the workpiece surface.
The test and validation of the system was conducted with ten users, which is a relatively small sample size. However, the experimental results show that the haptic and visual interface greatly improves the welding path definition accuracy compared to a purely visual interface. Furthermore, the subjective evaluation shows that the haptic and visual aspects of the system enhances the user experience in remote robot welding path definition, even for lay-users who have no background in robot programming. Therefore, the system shows promising potential in its application in intuitive remote robot welding programming.

7 Conclusions and future work

A novel method that integrates a haptic input device with an AR interface for programming robot welding tasks in unstructured and dynamic welding environment where the 3D models of workpieces are not known has been developed and implemented in this research. In the prototype system, haptic feedback is used to guide a user in controlling a PHANToM haptic input device to follow the surface of a workpiece; a virtual robot is overlaid on a view of the real robot workcell to allow users to visualize and adjust the pose of the end-effector of the virtual robot. This proposed approach aids a user to follow the topology of a surface of a workpiece at a specific welding torch orientation with respect to the workpiece when he performs robotic welding programming.
A pilot study shows that the approach is user-friendly, and the deviations of user-defined paths from the actual welding path are within ±15 mm. As seam tracking sensors can typically scan within an area of ±15 mm or ±20 mm, the study indicates that the user-defined paths are suitable for further processing using seam tracking sensors to achieve actual welding paths. The authors aim to improve the accuracy by investigating alternative point cloud acquisition methods and using more precise methods to calibrate the registration between the depth sensor and the real robot.

Acknowledgements

This research is supported by the Singapore A*STAR Agency for Science, Technology and Research Thematic Programme on Industrial Robotics (Grant No. 1225100001), and the China Scholarship Council.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://​creativecommons.​org/​licenses/​by/​4.​0/​), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
Literature
1.
go back to reference Rentzos L, Vourtsis C, Mavrikios D et al (2014) Using VR for complex product design. In: Proceedings of the international conference on virtual, augmented and mixed reality 2014, Crete, Greece, 22–27 June, pp 455–464 Rentzos L, Vourtsis C, Mavrikios D et al (2014) Using VR for complex product design. In: Proceedings of the international conference on virtual, augmented and mixed reality 2014, Crete, Greece, 22–27 June, pp 455–464
2.
go back to reference Mavrikios D, Karabatsou V, Fragos D et al (2006) A prototype virtual reality based demonstrator for immersive and interactive simulation of welding processes. Int J Comput Integr Manufact 19(3):294–300CrossRef Mavrikios D, Karabatsou V, Fragos D et al (2006) A prototype virtual reality based demonstrator for immersive and interactive simulation of welding processes. Int J Comput Integr Manufact 19(3):294–300CrossRef
3.
go back to reference Makris S, Karagiannis P, Koukas S et al (2016) Augmented reality system for operator support in human-robot collaborative assembly. CIRP Ann Manufact Technol 65(1):61–64CrossRef Makris S, Karagiannis P, Koukas S et al (2016) Augmented reality system for operator support in human-robot collaborative assembly. CIRP Ann Manufact Technol 65(1):61–64CrossRef
4.
go back to reference Kon T, Oikawa T, Choi Y et al (2006) A method for supporting robot’s actions using virtual reality in a smart space. In: Proceedings of the SICE-ICASE international joint conference 2006, Busan, Korea, 18–21 Oct, pp 3519–3522 Kon T, Oikawa T, Choi Y et al (2006) A method for supporting robot’s actions using virtual reality in a smart space. In: Proceedings of the SICE-ICASE international joint conference 2006, Busan, Korea, 18–21 Oct, pp 3519–3522
5.
go back to reference Andersson N, Argyrou A, Nägele F et al (2016) AR-enhanced human-robot-interaction: methodologies, algorithms tools. Procedia CIRP 44:193–198CrossRef Andersson N, Argyrou A, Nägele F et al (2016) AR-enhanced human-robot-interaction: methodologies, algorithms tools. Procedia CIRP 44:193–198CrossRef
6.
go back to reference Chotiprayanakul P, Wang D, Kwok N et al (2008) A haptic base human robot interaction approach for robotic grit blasting. In: Proceedings of the 25th international symposium on automation and robotics in construction, Vilnius, Lithuania, 26–29 June 2008, pp 148–154 Chotiprayanakul P, Wang D, Kwok N et al (2008) A haptic base human robot interaction approach for robotic grit blasting. In: Proceedings of the 25th international symposium on automation and robotics in construction, Vilnius, Lithuania, 26–29 June 2008, pp 148–154
7.
go back to reference El Saddik A (2007) The potential of haptics technologies. IEEE Instrum Meas Mag 10(1):10–17CrossRef El Saddik A (2007) The potential of haptics technologies. IEEE Instrum Meas Mag 10(1):10–17CrossRef
8.
go back to reference Velanas SV, Tzafestas CS (2010) Human telehaptic perception of stiffness using an adaptive impedance reflection bilateral teleoperation control scheme. In: Proceedings of the IEEE 19th international symposium in robot and human interactive communication, Viareggio, Italy, 13–16 Sept, pp 21–26 Velanas SV, Tzafestas CS (2010) Human telehaptic perception of stiffness using an adaptive impedance reflection bilateral teleoperation control scheme. In: Proceedings of the IEEE 19th international symposium in robot and human interactive communication, Viareggio, Italy, 13–16 Sept, pp 21–26
9.
go back to reference Rosenberg LB (1993) Virtual fixtures: perceptual tools for telerobotic manipulation. In: Proceedings of the IEEE virtual reality annual international symposium, Seattle, USA, 18–22 Sept, pp 76–82 Rosenberg LB (1993) Virtual fixtures: perceptual tools for telerobotic manipulation. In: Proceedings of the IEEE virtual reality annual international symposium, Seattle, USA, 18–22 Sept, pp 76–82
10.
go back to reference Li M, Kapoor A, Taylor RH (2007) Telerobotic control by virtual fixtures for surgical applications. Springer Tracts Adv Robot 31(1):381–401CrossRef Li M, Kapoor A, Taylor RH (2007) Telerobotic control by virtual fixtures for surgical applications. Springer Tracts Adv Robot 31(1):381–401CrossRef
11.
go back to reference Bolopion A, Régnier S (2013) A review of haptic feedback teleoperation systems for micromanipulation and microassembly. IEEE Trans Autom Sci Eng 10(3):496–502CrossRef Bolopion A, Régnier S (2013) A review of haptic feedback teleoperation systems for micromanipulation and microassembly. IEEE Trans Autom Sci Eng 10(3):496–502CrossRef
12.
go back to reference Xia, T, Leonard S, Kandaswamy I et al (2013) Model-based telerobotic control with virtual fixtures for satellite servicing tasks. In: Proceedings of the IEEE international conference on robotics and automation (ICRA) 2013, Karlsruhe, Germany, 6–10 May, pp 1479–1484 Xia, T, Leonard S, Kandaswamy I et al (2013) Model-based telerobotic control with virtual fixtures for satellite servicing tasks. In: Proceedings of the IEEE international conference on robotics and automation (ICRA) 2013, Karlsruhe, Germany, 6–10 May, pp 1479–1484
13.
go back to reference Aleotti J, Reggiani M (2005) Evaluation of virtual fixtures for a robot programming by demonstration interface. IEEE Trans Syst Man Cybern Part A: Syst Hum 35(4):536–545CrossRef Aleotti J, Reggiani M (2005) Evaluation of virtual fixtures for a robot programming by demonstration interface. IEEE Trans Syst Man Cybern Part A: Syst Hum 35(4):536–545CrossRef
14.
go back to reference Wang Y, Chen Y, Nan Z et al (2006) Study on welder training by means of haptic guidance and virtual reality for arc welding. In: Proceedings of the IEEE international conference on robotics and biomimetics 2006, Kunming, China, 17–20 Dec, pp 954–958 Wang Y, Chen Y, Nan Z et al (2006) Study on welder training by means of haptic guidance and virtual reality for arc welding. In: Proceedings of the IEEE international conference on robotics and biomimetics 2006, Kunming, China, 17–20 Dec, pp 954–958
15.
go back to reference Nichol CI, Manic M (2009) Video game device haptic interface for robotic arc welding. In: Proceedings of the 2nd conference on human system interactions 2009, Catania, Italy, 21–23 May, pp 648–653 Nichol CI, Manic M (2009) Video game device haptic interface for robotic arc welding. In: Proceedings of the 2nd conference on human system interactions 2009, Catania, Italy, 21–23 May, pp 648–653
16.
go back to reference Reddy PA, Reddy TD (2016) Design and development of a telemanipulated welding robot with visual and haptic feedback. Int J Res Eng Technol 5(8):371–376CrossRef Reddy PA, Reddy TD (2016) Design and development of a telemanipulated welding robot with visual and haptic feedback. Int J Res Eng Technol 5(8):371–376CrossRef
18.
go back to reference Leeper A, Chan S, Salisbur K (2012) Point clouds can be represented as implicit surfaces for constraint-based haptic rendering. In: IEEE international conference on robotics and automation 2012, St Paul, USA, 14–18 May, pp 5000–5005 Leeper A, Chan S, Salisbur K (2012) Point clouds can be represented as implicit surfaces for constraint-based haptic rendering. In: IEEE international conference on robotics and automation 2012, St Paul, USA, 14–18 May, pp 5000–5005
19.
go back to reference Garrido-Jurado S, Muñoz-Salinas R, Marín-Jiménez MJ (2014) Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recog 47(6):2280–2292CrossRef Garrido-Jurado S, Muñoz-Salinas R, Marín-Jiménez MJ (2014) Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recog 47(6):2280–2292CrossRef
Metadata
Title
Haptic and visual augmented reality interface for programming welding robots
Authors
D. Ni
A. W. W. Yew
S. K. Ong
A. Y. C. Nee
Publication date
18-08-2017
Publisher
Shanghai University
Published in
Advances in Manufacturing / Issue 3/2017
Print ISSN: 2095-3127
Electronic ISSN: 2195-3597
DOI
https://doi.org/10.1007/s40436-017-0184-7

Other articles of this Issue 3/2017

Advances in Manufacturing 3/2017 Go to the issue

Premium Partners