Skip to main content
Erschienen in: International Journal of Computer Assisted Radiology and Surgery 6/2015

Open Access 01.06.2015 | Original Article

Brain activation in parietal area during manipulation with a surgical robot simulator

verfasst von: Satoshi Miura, Yo Kobayashi, Kazuya Kawamura, Yasutaka Nakashima, Masakatsu G. Fujie

Erschienen in: International Journal of Computer Assisted Radiology and Surgery | Ausgabe 6/2015

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Purpose

we present an evaluation method to qualify the embodiment caused by the physical difference between master–slave surgical robots by measuring the activation of the intraparietal sulcus in the user’s brain activity during surgical robot manipulation. We show the change of embodiment based on the change of the optical axis-to-target view angle in the surgical simulator to change the manipulator’s appearance in the monitor in terms of hand–eye coordination. The objective is to explore the change of brain activation according to the change of the optical axis-to-target view angle.

Methods

In the experiments, we used a functional near-infrared spectroscopic topography (f-NIRS) brain imaging device to measure the brain activity of the seven subjects while they moved the hand controller to insert a curved needle into a target using the manipulator in a surgical simulator. The experiment was carried out several times with a variety of optical axis-to-target view angles.

Results

Some participants showed a significant peak (P value = 0.037, F-number = 2.841) when the optical axis-to-target view angle was \(75^{\circ }\).

Conclusions

The positional relationship between the manipulators and endoscope at \(75^{\circ }\) would be the closest to the human physical relationship between the hands and eyes.
Hinweise

Conflict of interest

   The authors declare that they have no conflict of interest.

Ethical standard

   All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. This experiments were approved by the Waseda University Institutional Review Board (No. 2013-201).

Informed consent

   Informed consent was obtained from all individual participants included in the study.

Introduction

Robotic surgery offers the advantage of minimally invasive surgery, which can reduce both scarring and patient recovery time because the surgical manipulator is small and precise [1, 2]. Surgical robots are therefore used worldwide [3]. For example, 2000 da Vinci surgical robots have been sold worldwide, and these surgical robots were used in more than 278,000 cases prior to 2010 [4].
The method of operation when using a surgical robot mainly involves a master–slave arrangement, where the surgeon inserts the slave manipulators and an endoscope into the patient’s body, and then operates the slave manipulators using the master console. The surgeon controls the master console to move the slave manipulators within the patient’s body while simultaneously observing the operative field through the endoscope. In robotic surgery, the surgeon’s control depends on a combination of visual observation of the slaves via the endoscope and the proprioceptive senses of the operator’s hand via the stimulation from the nerves [5]. When the surgeon moves the master, the surgeon depends on proprioceptive feedback from his or her hand. Simultaneously, when the surgeon examines the slave’s movement, he or she depends on visual feedback about the slave’s movement from the endoscope. When the surgeon feels that the use of visual and proprioceptive senses to control the slaves and endoscope are as intuitive as his or her own hands and eyes, the instruments can be operated as intuitively as the surgeon can operate his or her own body.
Surgical robots must be designed to make the best use of the surgeon’s skill and experience when operating, and maximize the intuitiveness of operation. Although intuitiveness has been studied by many scientists in a variety of fields, the master–slave system used in surgical robots has some problems. One is how exactly the posture and position of the manipulator’s tips are synchronized between master and slave. In endoscopic surgery, the direction that the surgeon’s hand moves is contrary to the direction that the forceps moves through the endoscope in the monitor. The surgeon, in endoscopic surgery, does not feel that their visual and proprioceptive senses are agreement. However, robotic surgery resolves the problem of the agreement of the tip’s kinematics between surgeon and manipulator. Another issue is how the surgeon feels that the manipulator belongs to their body, because they are operating using the manipulator and endoscope instead of their hands and eyes. This feeling is called hand–eye coordination. When the trocar port point changes to a different part, the surgeon’s cognitive sense of hand–eye coordination changes. From the viewpoint of robotics, hand–eye coordination caused by the physical difference between the human body and the robot mechanism is known as embodiment [68]. The embodiment means the cognition which is strongly influenced by aspects of human body beyond the brain itself. Hand–eye coordination is one of the embodiment [7, 8].
Although the surgical robot must be designed with embodiment as a consideration, there is currently no good method for evaluating embodiment. Conventionally, engineers design surgical robots taking mechanical performance aspects into account, such as the time taken to complete a given task, and the average speed and curvature of a movement under test conditions [9]. These working scores are so useful, in fact, that the mechanical performance of surgical robots has improved considerably in recent years, but the improvement in the mechanical performance of a robot does not necessarily represent the embodiment that the user feels.
In the field of cognitive neuroscience, many related studies have reported that the intraparietal sulcus is the specific brain area that is important in the function of embodiment. Some reported that the intraparietal sulcus shows how strongly a human perceives that a tool belongs to their body [10, 11]. In Iriki’s experiment, the intraparietal sulcus of the macaque changes before and after tool-use. Iriki reported that this result occurred because the macaque perceives the tool as belonging to its body [12, 13]. The intraparietal sulcus was also activated as the macaque saw its hand in the virtual space [14] and was measured in real time while using the tool with positron emission computerized tomography (PET) [15]. The function of the intraparietal sulcus is reported to be applicable to not only macaques but also humans [16, 17]. In addition, activity in the intraparietal sulcus has been found using not only f-MRI but also f-NIRS [1820]. This related research has often been reported in the cognitive science field, but very few attempts have been made to apply these findings to robotics design in the field of engineering.
In this paper, we validate the feasibility of a method to qualify the embodiment caused by the physical difference between master–slave surgical robots by measuring the activation of the intraparietal sulcus in the user’s brain activity during surgical robot manipulation as shown in Fig. 1. We measured the brain activity while the user controlled the manipulator on a surgical simulator, and determined an optimal robot design. In the experiment, to change the manipulator’s appearance in the monitor in terms of hand–eye coordination, we changed the endoscope viewpoint such as optical axis-to-target view, as shown in Fig. 2. We show the change of embodiment as change of the optical axis-to-target view angle, and explore the change in brain activation as a function of the change of the optical axis-to-target view angle.

Method

Experimental setup

We performed brain imaging using a functional near-infrared spectroscopic topography (f-NIRS) device shown in Fig. 3. F-NIRS is a relatively new brain imaging technique in which brain activity is indicated by the relative change in the concentrations of oxygenated hemoglobin concentration and deoxygenated hemoglobin concentration. F-NIRS transmits the near-infrared light to the subject’s head and receives the backed near-infrared light via the cerebral cortex. In the cerebral cortex, the near-infrared light is absorbed in the oxygenated hemoglobin concentration of the vessels in the brain which changes in proportion as the brain activation. F-NIRS shows the oxygenated hemoglobin concentration as the brain activation through the change of the near-infrared light between before and after passing the subject’s head. We used f-NIRS (ETG-4000; Hitachi Medico Co., Tokyo, Japan) to evaluate the activity around the intraparietal sulcus. Thus, the higher the changes in the oxygenated hemoglobin concentration around the intraparietal sulcus become, the more agreement of physical difference between master–slave the participant will feels because the intraparietal sulcus has the function of the embodiment [10, 11]. Whereas f-NIRS does have inferior resolution performance when compared to f-MRI, it allows brain activity measurements to be performed without the use of magnetic fields [21, 22]. f-NIRS takes advantage of the measurement during controlling the robots and machines, including master–slave system such as the automobile and artificial limbs [23, 24]. Additionally, when using f-MRI the participant needs to lie down, but when using f-NIRS the participant can perform body movement tasks [21, 22]. Finally, f-NIRS is reasonably compact.
We used two 3 \(\times \) 3 matrixes of photodiodes, consisting of 10 light transmitters and 8 receivers for the measurements, as shown in Fig. 4. The blood oxygen level was measured in the 30-mm area between each transmitter and receiver pair. In the parietal area, the two-scalp setup consisted of nine photodiodes forming 12 measurement channels. The intraparietal sulcus is positioned at the intraparietal area. To measure the activity of the intraparietal sulcus, we identified the channel on the participant’s head that corresponded to that brain area using the following procedure.
First, we measured the three-dimensional (3D) coordinates of each point on the participant’s head using a 3D position measurement device (3D digitizer; PATRIOT Digitizer; Polhemus; USA). Next, we compared the 3D coordinates of each of these points on the participant’s head with a standard brain model using platform for optical topography analysis tools (POTATO) combined with MATLAB to identify the channel that was above the intraparietal sulcus [25].
While we measured the brain activity of each participant, they moved a hand controller (Geomagic Touch, Geomagic, Raleigh, NC, USA) to control the virtual arm in the surgical simulator (Fig. 3). The simulation was presented to the user on a 24-inch liquid crystal display (LCD) monitor with a vertical refresh rate of 60 Hz. The time course of the stimulus presentation was controlled using a personal computer (PC). The participants set the monitor position to be perpendicular to their line of sight at their own discretion.
The virtual manipulator has six degrees-of-freedom (DOF), as shown in Fig. 5. Figure 6 shows the simulator with blue and green circle targets on the white plane around the red wall, and a gray virtual curved needle with two virtual manipulators. The virtual arm mechanism is similar to that of the da Vinci. The position and posture of the tip was synchronized between the virtual arm and the hand controller while the user operated the foot switch. The manipulator can grip the curved needle when the participant pushes the stylus button.

Experimental conditions

The experimental conditions were based on the optical axis-to-target view angle because we examined a hand–eye coordination configuration consisting of two hands and eyes. The optical axis-to-target view angle was determined such that the endoscope downward posture against the plane as shown in Fig. 2. We conducted tests using a total of six different optical axis-to-target view angles, which were \(15^{\circ }, 30^{\circ }, 45^{\circ }, 60^{\circ }, 75^{\circ }\) and \(90^{\circ }\), as shown in Fig. 6. At each optical axis-to-target view angle, the two pivot points were in parallel with the tip of the endoscope. The distance between the endoscope and the origin stayed constant.
The task was manipulation of the virtual arm to insert the curved needle. The participant moved the hand controller to control the virtual arm while operating the foot switch. The participant inserted the needle from the left green circle point into the right blue circle point, as shown in Fig. 6. When the participant finished the manipulation task, they then released the needle from the manipulator. We confirmed the release of the needle and stopped the measurement. Under the initial experimental conditions, the simulator started the condition such that the manipulator gripped the needle and positioned it in parallel with the plane. At the start of the task, the participant positioned the stylus in parallel with the plane while operating the foot switch and synchronizing the virtual manipulator. The task was performed this way because the posture of the virtual manipulator corresponded to the posture of the stylus against the plane.
Five healthy adults (four men and one woman; mean age of 24 years; age range of 23–26 years; six right-handed and one left-handed) participated in the experiment. All participants had normal or corrected-to-normal vision. The participants were informed about the measurement of their brain activity and the purpose of the experiment. Informed consent was obtained from all individual participants included in the study. The experiments were conducted in accordance with the Declaration of Helsinki and were approved by the Waseda University Institutional Review Board (No. 2013-201). All participants were students at Waseda University, Japan, and were not surgeons. We considered them to be appropriate participants for determination of human cognitive function based on measurements of their brain activity.

Experimental procedure

First, we placed the cap of the imaging device on the participant’s head. Next, the participant trained by performing a few trials under each experimental condition before starting the experiment. Third, the participant performed five trials for insertion of the curved needle into the target under each of six conditions. At the start of the measurement sequence, the imaging device scanned the oxygenated hemoglobin concentration in the participant’s brain for 10.0 s. When the scan was complete, the display showed the surgical simulator automatically, and the participant started the task. During each measurement session, the participant tried to maintain the same posture and minimize body movement. The order of the experimental conditions was random.
We corrected the raw data using the following procedures. First, the raw data were digitally low-pass filtered at 0.10 Hz to remove the measurement noise. Next, a baseline correction was performed to remove the linear trend in the hemoglobin concentration. We fitted a linear function to the data points sampled in the 5-s intervals before and after the onset of each task period. Because the raw data for f-NIRS consist of relative values, we could not compare the data from the participants or that of the channels directly.

Results

The mean and standard deviation of the oxygenated hemoglobin concentration data in all subjects is \(0.00399\pm 0.06431\) mMmm. The maximum of all oxygenated data is 0.39403 mMmm, and the minimum is \(-0.4214\) mMmm. All the channel shows no significant using ANOVA, one-way analysis of variance (P value = 0.68843, F-number = 0.83077) for null hypothesis.
Figure 7 shows the longitudinal data of the oxygenated hemoglobin concentration from the channel that corresponded with the intraparietal sulcus during the curved needle insertion task for one participant. The oxygenated hemoglobin concentration changed depending on the phase of the needle insertion process. The oxygenated hemoglobin concentration increased when the needle was close to the green circle and peaked when the tip of the needle touched the circle. The oxygenated hemoglobin concentration decreased during insertion of the curved needle until the needle tip appeared.
Figure 8 shows the mean value of the oxygenated hemoglobin concentration against the optical axis-to-target view angle for each participant. The mean was calculated during the task period over five trials. We used ANOVA for statistical testing and found that the data showed significant differences (P value = 0.037, F-number = 2.841). Figure 9 shows the average of the optical axis-to-target view angle at the most brain activated among all trials in all participants.

Discussion

The mean of the oxygenated data is only 0.00399, but standard deviation is 0.06431 mMmm because oxygenated data fluctuated widely according to the phase of the manipulation. The Fig. 7 shows that the brain activation changed widely during the manipulation. The trend observed in Fig. 7 suggests that the intraparietal sulcus can be used to determine the user’s introspection for physical human–robot correspondence. This is because the intraparietal sulcus activated depending on whether the tip of the virtual arm appeared. For the user to perceive the tool as part of their body, it would be necessary to synchronize the posture and position of the tip between master and slave. In Fig. 7, during the time where the curved needle was showing, the oxygenated hemoglobin concentration increased. In contrast, when the curved needle was hidden, the oxygenated hemoglobin concentration decreased. During the period where the participant could look at the tip, they perceived the manipulator as being part of their body.
Figure 8 shows that the brain activates the most significantly at \(75^{\circ }\) in four subjects of five. In Fig. 9, the average of the oxygenated hemoglobin concentration is the most significantly at \(75^{\circ }\). The positional relationship between the manipulators and endoscope at \(75^{\circ }\) would be the closest to the human physical relationship between hands and eyes because the positional relationship between the subject’s eyes and the screen is at \(75^{\circ }\). The view angle of the endoscope in the simulator matches the relationship between the eyes and the screen so that the operator would have better hand–eye coordination. At angles of \(1^{\circ }\) and \(30^{\circ }\), the brain activated more than at angles of \(45^{\circ }\) and \(60^{\circ }\). The participants can perceive the depth because at this angle, the depth is easily visible. However, the participants rotated their wrists with difficulty. Figure 10 shows the difference between rotating the wrist at \(15^{\circ }\) and at \(75^{\circ }\). At \(75^{\circ }\), each participant manipulated the virtual arm to insert the curved needle by rotating their wrists only. However, at \(15^{\circ }\), the participants operated the virtual arm to insert the needle by bending their elbows. The ease of rotation of the wrist would have had an effect on the difference in brain activity observed between \(15^{\circ }\) and \(75^{\circ }\). The brain activity at \(75^{\circ }\) is higher than that at \(15^{\circ }\) because the participants could rotate their wrists more easily at \(75^{\circ }\) than at \(15^{\circ }\). The case that the kinematics of the manipulator is adjusted to move the subject’s arm easily at \(15^{\circ }\), the oxygenated hemoglobin concentration would increase. At \(45^{\circ }\), \(60^{\circ }\) and \(90^{\circ }\), the brain activation was low. At \(90^{\circ }\), the participants can rotate their wrists easily, but it is the most difficult angle at which to perceive the depth. The view at \(45^{\circ }\) marks a halfway point for ease of rotation of the wrist and perception of the depth. Therefore for intuitive control, it is important to support both ease of rotation of the wrist and perception of the depth. In future work, we will measure the user’s brain activity and the movement of the arm to compare the cognitive function with a muscle–skeletal model.
In this paper, the optical axis-to-target was changed within only a range. Under the several range changed, the brain activation should be measured. The other parameters such as the distance between stereo scopes and lighting direction would also have an effect on the brain activation. In addition, the DOF between the master and slave was different. In future, the brain activation should be validated in the consideration of these conditions. Furthermore, not only the indirect manipulation system such as the master–slave, but it is necessary to see the f-NRIS measurement when the user is directly manipulating objects naturally to establish a comparative baseline.

Conclusion

In this study, we validated an evaluation method to qualify the embodiment caused by the physical difference between master–slave surgical robots by measuring the activation of the intraparietal sulcus in the user’s brain activity during surgical robot manipulation. The objective was to explore the change of the brain activation according to the change of the optical axis-to-target view angle. We measured the brain activity of the participants while they moved a hand controller to control a virtual arm using a variety of optical axis-to-target view angles. We found that some participants showed a significant peak (P value = 0.037, F number = 2.841) in brain activity at \(75^{\circ }\). We concluded that the positional relationship between the manipulators and endoscope at \(75^{\circ }\) would be the closest to the human physical relationship between the hands and eyes. The proposed method and the results obtained in this paper are likely to have a spillover effect into artificial arms and other master–slave systems, such as infrastructure building robots.

Conflict of interest

   The authors declare that they have no conflict of interest.

Ethical standard

   All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. This experiments were approved by the Waseda University Institutional Review Board (No. 2013-201).
   Informed consent was obtained from all individual participants included in the study.

Acknowledgments

This work was supported in part by The Ministry of Education, Culture, Sports, Science, and Technology, Japan, by Waseda University, Tokyo, Japan, in part by the High-Tech Research Center Project from Ministry of Education, Culture, Sports, Science and Technology, and in part by a Grant-in-Aid for Scientific Research (A) (no. 26242061), a Grant-in-Aid for Challenging Exploratory Research (no. 25560243) and a Grant-in-Aid for JSPS Fellows (14J07261), in part by Council for Science, Technology and Innovation(CSTI), Cross-ministerial Strategic Innovation Promotion Program (SIP), “Development of Intuitive Teleoperation Robot using the Human Measurement” (Funding agency: JST).
Open AccessThis article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.
Literatur
1.
Zurück zum Zitat Leven J, Burschka D, Kumar R, Zhang G, Blumenkranz S, Dai XD, Awad M, Hager GD, Marohn M, Choti M, Hasser C, Taylor RH (2005) DaVinci canvas: a telerobotic surgical system with integrated, robot-assisted, laparoscopic ultrasound capability. Med Image Comput Comput Assist Interv 8(Pt 1):811–818PubMed Leven J, Burschka D, Kumar R, Zhang G, Blumenkranz S, Dai XD, Awad M, Hager GD, Marohn M, Choti M, Hasser C, Taylor RH (2005) DaVinci canvas: a telerobotic surgical system with integrated, robot-assisted, laparoscopic ultrasound capability. Med Image Comput Comput Assist Interv 8(Pt 1):811–818PubMed
2.
Zurück zum Zitat Osa T, Staub C, Knoll A (2010) “Framework of automatic robot surgery system using Visual servoing,” In: 2010 IEEE/RSJ international conference intelligent robots and systems, pp 1837–1842, Oct 2010 Osa T, Staub C, Knoll A (2010) “Framework of automatic robot surgery system using Visual servoing,” In: 2010 IEEE/RSJ international conference intelligent robots and systems, pp 1837–1842, Oct 2010
3.
Zurück zum Zitat Ballantyne GH (2002) Robotic surgery, telerobotic surgery, telepresence, and telementoring. Review of early clinical results. Surg Endosc 16(10):1389–1402CrossRefPubMed Ballantyne GH (2002) Robotic surgery, telerobotic surgery, telepresence, and telementoring. Review of early clinical results. Surg Endosc 16(10):1389–1402CrossRefPubMed
4.
Zurück zum Zitat Turchetti G, Palla I, Pierotti F, Cuschieri A (2012) Economic evaluation of da Vinci-assisted robotic surgery: a systematic review. Surg Endosc 26(3):598–606CrossRefPubMed Turchetti G, Palla I, Pierotti F, Cuschieri A (2012) Economic evaluation of da Vinci-assisted robotic surgery: a systematic review. Surg Endosc 26(3):598–606CrossRefPubMed
5.
Zurück zum Zitat Wang F, Su E, Burdet E, Bleuler H (2008) Development of a microsurgery training system. Conf Proc IEEE Eng Med Biol Soc 2008:1935–1938PubMed Wang F, Su E, Burdet E, Bleuler H (2008) Development of a microsurgery training system. Conf Proc IEEE Eng Med Biol Soc 2008:1935–1938PubMed
6.
Zurück zum Zitat Hawkins J, Blakeslee S (2007) On intelligence. Macmillan Hawkins J, Blakeslee S (2007) On intelligence. Macmillan
8.
Zurück zum Zitat Rosch E, Thompson E, Varela F (1991) The embodied mind: cognitive science and human experience. MIT Press, Cambridge Rosch E, Thompson E, Varela F (1991) The embodied mind: cognitive science and human experience. MIT Press, Cambridge
9.
Zurück zum Zitat Suh I, Siu K (2011) Training program for fundamental surgical skill in robotic laparoscopic surgery. Int J Med Robot Comput Assist Surg 7:327–333 Suh I, Siu K (2011) Training program for fundamental surgical skill in robotic laparoscopic surgery. Int J Med Robot Comput Assist Surg 7:327–333
10.
Zurück zum Zitat Maravita A, Iriki A (2004) Tools for the body (schema). Trends Cognit Sci 8(2):79–86CrossRef Maravita A, Iriki A (2004) Tools for the body (schema). Trends Cognit Sci 8(2):79–86CrossRef
11.
Zurück zum Zitat Maravita A, Spence C, Kennett S, Driver J (2002) Tool-use changes multimodal spatial interactions between vision and touch in normal humans. Cognition 83(2):B25–34CrossRefPubMed Maravita A, Spence C, Kennett S, Driver J (2002) Tool-use changes multimodal spatial interactions between vision and touch in normal humans. Cognition 83(2):B25–34CrossRefPubMed
12.
Zurück zum Zitat Colby CL, Goldberg ME (1999) Space and attention in parietal cortex. Annu Rev Neurosci 22:319–349CrossRefPubMed Colby CL, Goldberg ME (1999) Space and attention in parietal cortex. Annu Rev Neurosci 22:319–349CrossRefPubMed
13.
Zurück zum Zitat Bladen JS, Anderson A, Bell GD, Heatley DJ (1993) “A non-radiological technique for the real time imaging of endoscopes in 3 dimensions,” In: 1993 IEEE conference record nuclear science symposium and medical imaging conference, pp 1891–1894 Bladen JS, Anderson A, Bell GD, Heatley DJ (1993) “A non-radiological technique for the real time imaging of endoscopes in 3 dimensions,” In: 1993 IEEE conference record nuclear science symposium and medical imaging conference, pp 1891–1894
14.
Zurück zum Zitat Iriki a, Tanaka M, Obayashi S, Iwamura Y (2001) Self-images in the video monitor coded by monkey intraparietal neurons. Neurosci Res 40(2):163–173CrossRefPubMed Iriki a, Tanaka M, Obayashi S, Iwamura Y (2001) Self-images in the video monitor coded by monkey intraparietal neurons. Neurosci Res 40(2):163–173CrossRefPubMed
15.
Zurück zum Zitat Obayashi S, Suhara T, Nagai Y, Maeda J, Hihara S, Iriki A (2002) Macaque prefrontal activity associated with extensive tool use. Neuroreport 13(17):2349–2354CrossRefPubMed Obayashi S, Suhara T, Nagai Y, Maeda J, Hihara S, Iriki A (2002) Macaque prefrontal activity associated with extensive tool use. Neuroreport 13(17):2349–2354CrossRefPubMed
16.
Zurück zum Zitat Cardinali L, Frassinetti F, Brozzoli C, Urquizar C, Roy AC, Farnè A (2009) Tool-use induces morphological updating of the body schema. Curr Biol 19(12):R478–R479CrossRefPubMed Cardinali L, Frassinetti F, Brozzoli C, Urquizar C, Roy AC, Farnè A (2009) Tool-use induces morphological updating of the body schema. Curr Biol 19(12):R478–R479CrossRefPubMed
17.
Zurück zum Zitat Makin TR, Holmes NP, Zohary E (2007) Is that near my hand? Multisensory representation of peripersonal space in human intraparietal sulcus. J Neurosci 27(4):731–740 Makin TR, Holmes NP, Zohary E (2007) Is that near my hand? Multisensory representation of peripersonal space in human intraparietal sulcus. J Neurosci 27(4):731–740
18.
Zurück zum Zitat Fuster J, Guiou M, Ardestani A, Cannestra A, Sheth S, Zhou Y-D, Toga A, Bodner M (2005) Near-infrared spectroscopy (NIRS) in cognitive neuroscience of the primate brain. Neuroimage 26(1):215–220CrossRefPubMed Fuster J, Guiou M, Ardestani A, Cannestra A, Sheth S, Zhou Y-D, Toga A, Bodner M (2005) Near-infrared spectroscopy (NIRS) in cognitive neuroscience of the primate brain. Neuroimage 26(1):215–220CrossRefPubMed
19.
Zurück zum Zitat Germon TJ, Evans PD, Barnett NJ, Wall P, Manara a R, Nelson RJ (1999) Cerebral near infrared spectroscopy: emitter-detector separation must be increased. Br J Anaesth 82(6):831–837CrossRefPubMed Germon TJ, Evans PD, Barnett NJ, Wall P, Manara a R, Nelson RJ (1999) Cerebral near infrared spectroscopy: emitter-detector separation must be increased. Br J Anaesth 82(6):831–837CrossRefPubMed
20.
Zurück zum Zitat Villringer a, Planck J, Hock C, Schleinkofer L, Dirnagl U (1993) Near infrared spectroscopy (NIRS): a new tool to study hemodynamic changes during activation of brain function in human adults. Neurosci Lett 154(1–2):101–104CrossRefPubMed Villringer a, Planck J, Hock C, Schleinkofer L, Dirnagl U (1993) Near infrared spectroscopy (NIRS): a new tool to study hemodynamic changes during activation of brain function in human adults. Neurosci Lett 154(1–2):101–104CrossRefPubMed
21.
Zurück zum Zitat Lee J, Folley BS, Gore J, Park S (2008) Origins of spatial working memory deficits in schizophrenia: an event-related FMRI and near-infrared spectroscopy study. PLoS One 3(3):e1760CrossRefPubMedCentralPubMed Lee J, Folley BS, Gore J, Park S (2008) Origins of spatial working memory deficits in schizophrenia: an event-related FMRI and near-infrared spectroscopy study. PLoS One 3(3):e1760CrossRefPubMedCentralPubMed
22.
Zurück zum Zitat Homan RW, Herman J, Purdy P (1987) Cerebral location of international 10–20 system electrode placement. Electroencephalogr Clin Neurophysiol 66(4):376–382CrossRefPubMed Homan RW, Herman J, Purdy P (1987) Cerebral location of international 10–20 system electrode placement. Electroencephalogr Clin Neurophysiol 66(4):376–382CrossRefPubMed
23.
Zurück zum Zitat Shimizu S, Takahashi N, Inoue H, Nara H, Miwakeichi F, Hirai N, Kikuchi S, Watanabe E, Kato S (2011) “Basic study for a new assistive system based on brain activity associated with spatial perception task during car driving,” In: 2011 IEEE international conference robots biomimetics, pp 2884–2889, Dec 2011 Shimizu S, Takahashi N, Inoue H, Nara H, Miwakeichi F, Hirai N, Kikuchi S, Watanabe E, Kato S (2011) “Basic study for a new assistive system based on brain activity associated with spatial perception task during car driving,” In: 2011 IEEE international conference robots biomimetics, pp 2884–2889, Dec 2011
24.
Zurück zum Zitat Attenberger A, Buchenrieder K (2012) “Modeling and visualization of classification-based control schemes for upper limb prostheses,” In: Engineering of computer-based systems 2012 IEEE 19th international conference and workshops on engineering of computer-based systems, pp 188–194, 2012 Attenberger A, Buchenrieder K (2012) “Modeling and visualization of classification-based control schemes for upper limb prostheses,” In: Engineering of computer-based systems 2012 IEEE 19th international conference and workshops on engineering of computer-based systems, pp 188–194, 2012
25.
Zurück zum Zitat Tanaka H, Homma K, Imamizu H (2012) Illusory reversal of causality between touch and vision has no effect on prism adaptation rate. Front Psychol 3(December):545PubMedCentralPubMed Tanaka H, Homma K, Imamizu H (2012) Illusory reversal of causality between touch and vision has no effect on prism adaptation rate. Front Psychol 3(December):545PubMedCentralPubMed
26.
Zurück zum Zitat Santos VJ, Valero-cuevas FJ (2006) Reported anatomical variability naturally leads to multimodal distributions of Denavit–Hartenberg parameters for the human thumb. IEEE Trans Biomed Eng 53(2):155–163CrossRefPubMed Santos VJ, Valero-cuevas FJ (2006) Reported anatomical variability naturally leads to multimodal distributions of Denavit–Hartenberg parameters for the human thumb. IEEE Trans Biomed Eng 53(2):155–163CrossRefPubMed
Metadaten
Titel
Brain activation in parietal area during manipulation with a surgical robot simulator
verfasst von
Satoshi Miura
Yo Kobayashi
Kazuya Kawamura
Yasutaka Nakashima
Masakatsu G. Fujie
Publikationsdatum
01.06.2015
Verlag
Springer Berlin Heidelberg
Erschienen in
International Journal of Computer Assisted Radiology and Surgery / Ausgabe 6/2015
Print ISSN: 1861-6410
Elektronische ISSN: 1861-6429
DOI
https://doi.org/10.1007/s11548-015-1178-1

Weitere Artikel der Ausgabe 6/2015

International Journal of Computer Assisted Radiology and Surgery 6/2015 Zur Ausgabe