Elsevier

Medical Image Analysis

Volume 11, Issue 5, October 2007, Pages 458-464
Medical Image Analysis

GPU based real-time instrument tracking with three-dimensional ultrasound

https://doi.org/10.1016/j.media.2007.06.009Get rights and content

Abstract

Real-time three-dimensional ultrasound enables new intracardiac surgical procedures, but the distorted appearance of instruments in ultrasound poses a challenge to surgeons. This paper presents a detection technique that identifies the position of the instrument within the ultrasound volume. The algorithm uses a form of the generalized Radon transform to search for long straight objects in the ultrasound image, a feature characteristic of instruments and not found in cardiac tissue. When combined with passive markers placed on the instrument shaft, the full position and orientation of the instrument is found in 3D space. This detection technique is amenable to rapid execution on the current generation of personal computer graphics processor units (GPU). Our GPU implementation detected a surgical instrument in 31 ms, sufficient for real-time tracking at the 25 volumes per second rate of the ultrasound machine. A water tank experiment found instrument orientation errors of 1.1° and tip position errors of less than 1.8 mm. Finally, an in vivo study demonstrated successful instrument tracking inside a beating porcine heart.

Introduction

Real-time three-dimensional ultrasound has been demonstrated as a viable tool for guiding minimally invasive surgery (Cannon et al., 2003). For example, beating heart intracardiac surgery is now possible with the use of three-dimensional ultrasound and minimally invasive instruments (Suematsu et al., 2004). These techniques eliminate the need for cardio-pulmonary bypass and its well-documented adverse effects, including delay of neural development in children, mechanical damage produced by inserting tubing into the major vessels, stroke risk, and significant decline in cognitive performance (Murkin et al., 1999, Zeitlhofer et al., 1993, Bellinger et al., 1999). Cannon et al. (2003) showed that complex surgical tasks, such as navigation, approximation, and grasping are possible with 3D ultrasound. However, initial animal trials revealed many challenges to the goal of ultrasound guided intracardiac surgery (Suematsu et al., 2004). In a prototypical intracardiac procedure, atrial septal defect closure, an anchor driver was inserted through the cardiac wall to secure a patch covering the defect (Fig. 1). Surgeons found it difficult to navigate instruments with 3D ultrasound in the dynamic, confined intracardiac space. Tools look incomplete and distorted (Fig. 1), making it difficult to distinguish and orient instruments.

To address this issue, researchers are developing techniques to localize instruments in ultrasound. Enhancing the displayed position of the instrument allows surgeons to more accurately control the instruments as they perform surgical tasks. In addition, real-time tracking of instruments in conjunction with a surgical robot opens the door for a range of enhancements, such as surgical macros, virtual fixtures, and other visual servoing techniques (Kragic et al., 2005, Shinsuk et al., 2001).

Previous work in instrument detection can be broadly separated into two categories: external tracking systems such as electromagnetic and optical tracking (Leotta, 2004, Lindseth et al., 2003) and image based detection algorithms (Draper et al., 2000, Novotny et al., 2003, Zhouping et al., 2004, Ortmaier et al., 2005). External tracking systems have suffered from the limitations of the surgical environment. Electromagnetic tracking has limited accuracy and is problematic to implement due to the abundance of ferro-magnetic objects in the operating room. Optical tracking of instruments is complicated by line-of-sight requirements. Both of these systems suffer from errors introduced by improper registration of the ultrasound image coordinates to the tracking coordinate frame. To eliminate such errors, image based algorithms are used to track instruments within the ultrasound image. Most of this work focused on tracking needles (Draper et al., 2000, Novotny et al., 2003, Zhouping et al., 2004) and more recently surgical graspers (Ortmaier et al., 2005) in 2D ultrasound images. As 3D ultrasound systems have become widely available, these 2D techniques have been adapted for implementation in 3D. An appealing approach to instrument localization is the Radon or Hough transform. These techniques have wide spread use in 2D image analysis for detection of wide variety of shapes. Applications of these techniques have focused on detecting 2D objects in 2D images, however, Hough and Radon based techniques have shown promise in 3D medical image analysis. Most relevant is a needle tracking technique for prostate biopsy (Ding and Fenster, 2003) that projects the ultrasound volume onto two orthogonal planes. A Hough transform is then performed on the two 2D images to identify the needle.

Beating heart intracardiac procedures pose different challenges and requirements than the 2D breast and prostate biopsy procedures in previous work. For example, the high data rates of 3D ultrasound machines, 30–40 MB/s, require very efficient algorithms for real-time implementation. Previous 2D ultrasound techniques are too computationally costly, or inappropriate for three dimensions. These methods are only appropriate for finding bright objects such as needles in ultrasound images that standout amongst relatively homogeneous tissue. In cardiac procedures, larger instruments such as anchor drivers and graspers are used that do not stand out amongst the surrounding dynamic, heterogeneous environment. To work in this environment, the algorithm must be efficient for handling the large data rates, and capable of distinguishing instruments from fast-moving cardiac structures of similar intensity.

Section snippets

Methods and materials

In this work we present a technique capable of detecting instruments used in minimally invasive procedures, such as endoscopic graspers, staplers, and cutting devices. Instruments used in minimally invasive procedures are fundamentally cylindrical in shape and typically 3–10 mm in diameter, a feature that is not found in cardiac tissue. We use a form of the Radon transform to identify these instruments within the ultrasound volumes. In the following sections we describe a generalization of the

Results

The results of the tank studies demonstrate the accuracy of the method. In Fig. 6, the angular accuracy of the method is shown for different orientations of the instrument with respect to the ultrasound probe. Fig. 6 shows that for angles from 0 to 60° of ϕinstr, the instrument tracking algorithm accurately determined its orientation. Across all trials, the RMS difference of the angle calculated by the tracking algorithm and the angle measured by the testing setup was 1.07°. There was no

Discussion

This paper demonstrates for the first time real-time tracking of surgical instruments in intracardiac procedures with 3DUS. The algorithm was both capable of distinguishing instruments from fast-moving cardiac structures and efficient enough to work in real-time. The generalized Radon transform is effective here because it integrates over the length of the instrument shaft to minimize the effects of noise and spatial distortion in ultrasound images. By taking advantage of the unique shape of

References (22)

  • M. Ding et al.

    A real-time biopsy needle segmentation technique using hough transform

    Medical Physics

    (2003)
  • Cited by (102)

    • Deep visual nerve tracking in ultrasound images

      2019, Computerized Medical Imaging and Graphics
    • Issues in closed-loop needle steering

      2017, Control Engineering Practice
    View all citing articles on Scopus

    Supported by United States National Institutes of Health (R01 HL073647-01).

    View full text