Skip to main content
Top
Published in: International Journal of Computer Assisted Radiology and Surgery 10/2020

Open Access 22-07-2020 | Short communication

Needle tip force estimation by deep learning from raw spectral OCT data

Authors: M. Gromniak, N. Gessert, T. Saathoff, A. Schlaefer

Published in: International Journal of Computer Assisted Radiology and Surgery | Issue 10/2020

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Purpose

Needle placement is a challenging problem for applications such as biopsy or brachytherapy. Tip force sensing can provide valuable feedback for needle navigation inside the tissue. For this purpose, fiber-optical sensors can be directly integrated into the needle tip. Optical coherence tomography (OCT) can be used to image tissue. Here, we study how to calibrate OCT to sense forces, e.g., during robotic needle placement.

Methods

We investigate whether using raw spectral OCT data without a typical image reconstruction can improve a deep learning-based calibration between optical signal and forces. For this purpose, we consider three different needles with a new, more robust design which are calibrated using convolutional neural networks (CNNs). We compare training the CNNs with the raw OCT signal and the reconstructed depth profiles.

Results

We find that using raw data as an input for the largest CNN model outperforms the use of reconstructed data with a mean absolute error of 5.81 mN compared to 8.04 mN.

Conclusions

We find that deep learning with raw spectral OCT data can improve learning for the task of force estimation. Our needle design and calibration approach constitute a very accurate fiber-optical sensor for measuring forces at the needle tip.
Notes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Introduction

Needle placement is a challenging problem for a variety of medical interventions, including brachytherapy or biopsy [12]. The force acting on the needle tip allows for inference about the currently penetrated tissue. This information can be used to navigate the needle and to prevent injuries of delicate structures [9]. In order to distinguish tissue based on tip forces, it may be required to measure those with an accuracy of approximately \(0.01 \, \hbox {N}\) [8]. Tip forces cannot be measured with external sensors due to friction forces at the needle shaft [5]. Therefore, small-scale fiber-optical force estimation methods have been directly integrated into the needle tip. Several sensor concepts are based on Fabry–Pérot interferometry [1] and fiber Bragg gratings [6]. Here, we consider a setting where optical coherence tomography is available, e.g., to study tissue deformation [10] or to realize elastography [7]. While OCT has been proposed for tip forces estimation before [2, 3], these approaches rely on the reconstructed gray value data. However, using the reconstructed data has two limitations. First, the signal processing is based on a number of assumptions which may cause some loss of signal information. Second, it does not incorporate the phase part of the complex OCT signal that is particularly sensitive to small axial shifts. Therefore, we explore whether the tip force estimation accuracy can be improved by directly using the raw spectral OCT data. Thus, we perform a calibration between the optical signal and forces applied to the needle tip with convolutional neural networks. We validate our approach with three different needles using a new, improved needle design.

Methods

Needle design

We used an improved needle design for force estimation at the needle tip. A scheme and an image of the needle are shown in Fig. 1. A brass tip with a piston is put on a brass sleeve such that it is able to perform a sliding motion inside of it. An epoxy layer between tip and sleeve acts as a spring. An optical fiber is embedded into a ceramic ferrule. The ferrule is positioned relative to the tip piston such that the light beam travels a distance of approximately \(1 \, \hbox {mm}\) through air until hitting the surface of the piston. For protection, the ferrule is embedded into a polymer tube which is glued to the brass sleeve. When forces act on the needle tip, the epoxy layer is compressed and the piston moves closer to the exit point of the laser beam which can be detected in the OCT signal. The diameter of the needle is \(2 \, \hbox {mm}\).
In [2], the needle tip was constructed as a cone and attached to the needle shaft with a deformable epoxy layer. Thus, radial forces on the needle tip could easily tilt it. The improved piston construction has the advantage that it guides the tip in axial needle direction and prevents tilt. This contributes to a more reproducible signal, an important aspect in the calibration of the needle, and to the overall durability.
Table 1
Mean absolute error results in mN and inference times in ms
 
Needle 1
Needle 2
Needle 3
Inf. times
raw
recon
raw
recon
raw
recon
ResNet 6
\(8.54 \pm 0.14\)
\(7.22 \pm 0.14\)
\(23.10 \pm 0.40\)
\(17.15 \pm 0.33\)
\(9.02 \pm 0.08\)
\(11.29 \pm 0.09\)
\(1.11 \pm 0.00\)
ResNet 18
\(4.36 \pm 0.05\)
\(7.09 \pm 0.18 \)
\(8.16 \pm 0.23\)
\(11.42 \pm 0.32\)
\(7.18 \pm 0.66\)
\(5.65 \pm 0.06\)
\(3.56 \pm 0.00\)
ResNet 34
\(4.40 \pm 0.06\)
\(6.61 \pm 0.19\)
\(6.95 \pm 0.24\)
\(11.14 \pm 0.32\)
\(6.08 \pm 0.06\)
\(6.37 \pm 0.05\)
\(6.43 \pm 0.00\)

Calibration data

We acquire calibration data for three custom build needles identical in construction. The data acquisition is performed similar to [3] where a needle is driven against a flat surface with a stepper motor. The force in the axial direction of the needle is measured with a force sensor and recorded together with the associated raw OCT data. Approximately 180 000 OCT-force pairs are collected for each needle for forces between \(0 \, \hbox {N}\) and \(1 \, \hbox {N}\). We perform regular OCT data reconstruction, which includes the following steps:
1.
Dechirping the data by resampling and interpolating to new sampling points, based on manufacturer specifications
 
2.
Estimation of the DC spectrum using an exponential moving average with a damping coefficient of \(d = 0.05\)
 
3.
Subtraction of the DC spectrum
 
4.
Apodization by using a Hann window for filtering the spectral data
 
5.
Fourier transform for mapping frequency values to spatial values
 
6.
Selection of the absolute signal value as the final reconstructed intensity image
 
Throughout the reconstruction process, information can be lost due to the DC spectrum estimation strategy (2), the apodization (4) which eliminates high-frequency signal parts and the selection of the absolute signal value (6) as the final image. Figure 2 shows an excerpt of the collected data, both in raw (left) and reconstructed form (right).
The raw OCT signal has a size of \(1024 \times N_t\) where \(N_t\) is the number of scans acquired over time. Reconstruction up to step (5) results in a complex signal which is also of size \(1024 \times N_t\). Finally, by taking the absolute signal value, the A-Scan image sequence of size \(512 \times N_t\) is obtained. This can be interpreted as a sequence of 1D depth images over time.

Deep learning architectures

We compare prediction performance for needle tip forces from both raw and reconstructed OCT data with different convolutional neural network (CNN) architectures. The considered architectures are variants of the ResNet [4], which is an extension of CNNs that enables better training through improved gradient flow. The ResNet models were originally built for 2D images with 2D convolutions. Here, the A-Scan data represent 1D images. Therefore, we adapt the architectures by replacing 2D convolutions by 1D convolutions. Furthermore, we replace the network’s original output layer for multi-class classification with a fully connected layer with one output for needle force regression. We consider several ResNet variants, resembling different network sizes, the regular ResNet34 and ResNet18 architecture as well as a smaller architecture with 2 residual blocks and 6 convolutional layers, which we name ResNet6. All deep learning models are implemented in PyTorch [11]. Learning is performed over 150 epochs with a batch size of \(N_B = 128\) and a learning rate of 0.005 using the Adam optimizer. As a loss function, we use the mean squared error which is defined as
$$\begin{aligned} MSE = \frac{1}{N_B}\sum _{j=1}^{N_B}(y^{j}-{\hat{y}}^{j})^2 \end{aligned}$$
(1)
where y is the ground-truth force and \({\hat{y}}\) is the predicted force value. We use \(20\%\) of the data as a hold-out validation set. We performed five training runs with different random seeds and averaged the individual results.

Results

We report the mean absolute error (MAE) in mN between force predictions and force targets on the validation set. Additionally, we report inference times for the examined neural network architectures. All results are shown in Table 1. Figure 3 shows the relative differences in errors graphically. For most combinations of needle and model architecture, the error for learning on raw OCT data is lower compared with learning from reconstructed OCT scans. Particularly for the ResNet34 architecture, the calibration performance improves for all needles.

Discussion and conclusion

In this paper, we address the calibration problem of OCT-based needle tip force estimation using new and improved sensor concept with a piston and a guiding sleeve. In contrast to a previous OCT-based concept [2], the needle is not as sensitive to lateral forces by design while achieving similar calibration results with intensity data.
Furthermore, we study an approach for improving deep learning-based calibration performance even further. During OCT acquisition, a spectral signal is obtained which is typically reconstructed to an intensity depth image. We illustrate that learning is possible in an end-to-end fashion, i.e., the process of image reconstruction can be avoided. Moreover, the results improve in six out of nine setups, indicating that there may be information lost in the original processing that can be used when training the network on the raw signal. The proposed approach allows for precise estimation of forces at the needle tip, which is particularly interesting for force-based robotic needle placement. With typical robot control cycles of \(1 \, \hbox {ms}\), a trade-off between accuracy and inference time must be balanced.
We find that learning forces from raw OCT data instead of reconstructed images work well, in particular, for larger deep learning models. For future work, our approach could be studied in more detail with additional deep learning methods and in different applications scenarios. Also, our approach could be extended to other applications where an imaging modality is used as a sensor signal, for example, in the context of motion tracking with OCT or ultrasound.

Acknowledgements

Open Access funding provided by Projekt DEAL.

Compliance with ethical standards

Conflict of interest

The authors M. Gromniak, N. Gessert, T. Saathoff and A. Schlaefer declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals.
Not applicable.
Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Literature
1.
go back to reference Beekmans S, Lembrechts T, van den Dobbelsteen J, van Gerwen D (2017) Fiber-optic fabry-pérot interferometers for axial force sensing on the tip of a needle. Sensors 17(1):38 Beekmans S, Lembrechts T, van den Dobbelsteen J, van Gerwen D (2017) Fiber-optic fabry-pérot interferometers for axial force sensing on the tip of a needle. Sensors 17(1):38
2.
go back to reference Gessert N, Priegnitz T, Saathoff T, Antoni ST, Meyer D, Hamann MF, Jünemann KP, Otte C, Schlaefer A (2018) Needle tip force estimation using an oct fiber and a fused convgru-cnn architecture. In: MICCAI. Springer, pp 222–229 Gessert N, Priegnitz T, Saathoff T, Antoni ST, Meyer D, Hamann MF, Jünemann KP, Otte C, Schlaefer A (2018) Needle tip force estimation using an oct fiber and a fused convgru-cnn architecture. In: MICCAI. Springer, pp 222–229
3.
go back to reference Gessert N, Priegnitz T, Saathoff T, Antoni ST, Meyer D, Hamann MF, Jünemann KP, Otte C, Schlaefer A (2019) Spatio-temporal deep learning models for tip force estimation during needle insertion. Int J CARS 14:1485–1493CrossRef Gessert N, Priegnitz T, Saathoff T, Antoni ST, Meyer D, Hamann MF, Jünemann KP, Otte C, Schlaefer A (2019) Spatio-temporal deep learning models for tip force estimation during needle insertion. Int J CARS 14:1485–1493CrossRef
5.
go back to reference Kataoka H, Washio T, Chinzei K, Mizuhara K, Simone C, Okamura AM (2002) Measurement of the tip and friction force acting on a needle during penetration. In: MICCAI. Springer, pp 216–223 Kataoka H, Washio T, Chinzei K, Mizuhara K, Simone C, Okamura AM (2002) Measurement of the tip and friction force acting on a needle during penetration. In: MICCAI. Springer, pp 216–223
6.
go back to reference Kumar S, Shrikanth V, Amrutur B, Asokan S, Bobji MS (2016) Detecting stages of needle penetration into tissues through force estimation at needle tip using fiber bragg grating sensors. J Biomed Optics 21(12):127009CrossRef Kumar S, Shrikanth V, Amrutur B, Asokan S, Bobji MS (2016) Detecting stages of needle penetration into tissues through force estimation at needle tip using fiber bragg grating sensors. J Biomed Optics 21(12):127009CrossRef
7.
go back to reference Latus S, Otte C, Schlüter M, Rehra J, Bizon K, Schulz-Hildebrandt H, Saathoff T, Hüttmann G, Schlaefer A (2017) An approach for needle based optical coherence elastography measurements. MICCAI 201: medical image computing and computer-assisted intervention, pp 655–663 Latus S, Otte C, Schlüter M, Rehra J, Bizon K, Schulz-Hildebrandt H, Saathoff T, Hüttmann G, Schlaefer A (2017) An approach for needle based optical coherence elastography measurements. MICCAI 201: medical image computing and computer-assisted intervention, pp 655–663
9.
go back to reference Okamura AM, Simone C, O’leary MD (2004) Force modeling for needle insertion into soft tissue. IEEE Trans Biomed Eng 51(10):1707–1716CrossRef Okamura AM, Simone C, O’leary MD (2004) Force modeling for needle insertion into soft tissue. IEEE Trans Biomed Eng 51(10):1707–1716CrossRef
10.
go back to reference Otte C, Hüttmann G, Schlaefer A (2012) Feasibiliy of optical detection of soft tissue deformation during needle insertion. In: SPIE 8316, Medical Imaging 2012: image-guided procedures, robotic interventions, and modeling, vol 8316, pp 282–292 Otte C, Hüttmann G, Schlaefer A (2012) Feasibiliy of optical detection of soft tissue deformation during needle insertion. In: SPIE 8316, Medical Imaging 2012: image-guided procedures, robotic interventions, and modeling, vol 8316, pp 282–292
11.
go back to reference Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z, Gimelshein N, Antiga L, Desmaison A, Kopf A, Yang E, DeVito Z, Raison M, Tejani A, Chilamkurthy S, Steiner B, Fang L, Bai J, Chintala S (2019) Pytorch: an imperative style, high-performance deep learning library. In: Advances in neural information processing systems 32. Curran Associates, Inc., pp 8024–8035 Paszke A, Gross S, Massa F, Lerer A, Bradbury J, Chanan G, Killeen T, Lin Z, Gimelshein N, Antiga L, Desmaison A, Kopf A, Yang E, DeVito Z, Raison M, Tejani A, Chilamkurthy S, Steiner B, Fang L, Bai J, Chintala S (2019) Pytorch: an imperative style, high-performance deep learning library. In: Advances in neural information processing systems 32. Curran Associates, Inc., pp 8024–8035
12.
go back to reference Taylor RH, Menciassi A, Fichtinger G, Fiorini P, Dario P (2016) Medical robotics and computer-integrated surgery. In: Springer handbook of robotics. Springer, pp 1657–1684 Taylor RH, Menciassi A, Fichtinger G, Fiorini P, Dario P (2016) Medical robotics and computer-integrated surgery. In: Springer handbook of robotics. Springer, pp 1657–1684
Metadata
Title
Needle tip force estimation by deep learning from raw spectral OCT data
Authors
M. Gromniak
N. Gessert
T. Saathoff
A. Schlaefer
Publication date
22-07-2020
Publisher
Springer International Publishing
Published in
International Journal of Computer Assisted Radiology and Surgery / Issue 10/2020
Print ISSN: 1861-6410
Electronic ISSN: 1861-6429
DOI
https://doi.org/10.1007/s11548-020-02224-w

Other articles of this Issue 10/2020

International Journal of Computer Assisted Radiology and Surgery 10/2020 Go to the issue

Premium Partner