Skip to main content
Top
Published in: Intelligent Service Robotics 2/2024

06-11-2023 | Original Research Paper

Dynamic liquid volume estimation using optical tactile sensors and spiking neural network

Authors: Binhua Huang, Senlin Fang, Meng Yin, Zhengkun Yi, Chaoxiang Ye, Xiaoyu Li, Zhenning Zhou, Xinyu Wu

Published in: Intelligent Service Robotics | Issue 2/2024

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Tactile sensing plays a unique role in robotic applications such as liquid volume estimation. The latest research shows that current computer vision methods for liquid volume estimation are still unsatisfactory in terms of accuracy. Additionally, their performance depends on the environment, such as requiring high illumination or facing issues with occlusions. In the face of these challenges, we design an optical tactile sensor to solve the task of liquid volume estimation. We build the sensor using four photoresistors and a LED on a printed circuit board, and an elastic dome bound with a 3D-printed base by four screw suits. The tactile sensor is attached to the Robotiq and applied in the task of liquid volume estimation in different trajectories movement. The experiment is designed as multiple different trajectories for imitating human movements. We propose the tactile regression spiking neural network (TR-SNN) for the liquid volume estimation. The TR-SNN uses a binary decoding method to decode the output spike trains of the network into liquid volume values. It is a novel and successful spiking neural network (SNN) decoding method for the liquid volume estimation task. The experimental results show that compared to baseline models, the TR-SNN can achieve a state-of-the-art estimation performance with a coefficient of determination up to 0.986.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Dahiya RS, Mittendorfer P, Valle M, Cheng G, Lumelsky VJ (2013) Directions toward effective utilization of tactile skin: a review. IEEE Sens J 13(11):4121–4138CrossRef Dahiya RS, Mittendorfer P, Valle M, Cheng G, Lumelsky VJ (2013) Directions toward effective utilization of tactile skin: a review. IEEE Sens J 13(11):4121–4138CrossRef
2.
go back to reference Sun X, Liu T, Zhou J, Yao L, Liang S, Zhao M, Liu C, Xue N (2021) Recent applications of different microstructure designs in high performance tactile sensors: a review. IEEE Sens J 21(9):10291–10303CrossRef Sun X, Liu T, Zhou J, Yao L, Liang S, Zhao M, Liu C, Xue N (2021) Recent applications of different microstructure designs in high performance tactile sensors: a review. IEEE Sens J 21(9):10291–10303CrossRef
3.
go back to reference Xue T, Wang W, Ma J, Liu W, Pan Z, Han M (2020) Progress and prospects of multimodal fusion methods in physical human–robot interaction: a review. IEEE Sens J 20(18):10355–10370CrossRef Xue T, Wang W, Ma J, Liu W, Pan Z, Han M (2020) Progress and prospects of multimodal fusion methods in physical human–robot interaction: a review. IEEE Sens J 20(18):10355–10370CrossRef
4.
go back to reference Yi Z, Calandra R, Veiga F, van Hoof H, Hermans T, Zhang Y, Peters J (2016) Active tactile object exploration with gaussian processes. In: 2016 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4925–4930 Yi Z, Calandra R, Veiga F, van Hoof H, Hermans T, Zhang Y, Peters J (2016) Active tactile object exploration with gaussian processes. In: 2016 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4925–4930
5.
go back to reference Dahiya RS, Metta G, Valle M, Sandini G (2009) Tactile sensing-from humans to humanoids. IEEE Trans Rob 26(1):1–20CrossRef Dahiya RS, Metta G, Valle M, Sandini G (2009) Tactile sensing-from humans to humanoids. IEEE Trans Rob 26(1):1–20CrossRef
6.
go back to reference Kroemer O, Lampert CH, Peters J (2011) Learning dynamic tactile sensing with robust vision-based training. IEEE Trans Rob 27(3):545–557CrossRef Kroemer O, Lampert CH, Peters J (2011) Learning dynamic tactile sensing with robust vision-based training. IEEE Trans Rob 27(3):545–557CrossRef
7.
go back to reference Zou L, Ge C, Wang ZJ, Cretu E, Li X (2017) Novel tactile sensor technology and smart tactile sensing systems: a review. Sensors 17(11):2653CrossRef Zou L, Ge C, Wang ZJ, Cretu E, Li X (2017) Novel tactile sensor technology and smart tactile sensing systems: a review. Sensors 17(11):2653CrossRef
8.
go back to reference Khan LU (2017) Visible light communication: applications, architecture, standardization and research challenges. Digit Commun Netw 3(2):78–88CrossRef Khan LU (2017) Visible light communication: applications, architecture, standardization and research challenges. Digit Commun Netw 3(2):78–88CrossRef
9.
go back to reference Kamiyama K, Vlack K, Mizota T, Kajimoto H, Kawakami K, Tachi S (2005) Vision-based sensor for real-time measuring of surface traction fields. IEEE Comput Graph Appl 25(1):68–75CrossRef Kamiyama K, Vlack K, Mizota T, Kajimoto H, Kawakami K, Tachi S (2005) Vision-based sensor for real-time measuring of surface traction fields. IEEE Comput Graph Appl 25(1):68–75CrossRef
10.
go back to reference Johnson MK, Adelson EH (2009) Retrographic sensing for the measurement of surface texture and shape. In: 2009 IEEE conference on computer vision and pattern recognition. IEEE, pp 1070–1077 Johnson MK, Adelson EH (2009) Retrographic sensing for the measurement of surface texture and shape. In: 2009 IEEE conference on computer vision and pattern recognition. IEEE, pp 1070–1077
11.
go back to reference Yuan W, Srinivasan MA, Adelson EH (2016) Estimating object hardness with a gelsight touch sensor. In: 2016 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 208–215 Yuan W, Srinivasan MA, Adelson EH (2016) Estimating object hardness with a gelsight touch sensor. In: 2016 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 208–215
12.
go back to reference Gomes DF, Lin Z, Luo S (2020) Geltip: a finger-shaped optical tactile sensor for robotic manipulation. In: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 9903–9909 Gomes DF, Lin Z, Luo S (2020) Geltip: a finger-shaped optical tactile sensor for robotic manipulation. In: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 9903–9909
13.
go back to reference Donlon E, Dong S, Liu M, Li J, Adelson E, Rodriguez A (2018) Gelslim: a high-resolution, compact, robust, and calibrated tactile-sensing finger. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 1927–1934 Donlon E, Dong S, Liu M, Li J, Adelson E, Rodriguez A (2018) Gelslim: a high-resolution, compact, robust, and calibrated tactile-sensing finger. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 1927–1934
14.
go back to reference Sun H, Kuchenbecker KJ, Martius G (2022) A soft thumb-sized vision-based sensor with accurate all-round force perception. Nat Mach Intell 4(2):135–145CrossRef Sun H, Kuchenbecker KJ, Martius G (2022) A soft thumb-sized vision-based sensor with accurate all-round force perception. Nat Mach Intell 4(2):135–145CrossRef
15.
go back to reference Xie H, Jiang A, Seneviratne L, Althoefer K (2012) Pixel-based optical fiber tactile force sensor for robot manipulation. In: Sensors. IEEE, pp 1–4 Xie H, Jiang A, Seneviratne L, Althoefer K (2012) Pixel-based optical fiber tactile force sensor for robot manipulation. In: Sensors. IEEE, pp 1–4
16.
go back to reference Zhao H, O’Brien K, Li S, Shepherd RF (2016) Optoelectronically innervated soft prosthetic hand via stretchable optical waveguides. Sci Robot 1(1):eaai7529CrossRef Zhao H, O’Brien K, Li S, Shepherd RF (2016) Optoelectronically innervated soft prosthetic hand via stretchable optical waveguides. Sci Robot 1(1):eaai7529CrossRef
17.
go back to reference Tang Y, Liu H, Pan J, Zhang Z, Xu Y, Yao N, Zhang L, Tong L (2021) Optical micro/nanofiber-enabled compact tactile sensor for hardness discrimination. ACS Appl Mater Interfaces 13(3):4560–4566CrossRef Tang Y, Liu H, Pan J, Zhang Z, Xu Y, Yao N, Zhang L, Tong L (2021) Optical micro/nanofiber-enabled compact tactile sensor for hardness discrimination. ACS Appl Mater Interfaces 13(3):4560–4566CrossRef
18.
go back to reference Xie H, Jiang A, Wurdemann HA, Liu H, Seneviratne LD, Althoefer K (2013) Magnetic resonance-compatible tactile force sensor using fiber optics and vision sensor. IEEE Sens J 14(3):829–838CrossRef Xie H, Jiang A, Wurdemann HA, Liu H, Seneviratne LD, Althoefer K (2013) Magnetic resonance-compatible tactile force sensor using fiber optics and vision sensor. IEEE Sens J 14(3):829–838CrossRef
19.
go back to reference Mahmood MA, Wang H, Xie H, Chen W (2018) An MR compatible tactile sensor array probe head. In: 2018 IEEE international conference on real-time computing and robotics (RCAR). IEEE, pp 310–315 Mahmood MA, Wang H, Xie H, Chen W (2018) An MR compatible tactile sensor array probe head. In: 2018 IEEE international conference on real-time computing and robotics (RCAR). IEEE, pp 310–315
20.
go back to reference Tar A, Cserey G (2011) Development of a low cost 3d optical compliant tactile force sensor. In: 2011 IEEE/ASME international conference on advanced intelligent mechatronics (AIM). IEEE, pp 36–240 Tar A, Cserey G (2011) Development of a low cost 3d optical compliant tactile force sensor. In: 2011 IEEE/ASME international conference on advanced intelligent mechatronics (AIM). IEEE, pp 36–240
21.
go back to reference Khamis H, Xia B, Redmond SJ (2019) A novel optical 3d force and displacement sensor-towards instrumenting the papillarray tactile sensor. Sens Actuators A 291:174–187CrossRef Khamis H, Xia B, Redmond SJ (2019) A novel optical 3d force and displacement sensor-towards instrumenting the papillarray tactile sensor. Sens Actuators A 291:174–187CrossRef
22.
go back to reference Schenck C, Fox D (2017) Visual closed-loop control for pouring liquids. In: 2017 IEEE international conference on robotics and automation (ICRA). IEEE, pp 2629–2636 Schenck C, Fox D (2017) Visual closed-loop control for pouring liquids. In: 2017 IEEE international conference on robotics and automation (ICRA). IEEE, pp 2629–2636
23.
go back to reference Golovko V, Mikhno E, Mamyha A (2021) Neural network approach for estimating the level and volume of liquid in transparent containers, pp 56–60 Golovko V, Mikhno E, Mamyha A (2021) Neural network approach for estimating the level and volume of liquid in transparent containers, pp 56–60
24.
go back to reference Wang BC, Wang P (2012) The new development and application of optical sensor. In: Advanced materials research, vol 430. Trans Tech Publ, pp 1215–1218 Wang BC, Wang P (2012) The new development and application of optical sensor. In: Advanced materials research, vol 430. Trans Tech Publ, pp 1215–1218
25.
go back to reference Kamata H, Mukuta Y, Harada T (2022) Fully spiking variational autoencoder. Proc AAAI Conf Artif Intell 36(6):7059–7067 Kamata H, Mukuta Y, Harada T (2022) Fully spiking variational autoencoder. Proc AAAI Conf Artif Intell 36(6):7059–7067
26.
go back to reference Ratnasingam S, McGinnity TM (2011) A spiking neural network for tactile form based object recognition. In: The 2011 international joint conference on neural networks. IEEE, pp 880–885 Ratnasingam S, McGinnity TM (2011) A spiking neural network for tactile form based object recognition. In: The 2011 international joint conference on neural networks. IEEE, pp 880–885
27.
go back to reference Gerstner W, Kistler WM (2002) Spiking neuron models: single neurons, populations, plasticity. Cambridge University Press Gerstner W, Kistler WM (2002) Spiking neuron models: single neurons, populations, plasticity. Cambridge University Press
28.
go back to reference Shrestha SB, Orchard G (2018) Slayer: spike layer error reassignment in time. In: Advances in neural information processing systems, p 31 Shrestha SB, Orchard G (2018) Slayer: spike layer error reassignment in time. In: Advances in neural information processing systems, p 31
29.
go back to reference San Millan-Castillo R, Morgado E, Goya-Esteban R (2019) On the use of decision tree regression for predicting vibration frequency response of handheld probes. IEEE Sens J 20(8):4120–4130CrossRef San Millan-Castillo R, Morgado E, Goya-Esteban R (2019) On the use of decision tree regression for predicting vibration frequency response of handheld probes. IEEE Sens J 20(8):4120–4130CrossRef
30.
go back to reference Huang K-H, Tan F, Wang T-D, Yang Y-J (2019) A highly sensitive pressure-sensing array for blood pressure estimation assisted by machine-learning techniques. Sensors 19(4):848CrossRef Huang K-H, Tan F, Wang T-D, Yang Y-J (2019) A highly sensitive pressure-sensing array for blood pressure estimation assisted by machine-learning techniques. Sensors 19(4):848CrossRef
31.
go back to reference Malek S, Melgani F, Bazi Y (2018) One-dimensional convolutional neural networks for spectroscopic signal regression. J Chemom 32(5):e2977CrossRef Malek S, Melgani F, Bazi Y (2018) One-dimensional convolutional neural networks for spectroscopic signal regression. J Chemom 32(5):e2977CrossRef
32.
go back to reference Yao L, Ge Z (2021) Dynamic features incorporated locally weighted deep learning model for soft sensor development. IEEE Trans Instrum Meas 70:1–11 Yao L, Ge Z (2021) Dynamic features incorporated locally weighted deep learning model for soft sensor development. IEEE Trans Instrum Meas 70:1–11
33.
go back to reference Gehrig M, Shrestha SB, Mouritzen D, Scaramuzza D (2020) Event-based angular velocity regression with spiking networks. In: 2020 IEEE international conference on robotics and automation (ICRA). IEEE, pp 4195–4202 Gehrig M, Shrestha SB, Mouritzen D, Scaramuzza D (2020) Event-based angular velocity regression with spiking networks. In: 2020 IEEE international conference on robotics and automation (ICRA). IEEE, pp 4195–4202
34.
go back to reference Dong J, Cong Y, Sun G, Zhang T (2022) Lifelong robotic visual-tactile perception learning. Pattern Recogn 121:108176CrossRef Dong J, Cong Y, Sun G, Zhang T (2022) Lifelong robotic visual-tactile perception learning. Pattern Recogn 121:108176CrossRef
Metadata
Title
Dynamic liquid volume estimation using optical tactile sensors and spiking neural network
Authors
Binhua Huang
Senlin Fang
Meng Yin
Zhengkun Yi
Chaoxiang Ye
Xiaoyu Li
Zhenning Zhou
Xinyu Wu
Publication date
06-11-2023
Publisher
Springer Berlin Heidelberg
Published in
Intelligent Service Robotics / Issue 2/2024
Print ISSN: 1861-2776
Electronic ISSN: 1861-2784
DOI
https://doi.org/10.1007/s11370-023-00488-0

Other articles of this Issue 2/2024

Intelligent Service Robotics 2/2024 Go to the issue