Skip to main content
Erschienen in: Wireless Personal Communications 4/2014

01.12.2014

Real-Time Tracking and Recognition Systems for Interactive Telemedicine Health Services

Erschienen in: Wireless Personal Communications | Ausgabe 4/2014

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Recent changes affecting the health industry include the digitization of medical information as well as the exchange of medical information through a network-connected medical infrastructure. In this paper, we propose a real-time tracking and recognition system for interactive telemedicine health services. The proposed method is a methodology for both hand and finger detection applied to posture recognition in telemedicine. The detected hand or finger can be used to implement a non-contact mouse in the machine-to-machine. This technology can be used to control telemedicine health devices such as a public healthcare system, pedometer health information reader, glucose-monitoring device, and blood pressure gauge. Skin color is used to segment the hand region from the background, and the contour is extracted from the segmented hand. Contour analysis provides the locations of the fingertips on the hand. Fingertip tracking is performed using a constant velocity model with a pixel-labeling approach. From the tracking process, several hand features can be extracted and then fed into a finite state classifier to identify the hand configuration. The hand can be classified into many gesture classes or several different movement directions. Using this method, we performed an extensive experiment and obtained a very encouraging result. It is shown that using the method used in previous studies, some of the points are lost, whereas using the proposed method described in this paper, all lost points are recovered with no or little displacement error. Ultimately, this paper provides empirical verification of the adequacy and validity of the proposed system for telemedicine health services. Accordingly, the satisfaction and quality of services will improve gesture recognition for interactive telemedicine health services.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Kang, S. K., Chung, K. Y., Rim, K. W., & Lee, J. H. (2011). Development of real-time gesture recognition system using visual interaction. In Proceedings of the international conference IT convergence and security 2011 (LNEE 120) (pp. 295–306). Berlin: Springer. Kang, S. K., Chung, K. Y., Rim, K. W., & Lee, J. H. (2011). Development of real-time gesture recognition system using visual interaction. In Proceedings of the international conference IT convergence and security 2011 (LNEE 120) (pp. 295–306). Berlin: Springer.
2.
Zurück zum Zitat Rho, M. J., Jang, K. S., Chung, K. Y., & Choi, I. Y. (2013). Comparison of knowledge, attitudes, and trust for the use of personal health information in clinical research. Multimedia Tools and Applications. doi:10.1007/s11042-013-1772-6. Rho, M. J., Jang, K. S., Chung, K. Y., & Choi, I. Y. (2013). Comparison of knowledge, attitudes, and trust for the use of personal health information in clinical research. Multimedia Tools and Applications. doi:10.​1007/​s11042-013-1772-6.
3.
Zurück zum Zitat Jung, E. Y., Kim, J. H., Chung, K. Y., & Park, D. K. (2013). Home health gateway based healthcare services through u-health platform. Wireless Personal Communications, 73(2), 207–218.CrossRef Jung, E. Y., Kim, J. H., Chung, K. Y., & Park, D. K. (2013). Home health gateway based healthcare services through u-health platform. Wireless Personal Communications, 73(2), 207–218.CrossRef
4.
Zurück zum Zitat Kang, S. K., Chung, K. Y., Ryu, J. K., Rim, K. W., & Lee, J. H. (2013). Bio-Interactive healthcare service system using lifelog based context computing. Wireless Personal Communications, 73(2), 341–351.CrossRef Kang, S. K., Chung, K. Y., Ryu, J. K., Rim, K. W., & Lee, J. H. (2013). Bio-Interactive healthcare service system using lifelog based context computing. Wireless Personal Communications, 73(2), 341–351.CrossRef
5.
6.
Zurück zum Zitat Jung, E. Y., Kim, J. H., Chung, K. Y., & Park, D. K. (2013). Mobile healthcare application with EMR interoperability for diabetes patients. Cluster Computing. doi:10.1007/s10586-013-0315-2. Jung, E. Y., Kim, J. H., Chung, K. Y., & Park, D. K. (2013). Mobile healthcare application with EMR interoperability for diabetes patients. Cluster Computing. doi:10.​1007/​s10586-013-0315-2.
7.
Zurück zum Zitat Kang, S. K., Chung, K. Y., & Lee, J. H. (2014). Development of head detection and tracking systems for visual surveillance. Personal and Ubiquitous Computing, 18(3), 515–522.CrossRef Kang, S. K., Chung, K. Y., & Lee, J. H. (2014). Development of head detection and tracking systems for visual surveillance. Personal and Ubiquitous Computing, 18(3), 515–522.CrossRef
8.
Zurück zum Zitat Davis, J., & Shah, M. (1994). Visual gesture recognition. Proceedings of Vision, Image, and Signal Processing, 141(2), 101–106.CrossRef Davis, J., & Shah, M. (1994). Visual gesture recognition. Proceedings of Vision, Image, and Signal Processing, 141(2), 101–106.CrossRef
9.
Zurück zum Zitat Vogler, C., & Metaxas, D. (2001). A framework for recognizing the simultaneous aspects of American sign language. Journal of Computer Vision and Image Understanding, 81, 358–384.CrossRefMATH Vogler, C., & Metaxas, D. (2001). A framework for recognizing the simultaneous aspects of American sign language. Journal of Computer Vision and Image Understanding, 81, 358–384.CrossRefMATH
10.
Zurück zum Zitat Fillbrandt, H., Akyol, S., & Kraiss, K. F. (2003). Extraction of 3D hand shape and posture from image sequences for sign language recognition. In Proceedings of the IEEE international workshop on analysis and modeling of faces and gestures (Vol. 17, pp. 181–186). Fillbrandt, H., Akyol, S., & Kraiss, K. F. (2003). Extraction of 3D hand shape and posture from image sequences for sign language recognition. In Proceedings of the IEEE international workshop on analysis and modeling of faces and gestures (Vol. 17, pp. 181–186).
11.
12.
Zurück zum Zitat Rehg, J., & Kanade, T. (1993). DigitEyes: Vision-based human hand tracking, school of computer science technical paper (CMU-CS-93-220). Carnegie Mellon University Rehg, J., & Kanade, T. (1993). DigitEyes: Vision-based human hand tracking, school of computer science technical paper (CMU-CS-93-220). Carnegie Mellon University
13.
Zurück zum Zitat Ng, C. W., & Ranganath, S. (2002). Real-time gesture recognition system and application. Journal of Image and Vision Computing, 20(13–14), 993–1007.CrossRef Ng, C. W., & Ranganath, S. (2002). Real-time gesture recognition system and application. Journal of Image and Vision Computing, 20(13–14), 993–1007.CrossRef
14.
Zurück zum Zitat Abe, K., Saito, H., & Ozawa, S. (2002). Virtual 3D interface system via hand motion recognition from two cameras. Journal of IEEE Transactions on Systems, Man and Cybernetics, Part A, 32(4), 536–540.CrossRef Abe, K., Saito, H., & Ozawa, S. (2002). Virtual 3D interface system via hand motion recognition from two cameras. Journal of IEEE Transactions on Systems, Man and Cybernetics, Part A, 32(4), 536–540.CrossRef
15.
Zurück zum Zitat Kwon, K., Zhang, H., & Dornaika, F. (2001). Hand pose recovery with a single video camera. In Proceedings of the IEEE international conference on robotics and automation (pp. 3181–4261). Kwon, K., Zhang, H., & Dornaika, F. (2001). Hand pose recovery with a single video camera. In Proceedings of the IEEE international conference on robotics and automation (pp. 3181–4261).
16.
Zurück zum Zitat Yang, J. G., Kim, J. K., Kang, U. G., & Lee, Y. H. (2013). Coronary heart disease optimization system on adaptive-network-based fuzzy inference system and linear discriminant analysis (ANFIS-LDA). Personal and Ubiquitous Computing. doi:10.1007/s00779-013-0737-0. Yang, J. G., Kim, J. K., Kang, U. G., & Lee, Y. H. (2013). Coronary heart disease optimization system on adaptive-network-based fuzzy inference system and linear discriminant analysis (ANFIS-LDA). Personal and Ubiquitous Computing. doi:10.​1007/​s00779-013-0737-0.
17.
Zurück zum Zitat Ha, O. K., Song, Y. S., Chung, K. Y., Lee, K. D., & Park, D. (2014). Relation model describing the effects of introducing RFID in the supply chain: Evidence from the food and beverage industry in South Korea. Personal and Ubiquitous Computing, 18(3), 553–561.CrossRef Ha, O. K., Song, Y. S., Chung, K. Y., Lee, K. D., & Park, D. (2014). Relation model describing the effects of introducing RFID in the supply chain: Evidence from the food and beverage industry in South Korea. Personal and Ubiquitous Computing, 18(3), 553–561.CrossRef
18.
Zurück zum Zitat Kim, J. Y., Chung, K. Y., & Jung, J. J. (2014). Single tag sharing scheme for multiple-object RFID applications. Multimedia Tools and Applications, 68(2), 465–477.CrossRef Kim, J. Y., Chung, K. Y., & Jung, J. J. (2014). Single tag sharing scheme for multiple-object RFID applications. Multimedia Tools and Applications, 68(2), 465–477.CrossRef
19.
Zurück zum Zitat Kim, S. H., & Chung, K. Y. (2014). 3D simulator for stability analysis of finite slope causing plane activity. Multimedia Tools and Applications, 68(2), 455–463.CrossRef Kim, S. H., & Chung, K. Y. (2014). 3D simulator for stability analysis of finite slope causing plane activity. Multimedia Tools and Applications, 68(2), 455–463.CrossRef
20.
Zurück zum Zitat Shirai, Y., Tanibata, N., & Shimada, N. (2002). Extraction of hand features for recognition of sign language words, VI’2002, computer-controlled mechanical systems. Graduate School of Engineering, Osaka University Shirai, Y., Tanibata, N., & Shimada, N. (2002). Extraction of hand features for recognition of sign language words, VI’2002, computer-controlled mechanical systems. Graduate School of Engineering, Osaka University
22.
Zurück zum Zitat Hamada, Y., Shimada, N., & Shirai, Y. (2004). Hand shape estimation under complex backgrounds for sign language recognition. In Proceedings of the international conference on automatic face and gesture recognition (pp. 589–594). Hamada, Y., Shimada, N., & Shirai, Y. (2004). Hand shape estimation under complex backgrounds for sign language recognition. In Proceedings of the international conference on automatic face and gesture recognition (pp. 589–594).
23.
Zurück zum Zitat Kim, G. H., Kim, Y. G., & Chung, K. Y. (2013). Towards virtualized and automated software performance test architecture. Multimedia Tools and Applications. doi:10.1007/s11042-013-1536-3. Kim, G. H., Kim, Y. G., & Chung, K. Y. (2013). Towards virtualized and automated software performance test architecture. Multimedia Tools and Applications. doi:10.​1007/​s11042-013-1536-3.
24.
Zurück zum Zitat Lee, J., & Kunii, T. (1995). Model-based analysis of hand posture. IEEE Computer Graphics and Applications, 15(5), 77–86.CrossRef Lee, J., & Kunii, T. (1995). Model-based analysis of hand posture. IEEE Computer Graphics and Applications, 15(5), 77–86.CrossRef
25.
Zurück zum Zitat Wu, Y., & Huang, T. S. (1999). Capturing articulated human hand motion: A divide-andconquer approach. In Proceedings IEEE international conference on Computer Vision (pp. 606–611). Corfu, Greece. Wu, Y., & Huang, T. S. (1999). Capturing articulated human hand motion: A divide-andconquer approach. In Proceedings IEEE international conference on Computer Vision (pp. 606–611). Corfu, Greece.
26.
Zurück zum Zitat Nölker, C., & Ritter, H. (1997). Detection of fingertips in human hand movement sequences. In I. Wachsmuth, & M. FroÈhlich (Eds.) Gesture and sign language in human–computer interaction (pp. 209–218). Nölker, C., & Ritter, H. (1997). Detection of fingertips in human hand movement sequences. In I. Wachsmuth, & M. FroÈhlich (Eds.) Gesture and sign language in human–computer interaction (pp. 209–218).
27.
Zurück zum Zitat Kuch, J. J., & Huang, T. S. (1995). Vision-based hand modeling and tracking for virtual teleconferencing and telecollaboration. In Proceedings of IEEE international conference computer vision (pp. 666–671), Cambridge, MA. Kuch, J. J., & Huang, T. S. (1995). Vision-based hand modeling and tracking for virtual teleconferencing and telecollaboration. In Proceedings of IEEE international conference computer vision (pp. 666–671), Cambridge, MA.
28.
Zurück zum Zitat Heap, T., & Hogg, D. (1996). Towards 3D hand tracking using a deformable model. In Proceedings of IEEE international conference automatic face and gesture recognition (pp. 140–145), Killington, VT. Heap, T., & Hogg, D. (1996). Towards 3D hand tracking using a deformable model. In Proceedings of IEEE international conference automatic face and gesture recognition (pp. 140–145), Killington, VT.
29.
Zurück zum Zitat Wu, Y., & Huang, T. S. (1999). Vision-based gesture recognition: A review. In Gesture workshop (GW 99) (pp. 103–115), France. Wu, Y., & Huang, T. S. (1999). Vision-based gesture recognition: A review. In Gesture workshop (GW 99) (pp. 103–115), France.
32.
33.
Zurück zum Zitat Ko, J. W., Chung, K. Y., & Han, J. S. (2013). Model transformation verification using similarity and graph comparison algorithm. Multimedia Tools and Applications. doi:10.1007/s11042-013-1581-y. Ko, J. W., Chung, K. Y., & Han, J. S. (2013). Model transformation verification using similarity and graph comparison algorithm. Multimedia Tools and Applications. doi:10.​1007/​s11042-013-1581-y.
34.
35.
Zurück zum Zitat Graetzel, C., Grange, S., Fong, T., & Baur, C. (2003). A non-contact mouse for surgeon-computer interaction. In Journal of IEEE medical image computing and computer assisted intervention, Toronto, Canada Interface. Graetzel, C., Grange, S., Fong, T., & Baur, C. (2003). A non-contact mouse for surgeon-computer interaction. In Journal of IEEE medical image computing and computer assisted intervention, Toronto, Canada Interface.
36.
Zurück zum Zitat Frigola, M., Fernandez, J., & Aranda, J. (2003). Visual human machine interface by gestures. In Proceedings of the IEEE international conference on robotics and automation (Vol. 1, pp. 386–391). Frigola, M., Fernandez, J., & Aranda, J. (2003). Visual human machine interface by gestures. In Proceedings of the IEEE international conference on robotics and automation (Vol. 1, pp. 386–391).
37.
Zurück zum Zitat Ueda, E., Matsumoto, Y., Imai, M., & Ogasawara, T. (2003). A hand-pose estimation for vision-based human interfaces. Journal of IEEE Transactions on Industrial Electronics, 50(4), 676–684.CrossRef Ueda, E., Matsumoto, Y., Imai, M., & Ogasawara, T. (2003). A hand-pose estimation for vision-based human interfaces. Journal of IEEE Transactions on Industrial Electronics, 50(4), 676–684.CrossRef
38.
Zurück zum Zitat Isaacs J., & Foo, J. S. (2004). Hand pose estimation for american sign language recognition. In Proceedings of the thirty-sixth southeastern symposium on system theory (pp. 132–136). Isaacs J., & Foo, J. S. (2004). Hand pose estimation for american sign language recognition. In Proceedings of the thirty-sixth southeastern symposium on system theory (pp. 132–136).
39.
Zurück zum Zitat Canny, J. (1986). Computational approach to edge detection. IEEE T-PAMI, 8(6):679–698. Canny, J. (1986). Computational approach to edge detection. IEEE T-PAMI, 8(6):679–698.
40.
Zurück zum Zitat Ouhaddi, H., & Horain, P. (1999). 3D hand gesture tracking by model registration. In Workshop on Synthetic-Natural Hybrid Coding and Three Dimensional Imaging (pp. 70–73). Ouhaddi, H., & Horain, P. (1999). 3D hand gesture tracking by model registration. In Workshop on Synthetic-Natural Hybrid Coding and Three Dimensional Imaging (pp. 70–73).
41.
Zurück zum Zitat Koller, D., Daniilidis, K., Thorhallson,T., & Nagel, H.-H. (1992). Model based object tracking in traffic scenes. In Proceedings of ECCV ’92. Springer-Verlag. Koller, D., Daniilidis, K., Thorhallson,T., & Nagel, H.-H. (1992). Model based object tracking in traffic scenes. In Proceedings of ECCV ’92. Springer-Verlag.
Metadaten
Titel
Real-Time Tracking and Recognition Systems for Interactive Telemedicine Health Services
Publikationsdatum
01.12.2014
Erschienen in
Wireless Personal Communications / Ausgabe 4/2014
Print ISSN: 0929-6212
Elektronische ISSN: 1572-834X
DOI
https://doi.org/10.1007/s11277-014-1784-1

Weitere Artikel der Ausgabe 4/2014

Wireless Personal Communications 4/2014 Zur Ausgabe

Neuer Inhalt