Skip to main content
Erschienen in: Neural Processing Letters 1/2023

21.01.2022

HDL-PSR: Modelling Spatio-Temporal Features Using Hybrid Deep Learning Approach for Post-Stroke Rehabilitation

verfasst von: Vishwanath Bijalwan, Vijay Bhaskar Semwal, Ghanapriya Singh, Tapan Kumar Mandal

Erschienen in: Neural Processing Letters | Ausgabe 1/2023

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Physiotherapy exercises like extension, flexion, and rotation are an absolute necessity for patients of post stroke rehabilitation (PSR). A physiotherapist uses many techniques to restore movements needs in daily life including nerve re-education, task training, muscle strengthening and uses various assistive techniques. But, a physiotherapist guiding the physiotherapy exercises to a patient is a time-consuming, tedious and costly affair. In the paper, a novel automated system is designed for detecting and recognizing upper limb exercises using an RGB-Depth camera that could guide the patients to perform real-time physiotherapy exercises without human intervention. Hybrid deep learning (HDL) approaches are exploited for the highly accurate and robust system for recognizing physiotherapy exercises of the upper limb for PSR. As a baseline, a deep convolutional neural network (CNN) is designed that automatically extracts features from the pre-processed data and classifies the performed physiotherapy exercise. As the exercise is being performed, to extract and utilize temporal dependencies, architectures of recurrent neural network (RNN) are used. In the CNN-LSTM model, CNN derives useful features that are provided to LSTM thus increasing the accuracy of recognized exercises. To train faster, another hybrid deep learning model, CNN-GRU is implemented where a novel focal loss criterion is used to overcome the drawbacks of standard cross-entropy loss. Experimental evaluation is done using RGB-D data obtained from Microsoft Kinect v2 sensors. Dataset comprising of 10 different physiotherapy exercises were created. Experimental results have shown significant activity recognition accuracy with 98% and 99% for CNN and CNN-LSTM model respectively. CNN-GRU model is the best suitable architecture with 100% accuracy.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Patil P, Kumar KS, Gaud N, Semwal VB (2019) Clinical human gait classification: extreme learning machine approach, In: 2019 1st international conference on advances in science, engineering and robotics technology (ICASERT). IEEE, pp 1–6 Patil P, Kumar KS, Gaud N, Semwal VB (2019) Clinical human gait classification: extreme learning machine approach, In: 2019 1st international conference on advances in science, engineering and robotics technology (ICASERT). IEEE, pp 1–6
2.
Zurück zum Zitat Semwal VB, Katiyar SA, Chakraborty R, Nandi GC (2015) Biologically-inspired push recovery capable bipedal locomotion modeling through hybrid automata. Robot Auton Syst 70:181–190CrossRef Semwal VB, Katiyar SA, Chakraborty R, Nandi GC (2015) Biologically-inspired push recovery capable bipedal locomotion modeling through hybrid automata. Robot Auton Syst 70:181–190CrossRef
3.
Zurück zum Zitat Gupta A, Semwal VB (2020) Multiple task human gait analysis and identification: ensemble learning approach. In: Emotion and information processing. Springer, pp 185–197 Gupta A, Semwal VB (2020) Multiple task human gait analysis and identification: ensemble learning approach. In: Emotion and information processing. Springer, pp 185–197
4.
Zurück zum Zitat Dua N, Singh SN, Semwal VB (2021) Multi-input cnn-gru based human activity recognition using wearable sensors. Computing, pp 1–18 Dua N, Singh SN, Semwal VB (2021) Multi-input cnn-gru based human activity recognition using wearable sensors. Computing, pp 1–18
5.
Zurück zum Zitat Jain R, Semwal VB, Kaushik P (2021) Deep ensemble learning approach for lower extremity activities recognition using wearable sensors. Exp Syst, p e12743 Jain R, Semwal VB, Kaushik P (2021) Deep ensemble learning approach for lower extremity activities recognition using wearable sensors. Exp Syst, p e12743
6.
Zurück zum Zitat Semwal VB, Gaud N, Lalwani P, Bijalwan V, Alok AK (2021) Pattern identification of different human joints for different human walking styles using inertial measurement unit (imu) sensor. Artif Intell Rev, pp 1–21 Semwal VB, Gaud N, Lalwani P, Bijalwan V, Alok AK (2021) Pattern identification of different human joints for different human walking styles using inertial measurement unit (imu) sensor. Artif Intell Rev, pp 1–21
7.
Zurück zum Zitat Doman CA, Waddell KJ, Bailey RR, Moore JL, Lang CE (2016) Changes in upper-extremity functional capacity and daily performance during outpatient occupational therapy for people with stroke. Am J Occup Ther 70(3):1–11CrossRef Doman CA, Waddell KJ, Bailey RR, Moore JL, Lang CE (2016) Changes in upper-extremity functional capacity and daily performance during outpatient occupational therapy for people with stroke. Am J Occup Ther 70(3):1–11CrossRef
8.
Zurück zum Zitat Crow JL, Harmeling-Van Der Wel BC (2008) Hierarchical properties of the motor function sections of the fugl-meyer assessment scale for people after stroke: a retrospective study. Phys Ther 88(12):1554–1567CrossRef Crow JL, Harmeling-Van Der Wel BC (2008) Hierarchical properties of the motor function sections of the fugl-meyer assessment scale for people after stroke: a retrospective study. Phys Ther 88(12):1554–1567CrossRef
9.
Zurück zum Zitat Bijalwan V, Semwal VB, Mandal T (2021) Fusion of multi-sensor based biomechanical gait analysis using vision and wearable sensor. IEEE Sens J Bijalwan V, Semwal VB, Mandal T (2021) Fusion of multi-sensor based biomechanical gait analysis using vision and wearable sensor. IEEE Sens J
10.
Zurück zum Zitat İnce F, Ö, Ince IF, Yıldırım ME, Park JS, Song JK, Yoon BW, (2020) Human activity recognition with analysis of angles between skeletal joints using a RGB-depth sensor. ETRI J 42(1):78–89 İnce F, Ö, Ince IF, Yıldırım ME, Park JS, Song JK, Yoon BW, (2020) Human activity recognition with analysis of angles between skeletal joints using a RGB-depth sensor. ETRI J 42(1):78–89
11.
Zurück zum Zitat Jardim D, Nunes L, Dias M (2017) Human activity recognition from automatically labeled data in RGB-D videos. In: 2016 8th computer science and electronic engineering conference, CEEC 2016 - conference proceedings, pp 89–94 Jardim D, Nunes L, Dias M (2017) Human activity recognition from automatically labeled data in RGB-D videos. In: 2016 8th computer science and electronic engineering conference, CEEC 2016 - conference proceedings, pp 89–94
12.
Zurück zum Zitat Procházka A, Vyšata O, Vališ M, Ťupa O, Schätz M, Mařík V (2015) Use of the image and depth sensors of the microsoft kinect for the detection of gait disorders. Neural Comput Appl 26(7):1621–1629CrossRef Procházka A, Vyšata O, Vališ M, Ťupa O, Schätz M, Mařík V (2015) Use of the image and depth sensors of the microsoft kinect for the detection of gait disorders. Neural Comput Appl 26(7):1621–1629CrossRef
13.
Zurück zum Zitat Singh G, Singh RK, Saha R, Agarwal N (2020) Iwt based iris recognition for image authentication. Procedia Comput Sci 171:1868–1876CrossRef Singh G, Singh RK, Saha R, Agarwal N (2020) Iwt based iris recognition for image authentication. Procedia Comput Sci 171:1868–1876CrossRef
14.
Zurück zum Zitat Ye M, Zhang Q, Wang L, Zhu J (2013) A survey on human motion analysis, time-of-flight and depth imaging. Sensors Algorithm Appl 8200:149–187 Ye M, Zhang Q, Wang L, Zhu J (2013) A survey on human motion analysis, time-of-flight and depth imaging. Sensors Algorithm Appl 8200:149–187
15.
Zurück zum Zitat Shotton J, Sharp T, Fitzgibbon A, Blake A, Cook M, Kipman A, Finocchio M, Moore R (2013) Real-Time human pose recognition in parts from single depth images. Commun ACM 56(1):116–124CrossRef Shotton J, Sharp T, Fitzgibbon A, Blake A, Cook M, Kipman A, Finocchio M, Moore R (2013) Real-Time human pose recognition in parts from single depth images. Commun ACM 56(1):116–124CrossRef
16.
Zurück zum Zitat Zhao W, Lun R, Gordon C, Fofana ABM, Espy DD, Reinthal MA, Ekelman B, Goodman GD, Niederriter JE, Luo X (2017) A human-centered activity tracking system: toward a healthier workplace. IEEE Trans Hum Mach Syst 47(3):343–355CrossRef Zhao W, Lun R, Gordon C, Fofana ABM, Espy DD, Reinthal MA, Ekelman B, Goodman GD, Niederriter JE, Luo X (2017) A human-centered activity tracking system: toward a healthier workplace. IEEE Trans Hum Mach Syst 47(3):343–355CrossRef
17.
Zurück zum Zitat Semwal VB, Chakraborty P, Nandi GC (2015) Less computationally intensive fuzzy logic (type-1)-based controller for humanoid push recovery. Robot Auton Syst 63:122–135CrossRef Semwal VB, Chakraborty P, Nandi GC (2015) Less computationally intensive fuzzy logic (type-1)-based controller for humanoid push recovery. Robot Auton Syst 63:122–135CrossRef
18.
Zurück zum Zitat Singh G, Chowdhary M, Kumar A, Bahl R (2020) A personalized classifier for human motion activities with semi-supervised learning. IEEE Trans Consum Electron 66(4):346–355CrossRef Singh G, Chowdhary M, Kumar A, Bahl R (2020) A personalized classifier for human motion activities with semi-supervised learning. IEEE Trans Consum Electron 66(4):346–355CrossRef
19.
Zurück zum Zitat Singh G, Rawat T (2013) Color image enhancement by linear transformations solving out of gamut problem. Int J Comput Appl 67(14):28–32 Singh G, Rawat T (2013) Color image enhancement by linear transformations solving out of gamut problem. Int J Comput Appl 67(14):28–32
20.
Zurück zum Zitat Agarwal N, Sondhi A, Chopra K, Singh G, Transfer learning: survey and classification. In: Smart innovations in communication and computational sciences. Springer, 2021, pp 145–155 Agarwal N, Sondhi A, Chopra K, Singh G, Transfer learning: survey and classification. In: Smart innovations in communication and computational sciences. Springer, 2021, pp 145–155
21.
Zurück zum Zitat Zhang L, Sheng Z, Li Y, Sun Q, Zhao Y, Feng D (2019) Image object detection and semantic segmentation based on convolutional neural network. Neural Comput Appl, pp 1–10 Zhang L, Sheng Z, Li Y, Sun Q, Zhao Y, Feng D (2019) Image object detection and semantic segmentation based on convolutional neural network. Neural Comput Appl, pp 1–10
22.
Zurück zum Zitat Ramakrishnan J, Mavaluru D, Sakthivel RS, Alqahtani AS, Mubarakali A, Retnadhas M (2020) Brain–computer interface for amyotrophic lateral sclerosis patients using deep learning network. Neural Comput Appl, pp 1–15 Ramakrishnan J, Mavaluru D, Sakthivel RS, Alqahtani AS, Mubarakali A, Retnadhas M (2020) Brain–computer interface for amyotrophic lateral sclerosis patients using deep learning network. Neural Comput Appl, pp 1–15
23.
Zurück zum Zitat Bijalwan V, Semwal VB, Singh G, Crespo RG (2021) Heterogeneous computing model for post-injury walking pattern restoration and postural stability rehabilitation exercise recognition. Exp Syst p e12706 Bijalwan V, Semwal VB, Singh G, Crespo RG (2021) Heterogeneous computing model for post-injury walking pattern restoration and postural stability rehabilitation exercise recognition. Exp Syst p e12706
24.
Zurück zum Zitat Pham Huy-Hieu, Khoudour L, Crouzil A, Zegers P, Velastin SA (2017) Learning and recognizing human action from skeleton movement with deep residual neural networks, pp 25 (6 .)—-25 (6 .) Pham Huy-Hieu, Khoudour L, Crouzil A, Zegers P, Velastin SA (2017) Learning and recognizing human action from skeleton movement with deep residual neural networks, pp 25 (6 .)—-25 (6 .)
25.
Zurück zum Zitat Komang MGA, Surya MN, Ratna AN (2019) Human activity recognition using skeleton data and support vector machine. J Phys Conf Ser 1192(1) Komang MGA, Surya MN, Ratna AN (2019) Human activity recognition using skeleton data and support vector machine. J Phys Conf Ser 1192(1)
26.
Zurück zum Zitat Semwal VB, Singha J, Sharma PK, Chauhan A, Behera B (2017) An optimized feature selection technique based on incremental feature analysis for bio-metric gait data classification. Multimedia Tools Appl 76(22):24457–24475CrossRef Semwal VB, Singha J, Sharma PK, Chauhan A, Behera B (2017) An optimized feature selection technique based on incremental feature analysis for bio-metric gait data classification. Multimedia Tools Appl 76(22):24457–24475CrossRef
27.
Zurück zum Zitat Nirjon S, Greenwood C, Torres C, Zhou S, Stankovic JA, Yoon HJ, Ra HK, Basaran C, Park T, Son SH (2014) Kintense: A robust, accurate, real-time and evolving system for detecting aggressive actions from streaming 3D skeleton data. In: 2014 IEEE international conference on pervasive computing and communications, PerCom 2014, pp 2–10 Nirjon S, Greenwood C, Torres C, Zhou S, Stankovic JA, Yoon HJ, Ra HK, Basaran C, Park T, Son SH (2014) Kintense: A robust, accurate, real-time and evolving system for detecting aggressive actions from streaming 3D skeleton data. In: 2014 IEEE international conference on pervasive computing and communications, PerCom 2014, pp 2–10
28.
Zurück zum Zitat Chang Y-J, Chen S-F, Huang J-D (2011) A kinect-based system for physical rehabilitation: A pilot study for young adults with motor disabilities. Res Dev Disabil 32(6):2566–2570CrossRef Chang Y-J, Chen S-F, Huang J-D (2011) A kinect-based system for physical rehabilitation: A pilot study for young adults with motor disabilities. Res Dev Disabil 32(6):2566–2570CrossRef
29.
Zurück zum Zitat Zhao W, Reinthal MA, Espy DD, Luo X (2017) Rule-based human motion tracking for rehabilitation exercises: realtime assessment, feedback, and guidance. IEEE Access, vol 5, pp 21382–21394 Zhao W, Reinthal MA, Espy DD, Luo X (2017) Rule-based human motion tracking for rehabilitation exercises: realtime assessment, feedback, and guidance. IEEE Access, vol 5, pp 21382–21394
30.
31.
Zurück zum Zitat Raj M, Semwal VB, Nandi GC (2018) Bidirectional association of joint angle trajectories for humanoid locomotion: the restricted boltzmann machine approach. Neural Comput Appl 30(6):1747–1755CrossRef Raj M, Semwal VB, Nandi GC (2018) Bidirectional association of joint angle trajectories for humanoid locomotion: the restricted boltzmann machine approach. Neural Comput Appl 30(6):1747–1755CrossRef
32.
Zurück zum Zitat Caon M, Yue Y, Tscherrig J, Mugellini E, Abou Khaled O (2011) Context-aware 3D gesture interaction based on multiple kinects, AMBIENT 2011. In: The first international conference on ambient computing, applications, services and technologies, pp 7–12 Caon M, Yue Y, Tscherrig J, Mugellini E, Abou Khaled O (2011) Context-aware 3D gesture interaction based on multiple kinects, AMBIENT 2011. In: The first international conference on ambient computing, applications, services and technologies, pp 7–12
33.
Zurück zum Zitat Singh P, Singh RK, Singh G (2018) An efficient iris recognition system using integer wavelet transform, In: 2018 2nd international conference on trends in electronics and informatics (ICOEI). IEEE, pp 1029–1034 Singh P, Singh RK, Singh G (2018) An efficient iris recognition system using integer wavelet transform, In: 2018 2nd international conference on trends in electronics and informatics (ICOEI). IEEE, pp 1029–1034
34.
Zurück zum Zitat Singh RK, Saha R, Pal PK, Singh G (2018) Novel feature extraction algorithm using dwt and temporal statistical techniques for word dependent speaker’s recognition, In: 2018 fourth international conference on research in computational intelligence and communication networks (ICRCICN). IEEE, pp 130–134 Singh RK, Saha R, Pal PK, Singh G (2018) Novel feature extraction algorithm using dwt and temporal statistical techniques for word dependent speaker’s recognition, In: 2018 fourth international conference on research in computational intelligence and communication networks (ICRCICN). IEEE, pp 130–134
35.
Zurück zum Zitat Semwal VB, Mondal K, Nandi GC (2017) Robust and accurate feature selection for humanoid push recovery and classification: deep learning approach. Neural Comput Appl 28(3):565–574CrossRef Semwal VB, Mondal K, Nandi GC (2017) Robust and accurate feature selection for humanoid push recovery and classification: deep learning approach. Neural Comput Appl 28(3):565–574CrossRef
36.
Zurück zum Zitat Kanagaraj N, Hicks D, Goyal A, Tiwari S, Singh G (2021) Deep learning using computer vision in self driving cars for lane and traffic sign detection. Int J Syst Assurance Eng Manag, pp 1–15 Kanagaraj N, Hicks D, Goyal A, Tiwari S, Singh G (2021) Deep learning using computer vision in self driving cars for lane and traffic sign detection. Int J Syst Assurance Eng Manag, pp 1–15
37.
Zurück zum Zitat Singh G, Chowdhary M, Kumar A, Bahl R (2019) A probabilistic framework for base level context awareness of a mobile or wearable device user. In: 2019 IEEE 8th global conference on consumer electronics (GCCE). IEEE, pp 217–218 Singh G, Chowdhary M, Kumar A, Bahl R (2019) A probabilistic framework for base level context awareness of a mobile or wearable device user. In: 2019 IEEE 8th global conference on consumer electronics (GCCE). IEEE, pp 217–218
38.
Zurück zum Zitat Chhillar S, Singh G, Singh A, Saini VK (2019) Quantitative analysis of pulmonary emphysema by congregating statistical features, In: 2019 3rd international conference on recent developments in control, automation & power engineering (RDCAPE). IEEE, pp 329–333 Chhillar S, Singh G, Singh A, Saini VK (2019) Quantitative analysis of pulmonary emphysema by congregating statistical features, In: 2019 3rd international conference on recent developments in control, automation & power engineering (RDCAPE). IEEE, pp 329–333
39.
Zurück zum Zitat Zhao ZQ, Zheng P, Xu ST, Wu X (2019) Object detection with deep learning: a review. IEEE Trans Neural Netw Learn Syst 30(11):3212–3232CrossRef Zhao ZQ, Zheng P, Xu ST, Wu X (2019) Object detection with deep learning: a review. IEEE Trans Neural Netw Learn Syst 30(11):3212–3232CrossRef
40.
Zurück zum Zitat Gupta V, Semwal VB (2021) Wearable sensor based pattern mining for human activity recognition : deep learning approach. Ind Robot, 48(1) Gupta V, Semwal VB (2021) Wearable sensor based pattern mining for human activity recognition : deep learning approach. Ind Robot, 48(1)
41.
Zurück zum Zitat Shao L, Han J, Xu D, Shotton J (2013) Computer vision for RGB-D sensors: kinect and its applications. IEEE Tran Cybernet 43(5):1314–1317CrossRef Shao L, Han J, Xu D, Shotton J (2013) Computer vision for RGB-D sensors: kinect and its applications. IEEE Tran Cybernet 43(5):1314–1317CrossRef
42.
Zurück zum Zitat Cao W, Zhong J, Cao G, He Z (2019) Physiological function assessment based on kinect V2. In: IEEE Access, vol 7, pp 105638–105651 Cao W, Zhong J, Cao G, He Z (2019) Physiological function assessment based on kinect V2. In: IEEE Access, vol 7, pp 105638–105651
43.
Zurück zum Zitat Collings DG, Scullion H, Vaiman V (2015) Talent management: progress and prospects. Hum Resour Manag Rev 25(3):233–235 Collings DG, Scullion H, Vaiman V (2015) Talent management: progress and prospects. Hum Resour Manag Rev 25(3):233–235
44.
Zurück zum Zitat Jalal A, Kim Y, Kamal S, Farooq A, Kim D (2015) Human daily activity recognition with joints plus body features representation using Kinect sensor. In: 2015 4th international conference on informatics, electronics and vision, ICIEV 2015, Jalal A, Kim Y, Kamal S, Farooq A, Kim D (2015) Human daily activity recognition with joints plus body features representation using Kinect sensor. In: 2015 4th international conference on informatics, electronics and vision, ICIEV 2015,
45.
Zurück zum Zitat Gaglio S, Re GL, Morana M (2015) Human activity recognition process using 3-D posture data. IEEE Trans Hum Mach Syst 45(5):586–597CrossRef Gaglio S, Re GL, Morana M (2015) Human activity recognition process using 3-D posture data. IEEE Trans Hum Mach Syst 45(5):586–597CrossRef
46.
Zurück zum Zitat Semwal VB, Mazumdar A, Jha A, Gaud N, Bijalwan N (2019) Speed, cloth and pose invariant gait recognition-based person identification. In: Machine learning: theoretical foundations and practical applications, p 39 Semwal VB, Mazumdar A, Jha A, Gaud N, Bijalwan N (2019) Speed, cloth and pose invariant gait recognition-based person identification. In: Machine learning: theoretical foundations and practical applications, p 39
47.
Zurück zum Zitat Gill T, Keller JM, Anderson DT, Luke RH (2011) A system for change detection and human recognition in voxel space using the Microsoft Kinect sensor. In: Proceedings - applied imagery pattern recognition workshop Gill T, Keller JM, Anderson DT, Luke RH (2011) A system for change detection and human recognition in voxel space using the Microsoft Kinect sensor. In: Proceedings - applied imagery pattern recognition workshop
48.
Zurück zum Zitat Su B, Wu H, Sheng M, Shen C (2019) Accurate hierarchical human actions recognition from kinect skeleton data. IEEE Access, vol7, pp 52532–52541 Su B, Wu H, Sheng M, Shen C (2019) Accurate hierarchical human actions recognition from kinect skeleton data. IEEE Access, vol7, pp 52532–52541
49.
Zurück zum Zitat Semwal VB, Nandi GC (2015) Toward developing a computational model for bipedal push recovery-a brief. IEEE Sens J 15(4):2021–2022CrossRef Semwal VB, Nandi GC (2015) Toward developing a computational model for bipedal push recovery-a brief. IEEE Sens J 15(4):2021–2022CrossRef
50.
Zurück zum Zitat Eldesokey A, Felsberg M, Khan FS (2019) Confidence propagation through cnns for guided sparse depth regression. IEEE Trans Pattern Anal Mach Intell 42(10):2423–2436CrossRef Eldesokey A, Felsberg M, Khan FS (2019) Confidence propagation through cnns for guided sparse depth regression. IEEE Trans Pattern Anal Mach Intell 42(10):2423–2436CrossRef
51.
Zurück zum Zitat He K, Sun J, Tang X (2012) Guided image filtering. IEEE Trans Pattern Anal Mach Intell 35(6):1397–1409CrossRef He K, Sun J, Tang X (2012) Guided image filtering. IEEE Trans Pattern Anal Mach Intell 35(6):1397–1409CrossRef
52.
Zurück zum Zitat Saha R, Singh RK, Kumar R, Singh G, Goel T, Pal PK (2019) Classification of human heart signals by novel feature extraction techniques for rescue application. In: 2019 fifth international conference on image information processing (ICIIP). IEEE, pp 156–160 Saha R, Singh RK, Kumar R, Singh G, Goel T, Pal PK (2019) Classification of human heart signals by novel feature extraction techniques for rescue application. In: 2019 fifth international conference on image information processing (ICIIP). IEEE, pp 156–160
53.
Zurück zum Zitat Pandey S, Sharma R, Singh G (2020) Implementation of 5-block convolutional neural network (cnn) for saliency improvement on flying object detection in videos. In: 2020 3rd international conference on emerging technologies in computer engineering: machine learning and internet of things (ICETCE). IEEE, pp 1–6 Pandey S, Sharma R, Singh G (2020) Implementation of 5-block convolutional neural network (cnn) for saliency improvement on flying object detection in videos. In: 2020 3rd international conference on emerging technologies in computer engineering: machine learning and internet of things (ICETCE). IEEE, pp 1–6
54.
Zurück zum Zitat Ahmad Z, Khan NM (2019) Multidomain multimodal fusion for human action recognition using inertial sensors. In: 2019 IEEE fifth international conference on multimedia big data (BigMM). IEEE, pp 429–434 Ahmad Z, Khan NM (2019) Multidomain multimodal fusion for human action recognition using inertial sensors. In: 2019 IEEE fifth international conference on multimedia big data (BigMM). IEEE, pp 429–434
55.
Zurück zum Zitat Ahmad Z, Khan NM (2020) Multidomain multimodal fusion for human action recognition using inertial sensors, CoRR, vol. abs/2008.09748. [Online]. arXiv:abs/2008.09748 Ahmad Z, Khan NM (2020) Multidomain multimodal fusion for human action recognition using inertial sensors, CoRR, vol. abs/2008.09748. [Online]. arXiv:​abs/​2008.​09748
Metadaten
Titel
HDL-PSR: Modelling Spatio-Temporal Features Using Hybrid Deep Learning Approach for Post-Stroke Rehabilitation
verfasst von
Vishwanath Bijalwan
Vijay Bhaskar Semwal
Ghanapriya Singh
Tapan Kumar Mandal
Publikationsdatum
21.01.2022
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 1/2023
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-022-10744-6

Weitere Artikel der Ausgabe 1/2023

Neural Processing Letters 1/2023 Zur Ausgabe

Neuer Inhalt