Skip to main content
Erschienen in: Neural Processing Letters 1/2021

21.11.2020

Densemask RCNN: A Hybrid Model for Skin Burn Image Classification and Severity Grading

verfasst von: C. Pabitha, B. Vanathi

Erschienen in: Neural Processing Letters | Ausgabe 1/2021

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

In the medical image processing, automatic segmentation of burn images is one of the critical tasks in the classification of skin burn into normal and burn area. Traditional models identify the burns from the image and distinguish the region as burn and non-burn regions. However, the earlier models cannot accurately classify the wound region and also requires more time in the prediction of burns. Also, the burn depth analysis is an important factor for the calculation of the percentage of burn depth i.e. degree of severity is analyzed by Total body surface area (TBSA). For those issues, we design a hybrid approach named DenseMask Regional convolutional neural network (RCNN) approach for segmenting the skin burn region based on the various degrees of burn severity. In this, hybrid integration of Mask-region based convolution neural network CNN (Mask R-CNN) and dense pose estimation are integrated into DenseMask RCNN that calculate the full-body human pose and performs semantic segmentation. At first, we use the Residual Network with a dilated convolution using a weighted mapping model to generate the dense feature map. Then the feature map is fed into the Region proposal network (RPN) which utilizes a Feature pyramid network (FPN) to detect the objects at different scales of location and pyramid level from the input images. For the accurate alignment of pixel-to-pixel labels, we introduce a Region of interest (RoI)-pose align module that properly aligns the objects based on the human pose with the characteristics of scale, right-left, translation, and left–right flip to a standard scale. After the alignment task, a cascaded fully convolutional architecture is employed on the top of the RoI module that performs mask segmentation and dense pose regression task simultaneously. Finally, the transfer learning model classifies the detected burn regions into three classes of wound depths. Experimental analysis is performed on the burn dataset and the result obtained shows better accuracy than the state-of-art approaches.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Evers LH, Bhavsar D, Mailänder P (2010) The biology of burn injury. ExpDermatol 19(9):777–783 Evers LH, Bhavsar D, Mailänder P (2010) The biology of burn injury. ExpDermatol 19(9):777–783
2.
Zurück zum Zitat Torpy Janet M, Lynm C, Glass RM (2009) Burn injuries. JAMA 302(16):1828–1928CrossRef Torpy Janet M, Lynm C, Glass RM (2009) Burn injuries. JAMA 302(16):1828–1928CrossRef
3.
Zurück zum Zitat Abraham JP, Plourde BD, Vallez LJ, Nelson-Cheeseman BB, Stark JR, Sparrow EM, Gorman JM (2018) Skin burns. Theory Appl Heat Trans Hum 2:723–739CrossRef Abraham JP, Plourde BD, Vallez LJ, Nelson-Cheeseman BB, Stark JR, Sparrow EM, Gorman JM (2018) Skin burns. Theory Appl Heat Trans Hum 2:723–739CrossRef
4.
Zurück zum Zitat Acha B, Serrano C, Fondón I, Gómez-Cía T (2013) Burn depth analysis using multidimensional scaling applied to psychophysical experiment data. IEEE Trans Med Imaging 32(6):1111–1120CrossRef Acha B, Serrano C, Fondón I, Gómez-Cía T (2013) Burn depth analysis using multidimensional scaling applied to psychophysical experiment data. IEEE Trans Med Imaging 32(6):1111–1120CrossRef
5.
Zurück zum Zitat Zhai LN, Li J (2015) Prediction methods of skin burn for performance evaluation of thermal protective clothing. Burns 41(7):1385–1396CrossRef Zhai LN, Li J (2015) Prediction methods of skin burn for performance evaluation of thermal protective clothing. Burns 41(7):1385–1396CrossRef
6.
Zurück zum Zitat Wantanajittikul K, Auephanwiriyakul S, Theera-Umpon N, Koanantakool T (2012) Automatic segmentation and degree identification in burn color images. In: The 4th 2011 biomedical engineering international conference. IEEE, pp 169–173 Wantanajittikul K, Auephanwiriyakul S, Theera-Umpon N, Koanantakool T (2012) Automatic segmentation and degree identification in burn color images. In: The 4th 2011 biomedical engineering international conference. IEEE, pp 169–173
7.
Zurück zum Zitat Stroppiana D, Bordogna G, Boschetti M, Carrara P, Boschetti L, Brivio PA (2011) Positive and negative information for assessing and revising scores of burn evidence. IEEE Geosci Remote Sens Lett 9(3):363–367CrossRef Stroppiana D, Bordogna G, Boschetti M, Carrara P, Boschetti L, Brivio PA (2011) Positive and negative information for assessing and revising scores of burn evidence. IEEE Geosci Remote Sens Lett 9(3):363–367CrossRef
8.
Zurück zum Zitat García JFG, Venegas-Andraca SE (2015) Region-based approach for the spectral clustering Nyström approximation with an application to burn depth assessment. Mach Vis Appl 26(2–3):353–368CrossRef García JFG, Venegas-Andraca SE (2015) Region-based approach for the spectral clustering Nyström approximation with an application to burn depth assessment. Mach Vis Appl 26(2–3):353–368CrossRef
9.
Zurück zum Zitat Tran H, Le T, Le T, Nguyen T (2015) Burn image classification using one-class support vector machine. In: ICCASA. Springer, Cham, pp 233–242 Tran H, Le T, Le T, Nguyen T (2015) Burn image classification using one-class support vector machine. In: ICCASA. Springer, Cham, pp 233–242
10.
Zurück zum Zitat Thumbunpeng P, Ruchanurucks M, Khongma A (2013) Surface area calculation using Kinect's filtered point cloud with an application of burn care. In: 2013 IEEE international conference on robotics and biomimetics (ROBIO). IEEE, pp 2166–2169 Thumbunpeng P, Ruchanurucks M, Khongma A (2013) Surface area calculation using Kinect's filtered point cloud with an application of burn care. In: 2013 IEEE international conference on robotics and biomimetics (ROBIO). IEEE, pp 2166–2169
11.
Zurück zum Zitat Şevik U, Karakullukçu E, Berber T, Akbaş Y, Türkyılmaz S (2019) Automatic classification of skin burn colour images using texture-based feature extraction. IET Image Proc 13(11):2018–2028CrossRef Şevik U, Karakullukçu E, Berber T, Akbaş Y, Türkyılmaz S (2019) Automatic classification of skin burn colour images using texture-based feature extraction. IET Image Proc 13(11):2018–2028CrossRef
12.
Zurück zum Zitat Serrano C, Boloix-Tortosa R, Gómez-Cía T, Acha B (2015) Features identification for automatic burn classification. Burns 41(8):1883–1890CrossRef Serrano C, Boloix-Tortosa R, Gómez-Cía T, Acha B (2015) Features identification for automatic burn classification. Burns 41(8):1883–1890CrossRef
13.
Zurück zum Zitat Fauzi MFA, Khansa I, Catignani K, Gordillo G, Sen CK, Gurcan MN (2015) Computerized segmentation and measurement of chronic wound images. Comput Biol Med 60:74–85CrossRef Fauzi MFA, Khansa I, Catignani K, Gordillo G, Sen CK, Gurcan MN (2015) Computerized segmentation and measurement of chronic wound images. Comput Biol Med 60:74–85CrossRef
14.
Zurück zum Zitat Gao Y, Zoughi R (2016) Millimeter wave reflectometry and imaging for noninvasive diagnosis of skin burn injuries. IEEE Trans Instrum Meas 66(1):77–84CrossRef Gao Y, Zoughi R (2016) Millimeter wave reflectometry and imaging for noninvasive diagnosis of skin burn injuries. IEEE Trans Instrum Meas 66(1):77–84CrossRef
15.
Zurück zum Zitat Yu J, Tan M, Zhang H, Tao D, Rui Y (2019) Hierarchical deep click feature prediction for fine-grained image recognition. In: IEEE transactions on pattern analysis and machine intelligence. 2019 July 30 Yu J, Tan M, Zhang H, Tao D, Rui Y (2019) Hierarchical deep click feature prediction for fine-grained image recognition. In: IEEE transactions on pattern analysis and machine intelligence. 2019 July 30
16.
Zurück zum Zitat Zhang J, Yu J, Tao D (2018) Local deep-feature alignment for unsupervised dimension reduction. IEEE Trans Image Process 27(5):2420–2432MathSciNetCrossRef Zhang J, Yu J, Tao D (2018) Local deep-feature alignment for unsupervised dimension reduction. IEEE Trans Image Process 27(5):2420–2432MathSciNetCrossRef
17.
Zurück zum Zitat Yu J, Li J, Yu Z, Huang Q (2019) Multimodal transformer with multi-view visual representation for image captioning. In: IEEE transactions on circuits and systems for video technology. 2019 Oct 15 Yu J, Li J, Yu Z, Huang Q (2019) Multimodal transformer with multi-view visual representation for image captioning. In: IEEE transactions on circuits and systems for video technology. 2019 Oct 15
18.
Zurück zum Zitat Butt AUR, Ahmad W, Ashraf R, Asif M, Cheema SA (2019) Computer aided diagnosis (CAD) for segmentation and classification of burnt human skin. In: 2019 international conference on electrical, communication, and computer engineering (ICECCE). IEEE, pp 1–5 Butt AUR, Ahmad W, Ashraf R, Asif M, Cheema SA (2019) Computer aided diagnosis (CAD) for segmentation and classification of burnt human skin. In: 2019 international conference on electrical, communication, and computer engineering (ICECCE). IEEE, pp 1–5
19.
Zurück zum Zitat Yadav DP, Sharma A, Singh M, Goyal A (2019) Feature Extraction based machine learning for human burn diagnosis from burn images. IEEE J TranslEng Health Med 7:1–7CrossRef Yadav DP, Sharma A, Singh M, Goyal A (2019) Feature Extraction based machine learning for human burn diagnosis from burn images. IEEE J TranslEng Health Med 7:1–7CrossRef
20.
Zurück zum Zitat Suvarna M, Niranjan UC (2013) Classification methods of skin burn images. Int J Comput Sci InfTechnol 5(1):109 Suvarna M, Niranjan UC (2013) Classification methods of skin burn images. Int J Comput Sci InfTechnol 5(1):109
21.
Zurück zum Zitat Trabelsi O, Tlig L, Sayadi M, Fnaiech F (2013) Skin disease analysis and tracking based on image segmentation. In: 2013 international conference on electrical engineering and software applications. IEEE, pp 1–7 Trabelsi O, Tlig L, Sayadi M, Fnaiech F (2013) Skin disease analysis and tracking based on image segmentation. In: 2013 international conference on electrical engineering and software applications. IEEE, pp 1–7
22.
Zurück zum Zitat Osborne CL, Petersson C, Graham JE, Meyer WJ III, Simeonsson RJ, Suman OE, Ottenbacher KJ (2017) The Burn Model Systems outcome measures: a content analysis using the. Int Class FunctDisabil Health DisabilRehabil 39(25):2584–2593 Osborne CL, Petersson C, Graham JE, Meyer WJ III, Simeonsson RJ, Suman OE, Ottenbacher KJ (2017) The Burn Model Systems outcome measures: a content analysis using the. Int Class FunctDisabil Health DisabilRehabil 39(25):2584–2593
23.
Zurück zum Zitat Ding H, Chang RC (2018) Simulating image-guided in situ bioprinting of a skin graft onto a phantom burn wound bed. AdditManuf 22:708–719 Ding H, Chang RC (2018) Simulating image-guided in situ bioprinting of a skin graft onto a phantom burn wound bed. AdditManuf 22:708–719
24.
Zurück zum Zitat Hai TS, Triet LM, Thai LH, Thuy NT (2017) Real time burning image classification using support vector machine. EAI Endorsed Trans Context Aware Syst Appl 4(12). Hai TS, Triet LM, Thai LH, Thuy NT (2017) Real time burning image classification using support vector machine. EAI Endorsed Trans Context Aware Syst Appl 4(12).
25.
Zurück zum Zitat Despo O, Yeung S, Jopling J, Pridgen B, Sheckter C, Silberstein S et al. (2017) BURNED: towards efficient and accurate burn prognosis using deep learning Despo O, Yeung S, Jopling J, Pridgen B, Sheckter C, Silberstein S et al. (2017) BURNED: towards efficient and accurate burn prognosis using deep learning
26.
Zurück zum Zitat Tran HS, Le TH, Nguyen TT (2016) The degree of skin burns images recognition using convolutional neural network. Indian J Sci Technol 9(45):1–6 Tran HS, Le TH, Nguyen TT (2016) The degree of skin burns images recognition using convolutional neural network. Indian J Sci Technol 9(45):1–6
27.
Zurück zum Zitat Abubakar A, Ajuji M, Usman Yahya I (2020) Comparison of deep transfer learning techniques in human skin burns discrimination. ApplSystInnov 3(2):20 Abubakar A, Ajuji M, Usman Yahya I (2020) Comparison of deep transfer learning techniques in human skin burns discrimination. ApplSystInnov 3(2):20
28.
Zurück zum Zitat Tran H, Le T, Le T, Nguyen T (2015) Burn image classification using one-class support vector machine. In: ICCASA, pp 233–242 Tran H, Le T, Le T, Nguyen T (2015) Burn image classification using one-class support vector machine. In: ICCASA, pp 233–242
29.
Zurück zum Zitat Rowland RA, Ponticorvo A, Baldado ML, Kennedy GT, Burmeister DM, Christy RJ, Durkin AJ (2019) Burn wound classification model using spatial frequency-domain imaging and machine learning. Journal of biomedical optics 24(5):056007 Rowland RA, Ponticorvo A, Baldado ML, Kennedy GT, Burmeister DM, Christy RJ, Durkin AJ (2019) Burn wound classification model using spatial frequency-domain imaging and machine learning. Journal of biomedical optics 24(5):056007
30.
Zurück zum Zitat Lee S, Ye H, Chittajallu D, Kruger U, Boyko T, Lukan JK, De S (2020) Real-time burn classification using ultrasound imaging. Sci Rep 10(1):1–13CrossRef Lee S, Ye H, Chittajallu D, Kruger U, Boyko T, Lukan JK, De S (2020) Real-time burn classification using ultrasound imaging. Sci Rep 10(1):1–13CrossRef
31.
Zurück zum Zitat Hong C, Yu J, Zhang J, Jin X, Lee KH (2018) Multimodal face-pose estimation with multitask manifold deep learning. IEEE Trans IndustrInf 15(7):3952–3961CrossRef Hong C, Yu J, Zhang J, Jin X, Lee KH (2018) Multimodal face-pose estimation with multitask manifold deep learning. IEEE Trans IndustrInf 15(7):3952–3961CrossRef
32.
Zurück zum Zitat Hong C, Yu J, Wan J, Tao D, Wang M (2015) Multimodal deep autoencoder for human pose recovery. IEEE Trans Image Process 24(12):5659–5670MathSciNetCrossRef Hong C, Yu J, Wan J, Tao D, Wang M (2015) Multimodal deep autoencoder for human pose recovery. IEEE Trans Image Process 24(12):5659–5670MathSciNetCrossRef
33.
Zurück zum Zitat Gao Q, Liu J, Ju Z, Zhang X (2019) Dual-hand detection for human–robot interaction by a parallel network based on hand detection and body pose estimation. IEEE Trans Industr Electron 66(12):9663–9672CrossRef Gao Q, Liu J, Ju Z, Zhang X (2019) Dual-hand detection for human–robot interaction by a parallel network based on hand detection and body pose estimation. IEEE Trans Industr Electron 66(12):9663–9672CrossRef
34.
Zurück zum Zitat Madadi M, Bertiche H, Escalera S (2020) SMPLR: Deep learning based SMPL reverse for 3D human pose and shape recovery. Pattern Recogn 25:107472CrossRef Madadi M, Bertiche H, Escalera S (2020) SMPLR: Deep learning based SMPL reverse for 3D human pose and shape recovery. Pattern Recogn 25:107472CrossRef
35.
Zurück zum Zitat Abubakar A, Ajuji M, Usman YI (2020) Comparison of deep transfer learning techniques in human skin burns discrimination. ApplSystInnov 3(2):20 Abubakar A, Ajuji M, Usman YI (2020) Comparison of deep transfer learning techniques in human skin burns discrimination. ApplSystInnov 3(2):20
36.
Zurück zum Zitat Alp Güler R, Neverova N, Kokkinos I (2018) Densepose: dense human pose estimation in the wild. In: Proceedings of the IEEE conference on computer vision and pattern recognition 2018, pp 7297–7306 Alp Güler R, Neverova N, Kokkinos I (2018) Densepose: dense human pose estimation in the wild. In: Proceedings of the IEEE conference on computer vision and pattern recognition 2018, pp 7297–7306
Metadaten
Titel
Densemask RCNN: A Hybrid Model for Skin Burn Image Classification and Severity Grading
verfasst von
C. Pabitha
B. Vanathi
Publikationsdatum
21.11.2020
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 1/2021
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-020-10387-5

Weitere Artikel der Ausgabe 1/2021

Neural Processing Letters 1/2021 Zur Ausgabe

Neuer Inhalt