Skip to main content
Top
Published in:
Cover of the book

Open Access 2018 | OriginalPaper | Chapter

Smart Endoscope—Firmware Complex for Real-Time Analysis and Recognition of Endoscopic Videos

Authors : K. U. Erendgenova, E. D. Fedorov, R. M. Kadushnikov, O. A. Kulagina, V. V. Mizgulin, D. I. Starodubov, S. I. Studenok

Published in: Proceedings of the Scientific-Practical Conference "Research and Development - 2016"

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

The method for analyzing endoscopic video images, obtained with high-resolution endoscopes, and featuring gastric and colon mucosa microstructures is proposed. The method was implemented in the form of a highly productive “Smart Endoscope” firmware complex used for real-time endoscopic video analysis supported with neural network. Complex was tested, and the accuracy analysis of neoplasm recognition was performed.

Introduction

Endoscopic research is a minimally invasive medical procedure that allows examination of body cavities, including gastric tract. It is the most reliable method for revealing early stages of gastric mucosa malignant neoplasms, and pre-cancer diseases that increase chances for cancer development [2, P.88].
First flexible tract endoscopes appeared in early 1960s, and an ability to transform optical signal into electric impulses allowed displaying images, and storing them in analog, and later—digital form. In late 1980s—early 1990s, with the expanded availability of computers, and the development of programming languages and environments allowed development of information systems for search and classification of gastric surface epithelial neoplasms, referred to as CADs (computer-aided diagnosis systems). CADs produce additional diagnostic data or preliminary diagnosis (either posterior or real time), or series of marked images used by a specialist to select a case that best matches the observed one. Recent decade was marked with an increase of interest directed toward CADs [10, P.73], which can be related to accumulation of an extensive set of image and video files, transforming visual examination, analysis, interpretation, and classification of that data into an extremely labor-intensive process. Besides that, developments in the area of wireless capsule endoscopy made the task of creating automated image analysis and diagnostic methods extremely relevant.
Medic performing an endoscopic examination has to possess substantial experience and remain focused during the whole procedure. Endoscopic image recognition accuracy greatly depends upon the specialist qualification, and is usually considered to be between 80 and 100% [5, P.174; 1, P.A507]. Modern CADs come close to visual examination by this criterion; however, they require further improvement before implementation into clinical practice [6, P.15; 7, P.350; 4, P.7130]. It is necessary to refine the algorithms used to select, analyze, and classify characteristic image elements while medical community reaches consensus on the topic of visual criteria of endoscopic image assessment [3, P.471; 8, P.526; 11, P.17].
The purpose of research was to develop and implement methodology of making diagnostic solutions based upon high-resolution endoscopic images in a CAD system in order to improve quality of diagnosis for stomach and colon oncological diseases.
Research goals included the following:
  • Formalization of visual criteria used to assess endoscopic image, and definition of computed microstructure parameters;
  • Development of algorithms for analysis of gastric and colon mucosa microstructure;
  • Assessment of gastric and colon neoplasm recognition algorithm accuracy;
  • Development of a highly productive endoscopic complex for running real-time analysis of endoscopic videos.

Gastric and Colon Mucosa Microstructure Analysis and Recognition Algorithm

An algorithm used to analyze microstructure of stomach or colon, which consists of capillaries, grands, and their excretory ducts, includes the following stages:
  • Selecting boundaries of interest areas––focused part of an image without background, flares, and other artifacts that complicate processing (Fig. 1a);
  • Selecting pits and capillaries (Fig. 1b), skeletonizing (Fig. 1c), and thickness calculation (Fig. 1d).
Skeleton of the capillaries is built using Zhang–Suen algorithm [12, P.237]. Image of the pits is obtained by subtracting image of capillaries from the part of an initial image limited by an area interest. At each skeleton branch, secants that are perpendicular to the branches were drawn with equal intervals. Intensity profiles are built along the secants, and used to determine thickness of pits and capillaries. Distortion degree for pit and capillary boundaries is characterized by a relative intensity drop that is calculated using formula (1)
$${\text{drop}} = \frac{{{ \hbox{min} }\left( { 1 ;\frac{\text{LeftMin}}{\text{Max}}} \right) + { \hbox{min} }\left( { 1 ;\frac{\text{RightMin}}{\text{Max}}} \right)}}{ 2},$$
(1)
where Max is the intensity of a central pixel; and LeftMin and RightMin are the closest local minimums located left and right of a central maximum.
In order to perform recognition the interest area is divided into squares with preset size. For each square six parameters are calculated: C 1—ratio of an average pit thickness to average capillary thickness; C 2—standard deviation of pit thickness; C 3—average value for relative intensity drop in secant profiles for the pits (distortion), calculated using (1); C 4—share of single-node clusters; C 5—share of two-node clusters (rods); C 6—number of end points. Coefficient that characterizes mucosa within a given square is calculated using the following formula:
$$K_{a} = \sum\limits_{i = 1 \ldots 6} {B_{i} p_{i} } ,$$
(2)
where B i is a coefficient equal to 0 if C i  ≥ C i2, and equal to 1 if C i  < C i2; and p i and C i2 are threshold values determined by an expert.
The range of K a values is broken into four equal intervals. Depending upon the value of K a coefficient, the boundaries of square area are color coded, and serve as local indicators of status for the parts of mucosa (Fig. 2a).
In order to characterize the overall state of a part of gastric or colon mucosa, presented on an image, as a whole, the value of C 1 for each square is additionally compared with expert-defined threshold value C 10. Squares with C 1 ≥ C 10 are marked as key ones. For eight squares connected with key one, the following pairs of conditions are checked: (a) \(C_{2} \ge \widetilde{N}_{20}\) AND \(\widetilde{N}_{1} \ge \widetilde{N}_{11}\), and (b) \(C_{3} \ge \widetilde{N}_{30}\) AND \(\widetilde{N}_{1} \ge \widetilde{N}_{11}\). The squares, for each at least one pair of conditions (a or b) is true, together with key square form the risk zone R. Image can contain several risk zones. Indicative coloring of risk zones is performed based upon the results of comparing values of C 1, C 2, and C 3, averaged for a zone, with threshold values set by an expert. After examination, the boundaries of risk zone squares get colored with indicative colors (Fig. 2b).
Endoscopic images were also processed using GoogLeNet neural network. Training set of endoscopic images contained 200 neoplasm cases, and 300 cases where no neoplasms were present. All endoscopy images were loaded into a web atlas (http://​endoscopy.​siams.​com). Histologic composition was specified for each image in accordance with WHO classification of gastric tumors of 2010, considering Vienna classification of gastrointestinal epithelial neoplasia of 2002 [9, P.52]. Medical practitioners applied web atlas tools to loaded histological images from the training set in order to select interest areas to be examined with different methods including histology. Initial image description in web atlas, created by medical professionals, contained additional data on regularity of mucosa microstructure. No univocal correspondence was determined between the binary regularity criterion, and binary neoplasm criterion, the following model for qualitative assessment of the changes was proposed:
$$K_{n} = p_{1} K_{c} + p_{2} K_{r} ,$$
(3)
where K n is a value for relative coefficient of gastric mucosa change, obtained using neural network; p 1, p 2—weights, K c is a binary neoplasm criterion, and K r is a binary criterion of regularity.
Table 1 presents interpretation of the model. Examples of endoscopic image fragments with assessment variants are presented on Fig. 3(a–d).
Table 1
Neural network-based model for qualitative assessment of mucosal changes
Irregular pattern
Neoplasm
Relative change coefficient, K n
Color markup
Illustration
Yes
No
0
Green
Fig. 1a
No
No
0.25
Yellow
Fig. 1b
Yes
Yes
0.75
Orange
Fig. 1c
No
Yes
1
Red
Fig. 1d
On the images marked up by medics fragments were cut from selected interest areas using sliding window. Empirically selected window size was 135 × 135 pixel. For each selected fragment, a value of relative change coefficient K n was assigned. That resulted in a training set containing more than 30,000 images, equally distributed among four classes corresponding with values of K n . Neural network was additionally trained to remove glares, recognize background, distorted fragments, and blood. According to cross-validation of neural network use results, the accuracy of assessing K n value with regard to initial markup was equal to 99%. Example of an image marked up using neural network is presented on Fig. 3. Markup is an overlapping regular grid, with corresponding calculated value of K n and color marker. For each endoscopic image, the relative shares were calculated for the cells of each color. Descriptor of an endoscopic image is a four-dimensional vector D 4 that represents calculated shares.
When using neural network the accuracy of assessing K n , obtained using cross-validation with regard to initial markup When using neural network the accuracy of assessing K n , obtained using cross-validation with regard to initial markup was equal to 93%.

Complex Architecture

Complex architecture is presented on Fig. 4, it is distributed and depends upon user roles. Real-time mode operator (role 1) uses the complex during the endoscopic examination, so he has to use the complete complex functionality of autonomous mode. Hardware of an autonomous module (module 1 on Fig. 4) is based upon the high-capacity graphic card, and includes wide-screen sensor display used to visualize recognition results and hints. Signal from endoscope recorder is transmitted to one of the graphic cards of an autonomous module that interacts with local server (module 2) using https protocol. During the operation of an autonomous module, marked up images are uploaded to local server.
Deferred mode operator (role 2) can change image markup, set values of image attributes, and place image to a training set using web interface. Images placed into training set can be reached by remote operator of a local server (role 3) through the web atlas mode. In order to supplement training set, selected image fragments transmit to a central server (module 3) using https. Training set on a central server is obtained by combining image collections from local servers. Central server operator (role 4) controls image quality.
Neural network training is performed using an external cluster (module 4) in order to not hamper work of a central server operator Neural network training takes several hours, and is regularly performed once in several days. Trained neural network represented by a complex mathematical model is translated from central to local servers, and then—to autonomous modules, in order to revalidate recognition results.

Conclusions

The research allowed transformation of visual assessment criteria provided by endoscopists into a set of formal characteristics describing microstructure of gastric and colon mucosa. That was an important step in development of “Smart endoscope”—firmware complex implementing automated endoscopic image analysis and recognition algorithms in real-time mode. Combination of the neoplasm recognition approaches described above allowed obtaining experimental accuracy comparable with accuracy of specialist analysis.
Research results allow hoping that future wide use of CAD systems in endoscopy will allow decreasing analysis subjectivity, improve quality of diagnostics, and cut research cost and time. Use of CADs as pathology classification tool for education and diagnostics will facilitate improving qualification of young specialists, and distribution of expert knowledge that becomes a base for the system intellectual and analytic kernel.

Acknowledgments

This work was carried out within the course of a project performed by the SIAMS Company and supported by the Ministry of Education and Science of the Russian Federation. Subsidiary Agreement no. 14.576.21.0018 27. June 2014. Applied research (project) UID RFMEFI57614X0018.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://​creativecommons.​org/​licenses/​by/​4.​0/​), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Literature
1.
go back to reference Buntseva, O.A., Fedorov, E.D., Erendgenova, K.Y.: Delicate features and subtle details for characterization of gastric epithelial neoplasias by high defenition and magnified narrow-band endoscopy. Proceedings of 24th UEG Week 2016, pp. A507–A508. Vienna, Austria (2016) Buntseva, O.A., Fedorov, E.D., Erendgenova, K.Y.: Delicate features and subtle details for characterization of gastric epithelial neoplasias by high defenition and magnified narrow-band endoscopy. Proceedings of 24th UEG Week 2016, pp. A507–A508. Vienna, Austria (2016)
2.
go back to reference Buntseva, O.A., Galkova, Z.V., Plakhov, R.V.: Modern endoscopic diagnosis of precancerous lesions and early cancers of the stomach and colon using computer decision support systems. Exp. Clin. Gastroenterol. 10, 88–96 (2014) Buntseva, O.A., Galkova, Z.V., Plakhov, R.V.: Modern endoscopic diagnosis of precancerous lesions and early cancers of the stomach and colon using computer decision support systems. Exp. Clin. Gastroenterol. 10, 88–96 (2014)
3.
go back to reference Ezoe, Y., Muto, M., Horimatsu, T., Minashi, K., Yano, T., Sano, Y., Chiba, T., Ohtsu, A.: Magnifying narrow-band imaging versus magnifying white-light imaging for the differential diagnosis of gastric small depressive lesions: a prospective study. Gastrointest. Endosc. 71(3), 477–484 (2010) Ezoe, Y., Muto, M., Horimatsu, T., Minashi, K., Yano, T., Sano, Y., Chiba, T., Ohtsu, A.: Magnifying narrow-band imaging versus magnifying white-light imaging for the differential diagnosis of gastric small depressive lesions: a prospective study. Gastrointest. Endosc. 71(3), 477–484 (2010)
4.
go back to reference Gadermayr, M., Kogler, H., Karla, M.: Computer-aided texture analysis combined with experts’ knowledge: Improving endoscopic celiac disease diagnosis. World J. Gastroenterol. 22(31), 7124–7134 Gadermayr, M., Kogler, H., Karla, M.: Computer-aided texture analysis combined with experts’ knowledge: Improving endoscopic celiac disease diagnosis. World J. Gastroenterol. 22(31), 7124–7134
5.
go back to reference Gadermayr, M., Uhl, A., Vecsei, A.: The effect of endoscopic lens distortion correction on physicians’ diagnosis performance. Proceedings of BVM, pp. 174–179 (2014) Gadermayr, M., Uhl, A., Vecsei, A.: The effect of endoscopic lens distortion correction on physicians’ diagnosis performance. Proceedings of BVM, pp. 174–179 (2014)
6.
go back to reference Gadermayr, M., Uhl, A., Vecsei, A.: Getting one step closer to fully automatized celiac disease diagnosis. Proceedings of IPTA, pp. 13–17 (2014) Gadermayr, M., Uhl, A., Vecsei, A.: Getting one step closer to fully automatized celiac disease diagnosis. Proceedings of IPTA, pp. 13–17 (2014)
7.
go back to reference Hegenbart, S., Uhl, A., Vecsei, A.: Survey on computer aided decision support for diagnosis of celiac disease. Comput. Biol. Med. 65, 348–358 (2015)CrossRef Hegenbart, S., Uhl, A., Vecsei, A.: Survey on computer aided decision support for diagnosis of celiac disease. Comput. Biol. Med. 65, 348–358 (2015)CrossRef
8.
go back to reference Kato, M., Kaise, M., Yonezawa, J., Toyoizumi, H., Yoshimura, N., Yoshida, Y., Kawamura, M., Tajiri, H.: Magnifying endoscopy with narrow-band imaging achieves superior accuracy in the differential diagnosis of superficial gastric lesions identified with white-light endoscopy: a prospective study. Gastrointest. Endosc. 72(3), 523–529 (2010) Kato, M., Kaise, M., Yonezawa, J., Toyoizumi, H., Yoshimura, N., Yoshida, Y., Kawamura, M., Tajiri, H.: Magnifying endoscopy with narrow-band imaging achieves superior accuracy in the differential diagnosis of superficial gastric lesions identified with white-light endoscopy: a prospective study. Gastrointest. Endosc. 72(3), 523–529 (2010)
9.
go back to reference Lauwers, G.Y., Carneiro, F., Graham, D.Y.: Gastric carcinoma. In: Bosman, F.T., Carneiro, F., Hruban, R.H., Theise, N.D. (eds.) WHO classification of tumours of the digestive system, 4th edn, pp. 48–58. IARC Press, Lyon (2010) Lauwers, G.Y., Carneiro, F., Graham, D.Y.: Gastric carcinoma. In: Bosman, F.T., Carneiro, F., Hruban, R.H., Theise, N.D. (eds.) WHO classification of tumours of the digestive system, 4th edn, pp. 48–58. IARC Press, Lyon (2010)
10.
go back to reference Liedlgruber, M., Uhl, A.: Computer-aided decision support systems for endoscopy in the gastrointestinal tract: a review source of the document. IEEE Rev. Biomed. Eng. 4, 73–88 Liedlgruber, M., Uhl, A.: Computer-aided decision support systems for endoscopy in the gastrointestinal tract: a review source of the document. IEEE Rev. Biomed. Eng. 4, 73–88
11.
go back to reference Omori, T., Kamiya, Y., Tahara, T., Shibata, T., Nakamura, M., Yonemura, J., Okubo, M., Yoshioka, D., Ishizuka, T., Maruyama, N., Kamano, T., Fujita, H., Nakagawa, Y., Nagasaka, M., Iwata, M., Arisawa, T., Hirata, I.: Correlation between magnifying narrow band imaging and histopathology in gastric protruding/or polypoid lesions: a pilot feasibility trial. BMC Gastroenterol. 12, 17 (2012) Omori, T., Kamiya, Y., Tahara, T., Shibata, T., Nakamura, M., Yonemura, J., Okubo, M., Yoshioka, D., Ishizuka, T., Maruyama, N., Kamano, T., Fujita, H., Nakagawa, Y., Nagasaka, M., Iwata, M., Arisawa, T., Hirata, I.: Correlation between magnifying narrow band imaging and histopathology in gastric protruding/or polypoid lesions: a pilot feasibility trial. BMC Gastroenterol. 12, 17 (2012)
12.
go back to reference Zhang, T.Y., Suen, C.Y.: A fast parallel algorithm for thinning digital patterns. Image Process. Comput. Vision 27(3), 236–239 (1984) Zhang, T.Y., Suen, C.Y.: A fast parallel algorithm for thinning digital patterns. Image Process. Comput. Vision 27(3), 236–239 (1984)
Metadata
Title
Smart Endoscope—Firmware Complex for Real-Time Analysis and Recognition of Endoscopic Videos
Authors
K. U. Erendgenova
E. D. Fedorov
R. M. Kadushnikov
O. A. Kulagina
V. V. Mizgulin
D. I. Starodubov
S. I. Studenok
Copyright Year
2018
DOI
https://doi.org/10.1007/978-3-319-62870-7_3

Premium Partners