Skip to main content

2020 | OriginalPaper | Buchkapitel

Marine Organisms Tracking and Recognizing Using YOLO

verfasst von : Tomoki Uemura, Huimin Lu, Hyoungseop Kim

Erschienen in: 2nd EAI International Conference on Robotic Sensor Networks

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

A system that investigates deep sea automatically has never developed. A purpose of this study is developing such a system. We employed a technique of recognition and tracking of multi-objects, called “You Only Look Once: YOLO.” This method provides us very fast and accurate tracker. In our system, we remove the haze, which is caused by turbidity of water, from image. After its process, we apply “YOLO” to tracking and recognizing the marine organisms, which includes shrimp, squid, crab, and shark. Our developed system shows generally satisfactory performance.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Redmon, J., Divvala, S. K., Girshick, R. B., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 779–788). Redmon, J., Divvala, S. K., Girshick, R. B., & Farhadi, A. (2016). You only look once: Unified, real-time object detection. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 779–788).
2.
Zurück zum Zitat Redmon, J., & Farhadi, A. (2017). YOLO9000: Better, faster, stronger. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 6517–6525). Redmon, J., & Farhadi, A. (2017). YOLO9000: Better, faster, stronger. In 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp. 6517–6525).
3.
Zurück zum Zitat Li, Y., Lu, H., Li, J., Li, X., Li, Y., & Serikawa, S. (2016). Underwater image de-scattering and classification by deep neural network. Computers and Electrical Engineering, 54, 68–77.CrossRef Li, Y., Lu, H., Li, J., Li, X., Li, Y., & Serikawa, S. (2016). Underwater image de-scattering and classification by deep neural network. Computers and Electrical Engineering, 54, 68–77.CrossRef
4.
Zurück zum Zitat Lu, H., Li, Y., Uemura, T., Ge, Z., Xu, X., He, L., Serikawa, S., & Kim, H. (2017). FDCNet: Filtering deep convolutional network for marine organism classification. Multimedia Tools and Applications, 77, 21847–21860.CrossRef Lu, H., Li, Y., Uemura, T., Ge, Z., Xu, X., He, L., Serikawa, S., & Kim, H. (2017). FDCNet: Filtering deep convolutional network for marine organism classification. Multimedia Tools and Applications, 77, 21847–21860.CrossRef
5.
Zurück zum Zitat Lu, H., Li, Y., Mu, S., Wang, D., Kim, H., & Serikawa, S. (2017). Motor anomaly detection for unmanned aerial vehicles using reinforcement learning. IEEE Internet of Things, 5(4), 2315–2322.CrossRef Lu, H., Li, Y., Mu, S., Wang, D., Kim, H., & Serikawa, S. (2017). Motor anomaly detection for unmanned aerial vehicles using reinforcement learning. IEEE Internet of Things, 5(4), 2315–2322.CrossRef
6.
Zurück zum Zitat Lu, H., Li, Y., Chen, M., Kim, H., & Serikawa, S. (2018). Brain intelligence: Go beyond artificial intelligence. Mobile Networks and Application, 23(2), 368–375.CrossRef Lu, H., Li, Y., Chen, M., Kim, H., & Serikawa, S. (2018). Brain intelligence: Go beyond artificial intelligence. Mobile Networks and Application, 23(2), 368–375.CrossRef
7.
Zurück zum Zitat Lu, H., Li, Y., Nakashima, S., & Serikawa, S. (2016). Turbidity underwater image restoration using spectral properties and light compensation. IEICE Transactions on Information and Systems, E-99D(1), 219–226.CrossRef Lu, H., Li, Y., Nakashima, S., & Serikawa, S. (2016). Turbidity underwater image restoration using spectral properties and light compensation. IEICE Transactions on Information and Systems, E-99D(1), 219–226.CrossRef
8.
Zurück zum Zitat Lu, H., Li, Y., Zhang, L., & Serikawa, S. (2015). Contrast enhancement for images in turbid water. Journal of Optical Society of America A, 32(5), 886–893.CrossRef Lu, H., Li, Y., Zhang, L., & Serikawa, S. (2015). Contrast enhancement for images in turbid water. Journal of Optical Society of America A, 32(5), 886–893.CrossRef
9.
Zurück zum Zitat Serikawa, S., & Lu, H. (2014). Underwater image dehazing using joint trilateral filter. Computers and Electrical Engineering, 40(1), 41–50.CrossRef Serikawa, S., & Lu, H. (2014). Underwater image dehazing using joint trilateral filter. Computers and Electrical Engineering, 40(1), 41–50.CrossRef
Metadaten
Titel
Marine Organisms Tracking and Recognizing Using YOLO
verfasst von
Tomoki Uemura
Huimin Lu
Hyoungseop Kim
Copyright-Jahr
2020
DOI
https://doi.org/10.1007/978-3-030-17763-8_6

Neuer Inhalt