Skip to main content

2022 | OriginalPaper | Buchkapitel

3D Scenery Construction of Agricultural Environments for Robotics Awareness

verfasst von : Aristotelis Christos Tagarakis, Damianos Kalaitzidis, Evangelia Filippou, Lefteris Benos, Dionysis Bochtis

Erschienen in: Information and Communication Technologies for Agriculture—Theme III: Decision

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Depth cameras started to gain popularity in agricultural applications during the last years. This type of cameras has been implemented mainly for three-dimensional (3D) reconstruction of objects in indoor or outdoor sceneries. The use of such cameras in the construction of 3D models for simulation purposes of complex structures that are usually met in nature, such as trees and other plants, is a great challenge. Remarkably, agricultural environments are extremely complex. Thus, the proper setup and implementation of such technologies is particularly important in order to attain useable data. So far, the depth information collected using these cameras varies among different objects’ structure and sensing conditions due to the uncertainty of the outdoor environment. The use of a specific methodology using color and depth images gives the opportunity to extract geometrical characteristics information about point clouds of the targeted objects. This chapter explores the different technologies used by depth cameras and presents several applications concerning indoor and outdoor environments by presenting indicative scenarios for agricultural applications. Towards that direction, a 3D reconstruction of trees was established producing point clouds from Red Green Blue Depth (RGB-D) images acquired in real field conditions. The point cloud samples of trees were collected using an unmanned ground vehicle (UGV) and imported in Gazebo in order to visualize a simulation of the environment. This simulation technique can be used for testing and evaluating the navigation of robotic systems. By further analyzing the resulted 3D point clouds, various geometrical measurements of the simulated samples, such as the volume or the height of tree canopies, can be calculated. Possible weaknesses of this procedure are mainly attributed to the camera’s limitations and the sampling parameters. However, results show that it is possible to establish a suitable simulation environment to implement it in several agricultural applications by utilizing automated unmanned robotic platforms.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Izadi, S., Kim, D., Hilliges, O., Molyneaux, D., Newcombe, R., Kohli, P., Shotton, J., Hodges, S., Freeman, D., Davison, A., et al. (2011). KinectFusion: Real-time 3D reconstruction and interaction using a moving depth camera. In Proceedings of the UIST’11 - Proceedings of the 24th annual ACM symposium on user interface software and technology. ACM Press, New York, NY, pp. 559–568. Izadi, S., Kim, D., Hilliges, O., Molyneaux, D., Newcombe, R., Kohli, P., Shotton, J., Hodges, S., Freeman, D., Davison, A., et al. (2011). KinectFusion: Real-time 3D reconstruction and interaction using a moving depth camera. In Proceedings of the UIST’11 - Proceedings of the 24th annual ACM symposium on user interface software and technology. ACM Press, New York, NY, pp. 559–568.
12.
Zurück zum Zitat Mrovlje, J., & Vrani, D. (2008). Distance measuring based on stereoscopic pictures. In 9th Int. PhD Work. Syst. Control Young Gener. Viewp. Mrovlje, J., & Vrani, D. (2008). Distance measuring based on stereoscopic pictures. In 9th Int. PhD Work. Syst. Control Young Gener. Viewp.
13.
Zurück zum Zitat Liu, Z., & Chen, T. (2009). Distance measurement system based on binocular stereo vision. In Proceedings of the IJCAI international joint conference on artificial intelligence; pp. 456–459. Liu, Z., & Chen, T. (2009). Distance measurement system based on binocular stereo vision. In Proceedings of the IJCAI international joint conference on artificial intelligence; pp. 456–459.
15.
Zurück zum Zitat Bertheloot, J., Cournède, P.-H., & Andrieu, B. PART OF A SPECIAL ISSUE ON FUNCTIONAL-STRUCTURAL PLANT MODELLING—NEMA, a functional-structural model of nitrogen economy within wheat culms after flowering. I. Model description. Annals of Botany, 108, 1085–1096. https://doi.org/10.1093/aob/mcr119 Bertheloot, J., Cournède, P.-H., & Andrieu, B. PART OF A SPECIAL ISSUE ON FUNCTIONAL-STRUCTURAL PLANT MODELLING—NEMA, a functional-structural model of nitrogen economy within wheat culms after flowering. I. Model description. Annals of Botany, 108, 1085–1096. https://​doi.​org/​10.​1093/​aob/​mcr119
16.
Zurück zum Zitat Vos, J., Evers, J. B., Buck-Sorlin, G. H., Andrieu, B., Chelle, M., & De Visser, P. H. B. (2010). Functional-structural plant modelling: A new versatile tool in crop science. Journal of Experimental Botany, 61, 2101–2115.CrossRef Vos, J., Evers, J. B., Buck-Sorlin, G. H., Andrieu, B., Chelle, M., & De Visser, P. H. B. (2010). Functional-structural plant modelling: A new versatile tool in crop science. Journal of Experimental Botany, 61, 2101–2115.CrossRef
17.
Zurück zum Zitat Zhu, C., Zhang, X., Hu, B., & Jaeger, M. (2008). Reconstruction of tree crown shape from scanned data. In Proceedings of the lecture notes in computer science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer, Berlin, Heidelberg, Vol. 5093 LNCS, pp. 745–756. Zhu, C., Zhang, X., Hu, B., & Jaeger, M. (2008). Reconstruction of tree crown shape from scanned data. In Proceedings of the lecture notes in computer science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Springer, Berlin, Heidelberg, Vol. 5093 LNCS, pp. 745–756.
18.
Zurück zum Zitat Omasa, K., Hosoi, F., & Konishi, A. (2007). 3D lidar imaging for detecting and understanding plant responses and canopy structure. Journal of Experimental Botany, 58, 881–898.CrossRef Omasa, K., Hosoi, F., & Konishi, A. (2007). 3D lidar imaging for detecting and understanding plant responses and canopy structure. Journal of Experimental Botany, 58, 881–898.CrossRef
19.
Zurück zum Zitat Binney, J., & Sukhatme, G. S. (2009). 3D tree reconstruction from laser range data. In Proceedings of the proceedings - IEEE international conference on robotics and automation, pp. 1321–1326. Binney, J., & Sukhatme, G. S. (2009). 3D tree reconstruction from laser range data. In Proceedings of the proceedings - IEEE international conference on robotics and automation, pp. 1321–1326.
22.
Zurück zum Zitat Orriordan, A., Newe, T., Dooly, G., & Toal, D. (2019). Stereo vision sensing: Review of existing systems. In Proceedings of the Proceedings of the international conference on sensing technology, ICST; IEEE computer society, Vol. 2018-Decem, pp. 178–184. Orriordan, A., Newe, T., Dooly, G., & Toal, D. (2019). Stereo vision sensing: Review of existing systems. In Proceedings of the Proceedings of the international conference on sensing technology, ICST; IEEE computer society, Vol. 2018-Decem, pp. 178–184.
25.
Zurück zum Zitat Kirsten, E., Inocencio, L. C., Veronez, M. R., Da Silveira, L. G., Bordin, F., & Marson, F. P. (2018). 3D data acquisition using stereo camera. In Proceedings of the international geoscience and remote sensing symposium (IGARSS), Institute of Electrical and Electronics Engineers Inc., Vol. 2018-July, pp. 9214–9217. Kirsten, E., Inocencio, L. C., Veronez, M. R., Da Silveira, L. G., Bordin, F., & Marson, F. P. (2018). 3D data acquisition using stereo camera. In Proceedings of the international geoscience and remote sensing symposium (IGARSS), Institute of Electrical and Electronics Engineers Inc., Vol. 2018-July, pp. 9214–9217.
28.
Zurück zum Zitat Tran, T. M., Ta, K. D., Hoang, M., Nguyen, T. V., Nguyen, N. D., & Pham, G. N. (2020). A study on determination of simple objects volume using ZED stereo camera based on 3D-points and segmentation images. International Journal of Emerging Engineering Research and Technology, 8, 1990–1995. https://doi.org/10.30534/ijeter/2020/85852020CrossRef Tran, T. M., Ta, K. D., Hoang, M., Nguyen, T. V., Nguyen, N. D., & Pham, G. N. (2020). A study on determination of simple objects volume using ZED stereo camera based on 3D-points and segmentation images. International Journal of Emerging Engineering Research and Technology, 8, 1990–1995. https://​doi.​org/​10.​30534/​ijeter/​2020/​85852020CrossRef
30.
Zurück zum Zitat Burdziakowski, P. (2018). Low cost real time UAV stereo photogrammetry modelling technique-accuracy considerations. In Proceedings of the E3S web of conferences, EDP sciences, Vol. 63, p. 00020. Burdziakowski, P. (2018). Low cost real time UAV stereo photogrammetry modelling technique-accuracy considerations. In Proceedings of the E3S web of conferences, EDP sciences, Vol. 63, p. 00020.
31.
Zurück zum Zitat Gupta, T., & Li, H. (2017). Indoor mapping for Smart Cities-An affordable approach: Using kinect sensor and ZED stereo camera. In Proceedings of the 2017 international conference on indoor positioning and indoor navigation, IPIN 2017, Institute of Electrical and Electronics Engineers Inc., Vol. 2017-Janua, pp. 1–8. Gupta, T., & Li, H. (2017). Indoor mapping for Smart Cities-An affordable approach: Using kinect sensor and ZED stereo camera. In Proceedings of the 2017 international conference on indoor positioning and indoor navigation, IPIN 2017, Institute of Electrical and Electronics Engineers Inc., Vol. 2017-Janua, pp. 1–8.
32.
Zurück zum Zitat Halmetschlager-Funek, G., Suchi, M., Kampel, M., & Vincze, M. (2019). An empirical evaluation of ten depth cameras: Bias, precision, lateral noise, different lighting conditions and materials, and multiple sensor setups in indoor environments. IEEE Robotics and Automation Magazine, 26, 67–77. https://doi.org/10.1109/MRA.2018.2852795CrossRef Halmetschlager-Funek, G., Suchi, M., Kampel, M., & Vincze, M. (2019). An empirical evaluation of ten depth cameras: Bias, precision, lateral noise, different lighting conditions and materials, and multiple sensor setups in indoor environments. IEEE Robotics and Automation Magazine, 26, 67–77. https://​doi.​org/​10.​1109/​MRA.​2018.​2852795CrossRef
34.
Zurück zum Zitat Wang, Q., & Zhang, Q. (2013). Three-dimensional reconstruction of a dormant tree using RGB-D Cameras. In Proceedings of the 2013 Kansas City, Missouri, July 21–July 24, 2013, American Society of Agricultural and Biological Engineers, St. Joseph, MI, Vol. 2, pp. 1341–1350. Wang, Q., & Zhang, Q. (2013). Three-dimensional reconstruction of a dormant tree using RGB-D Cameras. In Proceedings of the 2013 Kansas City, Missouri, July 21–July 24, 2013, American Society of Agricultural and Biological Engineers, St. Joseph, MI, Vol. 2, pp. 1341–1350.
36.
Zurück zum Zitat van der Heijden, G., Song, Y., Horgan, G., Polder, G., Dieleman, A., Bink, M., Palloix, A., van Eeuwijk, F., & Glasbey, C. (2012). SPICY: Towards automated phenotyping of large pepper plants in the greenhouse. Functional Plant Biology, 39, 870. https://doi.org/10.1071/FP12019CrossRef van der Heijden, G., Song, Y., Horgan, G., Polder, G., Dieleman, A., Bink, M., Palloix, A., van Eeuwijk, F., & Glasbey, C. (2012). SPICY: Towards automated phenotyping of large pepper plants in the greenhouse. Functional Plant Biology, 39, 870. https://​doi.​org/​10.​1071/​FP12019CrossRef
37.
Zurück zum Zitat Kazmi, W., Foix, S., & Alenyà, G. (2012). Plant leaf imaging using time of flight camera under sunlight, shadow and room conditions. In Proceedings of the 2012 IEEE International Symposium on Robotic and Sensors Environments, ROSE 2012 – Proceedings, IEEE Computer Society, pp. 192–197. Kazmi, W., Foix, S., & Alenyà, G. (2012). Plant leaf imaging using time of flight camera under sunlight, shadow and room conditions. In Proceedings of the 2012 IEEE International Symposium on Robotic and Sensors Environments, ROSE 2012 – Proceedings, IEEE Computer Society, pp. 192–197.
42.
Zurück zum Zitat Gené-Mola, J., Gregorio, E., Auat Cheein, F., Guevara, J., Llorens, J., Sanz-Cortiella, R., Escolà, A., & Rosell-Polo, J. R. (2020). Fruit detection, yield prediction and canopy geometric characterization using LiDAR with forced air flow. Computers and Electronics in Agriculture, 168, 105121. https://doi.org/10.1016/j.compag.2019.105121CrossRef Gené-Mola, J., Gregorio, E., Auat Cheein, F., Guevara, J., Llorens, J., Sanz-Cortiella, R., Escolà, A., & Rosell-Polo, J. R. (2020). Fruit detection, yield prediction and canopy geometric characterization using LiDAR with forced air flow. Computers and Electronics in Agriculture, 168, 105121. https://​doi.​org/​10.​1016/​j.​compag.​2019.​105121CrossRef
46.
47.
Zurück zum Zitat Fu, L., Gao, F., Wu, J., Li, R., Karkee, M., & Zhang, Q. (2020). Application of consumer RGB-D cameras for fruit detection and localization in field: A critical review. Computers and Electronics in Agriculture, 177, 105687.CrossRef Fu, L., Gao, F., Wu, J., Li, R., Karkee, M., & Zhang, Q. (2020). Application of consumer RGB-D cameras for fruit detection and localization in field: A critical review. Computers and Electronics in Agriculture, 177, 105687.CrossRef
50.
Zurück zum Zitat De Silva, K. T. D. S., Cooray, B. P. A., Chinthaka, J. I., Kumara, P. P., & Sooriyaarachchi, S. J. (2019). Comparative analysis of octomap and RTABMap for multi-robot disaster site mapping (pp. 433–438). Institute of Electrical and Electronics Engineers (IEEE). De Silva, K. T. D. S., Cooray, B. P. A., Chinthaka, J. I., Kumara, P. P., & Sooriyaarachchi, S. J. (2019). Comparative analysis of octomap and RTABMap for multi-robot disaster site mapping (pp. 433–438). Institute of Electrical and Electronics Engineers (IEEE).
51.
Zurück zum Zitat Wang, Z. (2020). Digital twin technology. In Industry 4.0 - Impact on Intelligent Logistics and Manufacturing, IntechOpen. Wang, Z. (2020). Digital twin technology. In Industry 4.0 - Impact on Intelligent Logistics and Manufacturing, IntechOpen.
52.
Zurück zum Zitat Glaessgen, E. H., & Stargel, D. S. (2012). The digital twin paradigm for future NASA and U.S. Air force vehicles. In Proceedings of the Collection of Technical Papers - AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference. Glaessgen, E. H., & Stargel, D. S. (2012). The digital twin paradigm for future NASA and U.S. Air force vehicles. In Proceedings of the Collection of Technical Papers - AIAA/ASME/ASCE/AHS/ASC Structures, Structural Dynamics and Materials Conference.
53.
Zurück zum Zitat Vachalek, J., Bartalsky, L., Rovny, O., Sismisova, D., Morhac, M., & Loksik, M. (2017). The digital twin of an industrial production line within the industry 4.0 concept. In Proceedings of the proceedings of the 2017 21st international conference on process control, PC 2017, Institute of Electrical and Electronics Engineers Inc., pp. 258–262. Vachalek, J., Bartalsky, L., Rovny, O., Sismisova, D., Morhac, M., & Loksik, M. (2017). The digital twin of an industrial production line within the industry 4.0 concept. In Proceedings of the proceedings of the 2017 21st international conference on process control, PC 2017, Institute of Electrical and Electronics Engineers Inc., pp. 258–262.
56.
Zurück zum Zitat Rosen, R., Von Wichert, G., Lo, G., & Bettenhausen, K. D. (2015). About the importance of autonomy and digital twins for the future of manufacturing. Proceedings of the IFAC-Papers, 28, 567–572.CrossRef Rosen, R., Von Wichert, G., Lo, G., & Bettenhausen, K. D. (2015). About the importance of autonomy and digital twins for the future of manufacturing. Proceedings of the IFAC-Papers, 28, 567–572.CrossRef
57.
Zurück zum Zitat Gabor, T., Belzner, L., Kiermeier, M., Beck, M. T., & Neitz, A. (2016). A simulation-based architecture for smart cyber-physical systems. In Proceedings of the Proceedings – 2016 IEEE International Conference on Autonomic Computing, ICAC 2016, Institute of Electrical and Electronics Engineers Inc., pp. 374–379. Gabor, T., Belzner, L., Kiermeier, M., Beck, M. T., & Neitz, A. (2016). A simulation-based architecture for smart cyber-physical systems. In Proceedings of the Proceedings – 2016 IEEE International Conference on Autonomic Computing, ICAC 2016, Institute of Electrical and Electronics Engineers Inc., pp. 374–379.
58.
Zurück zum Zitat Kang, H. S., Lee, J. Y., Choi, S., Kim, H., Park, J. H., Son, J. Y., Kim, B. H., & Noh, S. (2016). Do Smart manufacturing: Past research, present findings, and future directions. International Journal of Precision Engineering and Manufacturing-Green Technology, 3, 111–128. https://doi.org/10.1007/s40684-016-0015-5CrossRef Kang, H. S., Lee, J. Y., Choi, S., Kim, H., Park, J. H., Son, J. Y., Kim, B. H., & Noh, S. (2016). Do Smart manufacturing: Past research, present findings, and future directions. International Journal of Precision Engineering and Manufacturing-Green Technology, 3, 111–128. https://​doi.​org/​10.​1007/​s40684-016-0015-5CrossRef
60.
Zurück zum Zitat Tao, F., Zhang, L., & Laili, Y. (2015). CLPS-GA for energy-aware cloud service scheduling (pp. 191–224). Tao, F., Zhang, L., & Laili, Y. (2015). CLPS-GA for energy-aware cloud service scheduling (pp. 191–224).
61.
Zurück zum Zitat Zhang, M., Sui, F., Liu, A., Tao, F., & Nee, A. Y. C. (2020). Digital twin driven smart product design framework. In Digital twin driven smart design (pp. 3–32). Elsevier.CrossRef Zhang, M., Sui, F., Liu, A., Tao, F., & Nee, A. Y. C. (2020). Digital twin driven smart product design framework. In Digital twin driven smart design (pp. 3–32). Elsevier.CrossRef
62.
Zurück zum Zitat Kousi, N., Gkournelos, C., Aivaliotis, S., Giannoulis, C., Michalos, G., & Makris, S. (2019). Digital twin for adaptation of robots’ behavior in flexible robotic assembly lines. In Proceedings of the procedia manufacturing (Vol. 28, pp. 121–126). Elsevier B.V. Kousi, N., Gkournelos, C., Aivaliotis, S., Giannoulis, C., Michalos, G., & Makris, S. (2019). Digital twin for adaptation of robots’ behavior in flexible robotic assembly lines. In Proceedings of the procedia manufacturing (Vol. 28, pp. 121–126). Elsevier B.V.
63.
Zurück zum Zitat Moshrefzadeh, M., Machl, T., Gackstetter, D., Donaubauer, A., & Kolbe, T. H. (2020). Towards a distributed digital twin of the agricultural landscape. Journal of Digital Landscape Architecture, 5, 173–118. Moshrefzadeh, M., Machl, T., Gackstetter, D., Donaubauer, A., & Kolbe, T. H. (2020). Towards a distributed digital twin of the agricultural landscape. Journal of Digital Landscape Architecture, 5, 173–118.
64.
Zurück zum Zitat Alves, R. G., Souza, G., Maia, R. F., Tran, A. L. H., Kamienski, C., Soininen, J. P., Aquino, P. T., & Lima, F. (2019). A digital twin for smart farming. In Proceedings of the 2019 IEEE global humanitarian technology conference, GHTC 2019, Institute of Electrical and Electronics Engineers Inc. Alves, R. G., Souza, G., Maia, R. F., Tran, A. L. H., Kamienski, C., Soininen, J. P., Aquino, P. T., & Lima, F. (2019). A digital twin for smart farming. In Proceedings of the 2019 IEEE global humanitarian technology conference, GHTC 2019, Institute of Electrical and Electronics Engineers Inc.
65.
Zurück zum Zitat Verdouw, C., & Kruize, J. W. (2017). Digital twins in farm management: Illustrations from the FIWARE accelerators SmartAgriFood and Fractals. In Proceedings of the 7th Asian-Australasian conference on precision agriculture, pp. 1–5. Verdouw, C., & Kruize, J. W. (2017). Digital twins in farm management: Illustrations from the FIWARE accelerators SmartAgriFood and Fractals. In Proceedings of the 7th Asian-Australasian conference on precision agriculture, pp. 1–5.
66.
Zurück zum Zitat Qian, W., Xia, Z., Xiong, J., Gan, Y., Guo, Y., Weng, S., Deng, H., Hu, Y., & Zhang, J. (2014). Manipulation task simulation using ROS and Gazebo. In Proceedings of the 2014 IEEE international conference on robotics and biomimetics, IEEE ROBIO 2014, Institute of Electrical and Electronics Engineers Inc., pp. 2594–2598. Qian, W., Xia, Z., Xiong, J., Gan, Y., Guo, Y., Weng, S., Deng, H., Hu, Y., & Zhang, J. (2014). Manipulation task simulation using ROS and Gazebo. In Proceedings of the 2014 IEEE international conference on robotics and biomimetics, IEEE ROBIO 2014, Institute of Electrical and Electronics Engineers Inc., pp. 2594–2598.
69.
Zurück zum Zitat Anagnostis, A., Benos, L., Tsaopoulos, D., Tagarakis, A., Tsolakis, N., & Bochtis, D. (2021). Human activity recognition through reccurent neural networks for human-robot interaction. Applied Sciences, 11(5), 2188. https://doi.org/10.3390/app11052188 Anagnostis, A., Benos, L., Tsaopoulos, D., Tagarakis, A., Tsolakis, N., & Bochtis, D. (2021). Human activity recognition through reccurent neural networks for human-robot interaction. Applied Sciences, 11(5), 2188. https://​doi.​org/​10.​3390/​app11052188
71.
Zurück zum Zitat Benos, L., Tsaopoulos, D., & Bochtis, D. (2020). A review on ergonomics in agriculture. Part I: Manual operations. Applied Sciences, 10, 1–21. Benos, L., Tsaopoulos, D., & Bochtis, D. (2020). A review on ergonomics in agriculture. Part I: Manual operations. Applied Sciences, 10, 1–21.
72.
Zurück zum Zitat Benos, L., Tsaopoulos, D., & Bochtis, D. (2020). A review on ergonomics in agriculture. part II: Mechanized operations. Applied Sciences, 10, 3484.CrossRef Benos, L., Tsaopoulos, D., & Bochtis, D. (2020). A review on ergonomics in agriculture. part II: Mechanized operations. Applied Sciences, 10, 3484.CrossRef
74.
Zurück zum Zitat Marinoudi, V., Lampridi, M., Sørensen, C. G., Pearson, S., & Bochtis, D. (2021). The agricultural occupations landscape in view of work automation. In Bio-economy and agri-production (pp. 289–348). Elsevier.CrossRef Marinoudi, V., Lampridi, M., Sørensen, C. G., Pearson, S., & Bochtis, D. (2021). The agricultural occupations landscape in view of work automation. In Bio-economy and agri-production (pp. 289–348). Elsevier.CrossRef
75.
Zurück zum Zitat Shamshiri, R. R., Hameed, I. A., Pitonakova, L., Weltzien, C., Balasundram, S. K., Yule, I. J., Grift, T. E., & Chowdhary, G. (2018). Simulation software and virtual environments for acceleration of agricultural robotics: Features highlights and performance comparison. International Journal of Agricultural and Biological Engineering, 11, 15–31. https://doi.org/10.25165/ijabe.v11i4.4032CrossRef Shamshiri, R. R., Hameed, I. A., Pitonakova, L., Weltzien, C., Balasundram, S. K., Yule, I. J., Grift, T. E., & Chowdhary, G. (2018). Simulation software and virtual environments for acceleration of agricultural robotics: Features highlights and performance comparison. International Journal of Agricultural and Biological Engineering, 11, 15–31. https://​doi.​org/​10.​25165/​ijabe.​v11i4.​4032CrossRef
77.
Zurück zum Zitat Lavrenov, R., Zakiev, A., & Magid, E. (2017). Automatic mapping and filtering tool: From a sensor-based occupancy grid to a 3D Gazebo octomap. In Proceedings of the 2017 international conference on mechanical, system and control engineering, ICMSC 2017, Institute of Electrical and Electronics Engineers Inc., pp. 190–195. Lavrenov, R., Zakiev, A., & Magid, E. (2017). Automatic mapping and filtering tool: From a sensor-based occupancy grid to a 3D Gazebo octomap. In Proceedings of the 2017 international conference on mechanical, system and control engineering, ICMSC 2017, Institute of Electrical and Electronics Engineers Inc., pp. 190–195.
Metadaten
Titel
3D Scenery Construction of Agricultural Environments for Robotics Awareness
verfasst von
Aristotelis Christos Tagarakis
Damianos Kalaitzidis
Evangelia Filippou
Lefteris Benos
Dionysis Bochtis
Copyright-Jahr
2022
DOI
https://doi.org/10.1007/978-3-030-84152-2_6

Premium Partner