Skip to main content
Top

2025 | OriginalPaper | Chapter

ShizishanGPT: An Agricultural Large Language Model Integrating Tools and Resources

Authors : Shuting Yang, Zehui Liu, Wolfgang Mayer, Ningpei Ding, Ying Wang, Yu Huang, Pengfei Wu, Wanli Li, Lin Li, Hong-Yu Zhang, Zaiwen Feng

Published in: Web Information Systems Engineering – WISE 2024

Publisher: Springer Nature Singapore

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Recent developments in large language models (LLMs) have led to significant improvements in intelligent dialogue systems’ ability to handle complex inquiries. However, current LLMs still exhibit limitations in specialized domain knowledge, particularly in technical fields such as agriculture. To address this problem, we propose ShizishanGPT, an intelligent question answering system for agriculture based on the Retrieval Augmented Generation (RAG) framework and agent architecture. ShizishanGPT consists of five key modules: including a generic GPT-4 based module for answering general questions; a search engine module that compensates for the problem that the large language model’s own knowledge cannot be updated in a timely manner; an agricultural knowledge graph module for providing domain facts; a retrieval module which uses RAG to supplement domain knowledge; and an agricultural agent module, which invokes specialized models for crop phenotype prediction, gene expression analysis, and so on. We evaluated the ShizishanGPT using a dataset containing 100 agricultural questions specially designed for this study. The experimental results show that the tool significantly outperforms general LLMs as it provides more accurate and detailed answers due to its modular design and integration of different domain knowledge sources. Our source code, dataset, and model weights are publicly available at https://​github.​com/​Zaiwen/​CropGPT.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Wang, G.U.O., et al.: Big models in agriculture: key technologies, application and future directions. Smart Agriculture 6(2), 1–13 (2024) Wang, G.U.O., et al.: Big models in agriculture: key technologies, application and future directions. Smart Agriculture 6(2), 1–13 (2024)
2.
go back to reference Silva, B., Nunes, L., Estevão, R., Chandra, R.: PT-4 as an Agronomist Assistant? Answering Agriculture Exams Using Large Language Models. arXiv preprint arXiv:2310.06225 (2023) Silva, B., Nunes, L., Estevão, R., Chandra, R.: PT-4 as an Agronomist Assistant? Answering Agriculture Exams Using Large Language Models. arXiv preprint arXiv:​2310.​06225 (2023)
3.
go back to reference Ye, S., Lauer, J., Zhou, M., Mathis, A., Mathis, M.: AmadeusGPT: a natural language interface for interactive animal behavioral analysis. Adv. Neural Inf. Process. Syst. 36 (2024) Ye, S., Lauer, J., Zhou, M., Mathis, A., Mathis, M.: AmadeusGPT: a natural language interface for interactive animal behavioral analysis. Adv. Neural Inf. Process. Syst. 36 (2024)
5.
go back to reference Zhu, W., et al.: The CropGPT project: call for a global, coordinated effort in precision design breeding driven by AI using biological big data. Mol. Plant 17(2), 215–218 (2024)CrossRef Zhu, W., et al.: The CropGPT project: call for a global, coordinated effort in precision design breeding driven by AI using biological big data. Mol. Plant 17(2), 215–218 (2024)CrossRef
6.
go back to reference Devlin, J., Chang, M. W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018) Devlin, J., Chang, M. W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:​1810.​04805 (2018)
7.
go back to reference Yao, J.Y., Ning, K.P., Liu, Z.H., Ning, M.N., Yuan, L.: LLM lies: hallucinations are not bugs, but features as adversarial examples. arXiv preprint arXiv:2310.01469 (2023) Yao, J.Y., Ning, K.P., Liu, Z.H., Ning, M.N., Yuan, L.: LLM lies: hallucinations are not bugs, but features as adversarial examples. arXiv preprint arXiv:​2310.​01469 (2023)
8.
go back to reference Lewis, P., et al.: Retrieval-augmented generation for knowledge-intensive NLP tasks. Adv. Neural. Inf. Process. Syst. 33, 9459–9474 (2020) Lewis, P., et al.: Retrieval-augmented generation for knowledge-intensive NLP tasks. Adv. Neural. Inf. Process. Syst. 33, 9459–9474 (2020)
9.
go back to reference Dong, J., Fatemi, B., Perozzi, B., Yang, L. F., Tsitsulin, A.:Don’t Forget to Connect! Improving RAG with Graph-based Reranking. arXiv preprint arXiv:2405.18414 (2024) Dong, J., Fatemi, B., Perozzi, B., Yang, L. F., Tsitsulin, A.:Don’t Forget to Connect! Improving RAG with Graph-based Reranking. arXiv preprint arXiv:​2405.​18414 (2024)
10.
go back to reference Yao, S., Zhao, J., Yu, D., Du, N., Shafran, I., Narasimhan, K., Cao, Y.: React: synergizing reasoning and acting in language models. arXiv preprint arXiv:2210.03629 (2022) Yao, S., Zhao, J., Yu, D., Du, N., Shafran, I., Narasimhan, K., Cao, Y.: React: synergizing reasoning and acting in language models. arXiv preprint arXiv:​2210.​03629 (2022)
11.
go back to reference Wei, J., et al.: Chain-of-thought prompting elicits reasoning in large language models.Adv. neural inf. process. syst. 35, 24824–24837 (2022) Wei, J., et al.: Chain-of-thought prompting elicits reasoning in large language models.Adv. neural inf. process. syst. 35, 24824–24837 (2022)
12.
go back to reference Raffel, C., et al.: Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res. 21(140), 1–67 (2020) Raffel, C., et al.: Exploring the limits of transfer learning with a unified text-to-text transformer. J. Mach. Learn. Res. 21(140), 1–67 (2020)
13.
go back to reference Lewis, M., et al.: BART: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. arXiv preprint arXiv:1910.13461 (2019) Lewis, M., et al.: BART: denoising sequence-to-sequence pre-training for natural language generation, translation, and comprehension. arXiv preprint arXiv:​1910.​13461 (2019)
15.
go back to reference Brown, T., et al.: Language models are few-shot learners. Adv. Neural. Inf. Process. Syst. 33, 1877–1901 (2020) Brown, T., et al.: Language models are few-shot learners. Adv. Neural. Inf. Process. Syst. 33, 1877–1901 (2020)
17.
go back to reference Schick, T., et al.: Toolformer: Language models can teach themselves to use tools. Adv. Neural Inf. Process. Syst. 36 (2024) Schick, T., et al.: Toolformer: Language models can teach themselves to use tools. Adv. Neural Inf. Process. Syst. 36 (2024)
18.
go back to reference Qing, J., Deng, X., Lan, Y., Li, Z.: GPT-aided diagnosis on agricultural image based on a new light YOLOPC. Comput. Electron. Agric. 213, 108168 (2023)CrossRef Qing, J., Deng, X., Lan, Y., Li, Z.: GPT-aided diagnosis on agricultural image based on a new light YOLOPC. Comput. Electron. Agric. 213, 108168 (2023)CrossRef
19.
go back to reference Balaguer, A., et al.: RAG vs Fine-tuning: Pipelines, Tradeoffs, and a Case Study on Agriculture. arXiv e-prints, arXiv-2401 (2024) Balaguer, A., et al.: RAG vs Fine-tuning: Pipelines, Tradeoffs, and a Case Study on Agriculture. arXiv e-prints, arXiv-2401 (2024)
20.
21.
go back to reference Shen, Y., Song, K., Tan, X., Li, D., Lu, W., Zhuang, Y.: Hugginggpt: solving AI tasks with chatGPT and its friends in hugging face. Adv. Neural Inf. Process. Syst. 36 (2024) Shen, Y., Song, K., Tan, X., Li, D., Lu, W., Zhuang, Y.: Hugginggpt: solving AI tasks with chatGPT and its friends in hugging face. Adv. Neural Inf. Process. Syst. 36 (2024)
22.
go back to reference Wentao, T.A.N.G., Zelin, H.U.: Survey of agricultural knowledge graph. Comput. Eng. Appl. 60(2), 63–76 (2024) Wentao, T.A.N.G., Zelin, H.U.: Survey of agricultural knowledge graph. Comput. Eng. Appl. 60(2), 63–76 (2024)
23.
go back to reference Rezayi, S., et al.: AgriBERT: knowledge-infused agricultural language models for matching food and nutrition. In IJCAI, pp. 5150–5156 ( July 2022) Rezayi, S., et al.: AgriBERT: knowledge-infused agricultural language models for matching food and nutrition. In IJCAI, pp. 5150–5156 ( July 2022)
24.
go back to reference Chen, Y., Kuang, J., Cheng, D., Zheng, J., Gao, M., Zhou, A.: AgriKG: an Agricultural knowledge graph and its applications. In: Li, G., Yang, J., Gama, J., Natwichai, J., Tong, Y. (eds.) DASFAA 2019. LNCS, vol. 11448, pp. 533–537. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-18590-9_81 Chen, Y., Kuang, J., Cheng, D., Zheng, J., Gao, M., Zhou, A.: AgriKG: an Agricultural knowledge graph and its applications. In: Li, G., Yang, J., Gama, J., Natwichai, J., Tong, Y. (eds.) DASFAA 2019. LNCS, vol. 11448, pp. 533–537. Springer, Cham (2019). https://​doi.​org/​10.​1007/​978-3-030-18590-9_​81
Metadata
Title
ShizishanGPT: An Agricultural Large Language Model Integrating Tools and Resources
Authors
Shuting Yang
Zehui Liu
Wolfgang Mayer
Ningpei Ding
Ying Wang
Yu Huang
Pengfei Wu
Wanli Li
Lin Li
Hong-Yu Zhang
Zaiwen Feng
Copyright Year
2025
Publisher
Springer Nature Singapore
DOI
https://doi.org/10.1007/978-981-96-0573-6_21

Premium Partner