Skip to main content

2021 | OriginalPaper | Buchkapitel

Entropic Analysis of Garhwali Text

verfasst von : Manoj Kumar Riyal, Rajeev Kumar Upadhyay, Sanjay Kumar

Erschienen in: Recent Developments in Acoustics

Verlag: Springer Singapore

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

In the present study, a systematic statistical analysis has been performed by the use of words in continuous Garhwali speech corpus. The words of Garhwali in continuous speech corpus are taken from different sources of Garhwali, viz., Newspapers, storybooks, poems, lyrics of songs and magazines, and it showed that there is a quantitative relation between the role of content words in Garhwali and the Shannon information entropy [S] defined by the probability distribution. So far, very few researches have been conducted in Garhwali language. There is no previous knowledge about the syntactic structure of Garhwali language. We have taken finite continuous corpus of Garhwali language. The occurrences of words (frequency) are almost an inverse power law functions, i.e. Zipf’s law and very close to 1.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
2.
Zurück zum Zitat Nigam K, Lafferty J, McCallum A (1999) Using maximum entropy for text classification. In: IJCAI-99 workshop on machine learning for information filtering, vol 1 Nigam K, Lafferty J, McCallum A (1999) Using maximum entropy for text classification. In: IJCAI-99 workshop on machine learning for information filtering, vol 1
3.
Zurück zum Zitat Papadimitriou C, Karamanos K, Diakonos FK, Constantoudis V, Papageorgiou H (2010) Entropy analysis of natural language written texts. Physica A 389(16):3260–3266CrossRef Papadimitriou C, Karamanos K, Diakonos FK, Constantoudis V, Papageorgiou H (2010) Entropy analysis of natural language written texts. Physica A 389(16):3260–3266CrossRef
5.
Zurück zum Zitat Kalimeri M, Constantoudis V, Papadimitriou C, Karamanos K, Diakonos FK, Papageorgiou H (2012) Entropy analysis of word-length series of natural language texts: Effects of text language and genre. Int J Bifurcat Chaos 22(09):1250223CrossRef Kalimeri M, Constantoudis V, Papadimitriou C, Karamanos K, Diakonos FK, Papageorgiou H (2012) Entropy analysis of word-length series of natural language texts: Effects of text language and genre. Int J Bifurcat Chaos 22(09):1250223CrossRef
6.
Zurück zum Zitat Ebeling Werner, Pöschel Thorsten (1994) Entropy and long-range correlations in literary English. EPL (Europhysics Letters) 26(4):241CrossRef Ebeling Werner, Pöschel Thorsten (1994) Entropy and long-range correlations in literary English. EPL (Europhysics Letters) 26(4):241CrossRef
7.
Zurück zum Zitat Kalimeri M, Constantoudis V, Papadimitriou C, Karamanos K, Diakonos FK, Papageorgiou H (2015) Word-length entropies and correlations of natural language written texts. J Quant Linguist 22(2):101–118CrossRef Kalimeri M, Constantoudis V, Papadimitriou C, Karamanos K, Diakonos FK, Papageorgiou H (2015) Word-length entropies and correlations of natural language written texts. J Quant Linguist 22(2):101–118CrossRef
8.
Zurück zum Zitat Montemurro MA, Zanette DH (2002) Entropic analysis of the role of words in literary texts. Adv Complex Syst 5(01):7–17CrossRef Montemurro MA, Zanette DH (2002) Entropic analysis of the role of words in literary texts. Adv Complex Syst 5(01):7–17CrossRef
9.
Zurück zum Zitat Ospanova R (2013) Calculating information entropy of language texts. World Appl Sci J 22(1):41–45 Ospanova R (2013) Calculating information entropy of language texts. World Appl Sci J 22(1):41–45
10.
Zurück zum Zitat Zipf GK (1949) Human behaviour and the principal of least effort, reading. Addison-Wesley Publishing Co., MA Zipf GK (1949) Human behaviour and the principal of least effort, reading. Addison-Wesley Publishing Co., MA
11.
Zurück zum Zitat Cancho RFI, Sole´ RV (2002) Least effort and the origins of scaling in human language. Proceedings of the national academy of sciences of the United States of America, vol 100, pp 788–791 Cancho RFI, Sole´ RV (2002) Least effort and the origins of scaling in human language. Proceedings of the national academy of sciences of the United States of America, vol 100, pp 788–791
12.
Zurück zum Zitat Devadoss S, Luckstead J, Danforth D, Akhundjanov S (2016) The power law distribution for lower tail cities in India. Physica A 15(442):193–196CrossRef Devadoss S, Luckstead J, Danforth D, Akhundjanov S (2016) The power law distribution for lower tail cities in India. Physica A 15(442):193–196CrossRef
13.
Zurück zum Zitat Riyal MK, Rajput NK, Khanduri VP, Rawat L (2016) Rank-frequency analysis of characters in Garhwali text: emergence of Zipf’s law. Curr Sci 110(3):429–434CrossRef Riyal MK, Rajput NK, Khanduri VP, Rawat L (2016) Rank-frequency analysis of characters in Garhwali text: emergence of Zipf’s law. Curr Sci 110(3):429–434CrossRef
14.
Zurück zum Zitat Manin DY (2009) Mandelbrot’s model for Zipf’s law: can Mandelbrot’s model explain Zipf’s law for language? J Quant Linguist 16(3):274–285MathSciNetCrossRef Manin DY (2009) Mandelbrot’s model for Zipf’s law: can Mandelbrot’s model explain Zipf’s law for language? J Quant Linguist 16(3):274–285MathSciNetCrossRef
15.
Zurück zum Zitat Montemurro Marcelo A (2001) Beyond the Zipf-Mandelbrot law in quantitative linguistics. Physica A 300(3-4):567–578CrossRef Montemurro Marcelo A (2001) Beyond the Zipf-Mandelbrot law in quantitative linguistics. Physica A 300(3-4):567–578CrossRef
Metadaten
Titel
Entropic Analysis of Garhwali Text
verfasst von
Manoj Kumar Riyal
Rajeev Kumar Upadhyay
Sanjay Kumar
Copyright-Jahr
2021
Verlag
Springer Singapore
DOI
https://doi.org/10.1007/978-981-15-5776-7_3

    Marktübersichten

    Die im Laufe eines Jahres in der „adhäsion“ veröffentlichten Marktübersichten helfen Anwendern verschiedenster Branchen, sich einen gezielten Überblick über Lieferantenangebote zu verschaffen.