Skip to main content
Top

2003 | Chapter

Entropy and Complexity of Sequences

Authors : Werner Ebeling, Miguel Jimenez-Montano, Thomas Pohl

Published in: Entropy Measures, Maximum Entropy Principle and Emerging Applications

Publisher: Springer Berlin Heidelberg

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

We analyze and discuss here sequences of letters and time series coded as sequences of letters on certain alphabets. Main subjects are macromolecular sequences (e.g., nucleotides in DNA or amino acids in proteins) neural spike trains and financial time series. Several sequence representations are introduced, including return plots, surrogate sequences and surrogate processes. We give a short review of the definition of entropies and some other informational concepts. We also point out that entropies have to be considered as fluctuating quantities and study the corresponding distributions. In the last part we consider grammatical concepts. We discuss algorithms to evaluate the syntactic complexity and information content and apply them to several special sequences. We compare the data from seven neurons, before and after penicillin treatment, by encoding their inter-spike intervals according to their entropies, syntactic-complexity and informational content. Using these measures to classify these sequences with respect to their structure or randomness, give similar results. The other examples show significantly less order.

Metadata
Title
Entropy and Complexity of Sequences
Authors
Werner Ebeling
Miguel Jimenez-Montano
Thomas Pohl
Copyright Year
2003
Publisher
Springer Berlin Heidelberg
DOI
https://doi.org/10.1007/978-3-540-36212-8_11

Premium Partners