Skip to main content

Open Access 2025 | Open Access | Buch

The Mathematical Theory of Semantic Communication

verfasst von: Kai Niu, Ping Zhang

Verlag: Springer Nature Singapore

Buchreihe : SpringerBriefs in Computer Science

insite
SUCHEN

Über dieses Buch

Dieses Open-Access-Buch untersucht, wie der Fokus der klassischen Informationstheorie auf syntaktische Informationen die weitere Entwicklung der Kommunikationswissenschaft begrenzt hat. In letzter Zeit haben Kommunikationstechnologien, die semantische Informationen verarbeiten und übertragen, in der akademischen Welt breite Aufmerksamkeit erlangt. Semantische Kommunikation hat neue Wege für die Zukunft der Entwicklung der Kommunikationstechnologie geebnet, aber es fehlt ihr an einer allgemeinen mathematischen Leittheorie. Um dieser Herausforderung zu begegnen, bildet dieses Open-Access-Buch einen theoretischen Rahmen für die semantische Informationstheorie und bietet eine systematische Darstellung des Messsystems für semantische Information und der theoretischen Grenzen semantischer Kommunikation. Es dient Forschern im Bereich Information und Kommunikation als professionelle Referenz. Das Buch beginnt mit einer eingehenden Analyse der Datenmerkmale verschiedener Informationsquellen und der Bedürfnisse nachgelagerter Aufgaben, um das universelle Attribut semantischer Information - Synonymität - zusammenzufassen und zu verallgemeinern und die synonyme Abbildung zwischen semantischer Information und syntaktischer Information zu definieren. Ausgehend von diesem Kernkonzept, dem synonymen Mapping, führt dieses Buch in die Messgrößen semantischer Information ein. Anschließend wird ein neues mathematisches Werkzeug, Synonymous Asymptotic Equipartition (AEP), eingeführt, um die mathematischen Eigenschaften synonymer typischer Sequenzen zu erforschen. Es wendet zufällige Kodierung und synonyme typische Sequenzdekodierung / -kodierung an, um das semantische verlustfreie Quellcodierungstheorem, das semantische Kanalcodierungstheorem und das semantische Theorem der begrenzten Verzerrungsquellen zu beweisen. Darüber hinaus werden die semantischen Informationsmaße im kontinuierlichen Fall diskutiert und eine neue Kanalkapazitätsformel des bandbegrenzten Gaußkanals erhalten, die eine wichtige Erweiterung der klassischen Kanalkapazität darstellt. Das Buch ist eine wertvolle Ressource für Forscher, Wissenschaftler und Fachleute auf dem Gebiet der Information und Kommunikation, insbesondere für diejenigen, die daran interessiert sind, die Grenzen der semantischen Kommunikationstechnologie zu erweitern.

Inhaltsverzeichnis

Frontmatter

Open Access

Chapter 1. Introduction
Abstract
In this chapter, we first retrospect the research history of semantic information. By investigating the semantic information of various source modalities and downstream tasks, we find that synonymity is the essence of semantic information and synonymous mapping is a basic relationship between the semantic information and syntactic information. Then we give a systematic framework of semantic information theory and summarize the major contributions of this book.
Kai Niu, Ping Zhang

Open Access

Chapter 2. Semantic Communication System and Synonymous Mapping
Abstract
In this chapter, we develop a systematic model for semantic communication and describe the specific design criteria of semantic information transmission. Then we introduce the formal definition of synonymous mapping as a one-to-many mapping from the semantic alphabet to syntactic alphabet. We emphasize that the synonymous mapping plays a key role in semantic communication.
Kai Niu, Ping Zhang

Open Access

Chapter 3. Semantic Entropy
Abstract
In this chapter, based on the synonymous mapping, we first give the definition of semantic entropy, that is, the core measure of semantic information. We name the unit of semantic information as semantic bit (sebit). Like the classic information theory, this measure can also be extended to semantic conditional/joint entropy. The mathematical properties of these measures are carefully discussed, such as the chain rule of semantic sequential entropy. In addition, we prove that all semantic entropies are not larger than their classic counterparts, and the latter measures are regarded as specific examples of the former measures.
Kai Niu, Ping Zhang

Open Access

Chapter 4. Semantic Relative Entropy and Mutual Information
Abstract
In this chapter, we introduce the concepts of semantic relative entropy and semantic mutual information. First three measures of semantic relative entropies are defined and the relationship between these measures and the classic counterpart are discussed. Unlike the classic mutual information, it is noted that we use two measures to indicate semantic mutual information, such as the up semantic mutual information and the down semantic mutual information. In addition, the chain rule of semantic mutual information is addressed.
Kai Niu, Ping Zhang

Open Access

Chapter 5. Semantic Channel Capacity and Semantic Rate-Distortion
Abstract
In this chapter, we introduce the semantic channel capacity and semantic rate-distortion. We use the maximum up semantic mutual information to indicate the former. Similarly, the minimum down semantic mutual information is used to present the latter. These two measures are the theoretic limitations of semantic communication.
Kai Niu, Ping Zhang

Open Access

Chapter 6. Semantic Lossless Source Coding
Abstract
In this chapter, we discuss the semantic lossless source coding. First, we investigate the asymptotic equipartition property (AEP) of semantic coding and introduce the synonymous typical set. Then we prove the semantic source coding theorem and give the optimal length of semantic coding. Finally, we design the semantic Huffman coding to demonstrate the advantage of semantic data compression.
Kai Niu, Ping Zhang

Open Access

Chapter 7. Semantic Channel Coding
Abstract
In this chapter, we investigate the semantic channel coding. First, we introduce the jointly asymptotic equipartition property (JAEP) in the semantic sense and define the jointly synonymous typical set. Then we prove the semantic channel coding theorem by using the jointly typical decoding over the jointly synonymous typical set, which states that the semantic capacity, the maximum up semantic mutual information, is the largest achievable rate of semantic communication. Finally, we consider the semantic channel decoding problem and propose the maximum likelihood group (MLG) decoding algorithm. A simple example of semantic Hamming code is analyzed based on the MLG decoding rule.
Kai Niu, Ping Zhang

Open Access

Chapter 8. Semantic Lossy Source Coding
Abstract
In this chapter, we mainly discuss the semantic lossy source coding. Firstly, we investigate the semantic distortion measure and extend the concept of jointly typical sequence in the semantic sense. Then we prove the semantic rate-distortion coding theorem by using jointly typical encoding over the synonymous typical set, which means that the semantic rate-distortion function, namely, the minimum down semantic mutual information, is the lowest compression rate achieving by semantic lossy source coding.
Kai Niu, Ping Zhang

Open Access

Chapter 9. Semantic Information Measure of Continuous Message
Abstract
In this chapter, we extend the semantic information measures, such as semantic entropy, semantic mutual information to the continuous message. Firstly, we give the definitions of semantic entropy and mutual information in the continuous case. Secondly, we investigate the capacity of Gaussian channel in the semantic sense and obtain the semantic channel capacity formula of band-limited Gaussian channel, which is an important extension of the famous Shannon channel capacity formula. Finally, we derive the semantic rate-distortion function for the Gaussian source.
Kai Niu, Ping Zhang

Open Access

Chapter 10. Semantic Joint Source Channel Coding
Abstract
In this chapter, we consider the semantic joint source channel coding. Similar to the classic information theory, we can tie together two basic methods of semantic communication: semantic source coding and semantic channel coding and prove the semantic source channel coding theorem. Compared with the classic communication system, the code rate range of sematic communication can be further extended. This point reveals the significant potential of semantic coding.
Kai Niu, Ping Zhang

Open Access

Chapter 11. Conclusions
Abstract
In this chapter, we summarize the framework of the semantic information theory proposed in this work, including the measures of semantic information, the basic theorems of semantic encoding, and the performance limits of semantic communication systems. We explore the future development of semantic information theory and highlight the promising prospects of semantic communication.
Kai Niu, Ping Zhang
Metadaten
Titel
The Mathematical Theory of Semantic Communication
verfasst von
Kai Niu
Ping Zhang
Copyright-Jahr
2025
Verlag
Springer Nature Singapore
Electronic ISBN
978-981-9651-32-0
Print ISBN
978-981-9651-31-3
DOI
https://doi.org/10.1007/978-981-96-5132-0