Skip to main content

Über dieses Buch

The LNCS journal Transactions on Computational Science reflects recent developments in the field of Computational Science, conceiving the field not as a mere ancillary science but rather as an innovative approach supporting many other scientific disciplines. The journal focuses on original high-quality research in the realm of computational science in parallel and distributed environments, encompassing the facilitating theoretical foundations and the applications of large-scale computations and massive data processing. It addresses researchers and practitioners in areas ranging from aerospace to biochemistry, from electronics to geosciences, from mathematics to software architecture, presenting verifiable computational methods, findings and solutions and enabling industrial users to apply techniques of leading-edge, large-scale, high performance computational methods.

The fifth volume of the Transactions on Computational Science journal, edited by Yingxu Wang and Keith C.C. Chan, is devoted to the subject of cognitive knowledge representation. This field of study focuses on the internal knowledge representation mechanisms of the brain and how these can be applied to computer science and engineering. The issue includes the latest research results in internal knowledge representation at the logical, functional, physiological, and biological levels and describes their impacts on computing, artificial intelligence, and computational intelligence.



Toward a Formal Knowledge System Theory and Its Cognitive Informatics Foundations

Knowledge science and engineering are an emerging field that studies the nature of human knowledge, and its manipulations such as acquisition, representation, creation, composition, memorization, retrieval, and depository. This paper presents the nature of human knowledge and its mathematical models, internal representations, and formal manipulations. The taxonomy of knowledge and the hierarchical abstraction model of knowledge are investigated. Based on a set of mathematical models of knowledge and the Object-Attribute-Relation (OAR) model for internal knowledge representation, rigorous knowledge manipulations are formally described by concept algebra. A coherent framework of formalized knowledge systems is modeled based on the analyses of roles of formal and empirical knowledge. Then, the theory of knowledge acquisition and the cognitive model of knowledge spaces are systematically developed.
Yingxu Wang

On Temporal Properties of Knowledge Base Inconsistency

Inconsistent knowledge and information in the real world often have to do with not only what conflicting circumstances are but also when they happen. As a result, only scrutinizing the logical forms that cause inconsistency may not be adequate. In this paper we describe our research work on the temporal characteristics of inconsistent knowledge that can exist in an intelligent system. We provide a formal definition for temporal inconsistency that is based on a logical-conflicting and temporal-coinciding dichotomy. The interval temporal logic underpins our treatment of temporally inconsistent propositions in a knowledge base. We also propose a systematic approach to identifying conflicting intervals for temporally inconsistent propositions. The results help delineate the semantic difference between the classical and temporal inconsistency.
Du Zhang

Images as Symbols: An Associative Neurotransmitter-Field Model of the Brodmann Areas

The ability to associate images is the basis for learning relationships involving vision, hearing, tactile sensation, and kinetic motion. A new architecture is described that has only local, recurrent connections, but can directly form global image associations. This architecture has many similarities to the structure of the cerebral cortex, including the division into Brodmann areas, the distinct internal and external lamina, and the pattern of neuron interconnection. The images are represented as neurotransmitter fields, which differ from neural fields in the underlying principle that the state variables are not the neuron action potentials, but the chemical concentration of neurotransmitters in the extracellular space. The neurotransmitter cloud hypothesis, which asserts that functions of space, time and frequency, are encoded by the density of identifiable molecules, allows the abstract mathematical power of cellular processing to be extended by incorporating a new chemical model of computation. This makes it possible for a small number of neurons, even a single neuron, to establish an association between arbitrary images. A single layer of neurons, in effect, performs the computation of a two-layer neural network.
Analogous to the bits in an SR flip-flop, two arbitrary images can hold each other in place in an association processor and thereby form a short-term image memory. Just as the reciprocal voltage levels in a flip-flop can produce a dynamical system with two stable states, reciprocal-image pairs can generate stable attractors thereby allowing the images to serve as symbols. Spherically symmetric wavelets, identical to those found in the receptive fields of the retina, enable efficient image computations. Noise reduction in the continuous wavelet transform representations is possible using an orthogonal projection based on the reproducing kernel. Experimental results demonstrating stable reciprocalimage attractors are presented.
Douglas S. Greer

Knowledge Reduction of Covering Approximation Space

Knowledge reduction is a key issue in data mining. In order to simplify the covering approximation space and mining rules from it, Zhu proposed a reduction of covering approximation space which does not rely on any prior given concept or decision. Unfortunately, it could only reduce absolutely redundant knowledge. To reduce relatively redundant knowledge with respect to a given concept or decision, the problem of relative reduction is studied in this paper. The condition in which an element of a covering is relatively reducible is discussed. By deleting all relatively reducible elements of a covering approximation space, one can get the relative reduction of the original covering approximation space. Moreover, one can find that the covering lower and upper approximations in the reduced space are the same as in the original covering space. That is to say, it does not decrease the classification ability of a covering approximation space to reduce the relatively reducible elements in it. In addition, combining absolute reduction and relative reduction, an algorithm for knowledge reduction of covering approximation space is developed. It can reduce not only absolutely redundant knowledge, but also relatively redundant knowledge. It is significant for the following-up steps of data mining.
Jun Hu, GuoYin Wang

Formal Description of the Cognitive Process of Memorization

Memorization is a key cognitive process of the brain because almost all human intelligence is functioning based on it. This paper presents a neuroinformatics theory of memory and a cognitive process of memorization. Cognitive informatics foundations and functional models of memory and memorization are explored toward a rigorous explanation of memorization. The cognitive process of memorization is studied that reveals how and when memory is created in long-term memory. On the basis of the formal memory and memorization models, the cognitive process of memorization is rigorously described using Real-Time Process Algebra (RTPA). This work is one of the fundamental enquiries on the mechanisms of the brain and natural intelligence according to the Layered Reference Model of the Brain (LRMB) developed in cognitive informatics.
Yingxu Wang

Intelligent Processing of an Unrestricted Text in First Order String Calculus

First Order String Calculus (FOSC), introduced in this paper, is a generalization of First Order Predicate Calculus (FOPC). The generalization step consists in treating the unrestricted strings, which may contain variable symbols and a nesting structure, similarly to the predicate symbols in FOPC. As a logic programming technology, FOSC, combined with a string unification algorithm and the resolution principle, eliminates the need to invent logical atoms. An important aspect of the technology is the possibility to apply a matching of the text patterns immediately in logical reasoning. In this way the semantics of a text can be defined by string examples, which only demonstrate the concepts, rather than by a previously formalized mathematical knowledge. The advantages of avoiding this previous formalization are demonstrated. We investigate the knowledge representation aspects, the algorithmic properties, the brain simulation aspects, and the application aspects of FOSC theories in comparison with those of FOPC theories. FOSC is applied as a formal basis of logic programming language Sampletalk, introduced in our earlier publications.
Andrew Gleibman

Knowledge Reduction in Concept Lattices Based on Irreducible Elements

As one of the important problems of knowledge discovery and data analysis, knowledge reduction can make the discovery of implicit knowledge in data easier and the representation simpler. In this paper, a new approach to knowledge reduction in concept lattices is developed based on irreducible elements, and characteristics of attributes and objects are also analyzed. Furthermore, algorithms for finding attribute and object reducts are provided respectively. The algorithm analysis shows that the approach to knowledge reduction involves less computation and is more tractable compared with the current methods.
Xia Wang, Wenxiu Zhang

A Knowledge Representation Tool for Autonomous Machine Learning Based on Concept Algebra

Concept algebra is an abstract mathematical structure for the formal treatment of concepts and their algebraic relations, operations, and associative rules for composing complex concepts, which provides a denotational mathematic means for knowledge system representation and manipulation. This paper presents an implementation of concept algebra by a set of simulations in Java. A visualized knowledge representation tool for concept algebra is developed, which enables machines learn concepts and knowledge autonomously. A set of eight relational operations and nine compositional operations of concept algebra are implemented in the tool to rigorously manipulate knowledge by concept networks. The knowledge representation tool is capable of presenting concepts and knowledge systems in multiple ways in order to simulate and visualize the dynamic concept networks during machine learning based on concept algebra.
Yousheng Tian, Yingxu Wang, Kai Hu

Dyna: A Tool for Dynamic Knowledge Modeling

This paper presents the design and implementation of an ontology construction support tool. The Inferential Modeling Technique (IMT) (Chan, 2004), which is a technique for modeling the static and dynamic knowledge elements of a problem domain, provided the basis for the design of the tool. Existing tools lack support for modeling dynamic knowledge as defined by the IMT. Therefore, the focus of this work is development of a Protégé (Gennari, 2003) plug-in, called Dyna, which supports dynamic knowledge modeling and testing. Within Dyna, the Task Behaviour Language (TBL) supports formalized representation of the task behaviour component of dynamic knowledge. The interpreter for TBL can also enable the task behaviour representation to be run and tested, thus enabling verification and testing of the model. Dyna also supports storing the dynamic knowledge models in XML and OWL so that they can be shared and re-used across systems. The tool is applied for constructing an ontology model in the domain of petroleum contamination remediation selection.
Robert Harrison, Christine W. Chan

Rough Sets and Functional Dependencies in Data: Foundations of Association Reducts

We investigate the notion of an association reduct. Association reducts represent data-based functional dependencies between the sets of attributes, where it is preferred that possibly smallest sets determine possibly largest sets. We compare the notions of an association reduct to other types of reducts previously studied within the theory of rough sets. We focus particularly on modeling inexactness of dependencies, which is crucial for many real-life data applications. We also study the optimization problems and algorithms that aim at searching for the most interesting approximate association reducts in data.
Dominik Ślęzak

Hybrid Evolutionary Algorithm for the Graph Coloring Register Allocation Problem for Embedded Systems

Memory or registers are used to store the results of computation of a program. As compared to memory, accessing a register is much faster, but they are scarce resources, in real-time embedded systems and have to be utilized very efficiently. If the register set is not sufficient to hold all program variables, certain values have to be stored in memory and so-called spill code has to be inserted. The optimization goal is to hold as many live variables as possible in registers in order to avoid expensive memory accesses. The register allocation phase is generally more challenging in embedded systems In this paper, we present a new hybrid evolutionary algorithm (HEA) for graph coloring register allocation problem for embedded systems based on a new crossover operator called crossover by conflict-free sets (CCS) and a new local search function. The objective is to minimize the total spill cost.
Anjali Mahajan, M. S. Ali

Extended Pawlak’s Flow Graphs and Information Theory

Flow graph is an effective and graphical tool of knowledge representation and analysis. It explores dependent relation between knowledge in the form of information flow quantity. However, the quantity of flow can not exactly represent the functional dependency between knowledge. In this paper, we firstly present an extended flow graph using concrete information flow, and then give its interpretation under the framework of information theory. Subsequently, an extended flow graph generation algorithm based on the significance of attribute is proposed in virtue of mutual information. In addition, for the purpose of avoiding over-fitting and reducing store space, a reduction method about this extension using information metric has also been developed.
Huawen Liu, Jigui Sun, Huijie Zhang, Lei Liu


Weitere Informationen

Premium Partner