Skip to main content

2005 | Buch

Advances in Computer Science - ASIAN 2004. Higher-Level Decision Making

9th Asian Computing Science Conference. Dedicated to Jean-Louis Lassez on the Occasion of His 5th Birthday. Chiang Mai, Thailand, December 8-10, 2004. Proceedings

insite
SUCHEN

Inhaltsverzeichnis

Frontmatter

Keynote Papers

Counting by Coin Tossings

This text is an informal review of several randomized algorithms that have appeared over the past two decades and have proved instrumental in extracting efficiently quantitative characteristics of very large data sets. The algorithms are by nature probabilistic and based on hashing. They exploit properties of simple discrete probabilistic models and their design is tightly coupled with their analysis, itself often founded on methods from analytic combinatorics. Singularly efficient solutions have been found that defy information theoretic lower bounds applicable to deterministic algorithms. Characteristics like the total number of elements, cardinality (the number of distinct elements), frequency moments, as well as unbiased samples can be gathered with little loss of information and only a small probability of failure. The algorithms are applicable to traffic monitoring in networks, to data base query optimization, and to some of the basic tasks of data mining. They apply to massive data streams and in many cases require strictly minimal auxiliary storage.

Philippe Flajolet
On the Role Definitions in and Beyond Cryptography

More than new algorithms, proofs, or technologies, it is the emergence of

definitions

that has changed the landscape of cryptography. We describe how definitions work in modern cryptography, giving a number of examples, and we provide observations, opinions, and suggestions about the art and science of crafting them.

Phillip Rogaway
Meme Media for the Knowledge Federation Over the Web and Pervasive Computing Environments

Although Web technologies enabled us to publish and to browse intellectual resources, they do not enable people to reedit and redistribute intellectual resources, including not only documents but also tools and services, published in the Web. Meme media technologies were proposed to solve this problem, and to accelerate the evolution of intellectual resources accumulated over the Web. Meme media technologies will make the Web work as a pervasive computing environment, i.e., an open system of computing resources in which users can dynamically select and interoperate some of these computing resources to perform their jobs satisfying their dynamically changing demands. Federation denotes ad hoc definition and/or execution of interoperation among computing resources that are not a priori assumed to interoperate with each other. We define knowledge federation as federation of computing resources published in the form of documents. This paper reviews and reinterprets meme media technologies from a new view point of knowledge federation over the Web and pervasive computing environments. It focuses on the following four aspects of meme media technologies: (1) media architectures for reediting and redistributing intellectual resources, (2) client-side middleware technologies for application frameworks, (3) view integration technologies for the interoperation and graphical integration of legacy applications, and (4) knowledge federation technologies for pervasive computing environments.

Yuzuru Tanaka, Jun Fujima, Makoto Ohigashi

Contributed Papers

Probabilistic Space Partitioning in Constraint Logic Programming

We present a language for integrating probabilistic reasoning and logic programming. The key idea is to use constraints based techniques such as the constraints store and finite domain variables. First we show how these techniques can be used to integrate a number of probabilistic inference algorithms with logic programming. We then proceed to detail a language which effects conditioning by probabilistically partitioning the constraint store. We elucidate the kinds of reasoning effected by the introduced language by means of two well known probabilistic problems: the three prisoners and Monty Hall. In particular we show how the syntax of the language can be used to avoid the pitfalls normally associated with the two problems. An elimination algorithm for computing the probability of a query in a given store is presented.

Nicos Angelopoulos
Chi-Square Matrix: An Approach for Building-Block Identification

This paper presents a line of research in genetic algorithms (GAs), called building-block identification. The building blocks (BBs) are common structures inferred from a set of solutions. In simple GA, crossover operator plays an important role in mixing BBs. However, the crossover probably disrupts the BBs because the cut point is chosen at random. Therefore the BBs need to be identified explicitly so that the solutions are efficiently mixed. Let

S

be a set of binary solutions and the solution

s

=

b

1

...

b

,

b

i

∈ {0, 1}. We construct a symmetric matrix of which the element in row

i

and column

j

, denoted by

m

ij

, is the chi-square of variables

b

i

and

b

j

. The larger the

m

ij

is, the higher the dependency is between bit

i

and bit

j

. If

m

ij

is high, bit

i

and bit

j

should be passed together to prevent BB disruption. Our approach is validated for additively decomposable functions (ADFs) and hierarchically decomposable functions (HDFs). In terms of scalability, our approach shows a polynomial relationship between the number of function evaluations required to reach the optimum and the problem size. A comparison between the chi-square matrix and the hierarchical Bayesian optimization algorithm (hBOA) shows that the matrix computation is 10 times faster and uses 10 times less memory than constructing the Bayesian network.

Chatchawit Aporntewan, Prabhas Chongstitvatana
Design Exploration Framework Under Impreciseness Based on Register-Constrained Inclusion Scheduling

In this paper, we propose a design exploration framework which consider impreciseness in design specification. In high-level synthesis, imprecise information is often encountered. Two kinds of impreciseness are considered here: imprecise characteristics of functional units and imprecise design constraints. The proposed design exploration framework is based on efficient scheduling algorithm which considers impreciseness,

Register-Constrained Inclusion Scheduling

. We demonstrate the effectiveness of our framework by exploring a design solution for a well-known benchmark,

Voltera filter

. The selected solution meets the acceptability criteria while minimizing the total number of registers.

Chantana Chantrapornchai, Wanlop Surakumpolthorn, Edwin Sha
Hiord: A Type-Free Higher-Order Logic Programming Language with Predicate Abstraction

A new formalism, called

Hiord

, for defining type-free higher-order logic programming languages with predicate abstraction is introduced. A model theory, based on partial combinatory algebras, is presented, with respect to which the formalism is shown sound. A programming language built on a subset of

Hiord

, and its implementation are discussed. A new proposal for defining modules in this framework is considered, along with several examples.

Daniel Cabeza, Manuel Hermenegildo, James Lipton
Assessment Aggregation in the Evidential Reasoning Approach to MADM Under Uncertainty: Orthogonal Versus Weighted Sum

In this paper, we revisit the evidential reasoning (ER) approach to multiple-attribute decision making (MADM) with uncertainty. The attribute aggregation problem in MADM under uncertainty is generally formulated as a problem of evidence combination. Then several new aggregation schemes are proposed and simultaneously their theoretical features are explored. A numerical example traditionally examined in published sources on the ER approach is used to illustrate the proposed techniques.

Van-Nam Huynh, Yoshiteru Nakamori, Tu-Bao Ho
Learnability of Simply-Moded Logic Programs from Entailment

In this paper, we study exact learning of logic programs from entailment queries and present a polynomial time algorithm to learn a rich class of logic programs that allow local variables and include many standard programs like addition, multiplication, exponentiation, member, prefix, suffix, length, append, merge, split, delete, insert, insertion-sort, quick-sort, merge-sort, preorder and inorder traversal of binary trees, polynomial recognition, derivatives, sum of a list of naturals. Our algorithm asks at most polynomial number of queries and our class is the largest of all the known classes of programs learnable from entailment.

M. R. K. Krishna Rao
A Temporalised Belief Logic for Specifying the Dynamics of Trust for Multi-agent Systems

Temporalisation is a methodology for combining logics whereby a given logic system can be enriched with temporal features to create a new logic system. TML (Typed Modal Logic) extends classical first-order logic with typed variables and multiple belief modal operators; it can be applied to the description of, and reasoning about, trust for multi-agent systems. Without the introduction of a temporal dimension, this logic may not be able to express the dynamics of trust. In this paper, adopting the temporalisation method, we combine TML with a temporal logic to obtain a new logic, so that the users can specify the dynamics of trust and model evolving theories of trust for multi-agent systems.

Chuchang Liu, Maris A. Ozols, Mehmet Orgun
Using Optimal Golomb Rulers for Minimizing Collisions in Closed Hashing

We give conditions for hash table probing which minimize the expected number of collisions. A probing algorithm is determined by a sequence of numbers denoting jumps for an item during multiple collisions. In linear probing, this sequence consists of only ones – for each collision we jump to the next location. To minimize the collisions, it turns out that one should use the Golomb ruler conditions: consecutive partial sums of the jump sequence should be distinct. The commonly used quadratic probing scheme fulfils the Golomb condition for some cases. We define a new probing scheme – Golomb probing – that fulfills the Golomb conditions for a much larger set of cases. Simulations show that Golomb probing is always better than quadratic and linear and in some cases the collisions can be reduced with 25% compared to quadratic and with more than 50% compared to linear.

Lars Lundberg, Håkan Lennerstad, Kamilla Klonowska, Göran Gustafsson
Identity-Based Authenticated Broadcast Encryption and Distributed Authenticated Encryption

Since its introduction, broadcast encryption has attracted many useful applications. In this paper, we propose two identity-based schemes for authenticated broadcasting and distributed message authentication. The first scheme supports multiple broadcasters and allows each broadcaster to dynamically broadcast messages into an arbitrary group of receivers determined by the broadcaster. The receivers can obtain the broadcasted message using the identity of the broadcaster and his own secret decryption key; hence it ensures both

confidentiality

and

authenticity

of the message. The second scheme allows users (receivers) to send messages back to the broadcaster where the authentication of messages is done with the identity of the user. We also provide security proofs for our schemes under the random oracle model.

Yi Mu, Willy Susilo, Yan-Xia Lin, Chun Ruan
Deniable Partial Proxy Signatures

This paper describes a proxy signature scheme where a signer can delegate a partial signing right to a party who can then sign on behalf of the original signer to generate a partial proxy signature. A partial proxy signature can be converted into a full signature with the aid of the original signer. Our proxy signature scheme has the feature of deniability, i.e., only the designated receiver can verify the partial proxy signature and the full signature associated to him, while they are not transferable. This paper also describes an application of our scheme in a deniable optimistic fair exchange.

Yi Mu, Fangguo Zhang, Willy Susilo
Formal Concept Mining: A Statistic-Based Approach for Pertinent Concept Lattice Construction

In this paper, we define formal concept mining, a method for generating and evaluating all the pertinent concepts from large transaction databases. We propose a novel efficient formal concept mining algorithm, called Distribution Curve Self-Evaluation (DCSEA). Attempting repeatedly to self-adjust the normal distribution curve to be as close as the symmetry curve, DCSEA automatically identifies all the pertinent concepts by deleting and masking non-pertinent concepts. Instead of using the global support threshold, DCSEA allows users to specify the interestingness of the output concepts by using a more understandable statistic-based threshold, called minimum significance threshold. Such threshold measures the level of significance of the concept extent size (the number of objects) from all the concept extent sizes. Experimental results showed that the proposed algorithm gives high concept retrieval performance, and efficient concept focusing, especially on large databases.

Taweechai Ouypornkochagorn, Kitsana Waiyamai
A Robust Approach to Content-Based Musical Genre Classification and Retrieval Using Multi-feature Clustering

In this paper, we propose a new robust content-based musical genre classification and retrieval algorithm using multi-feature clustering (MFC) method. In contrast to previous works, this paper focuses on two practical issues of the system dependency problem on different input query patterns (or portions) and input query lengths which causes serious uncertainty of the system performance. In order to solve these problems, a new approach called multi-feature clustering (MFC) based on k-means clustering is proposed. To verify the performance of the proposed method, several excerpts with variable duration were extracted from every other position in a queried music file. Effectiveness of the system with MFC and without MFC is compared in terms of the classification and retrieval accuracy. It is demonstrated that the use of MFC significantly improves the system stability of musical genre classification and retrieval performance with higher accuracy rate.

Kyu-Sik Park, Sang-Heon Oh, Won-Jung Yoon, Kang-Kue Lee
Registration of 3D Range Images Using Particle Swarm Optimization

Improvements to the application of a relatively new meta-heuristic strategy, called Particle Swarm Optimization (PSO), to the registration of 3D range images are presented. The proposed improvements include: implementation of the search intensification procedure, the von-Neumann particle neighborhood topology and the differential evolution operator. The 3D data registration results show significant improvements compared to the currently used PSO-based registration techniques, as well as techniques involving other meta-heuristic optimization strategies, such as genetic algorithms and adaptive simulated annealing.

Hai V. Phan, Margaret Lech, Thuc D. Nguyen
Zero-Clairvoyant Scheduling with Inter-period Constraints

This paper introduces two new mathematical modelingparadigms called Periodic Linear Programming and Periodic Quantified Linear Programming. The former is an extension of traditional Linear Programming, whereas the latter extends Quantified Linear Programming. We use these tools to capture the specifications of real-time embedded systems, which are characterized by uncertainty, complex timing constraints and periodicity. The strength of the modeling techniques lies in the ease with which these specifications can be represented and analyzed.

K. Subramani
A Novel Texture Synthesis Based Algorithm for Object Removal in Photographs

Natural images and photographs sometimes may contain stains or undesired objects covering significant portions of the images. Inpainting is a method to fill in such portions using the information from the remaining area of the image. In this paper, we propose a novel photograph editing framework that utilizes texture synthesis techniques. Major contributions of our algorithm are: 1) a constraint-based candidate patch searching method which limits the searching within neighboring region with similar texture; 2) a metric of Coherence Confidence for selecting the best fit candidate preventing error accumulation and propagation; 3) integration of graphcut optimization to make the seam visually invisible. Experiments show that our system can efficiently handle different cases especially large regions in complex background.

Feng Tang, Yiting Ying, Jin Wang, Qunsheng Peng
Highly Efficient and Effective Techniques for Thai Syllable Speech Recognition

This paper presents a Thai syllable speech recognition system with the capability to achieve high accuracy of Thai syllable speech and Thai tone recognition. The recognition accuracy of 97.84% is achieved for Thai syllable speech recognition using the Continuous Density Hidden Markov Model (CDHMM). To provide a faster response, a beam pruning technique is applied, in which the result shows that by using this technique with an appropriate beam width, the recognition time can be reduced by more than 4 times. As Thai is tonal language, tone recognition is crucial for distinguishing meanings of Thai syllables. To obtain high rates of tone recognition in the Thai language, the CDHMM and a mixed acoustic feature method are employed. The tone recognition rates of 97.88%, 97.36%, 98.81%, 90.67% and 100.0% are achieved for mid, low, falling, high and rising tones, respectively.

S. Tangwongsan, P. Po-Aramsri, R. Phoophuangpairoj
Robot Visual Servoing Based on Total Jacobian

In robot visual servoing, only the conception of image Jacobian is traditionally utilized, which indicates that the image feature varies only with the robot motion. But for a moving object, the image feature can be affected both by the robot motion and by the object motion. In this paper total Jacobian and object Jacobian are defined. The total Jacobian matrix, consisting of image Jacobian and object Jacobian, is estimated using exponentially weighted least square algorithm. No calibration is required. The control scheme is image Jacobian pseudo-inverse appending an item related to the object motion. Results for both two and six degree-of-freedom systems demonstrate the success of the algorithm.

Qingjie Zhao, Zengqi Sun, Hongbin Deng

Invited Papers

Online Stochastic and Robust Optimization

This paper considers online stochastic optimization problems where uncertainties are characterized by a distribution that can be sampled and where time constraints severely limit the number of offline optimizations which can be performed at decision time and/or in between decisions. It reviews our recent progress in this area, proposes some new algorithms, and reports some new experimental results on two problems of fundamentally different nature: packet scheduling and multiple vehicle routing (MVR). In particular, the paper generalizes our earlier generic online algorithm with precomputation, least-commitment, service guarantees, and multiple decisions, all which are present in the MVR applications. Robustness results are also presented for multiple vehicle routing.

Russell Bent, Pascal Van Hentenryck
Optimal Constraint Decomposition for Distributed Databases

The problem considered is that of decomposing a global integrity constraint in a distributed database into local constraints for every local site, such that the local constraints serve as a conservative approximation, i.e., satisfaction of the local constraints by a database instance guarantees satisfaction of the global constraint. Verifying local rather than global constraints during database updates reduces distributed processing costs and allows most updates, even in the presence of site and network failures. This paper focuses on the problem of deriving the best possible decompositions, both at database design and update processing time. A generic framework is formulated for finding optimal decompositions for a range of design and update-time scenarios. For the case of linear arithmetic constraints, (1) a bounded size parametric formulation of the decomposition optimization problem is introduced which has a possibly smaller search space but is proven to have the same optimum, (2) the decomposition problem is reduced to the problem of resource distribution which simplifies distributed management of constraints, and (3) autonomous optimal decompositions in subsets of local database sites are shown possible and are proven to preserve optimality under the resource bounds constraints.

Alexander Brodsky, Larry Kerschberg, Samuel Varas
Adaptive Random Testing

In this paper, we introduce an enhanced form of random testing called

Adaptive Random Testing

. Adaptive random testing seeks to distribute test cases more evenly within the input space. It is based on the intuition that for non-point types of failure patterns, an even spread of test cases is more likely to detect failures using fewer test cases than ordinary random testing. Experiments are performed using published programs. Results show that adaptive random testing does outperform ordinary random testing significantly (by up to as much as 50%) for the set of programs under study. These results are very encouraging, providing evidences that our intuition is likely to be useful in improving the effectiveness of random testing.

T. Y. Chen, H. Leung, I. K. Mak
Minimal Unsatisfiable Sets: Classification and Bounds

Proving the unsatisfiability of propositional Boolean formulas has applications in a wide range of fields. Minimal Unsatisfiable Sets (MUS) are signatures of the property of unsatisfiability in formulas and our understanding of these signatures can be very helpful in answering various algorithmic and structural questions relating to unsatisfiability. In this paper, we explore some combinatorial properties of MUS and use them to devise a classification scheme for MUS. We also derive bounds on the sizes of MUS in Horn, 2-SAT and 3-SAT formulas.

Sudeshna Dasgupta, Vijay Chandru
LPOD Answer Sets and Nash Equilibria

Logic programs with ordered disjunctions (LPODs) are natural vehicles for expressing choices that have a preference ordering. They are extensions of the familiar extended logic programs that have answer sets as semantics. In game theory, players usually prefer strategies that yield higher payoffs. Since strategies are choices, LPODs would seem to be a suitable logical formalism for expressing some game-theoretic properties. This paper shows how pure strategy normal form games can be encoded as LPODs in such a way that the answer sets that are mutually most preferred by all players are exactly the Nash equilibria. A similar result has been obtained by researchers using a different, but related, logical formalism, viz., ordered choice logic programs that were used to encode extensive games.

Norman Foo, Thomas Meyer, Gerhard Brewka
Graph Theoretic Models for Reasoning About Time

Reasoning about time is a very ancient discipline, perhaps as old as prehistoric man. These ancient humans had discovered how long to roast their hunted meat and how to dry and age the skins of animals. They learned how and when to plant seeds, and were guided by the cycles of the sun, moon and the seasons. Our ancestors knew that day followed night and night followed day, and they had some notion of duration of day and night. This basic temporal knowledge was exploited to develop a sense of planning, taking advantage of observation and experience. For example, they would have observed that deer drink at the river at a certain time of the day, or that .sh are easier to catch in the early morning. Early humans could recognize the changing seasons, and adapted their behavior in order to expect and avoid some of the dangers of cold and hunger by preparing in advance.

Martin Charles Golumbic
Rule-Based Programming and Proving: The ELAN Experience Outcomes

Together with the Protheo team in Nancy, we have developed in the last ten years the

ELAN

rule-based programming language and environment. This paper presents the context and outcomes of this research effort.

Claude Kirchner, Hélène Kirchner
Towards Flexible Graphical Communication Using Adaptive Diagrams

Unlike today where the majority of diagrams are static, lifeless objects reflecting their origin in print media, the computer of the near future will provide more flexible visual computer interfaces in which diagrams adapt to their viewing context, support interactive exploration and provide semantics-based retrieval and adaptation. We provide an overview of the Adaptive Diagram Research Project whose aim is to provide a generic computational basis for this new type of diagrams.

Kim Marriott, Bernd Meyer, Peter J. Stuckey
A Framework for Compiler Driven Design Space Exploration for Embedded System Customization

Designing custom solutions has been central to meeting a range of stringent and specialized needs of embedded computing, along such dimensions as physical size, power consumption, and performance that includes real-time behavior. For this trend to continue, we must find ways to overcome the twin hurdles of rising non-recurring engineering (NRE) costs and decreasing time-to-market windows by providing major improvements in designer productivity. This paper presents compiler directed design space exploration as a framework for articulating, formulating, and implementing global optimizations for embedded systems customization, where the design space is spanned by parametric representations of both candidate compiler optimizations and architecture parameters, and the navigation of the design space is driven by quantifiable, machine independent metrics. This paper describes the elements of such a framework and an example of its application.

Krishna V. Palem, Lakshmi N. Chakrapani, Sudhakar Yalamanchili
Spectral-Based Document Retrieval

The fast vector space and probabilistic methods use the term counts and the slower proximity methods use term positions. We present the spectral-based information retrieval method which is able to use both term count and position information to obtain high precision document rankings. We are able to perform this, in a time comparable to the vector space method, by examining the query term spectra rather than query term positions. This method is a generalisation of the vector space method (VSM). Therefore, our spectral method can use the weighting schemes and enhancements used in the VSM.

Kotagiri Ramamohanarao, Laurence A. F. Park
Metadata Inference for Document Retrieval in a Distributed Repository

This paper describes a simple data model for the composition and metadata management of documents in a distributed setting. We assume that each document resides at the local repository of its provider, so all providers’ repositories, collectively, can be thought of as a single database of documents spread over the network. Providers willing to share their documents with other providers in the network must register them with a coordinator, or

mediator

, and providers that search for documents matching their needs must address their queries to the mediator. The process of registering (or un-registering) a document, formulating a query to the mediator, or answering a query by the mediator, all rely on document content

annotation

.

Content annotation depends on the nature of the document: if the document is atomic then an annotation provided explicitely by the author is sufficient, whereas if the document is composite then the author annotation should be augmented by an

implied annotation

, i.e., an annotation inferred from the annotations of the document’s components.

The main contributions of this paper are:

1

Providing appropriate definitions of document annotations;

2

Providing an algorithm for the automatic computation of implied annotations;

3

Defining the main services that the mediator should support.

P. Rigaux, N. Spyratos
A Simple Theory of Expressions, Judgments and Derivations

We propose a simple theory of expressions which is intended to be used as a foundational syntactic structure for the Natural Framework (NF). We define expression formally and give a simple proof of the decidability of

α

-equivalence. We use this new theory of expressions to define judgments and derivations formally, and we give concrete examples of derivation games to show a flavor of NF.

Masahiko Sato
Reactive Framework for Resource Aware Distributed Computing

Rapid strides in technology have lead to pervasive computing in a spectrum of applications such as crisis management systems, distributed critical systems, medical therapy systems, home entertainment etc. One of the common features in the spectrum of applications has been the reactivity of the systems and context of the environments. The environmental context has become highly sophisticated due to rapid advances in sensor technology and deployment. In this paper, we propose a reactive framework to enable the development of pervasive computing applications for different context environments. One of the novelties has been to use contexts as observables between components. Some of the context features that are observable are: the classical communications, termination, clock-time, suspension of actions based on presence or absence of signals, location, resource parameters, etc. The new observables provide a framework for the development of appropriate flexible middleware for a spectrum of applications. Further, it leads to the development of an implementational model for a spectrum of applications that can be effectively hooked onto available component implementations with appropriate interfaces. The new observables are suspensive in nature with respect to communications and locations and allow to model varieties of distributed applications that include sensor technology, wireless environments etc.

Rajesh Gupta, R. K. Shyamasundar
The Feature Selection and Intrusion Detection Problems

Cyber security is a serious global concern. The potential of cyber terrorism has posed a threat to national security; meanwhile the increasing prevalence of malware and incidents of cyber attacks hinder the utilization of the Internet to its greatest benefit and incur significant economic losses to individuals, enterprises, and public organizations. This paper presents some recent advances in intrusion detection, feature selection, and malware detection.

In intrusion detection, stealthy and low profile attacks that include only few carefully crafted packets over an extended period of time to delude firewalls and the intrusion detection system (IDS) have been difficult to detect. In protection against malware (trojans, worms, viruses, etc.), how to detect polymorphic and metamorphic versions of recognized malware using static scanners is a great challenge.

We present in this paper an agent based IDS architecture that is capable of detecting probe attacks at the originating host and denial of service (DoS) attacks at the boundary controllers. We investigate and compare the performance of different classifiers implemented for intrusion detection purposes. Further, we study the performance of the classifiers in real-time detection of probes and DoS attacks, with respect to intrusion data collected on a real operating network that includes a variety of simulated attacks.

Feature selection is as important for IDS as it is for many other modeling problems. We present several techniques for feature selection and compare their performance in the IDS application. It is demonstrated that, with appropriately chosen features, both probes and DoS attacks can be detected in real time or near real time at the originating host or at the boundary controllers.

We also briefly present some encouraging recent results in detecting polymorphic and metamorphic malware with advanced static, signature-based scanning techniques.

Andrew H. Sung, Srinivas Mukkamala
On the BDD of a Random Boolean Function

The

Binary Decision Diagram

BDD, the

Binary Moment Diagram

BMD and the

Minimal Deterministic Automaton

MDA are three canonical representations for Boolean functions. Exact expression

a

(

i

) and

w

(

i

) are provided for the average and worst size BDD over

${B^i}\rightarrowtail B$

, and they are proved equal to those for the average and worst size BMD.

Jean Vuillemin, Fredéric Béal
Concurrent Constraint-Based Memory Machines: A Framework for Java Memory Models (Summary)

A central problem in extending the von Neumann architecture to petaflop computers with a shared memory and millions of hardware threads is defining the

memory model

[1, 2, 3]. Such a model must specify the behavior of concurrent (conditional) reads and writes to the same memory locations. We present a simple, general framework for the specification of memory models based on an abstract machine that uses sets of

order

and

value

constraints to communicate between threads and main memory. Memory is permitted to specify exactly those linkings (mappings from read events to write events) which result in a unique solution for the constraints, and hence forces a unique patterns of bits to flow from writes to reads.

We show that this single principle accounts for almost all the “causality test cases” proposed for the

java

memory model. It may be used as the basis for developing a formal memory model for the

java

programming language.

An implementation of

CCM

s in a Prolog-based constraint language has been developed by Tom Schrijvers and Bart Demoen [4]. This paper is a summary of [5].

Vijay Saraswat
Backmatter
Metadaten
Titel
Advances in Computer Science - ASIAN 2004. Higher-Level Decision Making
herausgegeben von
Michael J. Maher
Copyright-Jahr
2005
Verlag
Springer Berlin Heidelberg
Electronic ISBN
978-3-540-30502-6
Print ISBN
978-3-540-24087-7
DOI
https://doi.org/10.1007/b103476

Premium Partner