Skip to main content

2007 | Buch

Advances and Innovations in Systems, Computing Sciences and Software Engineering

insite
SUCHEN

Über dieses Buch

Advances and Innovations in Systems, Computing Sciences and Software Engineering includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of Computing Sciences, Software Engineering and Systems.

Advances and Innovations in Systems, Computing Sciences and Software Engineering includes selected papers form the conference proceedings of the International Conference on Systems, Computing Sciences and Software Engineering (SCSS 2006) which was part of the International Joint Conferences on Computer, Information and Systems Sciences and Engineering (CISSE 2006).

All aspects of the conference were managed on-line; not only the reviewing, submissions and registration processes; but also the actual conference. Conference participants - authors, presenters and attendees - only needed an internet connection and sound available on their computers in order to be able to contribute and participate in this international ground-breaking conference. The on-line structure of this high-quality event allowed academic professionals and industry participants to contribute work and attend world-class technical presentations based on rigorously refereed submissions, live, without the need for investing significant travel funds or time out of the office. Suffice to say that CISSE received submissions from more than 70 countries, for whose researchers, this opportunity presented a much more affordable, dynamic and well-planned event to attend and submit their work to, versus a classic, on-the-ground conference.

The CISSE conference audio room provided superb audio even over low speed internet connections, the ability to display PowerPoint presentations, and cross-platform compatibility (the conferencing software runs on Windows, Mac, and any other operating system that supports Java). In addition, the conferencing system allowed for an unlimited number of participants, which in turn granted CISSE the opportunity to allow all participants to attend all presentations, as opposed to limiting the number of available seats for each session.

Inhaltsverzeichnis

Frontmatter
Chapter 1. An Adaptive and Extensible Web-based Interface System for Interactive Video Contents Browsing

With the growing popularity of mobile devices (including phones and portable media players) and coverage of Internet access, we tend to develop the need of consuming video content on the move. Some technologies already allow end-users to watch TV and listen to news

podcasts

or download music videos on their devices. However, such services are restricted to a provider’s selection of pre-formatted and linear content streams. Hence, we propose a web-based interface system that supports interactive contents navigation, making it possible for end-users to “surf” on video content like they are used to on the Web. This system is extensible to any specific domain of video contents, any web-enabled platform, and to any browsing scheme. In this paper, we will explain the architecture and design of this system, propose an application for soccer videos and present the results of its user evaluation.

Adrien Joly, Dian Tjondronegoro
Chapter 2. Design and Implementation of Virtual Instruments for Monitoring and Controlling Physical Variables Using Different Communication Protocols

In this Project were developed software components

(Java Beans) which have the capability of communication through different communication protocols with hardware elements interconnected to sensors and control devices for monitoring and controlling different physical variables, conforming a hardware-software platform that obeys the virtual instruments design pattern. The implemented communication protocols are RS232, 1-Wire and TCP/IP with all of its annexed technologies like WiFi (Wireless Fidelity) and WiMax

A. Montoya, D. Aristizábal, R. Restrepo, N. Montoya, L. Giraldo
Chapter 3. Online Decision Support System for Dairy Farm

online decision support system for dairy farm was created for helping Lithuanian dairy farmers, scientists, dairy technology producers, students and other peoples interesting in dairy business. It enable they use newest information and technology for planning own business

A. Savilionis, A. Zajančkauskas, V. Petrauskas, S. Juknevičius
Chapter 4. Decision Making Strategies in Global Exchange and Capital Markets

The main objective of this paper is to present the investment decision management system in exchange and capital markets – the Double Trump model. The main problems being solved with this model are named as quantitative decision search problems. Computer-imitational methods are also analysed as the main solving means for the mathematical models viewed as stochastical programming tasks in order to reflect the problems characteristics. Attention is paid to the revealing of the analytical possibilities of the decision management system and to decision methods identification, analyzing such non-traditional problems of financial engineering as three-dimensional utility function maximization in the adequate for investment decisions reliability assessment portfolio possible set of values, searching for investment decisions profitability, reliability and riskiness commensuration concept and mathematical decisions methods. Solving of the problems named above ensures sustainable investment decisions development in capital and exchange markets.

Aleksandras Vytautas Rutkauskas, Viktorija Stasytyte
Chapter 5. A Simple and Efficient Solution for Room Synchronization Problem in Distributed Computing

Room synchronization problem was first introduced by Joung in 1998 and widely studied subsequently. The problem arises in various practical applications that require concurrent data sharing. The problem aims at achieving exclusive access to shared data while facilitating suitable concurrency.

Alex A. Aravind
Chapter 6. Improving Computer Access For Blind Users

This paper discusses the development of applications dedicated to the blind users, with the help of reusable components. The methodology relies on component based development. For this purpose, braille-speech widgets adapted from classical widgets, have been studied, specified and implemented. The developed components can be used by developers to implement software for blind users. The contribution of this work in the field of assistive technology is valuable, because there are no existing tools that facilitate the creation of interfaces for the blind users, and it may considerably improve computer access for this category of users.

Amina Bouraoui, Mejdi Soufi
Chapter 7. Developing a Multi-Agent System for Dynamic Scheduling Trough Aose Perspective

Agent-based computing can be considered as a new general purpose paradigm for software development, which tends to radically influence the way a software system is conceived and developed, and which calls for new agent specific software engineering approaches. This paper presents an architecture for distributed manufacturing scheduling and follows Agent Oriented Software Engineering (AOSE) guidelines trough specification defined by Ingenias methodology. This architecture is based on a Multi-Agent System (MAS) composed by a set of autonomous agents that cooperates in order to accomplish a good global solution.

Ana Madureira, Joaquim Santos, Nuno Gomes, Ilda Ferreira
Chapter 8. Criminal Sentencing, Intuition and Decision Support
Andrew Vincent, Tania Sourdin, John Zeleznikow
Chapter 9. An Approach for Invariant Clustering and Recognition in Dynamic Environment

An approach for invariant clustering and recognition of objects (situation) in dynamic environment is proposed. This approach is based on the combination of clustering by using unsupervised neural network (in particular ART-2) and preprocessing of sensor information by using forward multi-layer perceptron (MLP) with error back propagation (EBP) which supervised by clustering neural network. Using MLP with EBP allows to recognize a pattern with relatively small transformations (shift, rotation, scaling) as a known previous cluster and to reduce producing large number of clusters in dynamic environment, e.g. during movement of robot or recognition of novelty in security system.

Andrey Gavrilov, Sungyoung Lee
Chapter 10. Modelling non Measurable Processes by Neural Networks: Forecasting Underground Flow Case Study of the Céze Basin (Gard - France)

After a presentation of the nonlinear properties of neural networks, their applications to hydrology are described. A neural predictor is satisfactorily used to estimate a flood peak. The main contribution of the paper concerns an original method for visualising a hidden underground flow Satisfactory experimental results were obtained that fitted well with the knowledge of local hydrogeology, opening up an interesting avenue for modelling using neural networks.

A. Johannet, P.A. Ayral, B. Vayssade
Chapter 11. Significance of Pupil Diameter Measurements for the Assessment of Affective State in Computer Users

The need to provide computers with the ability to distinguish the affective state of their users is a major requirement for the practical implementation of Affective Computing concepts. The determination of the affective state of a computer user from the measurement of some of his/her physiological signals is a promising avenue towards that goal. In addition to the monitoring of signals typically analyzed for affective assessment, such as the Galvanic Skin Response (GSR) and the Blood Volume Pulse (BVP), other physiological variables, such as the Pupil Diameter (PD) may be able to provide a way to assess the affective state of a computer user, in real-time. This paper studies the significance of pupil diameter measurements towards differentiating two affective states (stressed vs. relaxed) in computer users performing tasks designed to elicit those states in a predictable sequence. Specifically, the paper compares the discriminating power exhibited by the pupil diameter measurement to those of other single-index detectors derived from simultaneously acquired signals, in terms of their Receiver Operating Characteristic (ROC) curves.

Armando Barreto, Jing Zhai, Naphtali Rishe, Ying Gao
Chapter 12. A Novel Probing Technique for Mode Estimation in Video Coding Architectures

Video compression standards operate by removing redundancy in the temporal, special, and even frequency domains. Temporal redundancy is usually removed by motion compensated prediction resulting in Inter-, Intra-, Bidirectional- frames. However, video coding standards do not specify the encoding process but the bit stream. Thus one of the key tasks of any implementations of such standards is to estimate the modes of frames as well as macro blocks. In this article we propose a novel technique for this purpose.

Ashoka Jayawardena
Chapter 13. The Effects of Vector Transform on Speech Compression
B.D. Barkana, M.A. Cay
Chapter 14. Software Development Using an Agile Approach for Satellite Camera Ground Support Equipment

This work presents the development of the software that controls a set of equipments, called Ground Support Equipment (GSE), which verifies requirement fulfilling and helps integration procedures of the CBERS-3 and 4 satellites’ Multispectral Camera (MUXCAM). The software development followed an iterative spiral model, with agile methods characteristics that were originally used at Opto Electronics in industrial and medical equipment’s projects. This approach allowed a small team, constituted by only four engineers, to fast create the first software version, even sharing time with GSE’s hardware development, and to keep the project on schedule, in spite of some requirement changes.

D. dos Santos Jr., I. N. da Silva, R. Modugno, H. Pazelli, A. Castellar
Priming the Pump: Load Balancing Iterative Algorithms

Load balancing iterative algorithms is an interesting problem in resource allocation that is useful for reducing total elapsed processing time through parallel processing. Load balancing means that each processor in a parallel processing environment will handle about the same computational load. It is not sufficient to allocate the same number of processes to each processor since different processes or tasks can require different loads . For iterative algorithms, load balancing is the process of distributing the iterations of a loop to individual processes . This paper will analyze different methods used for load balancing. Each method will be measured by how well it reduces the total elapsed time and by algorithm complexity and overhead. Measured data for different load balancing methods will be included in this paper.

David J. Powers
An Ontology for Modelling Flexible Business Processes

With the aim of developing information systems better fitted to the main challenges raised by globalisation, we propose an ontology for the modelling of interoperable and flexible business processes. We distinguish three types of Activities a Process can be made of: whereas Procedures are defined by a sequence of Tasks and Services by a Service Description, Interactions are specified by a Goal and ruled by a Social Convention. Correlatively an Actor can have three different Statuses: Performer, Provider or Agent.

Denis Berthier
Routing Free Messages Between Processing Elements in a Hypercube with Faulty Links

An algorithm for routing free messages between processing elements in a multiprocessor system is proposed. As a basic architecture an n-dimensional hypercube is applied. Only one of the processors in the hypercube is connected with an external user. The external machine is called host processor. Bidirectional one-port links, some of them faulty at same time are applied. The algorithm can be applied on an arbitrary connected multiprocessor system.

Dinko Gichev
OPTGAME: An Algorithm Approximating Solutions for Multi-Player Difference Games

We present a new numerical tool to determine solutions of non-zero-sum multi-player difference games. In particular, we describe the computer algorithm OPTGAME (version 2.0) which solves affine-quadratic games and approximates solutions for nonlinear games iteratively by using a local linearization procedure. The calculation of these solutions (open-loop and feedback Nash and Stackelberg equilibrium solutions) is sketched, as is the determination of the cooperative Pareto-optimal solution.

Doris A. Behrens, Reinhard Neck
Rapid Development of Web Applications with Web Components

This paper provides a brief overview of Domain Model RAD, a web framework, which is used for developing dynamic web applications with a minimum amount of programming. Domain Model RAD uses Domain Model Lite to represent a domain model of a web application. Domain Model Lite is a framework that facilitates the definition and the use of domain models in Java. Domain Model RAD uses Wicket for web application pages and page sections. Wicket is a web framework that provides basic web components, to construct, in an object oriented way, more advanced web components. Domain Model RAD interprets the application model and creates default web pages from its web components that are based on the domain model.

Dzenan Ridjanovic
Mesh-adaptive methods for viscous flow problem with rotation

In this paper, new functional type a posteriori error estimates for the viscous flow problem with rotating term are presented. The estimates give guaranteed upper bounds of the energy norm of the error and provide reliable error indication. We describe the implementation of the adaptive finite element methods (AFEM) in the framework of the functional type estimates proposed. Computational properties of the estimates are investigated on series of numerical examples.

E. Gorshkova, P. Neittaanmaki, S. Repin
Chapter 21. Metamodel-based Comparison of Data Models
Erki Eessaar
BEMGA: A HLA Based Simulation Modeling and Development Tool

High Level Architecture (HLA) is a general purpose architecture, developed to support reuse and interoperability across a large number of different types of distributed simulation projects.

Ersin Ünsal, Fatih Erdoğan Sevilgen
Comparison of different POS Tagging Techniques (n-gram, HMM and Brill’s tagger) for Bangla

There are different approaches to the problem of assigning each word of a text with a parts-of-speech tag, which is known as Part-Of-Speech (POS) tagging. In this paper we compare the performance of a few POS tagging techniques for Bangla language, e.g. statistical approach (n-gram, HMM) and transformation based approach (Brill’s tagger). A supervised POS tagging approach requires a large amount of annotated training corpus to tag properly. At this initial stage of POS-tagging for Bangla, we have very limited resource of annotated corpus. We tried to see which technique maximizes the performance with this limited resource. We also checked the performance for English and tried to conclude how these techniques might perform if we can manage a substantial amount of annotated corpus.

Fahim Muhammad Hasan, Naushad UzZaman, Mumit Khan
Real-Time Simulation and Data Fusion of Navigation Sensors for Autonomous Aerial Vehicles

This paper presents an integrated navigation tool developed in the framework of an advanced study on navigation of Unmanned Aerial Vehicles. The study aimed at testing innovative navigation sensor configurations to support fully autonomous flight even during landings and other critical mission phases. The tool is composed of sensor simulation and data fusion software. The most important navigation sensors that are installed onboard an unmanned aircraft have been modeled: i.e. inertial, GPS, air data, high accuracy altimeter, and magnetometer. Their model included every non negligible error source that has been documented in the literature. Moreover, a specific sensor data fusion algorithm has been developed that integrates inertial sensor measurements with GPS and radar altimeter measurements. The paper reports on numerical testing of sensor simulator and data fusion algorithm. The algorithm was coded for real time implementation to perform hardware–in-the-loop validation and in flight tests onboard a small Unmanned Aerial Vehicle.

Francesco Esposito, Domenico Accardo, Antonio Moccia, U. Ciniglio, F. Corraro, L. Garbarino
Swarm-based Distributed Job Scheduling in Next-Generation Grids

The computational Grid paradigm is now commonly used to define and model the architecture of a distributed software and hardware environment for executing scientific and engineering applications over wide area networks. Resource management and load balanced job scheduling are a key concern when implementing new Grid middleware components to improve resource utilization. Our work focuses on an evolutionary approach based on swarm intelligence and precisely on the ant-colony based meta-heuristic, to map the solution capability of social insects to the above resource scheduling and balancing problem, achieving an acceptable near-optimal solution at a substantially reduced complexity. The Grid resource management framework, will be implemented as a multi-agent system where all the agents communicate each other through the network and cooperate according to ant-like local interactions so that load balancing and Grid makespan/flowtime optimization can be achieved as an emergent collective behaviour of the system. We showed, by presenting some simulation results, that the approach has the potential to become really appropriate for resource balanced scheduling in Grid environments.

Francesco Palmieri, Diego Castagna
Facial Recognition with Singular Value Decomposition

This paper implements a real-time system to recognize faces. The approach is essentially to apply the concepts of vector space and subspace to face recognition. The set of known faces with m × n pixels forms a subspace, called “face space”, of the “image space” containing all images with m × n pixels. This face space best defines the variation of the known faces. The basis of the face space is defined by the singular-vectors of the set of known faces. These singular-vectors do not necessarily correspond to the distinct features like ears, eyes and noses. The projection of a new image onto this face space is then compared to the available projections of known faces to identify the person. Since the dimension of face subspace is much less than the whole image space, it is much easier to compare projections than origin images pixel by pixel. Based on the above idea, a Singular Value Decomposition (SVD) approach is implemented in this paper. The framework provides our system the ability to learn to recognize new faces in a real-time and automatic manner.

Guoliang Zeng
The Application of Mobile Agents to Grid Monitor services

Grids provide a uniform interface to a collection of heterogeneous, geographically distributed resources. In recent years, the research on the Grid Monitoring System gets increasingly essential and significant. In this paper we put forward a novel Mobile Agent-based Grid Monitoring Architecture (MA-GMA), which is based on the GMA from GGF and introduces the mobile agents and cache mechanism of MDS. Based on the Open Grid Service Architecture (OGSA) standard, we merge the intelligence and mobility characteristic of mobile agent into the current OGSA to constructing a dynamic and extensible monitoring system. In the end, we do some experiments under different environments. As the results shown, this MA-GMA is proved to be effective and improve the monitoring performance greatly.

Guoqing Dong, Weiqin Tong
Expanding the Training Data Space Using Bayesian Test

Expanding the training dataset is a new technique proposed recently to improve the performance of classification methods. In this paper, we propose a powerful method to conduct the previous task. Our method is based on applying the Bayesian test based on emerging patterns to evaluate and improve the quality of the new data instances used to expand the training data space. Our experiments on a number of datasets show that our method outperforms the previous proposed methods and is able to add additional knowledge to the space of data.

Hamad Alhammady
A Multi-Agent Framework for Building an Automatic Operational Profile

Since the early 1970s, researchers have proposed several models to improve software reliability. Among these, the operational profile approach is one of the most common. Operational profiles are a quantification of usage patterns for a software application. The research described in this paper investigates a novel multi-agent framework for automatically creating an operational profile for generic distributed systems after their release into the market. The operational profile in this paper is extended to comprise seven different profiles. Also, the criticality of operations is defined using a new composed metrics in order to organise the testing process as well as to decrease the time and cost involved in this process. The proposed framework is considered as a step towards making distributed systems intelligent and self-managing.

Hany EL Yamany, Miriam A.M. Capretz
An Efficient Interestingness based Algorithm for Mining Association Rules in Medical Databases

Mining association rules is animportant area in data mining. Massively increasing volume of data in reallife databases has motivated researchers to design novel and efficientalgorithm for association rules mining. In this paper, we propose anassociation rule mining algorithm that integrates interestingness criteriaduring the process of building the model. One of the main features of thisapproach is to capture the user background knowledge, which is monotonicallyaugmented. We tested our algorithm and experiment with some public medicaldatasets and found the obtained results quite promising.

Siri Krishan Wasan, Vasudha Bhatnagar, Harleen Kaur
NeSReC: A News meta-Search Engines Result Clustering Tool
Hassan Sayyadi, Sara Salehi, Hassan AbolHassani
Automatic Dissemination of Text Information using the EBOTS system

World Wide Web contains 170 Terabytes of information [1] and storage estimates show that the new information is growing at a rate of over 30% a year. With the quanta of information growing exponentially, it is important to understand the information semantically to know what concepts are relevant and what are irrelevant. The Evolutionary Behavior Of Textual Semantics (EBOTS) system being developed at University of Arkansas at Little Rock [2] aims at the quantitative reasoning aspect of textual information. In the automatic decision-making mode, the EBOTS system can distinguish between relevant and irrelevant information, discarding irrelevant documents and accepting only relevant information to develop expertise in a particular field. This paper discusses the usefulness of Information Theory in the development of relevance criteria and the results obtained in the context of textual information.

Hemant Joshi, Coskun Bayrak
Mapping State Diagram To Petri Net : An Approach To Use Markov Theory For Analyzing Non-Functional Parameters

The quality of an architectural design of a software system has a great influence on achieving non-functional requirements to the system.

Unified Modeling Language (

UML

), which is the industry standard as a common object oriented modeling language needs a well-defined semantic base for its notation. Integrating formal methods Petri nets (

PNs

) with object oriented design concepts

UML

is useful to benefit from the strengths of both approaches. Formalization of the graphical notation enables automated processing and analysis tasks. In this paper we use a method to converting State Diagram to Generalized Stochastic Petri Net (

GSPN

) and then we derive the embedded Continues Time Markov Chain from the

GSPN

and finally we use Markov Chain theory to obtain performance parameters.

H. Motameni, A. Movaghar, M. Siasifar, M. Zandakbari, H. Montazeri
A Distributed Planning & Control Management Information System for Multi-site Organizations

This paper describes the design, development and deployment challenges facing an implementation of an enterprise-wide distributed web-based

Planning, Budgeting and Reporting Control Management Information System

for a large public utility organization. The system serves the needs of all departments of the company’s General Division of Production. The departments of the division are situated all over the Greek state with many geographically remote plants under the control of the division. To speed-up the exchange of Management Information between the various levels of the hierarchy regarding daily, monthly, or longer-term reports on the operational level, a portal was set-up that enabled all levels of management personnel to have controlled access to constantly updated information about operations, strategic goals and actions. A new planning and budgeting system for controlling operational, investment, and personnel expenses based on the Activity-Based Costing (ABC) & Budgeting model was then integrated into the portal to provide a web-based Planning Budgeting & Reporting Control MIS. The system is capable of handling many thousands of requests per hour for internal reports, graphs, set goals etc. and allows the seamless collaboration and coordination between all departments in the organizational hierarchy.

Ioannis T. Christou, Spyridon Potamianos
Supporting Impact Analysis by Program Dependence Graph Based Forward Slicing

Since software must evolve to meet the typically changing requirements, source code modifications can not be avoided. Impact analysis is one of the central and relatively demanding tasks of software maintenance. It is constantly needed while aiming at ensuring the correctness of the made modifications. Due to its importance and challenging nature automated support techniques are required. Theoretically, forward slicing is a very suitable technique for that purpose. Therefore, we have implemented a program dependence graph (PDG) based tool, called GRACE, for it. For example, due to the typical rewritings of Visual Basic programs there is a great need to support their impact analysis. However, there were neither earlier scientific studies on slicing Visual Basic nor reported slicers for it. In case of forward slicing there is a need to perform efficient static slicing revealing all the potential effects of considered source code modifications. Use of PDGs helps in achieving this goal. Therefore, this paper focuses on describing automated PDG-based forward slicing for impact analysis support of Visual Basic programs. GRACE contains a parser, a PDG-generator and all other necessary components to support forward slicing. Our experiences on the application of the PDG-based forward slicing has confirmed the feasibility of the approach in this context. GRACE is also compared to other forward slicing tools.

Jaakko Korpi, Jussi Koskinen
An Analysis of Several Proposals for Reversible Latches

Recent work has begun to investigate the advantages of using reversible logic for the design of circuits. The majority of work, however, has limited itself to combinational logic. Researchers are just now beginning to suggest possibilities for sequential implementations. This paper performs a closer analysis of three latch designs proposed in previous work and suggests advantages and disadvantages of each

J.E. Rice
Implementation of a Spatial Data Structure on a FPGA

Many systems exist that store and manipulate data; however, many do no have sufficient support for spatial data Many data structures are proposed that are intended specifically for spatial data; however, software implementations have not performed as well as hoped. This work presents a feasibility study investigating the use of a FPGA for the implementation of a structure to support spatial search and retrieval.

J.E. Rice, W. Osborn, J. Schultz
Security Management: Targets, Essentials and Implementations

We first analyze security targets of implementing security management for nowadays IT infrastructures – information systems created by enterprises for successful business, and detail possible measures for achieving relevant targets. Secondly, we conclude that the essentials of security management are to construct trustworthy network endpoints, and to establish trustworthy communication channel between intending communication parties; then two instances of accomplishing the essentials of security management are exemplified, i.e. trustworthy smart card transaction and trustworthy SOA-Based Web Services. At last, we discuss the main aspects of implementing security management for information systems, precisely, strategic steps, i.e. (1) attestation and negotiation, (2) proposing and implementing application-specific strategies, and (3) considerations for strength and efficiency of security management.

Zhao Jing, Zheng Jianwu
Application of fuzzy set ordination and classification to the study of plant communities in Pangquangou Nature Reserve, China

Fuzzy Set Ordination (FSO) and Fuzzy C-means Classification techniques were used to study the relationships between plant communities and environmental factors in Pangquangou Nature Reserve, Shanxi province of China. Pangquangou Nature Reserve, located at N37°20’-38°20’, E110°18’-111°18’, is a part of Luliang mountain range. Eighty-nine quadrats of 10m x 20m along an elevation gradient were set up and recorded in this area. The results showed that the two methods, FSO and fuzzy C-means classification describe the ecological relations of communities successfully. The results of FSO showed that the distribution of communities is closely related to elevation, water-conditions and humidity, and also related to aspect and slope. Thirteen community types were distinguished by fuzzy C-means classification, and each of them has special characteristics. The combination of FSO and fuzzy C-means classification may be more effective in the studies of community ecology.

Jin-tun Zhang, Dongpin Meng
On Searchability and LR-Visibility of Polygons

Imagine that intruders are in a dark polygonal room and move at a finite but unbounded speed, trying to avoid detection. Polygon search problem asks whether a polygon is searchable, i.e., no matter how intruders move, searcher(s) can always detect them. A polygon is LR-visible if there exist two boundary points such that the two polygonal chains divided by them are mutually weakly visible. We explore the relationship between the searchability and LR-visibility of a polygon. Our result can be used as a preprocessing step in designing algorithms related to polygon search.

John Z. Zhang
Swarm Intelligence in Cube Selection and Allocation for Multi-Node OLAP Systems

The continuous growth of OLAP users and data impose additional stress on data management and hardware infrastructure. The distribution of multidimensional data through a number of servers allows the increasing of storage and processing power without an exponential increase of financial costs. But this solution adds another dimension to the problem: space. Even in centralized OLAP, cube selection efficiency is complex, but now, we must also know where to materialize subcubes. This paper proposes algorithms that solve the distributed OLAP selection problem under space constraints, considering a query profile, using discrete particle swarm optimization in its normal, cooperative, multi-phase and hybrid genetic versions.

Jorge Loureiro, Orlando Belo
Developing Peer-to-Peer Applications with MDA and JXTA

Recently, Peer-to-Peer (P2P) architecture is being used to explore better the computing power and bandwidth of networks than a client/server architecture. In order to support the creation of P2P applications, some frameworks were proposed such as JXTA. However, large systems using P2P architecture are complex to be developed, maintained and evolved. Model Driven Architecture (MDA) can support the management of the complexity in the software development process through transformations of Platform Independent Models (PIM) into Platform Specific Models (PSM). In this paper, we apply an MDA approach to allow the development of applications based on JXTA. The JXTA implementation in Java is used to demonstrate our approach. We propose a model transformation definition from an UML model to a Java+JXTA model. In order to validate our approach, we present two case studies.

José Geraldo de Sousa Junior, Denivaldo Lopes
A Case Study to Evaluate Templates & Metadata for Developing Application Families

Automatic code generation of application families emerges as a solid promise to cope with the increasing demand of software in business environments. Using templates and metadata for development of abstract solutions and further automatic generation of the particular cases, helps freeing the developers from the most mechanical and tedious tasks of the implementation phase, allowing them to focus their knowledge in the expression of conceptual solutions.

In this case study, we adapted the Halstead metrics for object-oriented code, templates, and metadata -in XML format- to measure the effort required to specify and then automatically generate complete applications, in comparison with the effort required to build the same applications entirely by hand. Then we used the same metrics to compare the effort of specifying and generating a second application of the same family, versus the effort required to coding this second application by hand.

José Lamas Ríos, Fernando Machado-Píriz
Application of Multi-Criteria to Perform an Organizational Measurement Process

Software quality has become increasingly important as a crucial factor in keeping organizations competitive. Software process measurement is an essential activity in achieving better quality and guarantees, both in the development process and in the final product. This paper presents the use of multi-criteria in a proposed model for the software measurement process, in order to make it possible to perform organizational planning for measurement, prioritize organizational metrics and define minimal acceptance percentage levels for each metric. This measurement process was based on five well known processes of measurement: CMMI-SW, ISO/IEC 15939, IEEE Std 1061, Six Sigma and PSM.

Josyleuda Melo Moreira de Oliveira, Karlson B.de Oliveira, Ana Karoline A.de Castro, Plácido R. Pinheiro, Arnaldo D. Belchior
Institutionalization of an Organizational Measurement Process

Software development is a complex activity which demands a series of factors to be controlled. In order for this to be controlled in an effective manner by project management, it is necessary to use software process measurement to identify problems and to consider improvements. This paper presents an organizational software measurement process resulting from the mapping of five relevant software measurement processes: CMMI-SW, ISO/IEC 15939, IEEE Std 1061, Six Sigma, and PSM (Practical Software Measurement). The best practices of each one were used, including relevant keys to facilitate the applicability of a measurement process focused on project management, as well as assuring the software quality.

Josyleuda Melo Moreira de Oliveira, Karlson B.de Oliveira, Arnaldo D. Belchior
Decomposition of Head Related Impulse Responses by Selection of Conjugate Pole Pairs

Currently, to obtain maximum fidelity 3D audio, an intended listener is required to undergo time consuming measurements using highly specialized and expensive equipment. Customizable Head-Related Impulse Responses (HRIRs) would remove this limitation. This paper reports our progress in the first stage of the development of customizable HRIRs. Our approach is to develop compact functional models that could be equivalent to empirically measured HRIRs but require a much smaller number of parameters, which could eventually be derived from the anatomical characteristics of a prospective listener. For this first step, HRIRs must be decomposed into multiple delayed and scaled damped sinusoids which, in turn, reveal the parameters (delay and magnitude) necessary to create an instance of the structural model equivalent to the HRIR under analysis. Previously this type of HRIR decomposition has been accomplished through an exhaustive search of the model parameters. A new method that approaches the decomposition simultaneously in the frequency (Z) and time domains is reported here.

Kenneth John Faller, Armando Barreto, Navarun Gupta, Naphtali Rishe
GIS Customization for Integrated Management of Spatially Related Diachronic Data

This study presents the development of an interface for the management of diachronic spatial data that describe the evolution of an area.

Subsequent data that represent specific spatial characteristics for various time periods are organized and processed in a customized GIS environment.

Vector and raster data (old scanned maps, air photos and satellite imagery) are related based on their spatial and temporal properties and they are archived adequately.

Part of the data set contains digital documentation in form of digital photos, audio and video files, thus the customization includes multimedia playback for selected geographic features that are described by these means.

As a case study is used an extended area in Northern Greece that includes various archaeological sites along Egnatia road (nearby ancient Via Egnatia).

K.D. Papadimitriou, T. Roustanis
BlogDisc: A System for Automatic Discovery and Accumulation of Persian Blogs

One of the important elements of the new generation of the Web is the emergence of blogs. Currently a considerable number of users are creating content using blogs. Although Persian blogs have a short history, they have improved significantly during this short period. Because of fundamental differences between Persian and other languages, limited work has been done to analyze Persian blogs. In this work, a system named BlogDisc for automatic discovery and accumulation of Persian blogs is developed. This system uses content as well as link structure of the blogs. As an important part of this research, we propose an algorithm to recognize blogs that are not hosted on special blog hosts.

Kyumars Sheykh Esmaili, Hassan Abolhassani, Zeinab Abbassi
Fuzzy Semantic Similarity Between Ontological Concepts

The main focus of this paper concerns the measuring similarity in a content-based information retrieval and intelligent question-answering environment. While the measure of semantic similarity between concepts based on hierarchy in ontology is well studied, the measure of semantic similarity in an arbitrary ontology is still an open problem. In this paper we define a fuzzy semantic similarity measure based on information theory that exploits both the hierarchical and non-hierarchical structure in ontology. Our work can be generalized the following: firstly each concept is defined as a semantic extended fuzzy set along its semantic paths; secondly the semantic similarity between two concepts is computed with two semantic extended fuzzy sets instead of two concepts themselves. Our fuzzy measure considers some factors synthetically such as ontological semantic relation density, semantic relation depth and different semantic relations, which can affect the value of similarity. Compared with existed measures, this fuzzy similarity measure based on shared information content could reflect latent semantic relation of concepts better than ever.

Ling Song, Jun Ma, Hui Liu, Li Lian, Dongmei Zhang
Research on Distributed Cache Mechanism in Decision Support System

With the development of Internet technology, the business decision makers put forward higher requirements with the performance in Decision Support System (DSS). In order to improve the query response -time in DSS, this paper proposes a DSS architecture with distributed cache mechanism, gives the working flow of the system, and introduces an admissio & replacement algorithm. Experiments prove that the system performance is favorable.

Liu Hui, Ji Xiu-hua
Research on Grid-based and Problem-oriented Open Decision Support System

The characteristics of grid technology is suitable for constructing the platform of DSS, and can solve the problems of the development of DSS in distributed and dynamic decision-making environments. Open Grid Service Architecture (OGSA) is a new type grid architecture, which can support service creating, keeping and applying. With the inspiration of virtual enterprise building idea, this paper puts forward the architecture model of Grid-based and Problem-oriented Open Decision Support System (GPODSS) and discusses its operational process

Xueguang Chen, Liu Xia, Zhiwu Wang, Qiaoyun Ma
Development and Analysis of Defect Tolerant Bipartite Mapping Techniques for Programmable cross-points in Nanofabric Architecture

Chemically Assembled Electronic Nanotechnology (CAEN) using bottom-up approach for digital circuit design has imposed new dimensions for miniaturization of electronic devices. Crossbar structures or Nanofabrics using silicon nanowires and carbon nanotubes are the proposed building blocks for CAEN, sizing less than 20 nm, allowing at least 10

10

gates/cm

2

. Along with the decrease in size, defect rates in the above architectures increase rapidly, demanding for an entirely different paradigm for increasing yields, viz.

greater defect tolerance

, because the defect rates can be as high as 13% or more. In this paper, we propose a non-probabilistic approach for defect tolerance and evaluate it in terms of its coverage for different sizes of fabric and different defect rates.

Mandar Vijay Joshi, Waleed Al-Assadi
Nash Equilibrium Approach to Dynamic Power Control in DS-CDMA System

This papers aims at the power control aspect of Resource allocation for wireless data via employing microeconomics concepts of utility and pricing in relation to the non-cooperative game theory and Nash Equilibrium. Specifically, an efficient algorithm based on stochastic gradient formulation is proposed to adaptively converge to arrive at optimal power set with higher utilities with the pricing factor is a parameter. Both single cell and multi-cell are presented in this paper. Comparative numerical and graphical results provided attest to practical usefulness of the proposed power control algorithm.

J. Qasimi M, M. Tahernezhadi
Natural Language Processing of Mathematical Texts in mArachna

mArachna is a technical framework designed for the extraction of mathematical knowledge from natural language texts. mArachna avoids the problems typically encountered in automated-reasoning based approaches through the use of natural language processing techniques taking advantage of the strict formalized language characterizing mathematical texts. Mathematical texts possess a strict internal structuring and can be separated into text elements (entities) such as definitions, theorems etc. These entities are the principal carriers of mathematical information. In addition, Entities show a characteristic coupling between the presented information and their internal linguistic structure, well suited for natural language processing techniques. Taking advantage of this structure, mArachna extracts mathematical relations from texts and integrates them into a knowledge base. Identifying sub elements within new elements of information with already stored mathematical concepts defines the structure of the knowledge base. As a result, mArachna generates an ontology of the analyzed mathematical texts. In response to user queries, parts of the knowledge base are visualized using OWL. In particular, mArachna aims to provide an overview of single fields of mathematics, as well as showing intra-field relations between mathematical objects and concepts. The following paper gives an overview of the theoretical basis and the technologies applied within the mArachna framework.

Marie Blanke, Sabina Jeschke, Nicole Natho, Ruedi Seiler, Marc Wilke
Humanization of E-services: Human Interaction Metaphor in Design of E-services

A possible way of introducing better e-services is to regard an e-service not as a package of functions but as a person or people offering the service. It finally means emulation of a particular structure of human functioning, human interaction and communication by an instrument such as a computer or a mobile phone etc in the most human-like way possible. It means using human schematas and scripts in programming where the e-service provider is a “personality” who has “his” social role in human-computer interaction. That requires the use of psychological principles that are common in human goal-orientated behaviour, human-human interaction and natural intercourse between people. There is a need for research on human-human interaction in service and/or other situations, and introduction of the findings in the technical solutions of e-services.

Mart Murdvee
Introducing The (POSSDI) Process The Process of Optimizing the Selection of The Scanned Document Images

Today, many institutions and organizations are facing serious problem due to the tremendously increasing size of documents, and this problem is further triggering the storage and retrieval problems due to the continuously growing space and efficiency requirements. This problem is becoming more complex with time and the increase in the size and number of documents in an organization. Therefore, there is a growing demand to address this problem. This demand and challenge can be met by developing a process to enable specialized document imaging people to select the most suitable image type and scanning resolution to use when there is a need for storing documents images. This process, if applied, attempts to solve the problem of the image storage type and size to some extent. In this paper, we present a process to optimize the selection of the scanned image type and resolution to use prior to acquire the document image which we want to store and hence to retrieve; therefore, we optimize the document image storage size and retrieval time.

Mohammad A. ALGhalayini, Abad Shah
Infrastructure for Bangla Information retrieval in the context of ICT for Development

In this paper, we talk about developing a search engine and information retrieval system for Bangla. Current work done in this area assumes the use of a particular type of encoding or the availability of particular facilities for the user. We wanted to come up with an implementation that did not require any special features or optimizations in the user end, and would perform just as well in all situations. For this purpose, we picked two case studies to work on in our effort to finding a suitable solution to the problem. While working on these cases, we encountered several problems and had to find our way around these problems. We had to pick and choose from a set of software packages for the one that would best serve our needs. We also had to take into consideration user convenience in using our system, for which we had to keep in mind the diverse demographics of people that might have need for such a system. Finally, we came up with the system, with all the desired features. Some possible future developments also came into mind in the course of our work, which are also mentioned in this paper.

Nafid Haque, M. Hammad Ali, Matin Saad Abdullah, Mumit Khan
An Improved Watermarking Extraction Algorithm

Echo hiding is one of the prevailing techniques in audio watermarking due to its good perceptual quality. However, the detection ratio of this method is relatively low and its robustness against many common signal-processing operations is not satisfactory. In this paper, an improved watermarking extraction algorithm, which is based on auto-power-cepstrum, is proposed. Computer simulation results prove that the new method achieves higher detection ratio when compared with conventional auto-complex-cepstrum based algorithm and its robustness against various signal processing manipulations, such as Mp3 compression, re-sampling, cropping, re-quantization, filtering, amplitude amplifying, noise addition and time delay, is great.

Ning Chen, Jie Zhu
Building Knowledge Components to Enhance Frequently Asked Question

A web page that adopts knowledge components concepts is able to help its users in delivering or obtaining information and knowledge through websites itself. In this study, the adoption of knowledge components is implemented through an agent and it is compared to other types of FAQ. A well-structured agent and its knowledge components will benefit the websites users and motivates them to leverage on usage of the FAQ. This research also includes the development of knowledge components, agent and the result of observation is done to view user’s perception. Program D works as the interpreter is part of the methodology, which is created using open source Artificial Intelligence Mark-up Language (AIML) and Program D. The system implementation includes Knowledge Warehouse, which stores and organized the knowledge components. The goal of this study is to develop the knowledge components and it is implemented through agent. The outcome of this study is measured by comparing the agent with knowledge components with link type FAQ and top down FAQ. These three types of FAQ serve the same purpose, which is to cater frequent normal questions and ad-hoc queries from website users. It is hope that a well-structured knowledge component could enhance the usage of Internet FAQ where people could benefit an appropriate and relevant answer upon having any enquiries.

Noreen Izza Arshad, Savita K. Sugathan, Mohamed Imran M. Ariff, Siti Salwa A. Aziz
Semantic Representation of User’s Mental Trust Model

It is believed that trust will be the primary mental force in the electronic environment as it is in the current physical environment. At the core trust is impacted by users’ propensity to trust (internal mental state), reliance on the trustee and external direct and indirect factors.

Decentralization of publication is one of the great advantages of the internet infrastructure. Web 2.0 applications aim to promote and assist online users to publish and contribute freely for Collective Intelligence. They resolve around the notion that people can add contents for Collective Intelligence and enterprises can also use the contents to reduce costs and increase profits through Social Network Analysis.

This paper proposes a conceptual mental trust recognition and evaluation model and meta-document structure to represent, distribute and store users trust evaluations. The proposed document design is based on decentralized information structure that semantically represents contributed contents and the contributor. The contents are represented and distributed by using Atom, Resource Description Framework (RDF) and RDF Schema. The proposed meta-document structure uses RDF Schema to semantically represent users’ internal (inner) online trust evaluation model within Web 2.0 environment. It can be used as a blueprint to develop new vocabularies for any e-domain. However, trust recognition model is selected due to its importance in electronic environment.

Omer Mahmood, John D Haynes
Chapter 61. Access Concurrents Sessions Based on Quorums

This paper presents a quorum-based distributed algorithm for the group mutual exclusion. In the group mutual exclusion problem, multiples processes can enter a critical section simultaneously if they belong to the same group. This algorithm assumes that only one session can be opened at any time, several processes can access to the same session, and any requested session can be opened in a finite time. The message complexity of this algorithm is O(□n ) for the finite projective plane of order 2 (Fano plane) and O(2□n -1) for a grid where n is the total number of processes.

Ousmane Thiare, Mohamed Naimi, Mourad Gueroui
Chapter 62. A Dynamic Fuzzy Model for Processing Lung Sounds

This paper presents a dynamic fuzzy filter, with internal feedback, that performs the task of separation of lung sounds, obtained from patients with pulmonary pathology. The filter is a novel generalized TSK fuzzy model, where the consequent parts of the fuzzy rules are Block-Diagonal Recurrent Neural Networks. Extensive experimental results, regarding the lung sound category of coarse crackles, are given, and a performance comparison with a series of other fuzzy and neural filters is conducted, underlining the separation capabilities of the proposed filter.

P.A. Mastorocostas, D.N. Varsamis, C.A. Mastorocostas, C.S. Hilas
Chapter 63. A Formal Specification in JML of Java Security Package

The Java security package allows a programmer to add security features to Java applications. Although the package provides a complex application programming interface (API), its informal description, e.g., Javadoc comments, is often ambiguous or imprecise. Nonetheless, the security of an application can be compromised if the package is used without a concrete understanding of the precise behavior of the API classes and interfaces, which can be attained via formal specification. In this paper, we present our experiences in formally specifying the Java security package in JML, a formal behavior interface specification language for Java. We illustrate portions of our JML specifications and discuss the lessons that we learned, from this specification effort, about specification patterns and the effectiveness of JML. Our specifications are not only a precise document for the API but also provide a foundation for formally reasoning and verifying the security aspects of applications. We believe that our specification techniques and patterns can be used to specify other Java packages and frameworks.

Poonam Agarwal, Carlos E. Rubio-Medrano, Yoonsik Cheon, Patricia J Teller
Chapter 64. Enterprise Integration Strategy of Interoperability

In this new computing age of high complexity, a common weakness in the interoperability between business and IT leaves IT far behind the direction business is taking; poor business responsiveness and IT governance makes it even harder to achieve the enterprise goal. To cope with this common issue, we introduce the enterprise interoperability to integrate the metadata between business, service and information layers, this create visibility of vertical alignment within enterprise architecture and use metadata configuration to construct the mapping between each layer.

Raymond Cheng-Yi Wu, Jie Lu
Chapter 65. A Method for Consistent Modeling of Zachman Framework Cells

Enterprise Architecture has been in center of attention in late 90s as a comprehensive and leading solution regarding the development and maintenance of information systems. An enterprise is considered a set of elaborate physical and logical processes in which information flow plays a crucial role. The term Enterprise Architecture encompasses a collection of different views within the enterprise which constitute a comprehensive overview when put together. Such an overview can not be organized regardless of incorporating a logical structure called Enterprise Architecture Framework. Among various proposed frameworks, the Zachman Framework (ZF) is one of the most prominent ways of conceptualization. The main problem faced in using ZF is the lack of coherent and consistent models for its cells. Several distinctive solutions have been proposed in order to eliminate the problem, however achieving no success in thoroughly covering all the cells in ZF. In this paper, we proposed an integrated language based on Model Driven Architecture (MDA) in order to obtain compatible models for all cells in ZF. The proposed method was examined in practice, revealing its advantages and the efficiency gained in comparison to previously studied techniques.

S. Shervin Ostadzadeh, Fereidoon Shams Aliee, S. Arash Ostadzadeh
Chapter 66. Beyond User Ranking: Expanding the Definition of Reputation in Grid Computing

Shopping around for a good service provider in a Grid Computing environment is no less challenging than the traditional shopping around in non-virtual marketplace. A client may consult a service broker for providers that can meet specific QoS requirements (e.g., CPU speed), and the broker may return a list of candidate providers that satisfy the client's demands. If this computing platform is backed up by some reputation system, the list of providers is then sorted based on some reputation criterion, which is commonly the user rating. We argue in this paper that judging the reputation of a provider based on user rating is not sufficient. The reputation should additionally reflect how trustworthy that provider has been with respect to complying with the finalized SLA (using a metric called conformance) and how consistent it has been with respect to honouring its compliance levels (using a metric called fidelity). Accordingly, we perceive the reputation as a vector of three dimensions: user rating, conformance, and fidelity. In this paper, we define these metrics, explain how to compute them formally, and how to use them in the reputation-enabled framework that we describe.

Said Elnaffar
Chapter 67. A Comparative Study for Email Classification

Email has become one of the fastest and most economical forms of communication. However, the increase of email users have resulted in the dramatic increase of spam emails during the past few years. In this paper, email data was classified using four different classifiers (Neural Network, SVM classifier, Naïve Bayesian Classifier, and J48 classifier). The experiment was performed based on different data size and different feature size. The final classification result should be ‘1’ if it is finally spam, otherwise, it should be ‘0’. This paper shows that simple J48 classifier which make a binary tree, could be efficient for the dataset which could be classified as binary tree.

Seongwook Youn, Dennis McLeod
Chapter 68. Noise Reduction for VoIP Speech Codecs Using Modified Wiener Filter

Noise reduction is essential to achieve an acceptable QoS in VoIP systems. This paper proposes a Wiener filter-based noise reduction scheme optimized to the estimated SNR at each frequency bin as a logistic function is used. The proposed noise reduction scheme would be applied as pre-processing before speech encoding. For various noisy conditions, the PESQ evaluation is performed to evaluate the performance of the proposed method. In this paper, G.711, G.723.1, and G.729A are used as test VoIP speech codecs. The PESQ results show that the performance of our proposed noise reduction scheme outperforms those of the noise suppression one in the IS-127 EVRC and the noise reduction one in the ETSI standard for the advanced distributed speech recognition front-end.

Seung Ho Han, Sangbae Jeong, Heesik Yang, Jinsul Kim, Won Ryu, Minsoo Hahn
Chapter 69. A Formal Framework for “Living” Cooperative Information Systems

This paper constructs a high-level Abstract State Machine (ASM) model of our conceptual software architecture for “living” cooperative information systems founded in living systems theory. For practical execution, we use AsmL, the Abstract state machine Language developed at Microsoft Research and integrated with Visual Studio, to refine the ASM model to an executable system model for evaluation.

Shiping Yang, Martin Wirsing
Chapter 70. Crime Data Mining

Solving crimes is a complex task and requires a lot of experience. Data mining can be used to model crime detection problems. The idea here is to try to capture years of human experience into computer models via data mining. Crimes are a social nuisance and cost our society dearly in several ways. Any research that can help in solving crimes faster will pay for itself. According to Los Angeles Police Department, about 10% of the criminals commit about 50% of the crimes. Here we look at use of clustering algorithm for a data mining approach to help detect the crimes patterns and speed up the process of solving crime. We will look at k-means clustering with some enhancements to aid in the process of identification of crime patterns. We applied these techniques to real crime data from a sheriff’s office and validated our results. We also used semi-supervised learning technique here for knowledge discovery from the crime records and to help increase the predictive accuracy. Our major contribution is the development of a weighting scheme for attributes, to deal with limitations of various out of the box clustering tools and techniques. This easy to implement data mining framework works with the geo-spatial plot of crime and helps to improve the productivity of the detectives and other law enforcement officers. It can also be applied for counter terrorism for homeland security.

Shyam Varan Nath
Chapter 71. Combinatorial Hill Climbing Using Micro-Genetic Algorithms

This paper introduces a new hill-climbing operator, (MGAC), for GA optimization of combinatorial problems, and proposes two implementation techniques for it. The MGAC operator uses a small size second-level GA with a small population that evolves for a few generations and serves as the engine for finding better solutions in the neighborhood of the ones produced by the main GA. The two implementations are tested on a Power Systems' problem called the Unit Commitment Problem, and compared with three other methods: a GA with classic hill-climbers, Lagrangian-Relaxation, and Dynamic Programming. The results show the superiority of the proposed MGAC operator.

Spyros A. Kazarlis
Chapter 72. Alternate Paradigm for Navigating the WWW Through Zoomable User Interface

web browsing has become extremely important in every field of life whether it is education, business or entertainment. With a simple mouse click, user navigates through a number of web pages. This immediacy of traversing information links make it difficult to maintain an intuitive sense of where one is, and how one got there. A zooming browser is designed in java to explore alternate paradigm for navigating the www. Instead of having a single page visible at a time, multiple pages and the links between them are depicted on a large zoomable information surface. Links are shown in hierarchy so that user can see the relationship of web pages with their parent and child nodes. Browser also maintains the history of links traversed.

Sumbul Khawaja, Asadullah Shah, Kamran Khowaja
Chapter 73. A Verifiable Multi-Authority E-Voting Scheme for Real World Environment

In this Paper, we proposed a verifiable multi-authority e-voting scheme which satisfies all the requirements of large scale general elections. We used blind signature for voters’ anonymity and threshold cryptosystem for guarantee fairness of the voting process. Our scheme supports all types of election easily without increasing complexity of the scheme. Moreover, our scheme allows open objection that means a voter can complain in each stage while his privacy remains secret. Furthermore, the simplicity and low complexity of computation of the protocol makes it practical for general use.

T. Taghavi, M. Kahani, A.G. Bafghi
Chapter 74. Stochastic Simulation as an Effective Cell Analysis Tool

Stochastic Simulation is today a powerful tool to foresee possible dynamics of strict subsets of the real world. In recent years, it has been successfully employed in simulating cell dynamics with the aim of discovering exogenic quantities of chemicals able to deflect typical diseased simulation paths in healthy ones. This paper gives a large overview of the stochastic simulation environment and offers an example of a possible use of it on a pathway triggered by DNA damage.

Tommaso Mazza
Chapter 75. Bond Graph Causality Assignment and Evolutionary Multi-Objective Optimization

Causality assignment is an important task in physical modeling by bond graphs. Traditional causality assignment algorithms have specific aims and particular purposes. However they may fail if a bond graph has loops or contains junction causality violations. Some of the assignment algorithms focuses on the generation of differential algebraic equations to take into account junction violations caused by nonlinear multi-port devices and is not suitable for general bond graphs. In this paper, we present a formulation of the causality assignment problem as a constrained multi-objective optimization problem. Previous solution techniques to this problem include multi-objective Branch-and-Bound and Pareto archived evolution strategy – both are highly complex and time-consuming algorithms. A new solution technique called gSEMO (global Simple Evolutionary Multi-objective Optimizer) is now used to solve the causality assignment problem with very promising results.

Tony Wong, Gilles Cormier
Chapter 76. Multi-criteria Scheduling of Soft Real-time Tasks on Uniform Multiprocessors Using Fuzzy Inference

Scheduling algorithms play an important role in design of real-time systems. Due to high processing power and low price of multiprocessors, real-time scheduling in such systems is more interesting, yet more complicated. Uniform multiprocessor platforms consist of different processors with different speed or processing capacity. In such systems the same piece of code may require different amount of time to execute upon different processing units. It has been proved that there in no optimal online scheduler for uniform parallel machines. In this paper a new fuzzy-based algorithm for scheduling soft real-time tasks on uniform multiprocessors is presented. The performance of this algorithm is then compared with that of EDF algorithm. It is shown than our proposed approach has supremacy over EDF in some aspects, since it usually results in higher success ratio, better utilizes the processors and makes a more balanced schedule.

Vahid Salmani, Mahmoud Naghibzadeh, Mohsen Kahani, Sedigheh Khajouie Nejad
Chapter 77. A Finite Element Program Based on Object-Oriented Framework for Spatial Trusses

Spatial truss structures are very popular in architecture and civil engineering. These types of structures have single structural elements with small size. Due to this, spatial trusses can be easily manufactured, transported and assembled in practice. The aim of this work is to develop a Java software package for linear simulation of spatial truss structures using the finite element method. In this program, in contrast to the node-oriented description of element quantities the element-oriented matrix notation is used and it is possible to visualize the model as well as the simulation results. The functionality of the software is demonstrated at hand of some application examples. The results and visualizations of the numerical examples are confirmed that presented finite element analysis program based on object-oriented methodology for spatial truss can be used effectively.

Vedat Togan, Serkan Bekiroglu
Chapter 78. Design for Test Techniques for Asynchronous NULL Conventional Logic (NCL) Circuits

Conventional ATPG algorithms would fail when applied to asynchronous circuits due to the absence of a global clock and presence of more state holding elements that synchronize the control and data paths, leading to poor fault coverage. This paper presents three DFT implementations for the asynchronous NULL Conventional Logic (NCL) paradigm, with the following salient features: 1) testing with commercial DFT tools is shown to be feasible; 2) this yields a high test coverage; and 3) minimal area overhead is required. The first technique incorporates XOR gates for inserting test points; the second method uses a scan latch scheme for improving observability; and in the third scheme, scan latches are inserted in the internal gate feedback paths. The approaches have been automated, which is essential for large systems; and are fully compatible with industry standard tools.

Venkat Satagopan, Bonita Bhaskaran, Waleed K. Al-Assadi, Scott C. Smith, Sindhu Kakarla
Chapter 79. Ant Colony based Algorithm for Stable Marriage Problem

This paper introduces ant colony system (ACS), a distributed algorithm that is applied to the Stable Marriage Problem (SM). The stable marriage problem is an extensively-studied combinatorial problem with many practical applications. It is well known that at least one stable matching exists for every stable marriage instance. However, the classical Gale-Shapley [2] algorithm produces a marriage that greatly favors the men at the expense of the women, or vice versa. In our proposed ACS, a set of cooperating agents called ants cooperate to find stable matchings such as stable matching with man-optimal, woman-optimal, egalitarian stable matching, sex-fair stable matching. So this ACS is a novel method to solve Stable Marriage Problem. Our simulation results show the effectiveness of the proposed ACS.

Ngo Anh Vien, Nguyen Hoang Viet, Hyun Kim, SeungGwan Lee, TaeChoong Chung
Chapter 80. Q-Learning based Univector Field Navigation Method for Mobile Robots

In this paper, the Q-Learning based univector field method is proposed for mobile robot to accomplish the obstacle avoidance and the robot orientation at the target position. Univector field method guarantees the desired posture of the robot at the target position. But it does not navigate the robot to avoid obstacles. To solve this problem, modified univector field is used and trained by Q-learning. When the robot following the field to get the desired posture collides with obstacles, univector fields at collision positions are modified according to the reinforcement of Q-learning algorithm. With this proposed navigation method, robot navigation task in a dynamically changing environment becomes easier by using double action Q-learning [8] to train univector field instead of ordinary Q-learning. Computer simulations and experimental results are carried out for an obstacle avoidance mobile robot to demonstrate the effectiveness of the proposed scheme.

Ngo Anh Vien, Nguyen Hoang Viet, HyunJeong Park, SeungGwan Lee, TaeChoong Chung
Chapter 81. Statistical Modeling of Crosstalk Noise in Domino CMOS Logic Circuits

Domino logic circuits have been aggressively explored for vulnerabilities due to crosstalk noise. In these circuits, statistical modeling of crosstalk noise seems to be a promising approach due to factors like: large unpredictability in crosstalk noise with technology trends pushing process variations to their extreme end and reducing feature sizes ensuing unevenness in device geometries. We present here a general model for crosstalk noise with cross-coupling capacitive variance and MOS devices’ channel width variation effects and progressively refine it to get the most accurate circuit analysis model for deriving the crosstalk distribution. The statistical model derived is validated with 1000 runs of Monte Carlo simulations.

Vipin Sharma, Waleed K. Al-Assadi
Chapter 82. A Decision Making Model for Dual Interactive Information Retrieval

A new task in Interactive Information Retrieval (IIR) is considered – optimization of information retrieval taking into account impact on quality of interaction with the user. Dual IIR (DIIR) is defined. An integer programming model for DIIR is given.

Vitaliy Vitsentiy
Chapter 83. Business Rules Applying to Credit Management
Vladimir Avdejenkov, Olegas Vasilecas
Chapter 84. Information System in Atomic Collision Physics

Fundamental aspects of scientific research in the field of atomic physics are discussed in this paper from the point of view of information system that would cover the most important phases of research. Such information system should encompass the complexity of scientific research trying to incorporate data scattered in various books, articles, research centers, databases, etc. We started from scratch with principal analysis of basic research processes and data that represent needs and condensed research experience. Particular problem of search for data is specially discussed and the main idea for new proposed approach is described. We developed a prototype of information system to be used by researchers in various research phases. Search for data is based on the web, as it is the standard way for easy data access.

V.M. Cvjetković, B.M. Marinković, D. Šević
Chapter 85. Incremental Learning of Trust while Reacting and Planning

The general idea of the proposed approach is to integrate simple reactive intelligence acquired by experimentation together with planning and learning processes. The autonomous agent [1] can be considered as a representative of an intelligent entity located in the real world. It is expected that it should express rational behavior and possess the ability to learn relevant to it’s goals. The main objective of this paper is to construct a cognitive model of an agent that is capable of rational behavior in a dynamical environment. The concepts, like the goal, reactivity, and planning are investigated in the context of an agent that undertakes decisions and actions in completely or partly unknown environment. We have also proposed the integration of reactive and planning decision selection mechanisms by applying the concept of trust to its decisions on the basis of reinforcement. When designing our agent, we applied the bottom up approach, aiming to present some of the relevant research in this area. The primary advantage of this approach is shown by the improved performance of the agent during the execution of the given task. The effectiveness of the proposed solution has been initially tested in a simulated environment (evasive maneuver problem).

W. Froelich, M. Kisiel-Dorohinicki, E. Nawarecki
Chapter 86. Simulation of Free Feather Behavior

We present a general framework for simulating the behaviors of free feather like objects inside a dynamic changing flow field. Free feathers demonstrate beautiful dynamics, as they float, flutter, and twirl in response to lift and drag forces created by its motion relative to the flow. To simulate its movement in 2D, we adopt the thin strip model to account for the effect of gravity, lift and inertial drag. To achieve 3D animations, we implement two methods. For the first approach, we extend the thin strip model, use either flow primitive or noise functions to construct a time-varying flow field and extract external forces to update the thin strip computation. For the second approach, we implement a physically based simulation of the flow field and adopt the momentum-exchange method to evaluate the body force on the feather. As a result, the natural flutter, tumble, gyration dynamics emerge and vortices are created all in response to local surface-flow interactions without the imposition of the thin strip model.

Xiaoming Wei, Feng Qiu, Arie Kaufman
Chapter 87. Evolutionary Music Composer integrating Formal Grammar

In this paper, an autonomous music composition tool is developed using Genetic Algorithms. The production is enhanced by integrating simple formal grammar rules. A formal grammar is a collection of either or both descriptive or prescriptive rules for analyzing or generating sequences of symbols. In music, these symbols are musical parameters such as notes and their attributes. The composition is conducted in two Stages. The first Stage generates and identifies musically sound patterns (motifs). In the second Stage, methods to combine different generated motifs and their transpositions are applied. These combinations are evaluated and as a result, musically fit phrases are generated. Four musical phrases are generated at the end of each program run. The generated music pieces will be translated into Guido Music Notation (GMN) and have alternate representation in Musical Instrument Digital Interface (MIDI). The Autonomous Evolutionary Music Composer (AEMC) was able to create interesting pieces of music that were both innovative and musically sound.

Yaser M.A. Khalifa, Jasmin Begovic, Badar Khan, Airrion Wisdom, M. Basel Al-Mourad
Chapter 88. A New Algorithm and Asymptotical Properties for the Deadlock Detection Problem for Computer Systems with Reusable Resource Types

We study the classical problem of deadlock detection for systems with

n

processes and

d

reusable resource types, where

d≪n

. We present a novel algorithm for the problem. The algorithm enjoys two properties. First, its cost is

n/log(n)

times sm aller than that of the well-known Dijkstra’s algorithm, when

d=O(log(n))

. Secondly, its data structures are simple and easy to maintain. In particular, the algorithm employs no graph or tree based data structures. We also derive a linear-time algorithm when

d

and the resource requests are bounded by constants. The linear-time algorithm is asymptotically optimal. The algorithms are applicable to improving the Banker’s algorithm for deadlock avoidance. Categories and Subject Descriptors: D.4.1 Operating Systems: Process Management; General Terms: Deadlock, algorithms, performance

Youming Li, Robert Cook
Chapter 89. On Path Selection for Multipath Connection

Multipath connection, which utilizes the multiple paths between network hosts in parallel, has been used to improve the network performance, security and reliability. Path selection is a critical decision in a multipath connection network. Different selection results in significantly different result. In this paper, we present several heuristic algorithms including genetic algorithm to solve the path selection problem in a multipath connection environment. The reasons to choose genetic algorithm are because of its flexibility and extensibility when the context of problem changes. We define two objective functions and two constrains in this problem. The performance results of the proposed algorithms on the simulated network topology as well as a real-world network topology are presented. It is observed that genetic algorithm can produce satisfactory results within reasonable execution time.

Yu Cai, C. Edward Chow
Chapter 90. Some Results on the Sinc Signal with Applications to Intersymbol Interference in Baseband Communication Systems

Some useful results related to the Sinc signal are presented. They are derived using Fourier Series Decomposition and Parseval’s Identity. A simple convergence analysis is provided. These results should be useful in many practical situations involving band-limited or time-limited signals. This is illustrated by examples dealing with bandwidth requirements in Baseband Data Communication Systems in the presence of Additive Noise, Intersymbol Interference, and Timing Problems. Using simple trigonometric identities, a larger generalized set of new infinite series results related to the sinc signal is also obtained.

Zouhir Bahri
Chapter 91. Multi-Focus Image Fusion Using Energy Coefficient Matrix

An image fusion algorithm based on Energy Coefficient Matrix (ECM) is presented in this paper. Energy coefficient matrix is computed and proved to indicate the clarity of a pixel at a particular physical location. Based on this clarity value, a fusion decision map has been created. This decision map provides information about which pixel to choose at a particular physical location in wavelet domain. Comparison of the scheme proposed in this paper has been done with two other well known techniques based on Discrete Wavelet Transform (DWT) and Discrete Wavelet Frame Transform (DWFT). Experimental results have shown that the performance of the algorithm proposed in this paper is superior to that of DWT and DWFT-based one.

Adnan Mujahid Khan, Mudassir Fayyaz, Asif M. Gillani
Chapter 92. Measuring Machine Intelligence of an Agent-Based Distributed Sensor Network System

A measure of machine intelligence facilitates comparing alternatives having different complexity. In this paper, a method for measuring the machine intelligence quotient (MIQ) of human-machine cooperative systems is adapted and applied to measure the MIQ of an agent-based distributed sensor network system. Results comparing the MIQ of different agent-based scenarios are presented for the distributed sensor network application. The MIQ comparison is contrasted with the average sensor network field life, a key performance indicator, achieved with each scenario in Monte Carlo simulations.

Anish Anthony, Thomas C. Jannett
Chapter 93. Image Processing for the Measurement of Flow Rate of Silo Discharge

In this work, silo discharge was viewed as a complex fluid flow, in order to perfect a new technique for the measurement of flow rate. Flow rate was investigated using a non intrusive method measuring the evolution of the free surface profile during the discharge flow. This method consisted of recording via a CCD sensor, the evolution of the free surface by laser planes, and then obtaining by processing the free surface position and shape over time.

Cédric Degouet, Blaise Nsom, Eric Lolive, André Grohens
Chapter 94. A Blind Watermarking Algorithm Based on Modular Arithmetic in the Frequency Domain

Robustness is the important issue in watermarking, robustness at the same time with blind watermark recovering algorithm remains especially challenging. This paper presents a combined DWT and DCT still image blind watermarking algorithm. The two-level DWT are performed on the original image, the low-frequency sub-band is divided into blocks, the DCT is performed on the every block, the DCT coefficients of every block are sorted using Zig-Zag order, the DCT low-frequency coefficient is selected as embedding watermarking. The watermarking signals are embedded into the selected embedding points using the modular arithmetic. The watermark recovering is the inverse process of the watermark embedding, according to the answer of the modular arithmetic, we can estimate the value of embedded the watermark. The algorithm is compared with a pure DWT-based scheme Experiment results shown that the proposed algorithm is robust to many attacks such as JPEG compression, addition noise, cropping, JPEG compression, median filter, rotation, and resize etc. Proposed algorithm is shown to provide good results in term of image imperceptibility, too.

Cong Jin, Zhongmei Zhang, Yan Jiang, Zhiguo Qu, Chuanxiang Ma
Chapter 95. Determination of Coordinate System in Short-Axis View of Left Ventricle

With the increasing rate of myocardial infarction (MI) in men and women, it is important to develop a diagnosis tool to determine the effect of MI on the mechanics of the heart and to minimize the effect of heart muscle damage on overall cardiac performance. After a myocardial infarct, the left ventricle of the heart enlarges to compensate for a weak heart muscle. The enlarged and weakened heart gives rise to the clinical syndrome of heart failure. In order to maximize the mechanical performance of the weakened heart, regional ventricular loading and contraction must be understood. To isolate regional wall mechanics, a floating centroid for the left ventricle must to be calculated. This is easy in the normal heart where the left ventricle approximates a single radius of curvature; however in heart failure there are irregular shape changes that complicate this calculation. The conventional method used for centroid calculation employs a center of mass (COM) determination of the whole left ventricle. This method has many shortcomings when applied to an enlarged and irregular left ventricle. This paper proposes a new algorithm for centroid calculation based on iterative majorization to locate the centroid.

Gaurav Sehgal, Dr. Gabrielle Horne, Dr. Peter Gregson
Chapter 96. On-line Modeling for Real-Time, Model-Based, 3D Pose Tracking

Model-based object-tracking can provide mobile robotic systems with real-time 6-dof pose information of a dynamic target object. However, model-based trackers typically require the model of the target to be known

a-priori

. This paper presents a novel method capable of building an approximate 3D geometric model of a target object in an on-line mode, fast enough for real-time use by a model-based object tracker. The algorithm constructs a 3D tessellated model and uses projective texture mapping to model the target object's surface features.

Hans de Ruiter, Beno Benhabib
Chapter 97. Grid Enabled Computer Vision System for Measuring Traffic Parameters

In this paper, we propose a application for traffic flow analysis including vehicle detection and classification, vehicle speed estimation, and accidents detection. The application is implemented to work in a grid environment with support for data management, job submission and execution and scene recording with remote cameras. The user is allowed to access the Grid infrastructure via a client application and easily manage the system. The management includes selecting stream from different cameras, acquiring results from the analyses and visualization of the obtained results. We implemented a set of algorithms based on image processing for video sequences analyze. Finally, we estimate several aspects of the proposed system in a set of experiments, including an analysis of the usefulness of streaming large video data packets and speed-up possibilities.

Ivica Dimitrovski, Gorgi Kakasevski, Aneta Buckovska, Suzana Loskovska, Bozidar Proevski
Chapter 98. Physically Constrained Neural Network Models for Simulation

We present a method for combining measurements of a system and mathematical descriptions of its behavior. The approach is the opposite of data assimilation, where data is used in order to correct the results of a model based on differential equations. Here, differential equations are used in order to correct interpolation results. The method may be interpreted as a regularization technique, able to handle the ill-posed character of a neural network regression problem. Significant examples illustrate the numerical behavior and show that the method proposed is effective to calculate.

J. E. Souza de Cursi, A. Koscianski
Backmatter
Metadaten
Titel
Advances and Innovations in Systems, Computing Sciences and Software Engineering
herausgegeben von
Khaled Elleithy
Copyright-Jahr
2007
Verlag
Springer Netherlands
Electronic ISBN
978-1-4020-6264-3
Print ISBN
978-1-4020-6263-6
DOI
https://doi.org/10.1007/978-1-4020-6264-3

Neuer Inhalt