Skip to main content

2006 | Buch

Advances in Systems, Computing Sciences and Software Engineering

Proceedings of SCSS05

herausgegeben von: Tarek Sobh, Khaled Elleithy

Verlag: Springer Netherlands

insite
SUCHEN

Über dieses Buch

Advances in Systems, Computing Sciences and Software Engineering This book includes the proceedings of the International Conference on Systems, Computing Sciences and Software Engineering (SCSS’05). The proceedings are a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of computer science, software engineering, computer engineering, systems sciences and engineering, information technology, parallel and distributed computing and web-based programming. SCSS’05 was part of the International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CISSE’05) (www. cisse2005. org), the World’s first Engineering/Computing and Systems Research E-Conference. CISSE’05 was the first high-caliber Research Conference in the world to be completely conducted online in real-time via the internet. CISSE’05 received 255 research paper submissions and the final program included 140 accepted papers, from more than 45 countries. The concept and format of CISSE’05 were very exciting and ground-breaking. The PowerPoint presentations, final paper manuscripts and time schedule for live presentations over the web had been available for 3 weeks prior to the start of the conference for all registrants, so they could choose the presentations they want to attend and think about questions that they might want to ask. The live audio presentations were also recorded and were part of the permanent CISSE archive, which also included all power point presentations and papers. SCSS’05 provided a virtual forum for presentation and discussion of the state-of the-art research on Systems, Computing Sciences and Software Engineering.

Inhaltsverzeichnis

Frontmatter
An Elastic Display Method for Visulaizing and Navigating a Large Quantity of Alarma in a Control Room of a Nuclear Power Plant

In a conventional control room of a Nuclear Powre Plant, a great number of tiled alarms are generated especially under a plant upset condition. As its conventional control room evolves into an advanced one, an annunciator-based tile display for an alarm status is required to be removed and replaced by a computer-based tile display. Where this happens, it becomes a bothering task for plant operators to navigate and acknowledge tiled alarm information, because it places an additional burden on them. In this them. In this paper, a display method, Elastic Tile Display, was proposed, which can be use to visualize and navigate effectively a large quantity of tiled alarms. We expect the method to help operators navigate alarms with a little cost of their attention resources and acknowledge them in a timely manner.

Sang Moon Suh, Gui Sook Jang, Geun Ok Park, Hee Yun Park, In Soo Koo
An EAI Technology Framework

EAI issue is constantly a significant aspect of enterprise computing area. Furthermore, recent advances in Web Service and integration sever provide a promising pushing to EAI issues. However, few investigations have focused on a general view of EAI technologies and solutions. to provide a clear perspective about EAI, a technology framework-ABCMP is presented in this paper. ABCMP attempts to describe the structure of diverse EAI technologies, which allows a general comprehension of the EAI issue. Moreover, process topic about ABCMP is also discussed to provide a wider vision of the technology framework.

Jing Liu, Lifu Wang, Wen Zhao, Shikum Zhang, Ruobai Sun
A Fuzzy Algorithm for Scheduling Soft Periodic Tasks in Preemptive Real-Time Systems

Most researches concerning real-time system scheduling assumes scheduling constraint to be precise. However, in real world scheduling is a decision making process which involves vague constraints and uncertain data. Fuzzy constraints are particularly well suited for dealing with imprecise data. This paper proposes a fuzzy scheduling approach to real-time system scheduling in which the scheduling parameters are treated as fuzzy variables. A simulation is also performed and the results are compared with both EDF and LLF scheduling algorithms. The latter two algorithms are the most commonly used algorithms for scheduling real-time processes. It is conculded that the proposed fuzzy approach is very promising and it has the potential to be considered for future research.

Mojtaba Sabeghi, Mahmoud Naghibzadeh, Toktam Taghavi
Parallel Construction of Huffman Codes

For decades, different algorithms were proposed addressing the issue of constructing Huffman Codes. In this paper we propose a detailed time-efficient three-phase parallel algorithm for generating Huffman codes on CREW PRAM model exploiting

n

processors, where

n

is equal to the number of symbols in the input alphabet. First the codeword length for each symbol is computed concurrently with a direct parallelization of the Huffman tree construction algorithm eliminating the complexity of dealing with the original tree-like data structure. Then the Huffman codes corresponding to symbols are generated in parallel based on a recursive formula. The performance of the proposed algorithm depends directly on the height of the corresponding Huffman tree. It achieves an O

(n)

time complexity in the worst case whichis rarely encountered in practice.

S. Arash Ostadzadeh, M. Amir Moulavi, Zeinab Zeinalpour, B. Maryam Elahi
Semantic Description of Multimedia Content Adaptation Web services

-Multimedia content adaptation can be performed by third -party services that can be implemented using web services. In order to use these web services, one has to be able to find them on the Web. However, today, finding the right web service for a specific task (multimedia content adaptation in our case) is difficult. This is mainly because of the type of description used to advertise the web services.

Current standards that are used to describe web services are syntactic. Syntatic description of web service, which is later, used for searching the services. However, this information does not allow high rate of success during service discovery. These limitations of the current standards can be improved by using semantic describtion methods. Semantic description gives the meanings and relationships of the terms and concepts used in describing a web service, which improves its discovery. In order to achieve this, we have developed an ontology for multimedia content adaptation web services.

Lemma Surafel, Dawit Beckele, Girma Berhe Lab.
Grid Computing Communication Strategies for Cross Cluster Job Execution

With the advent of grid computing, some jobs are executed across several clusters. In this paper, we show that (1) cross cluster job execution can suffer from network contention on the Inter-cluster links and (2) how to reduce traffic on the Inter-cluster links. With our new all-to-all communication strategtes and host file mapping. It is possible to reduce the volume of data between two clusters from(n/2, for a job with size n. Our measurements confirm this reduction in data volume on the inter-cluster link and show a reduction in runtime. With these improvements it is possible to increase the size of grids without significantly comparomtsing performance.

Jens Mache, Chris Allick, André Pinter, Damon Tymon Tyman
Glue Code Synthesis for Distributed Software Programming

In this paper, we propose a method for synthesizing the glue code for distributed programming. The goal of this method is to completely automate the synthesis of code for handling distributed computing issues, such as remote method calls and message passing. By using this method, the software for migrating the objects, synchronizing the communication, and supporting remote method calls can be automatically generated. From the programmer’s point of view, remote accesses to objects are syntactically and semantically indistinguishable from local accesses. The design of the system is discussed and the whole system is based on the Linda notation. A prototype has been developed using JavaSpaces for code synthesis in Java. Experiments show that this method can help developers generate the code for handling distributed message passing and remote procedure calls.

Jian Liu, Farokh B. Bastani, I-Ling Yen
Interactive Elicitation of Relation Semantics for the Semantic Web

This paper the workflow and architecture of the Relation Semantics Elicitation Prototype {RSEP}. RSEP has been developed to address the limitations of the Description Logic based constructs of the Web Ontology Language (OWL)

Cartik R. Kothari, David J. Russomanno, Phillip N. Tran
An Improved Configuaration Similarity Retrieval Model

Modern information technology has equipped the world with more powerful telecommunication and mobility. We interact with various information systems daily especially those dealing with global positioning systems. Geographical information system is one of important element in those positioning systems. Hence spatial retrieval becomes more and more important in telecommunication systems, fleet management, vehicle navigation, robot automation, and satellite signal processing. Nowadays, spatial query is not longer done with lengthy and complicated syntax. However it is easily done with sketching and speech. The spatial queries are more dealing with daily routines like searching for buildings and routes, analyzing customer preferences and so on. This kind of query is called configuration or structural spatial query that needs powerful retrieval technique as it involves high volume of database accesses to locate the suitable objects. Configuration query retrieval is in content based retrieval family that is also an active research area in spatial databases. This research developed an enhanced configuration similarity retrieval model using single measure in contrast with the multimeasure models in the domain of spatial retrieval.

Lau Bee Theng, Wang Yin Chai
Automatic Generation of Software Component Wizards based on the Wizard Pattern

When a software component is used, it is often necessary to set initial values in many of its attributes. To set these initial values appropriately, the user of the component must ascertain which attributes are needed to be initialized, and set them programmatically to suitable initial values. The work involved in this sort of initialization can be alleviated by attaching a wizard interface to the target component itself and setting the initial values visually from the wizard. However, there are large development costs associated with devising suitable initial value candidates and producing a new wizard to use these initial values for each individual component. In this paper, we propose a system whereby application programs that use a target component are subjected to dynamic analysis to discover which attributes and initial values are set most often during thr running of the component. The proposed system generates and attaches a wizard, which supports application programmers to initialize the component visually by using these initial values, to the component. The proposed system can be recognized as a system for applying the Wizard pattern to each component automatically. Experiments have shown that the attributes and their initial values chosen for initialization by generated wizards closely resemble the expectations of the component’s original developers. We have thus confirmed that the proposed system can bring about a substantial reduction in wizard development costs.

Hironori Washizaki, Shinichi Honiden, Rieko Yamamoto, Takao Adachi, Yoshiaki Fukazawa
Content Based Image Retrieval Using Quadrant Motif Scan

-Image retrieval based on Quadrant Motif Scan (QMS) is proposed in this paper. Motif traces from image pixels are the core idea to extract feature vectors and used for distinguishing images by region-based comparisons. We exploit recursive quadrant segmentation in image and used for distingushing images by region-based comparisons. We exploit recursive quadrant segmentation in images and derive representative motif for stratified regions. In this sense, a parent regionis segmented into sub-regions until a predefined stratum threshold. Matching data for each region contains its motif code plus the result from uniformity detection. By the credit setting, the similarity mechanism proceeds in corresponding regions from two images in a top-down manner. Dynamic parameter adjustments to relevance feedback can help pursue best retrieval results. Besides, a peck inspection technique is also added in the QMS matching metric to enhance performance. Experimental results reveal effectiveness and efficiency comparable to the Motif Cooccurrence Matrix (MCM) method with invariance to image scaling.

Tsong-Wuu Lin, Chung-Shen Hung
Parallel Implementation of MPEG-4 Encoding over a Cluster of workstations

“Parallel Implementation of MPEG-4 Encoding over a Cluster of Wrokstations” is a research project and a project proposal to reduce the delay involved in the MPEG-4 encoding process. As a case study we take Amrita Vishwa Vidyapeetham which is a multicampus Deemed University, one of its own kinds in India. All the campuses are connected through satellite provided by ISRO (Indian Space Research Organization). A number of e-learning classes, guest lectures and meetings are conducted across the satellite network with the video being compressed in MPEG-4 standard. The project optimizes the system by parallelizing the encoding process. For this a cluster of machines is created and the video frames are distributed among these nodes. This follows the SPMD {Single Program Multiple Data) model. Dedicated nodes within the cluster perform the encoding process. while there is a node that distributes the video to be encoded over these nodes. Another node collects the encoded video and places them back in the original sequence. The two nodes can be utilized for the encoding process also.

R Karthik Sankar, Shivsubramani.K. Moorthy, K.P Soman
Different Strategies for Web Mining

In the past decade, the amount of information available on the Internet has expanded astronomically. The amount of raw data now available on the Internet presents an opportunity for business and researchers to derive useful knowledge out of it by utilising the concepts of data mining. The area of research within data mining, often referred to as web mining, has emerged to extract knowledge from the Internet. Existing algorithms have been applied to this new domain, and new algorithms are being researched to address indexing and knowledge requirements. Three main areas of internet have emerged in web mining: Mining the content of the web pages; mining the structure of the web; and mining the usage patterns of clients. This paper provides an overview of web mining, with an in depth look at each of the three areas just mentioned.

Michael Kernahan, Luiz F. Capretz
Coalesced QoS: A Pragmatic approach to a unified model for KLOS

- Advances in softward and hardware technologies have given operating systems the ability to process data and handle various concurrent processes. The increased ability has been one of the driving forces which have led to the proliferation of mechanisms in operating systesm to satisfy the performance requriements of applications with predictable resourece allocation. As differnt classes of applicatons require different resources managemnt policies one needs to look into ways to satify all classes of applications. Conventional general purpose operating systems have been developed for a single class of best-effort applications, hence, are inadequate to support multiple classes of applications. We present an abstract architecture for the support of Quality of Service (Qos) in Kernel-Less Operating System (KLOS). We propose new semantics for the Qos resources management paradigm, based on the notion of Quality of Service. By virtue of this new semantics, it is possible to provide the support required by KLOS to various components of the operating system such as memory manager, processor time, IO management etc. These mechanisms which are required within an operating system to support this paradigm are described, and the design and implementation of a protoypical kernel which implements them in is presented. Various notions of negotiation rules between the application and the operating systems are discussed along-with a feature which allows the user to express its requirements and the fact is that this model assures the user in providing the selcted parameters and returns feedback about the way it meets those requirements. This Qos model presents a design paradigm that allows the internal components to be rearranged dynamically, adapting the architecutre to the high performance of KLOS. The benefits of our framework are demonstrated by building a simulation model to represent how the various modules of an opeating system and the interface between the processes and the operating system can be tailored to provide Qualitu of Service guarantees.

Ashish Chawla, Yerraballi Ramesh, Amit Vasudevan
Method of Key Vectors Extraction Using R-Cloud Classifiers

A noved method for reducing a training data set in the context of nonparametric classification is proposed. The new method is based on the method of R-clouds. The advantage of the R-cloud classification method introduced recently are being investigated. A data set reduction approach using Rvachev function-based represetation of the separating boundary is proposed. The R-cloud method was found instructive and practical in a number of engineering problems related to pattern classification. The method of key vectors extraction uses the property of the normal R-cloud boundary to evaluate the distance from the sample to the separating boundary.

A.A. Bougaev, A.M. Urmanov, A.M. Gross, L.H. Tsoukalas
Semantic Web Knowledge Management

The aim of the Semantic Web is to allow access to more advanced knowledge managment by organizing knowledge in conceptual space according to its meaning. Semantic Web agents will use the techniligies of the Semantic Web to identfy and extract information from Web sources. However, the implementation of the Semantic web has a problem, namely, that the Semantic Web ignores the different types of already-existing resources in the current implementation of the Web. As a result, many of the resources will not be included in the conceptual spaces of the Semantic Web. This research introduces a framework that catalog allows mulit-access points to different resources, and it allows agents to discover the sought-after knowledge. This catalog can be established through the creation of a framework that helps computer-to-computer communication, knowledge management, and imformation retrieval general.

Malik F Salesh
Augmented Color Recognition by Applying Erasure Capability of Reed-Solomon Algorithm

Color-image data is generally more complex than black and whilt version in terms of the number of variables in the channels and “color constancy” issue. Most of two-dimensional (2D) codes are dominant in a black and white domain such as QR code, PDF417, Data Matrix and Maxicode rather than in color domain, affected usually by such technical difficulities of handing color variables. Conventional 2D codes adopt Reed-Solomon (RS) algorithm for their error control codes mainly to recover from damages such as stains, scrathes and spots, while highlight and shadow issues in color codes are much more complex. In this paper we propose a new decoding approach by applying recoverable erasures by RS algorithm in color codes to resolve color recognition issue effectively. Using erasure concept ultimately for color constantcy, marking erasure strategy during color recognition processing steps is very crucial. Our algorithm ultimately mitigates color recognition load overall in decoding color codes. Consequently our new erasure decision algorithm prevents color recognition failures within erasure correction capability of RS algorithm, and it can be applied along with other color constancy algorithms easily to increase overall decoding performance just using RS erasure.

Jaewon Ahn, Cholho Cheong, Tack-Don Han
Decoupling of Collaboration-Based Designs

Collaboration-based design is a well known method for construction complex software systems [1, 12, 13]. A collaboration implements one feature of the system. Because of the independent development of collaborations, collaborations might easily produce methods with identical signatures though no intention of overriding [7,10]. This paper differentiates between accidental and intended overriding and proposes a solution to the problem generated from oveeiding method signatures between collaborations. Our solution is based goal is the clarity, measured by the ease-of-use by developers.

Dr. Osama Izzat Salameh, Dima Jamal Daman
A Location Servive for Pervasive Grids

-Grid computing environments are being extended in order to present some features that are typically found in pervasive computing environments. In particular, Grid environments have to allow mobile users to get access to their services and resources, as well as they have to adapt services depending on mobile user location and context. This paper presents a 2-layers location service that locates mobile users both in intra-Grid and extra-Grid architectures. The lower layer of the location service locates mobile users within a single intraGrid environment, i.e mobile users can move among different areas of the physical environment adn the Grid can provide its services accordingly to the user position. The uper layer locates mobile users in an extra-grid, which is composed by a distributed federation of intra-Grids, by collection location information coming from basis layers. The location service has been developed at the top of the standard OGSA architecture.

M. Ciampi, A Coronato, G De Pietro, M Esposito
Extending an existing IDE to create non visual interfaces

This exposes how to modify an existing progrmming workbench to make it useful and efficient for developing interfaces for the blind and partically sighted people. This work is based on abstract data types and components.

Amina Bouraoui
Performance Comparison of Two Identification Methods for Analysis of Head Related Impulse Responses

-Head-Related Impulse Responses (HRIRs) are used in signal processing to model the synthesis of spatialized audio which is used in a wide variety of applications, from computer games to aids for the vision impaired. They represent the modification to sound due to the listener’s torso, shoulders, head and pinnae, or outer ears. As such, HRIRs are somewhat different for each listener and require expensive specialized equipment for their measurement. Therefore, the development of a method to obtain customized HRIRs without specialized equipment is extremely desirable. In previous research on this topic, Prony’s modeling method was used to obtain an appropriate set of time delays and a resonant frequency to approximate measured HRIRs. During several recent experimental attempts to improve on this previous method, a noticeable increase in percent fit was obtained using the Steiglitz-McBride interative approximation method. In this paper we report on the comparison between these two methods and the statistically significant advantage found in using the Steiglitz-McBride method for the modeling of most HRIRs.

John Faller II Kenneth, Armando Barreto, Navarun Gupta, Naphtali Rishe
Reversers-A programming language construct for reversing out of code

This paper proposes a new programming language construct called a reverser for situations in which a subroutine performs actions that it must reverse if it encounters a failure. The reversal code stacks up as more actions are performed. A failure invokes all the reversals in LIFO order; success invokes none of them. The reverser construct avoids a common situation in the Linux source code that it currently programmed by goto statements.*

Raphael Finkel
Hand-written Character Recognition Using Layered Abduction

-Even though automated hand-written character recognition can be highly accurate, most of these systems are unable to apply context to improve results, unlike human readers. This paper describes an approach to automated hand-written character recognition that seeks explicit features amongst the input data, and applies layered abduction to derive an explanation to account for the input in terms of English characters. Layered abduction is used because it can provide top-down guidance is used because it can provide top-down guidance to improve accuracy. Such an approach has been taken here resulting in more than 96% accuracy for hand-written printed character recognition in a limited domain.

Richard Fox, William Hartmann
Dynamic Pricing Algorithm for E-Commerce

E-Commerce has developed into a major business arena during the past decade, and many of the sales activities are handled by computers. Intelligent algorithms are being developed to further automate the sales process, which reduces labor costs as well as business operational expenses. This paper describes an automatic sales-price determination algorithm for online markets in a competitive environment. It tries to dynamically adjust the sales price to maximize the profit and minimize the sales time. This algorithm gathers sales prices of competing online stores, and optimally adjusts the advertising price. It is particularly useful when a store carries numerous items so manual price adjustments are laborious and costly. The algorithm has been applied to DVD movie sales in the online market, and shown to shorten the sales time and increase the profit.

Samuel B. Hwang, Sungho Kim
Runtime support for Self-Evolving Software

Software development and deployment is based on a well established two stage model: program development and program execution. Since the beginning of software development, meta-programming has proven to be useful in several applications. Even though interpreted languages have successfully exploited the ability of generating programs, meta-programming is still far from being main stream. The two stage model is largely due to the computing metaphors adopted for defining programs, and deeply entrenched within the whole software management infrastructure. In this paper we explore how a runtime, multi-staged, self-contained, meta-programming system based on the notion of partial application can provide a suitable support for programs capable of evolve along time. To make the discussion more concrete we discuss two scenarios where the suggested infrastructure is used: software installation and robot control by means of programs embedding knowledge into thier code rather than into data structures.

Tarek Sobh, Khaled Elleithy
A Mobile Location Algorithm Using Clustering Technique for NLos Environments

For the mass demands of wireless communication application services, the mobile location technologies have drawn much attention of the governments, academia, and industries around the world. In wireless communication, one of the main problem facing accurate location is nonline of sight (NLoS) propagation. To solve the problem, we present a new location algorithm with clustering technology by utilizing the geometrical feature of cell layout, time of arrival (ToA) range measurements, and three base stations. The mobile location is estimated by solveing the optimal solution of the objective function based on the high density cluter. Simulations study was conducted to evaluate the performance of the algorlthm for different NloS error. The results of our experiments demonstrate that the proposed algorithm is significantly more effective in location accuracy than linear line of position algorithm and Taylor series algorithm, and also satisflies the location accuracy demand of E- 911.

Cha-Hwa Lin, Juin-Yi Cheng
A Simplified and Systematic Technique to Develop and Implement PLC Program for a Control Process

Programmable logic controllers (PLCs) have made it possible to precisely control large process machines and driven equipment with less physical wiring and lower instabllation costs than required with standard electromechanical relays, pneumatic timers, drum switches, and so on. The programmability allows for fast and easy changes in relay ladder logic to meet the changing needs of the process or driven equipment without the nedd for expensive and time consuming rewiring. In this paper a target to develop, simulate and, implement PLC software for automation of a water treatment plant is achieved. The algorithm developed in this paper may be used to develop PLC based software for any control process.

Development of Mathematical Model of Blast Furnace Smelting

The solution of a problem of mathematical description of heat exchange, gas dynamics and the physicochemical phenomena taking place in blast furnace in their correlations, and some its application for study of processes, defining reduction of metals from multicomponent iron ores are considered.

Andrey N. Dmitriew
Research and researcher implications in sustainable development projects: Muti-agent Systems (MAS) and Social Sciences applied to Senegalese examples
Alassane Bah, Jean-Max Estay, Christine Fourage, Ibra Toure
The Importance of modeling metadata for Hyperbook

This paper provides a review of modelling metadata for adaptive Hyperbook. We analyse the adaptivity and conceptual model for representing and sorting a Hyperbook, the language and tools for metadata and the architecture of the Hyperbook.

Abdoulmajid Hakki
Cluster-Based Mining of Microarray Data in PHP/MYSQL Environment

- Extracting biological significance from a large microarray dataset using data mining clustering technique is an important process in bioinformatics. In this paper, a microarray dataset (matrix 504 x 227) made available by SAMSI institute, was used as the base sample to develop a new demo web-based clustering system that exploits the improved efficiency and functionality of PHP/MYSQL technology. The clustering algorithms and robustness of PHP/MYSQL produced categorized microarray data that can be associated with diseases with improved visulizations.

E. Udoh, S. Bhuiyan
Grid and Agent based Open DSS

The paper discusses an open DSS model based on grid and agent technologies for decision resources sharing and reusing. The model faces decision objects and organizes decision resources in the form of e-market, so the model improves the timeliness, openness and intelligent capacities of DSS, also enhances its ability to handle distribution issues. The structure and the operating process of the model are presented in detail and its advantages compared with the traditional DSS are also discussed.

Zhiwu wang, Qianquian Wei, Xueguang Chen
The Design and Implementation of Elactronic Made-to-Measure System

-Electronic Made to Measure is a new mode of garment production. This paper analyses the state of eMTM, and introduces the functional components of the eMTM infrastructures, designs it’s workflow, then gives one feasible solution. The research work of his paper will play a good role for future research and popularization of eMTM in China.

SHI Xiu-jin, Li SUN, CHEN Jia-xun
Ants in text documents clustering

- Ant systems are flexible to implement and give possibility to scale because they are based on multi agent cooperation. The aim of this publications is to show the universal character of that solution and potentiality to implement it in wide areas of applications. The increase of demand for effective methods of big document collections management is sufficient stimulus to place the research on the new application of ant based systems in the area of text document processing. The author will define the ACO (Ant Colony Optimization) meta-heuristic, which was the basis of method developed by him. Presentation of the details of the ant based documents clustering method will be the main part of publication.

Lukasz Machnik
Optimal Control in a Monetary Union: An Application of Dynamic Game Theory to a Macroeconomic Policy Problem

We develop a dynamic game model to study the optimal control of the economies in a two-country monetary union under startegic interactions between macroeconomic policy-makers. In this union, goverments of participating countries pursure national golas when deciding on fiscal policies, whereas the common central bank’s monetary policy aims at union-wide objective varibles. For a symmetric demand shock, we derive numerical solutions of the central bank. The different solution concepts for this game serve as models of a conflict between naional and supra-national institutions (noncooperative Nash equilibrium) on the one hand of coordinated policy-making (cooperative Pareto solutions) on the other. we show that there is a trade-off between instruments’ and targets’ deviations from desired paths; moreover, the volatility of output and inflation increase when private agents reactmore strongly to changes in actual inflation.

Reinhard Neck, Doris A. Behrens
Schema Matching in the Context of Model Driven Engineering: From Theory to Practice

-Recently, software engineering is required to make face to the development, maintenance and evolution complexity of software systems. Among the proposed solutions for manging the complexity, Model Driven Engineering (MDE) has been accepted and implemented as one of the most promising solutions. In this approach, models become the hub of development, separating platform independent aspects from platform dependent aspects. Among the more important issues in MDE, we remark model transformation and model matching (or schema matching). In this paper, we propose an approach to take into account schema matching in the context of MDE. A schema matching algorithm is provided and implemented as a plug-in for Eclipse. We discuss an illustrative example to validate our approach.

Denivaldo Lopes, Slimane Hammoudi, Zair Abdelouahab
FOXI-Hierarchical Structure Visualizaion

This paper presents a novel approach in hierarchical structure visualization. The main goal of the presented approach is to achieve an ability to display infinite hierarchy size in a limited display area, maintaining high level of orientation and fast navigation in the resulting hierarchical structure. High level of orientation is achieved by specific hierarchy visualization techniques where each of hierarchy is abstracted from its substructures and limited visualization area leads to high speed navigation engaging hierarchy substructure zooming.

Robert Chudy, Jaroslav Kadlec
A Hands-Free Non-Invasive Human Computer Interaction System

Conventional Human Computer Interaction requires the use of hands for moving the mouse and pressing keys on the keyboard. As a result paraplegics are not able to use computer systems unless they acquire special Human Computer Interaction equipment. In this paper we describe a system that aims to provide paraplegics the opportunity to use computer systems without the need for additional invasive hardware. Our system uses a standard web camera for capturing face images showing the user of the computer. Image processing techniques are used for tracking head movements, making it possible to use head motion in order to interact with a computer. The performance of the proposed system was evaluated using a number of specially designed test applications. According to the quantitative results, it is possible to perform most HCI tasks with the same ease and accuracy as in the case that a touch pad of a portable computer is used. Currently our system is being used by a number of paraplegic users.

F. Frangeskides, A. Lanitis
A Model for Anomalies of Software Engineering

We present a model for the anomalies of software engineering that cause system failures. Our model enables us to develop a classification scheme in terms of the types of errors that are responsible, as well as to identify the layers of a system architecture at which control mechanisms should be applied.

Gertrude N. Levine
A Qualitative Analysis of the Critical's Path of Communication Models for Next Performant Implementations of High-speed Interfaces

Recent high-speed networks provide new features such as DMA and programmable network cards. However standard network protocols, like TCP/IP, still consider a more classical network architecture usually adapted to the ethernet network. In order to use the high-speed networks efficiently, protocol implementors should use the new features provided by recent networks. This article provides an advanced study of both hardware and software requirements for high-speed network protocols. A focus is made on the programming model and the steps involved in the transfer’s critical path.

Ouissem Ben Fredj, Éric Renault
Detailed Theoretical Considerations for a Suite of Metrics for Integration of Software Components

This paper defines two suites of metrics, which cater static and dynamic aspects of component assembly. The static metrics measure complexity and criticality of component assembly, wherein complexity is measured using Component Packing Density and Component Interaction Density metrics. Further, four criticality conditions namely, Link, Bridge, Inheritance and Size criticalities have been identified and quantified. The complexity and criticality metrics are combined into a Triangular Metric, which can be used to classify the type and nature of applications. Dynamic metrics are collected during the runtime of a complete application. Dynamic metrics are useful to identify super-component and to evaluate utilisation of components. In this paper both static and dynamic metrics are evaluated using Weyuker's set of properties. The result shows that the metrics provide a valid means to measure issues in component assembly.

V. Lakshmi Narasimhan, B. Hendradjaya
Study on a Decision Model of IT Outsourcing Prioritization

-In today's knowledge-based society, IT outsourcing has been used increasingly in many enterprises as a policy instrument for changing the way publicly funded services are provided. Full IT outsourcing strategy is an extreme one, there are many arguments for it, and selective IT outsourcing strategy often results in greater flexibility and better services. So, identifying IT object prioritization is very important to IT outsourcing. But, now most of researchers paid too much attention the decision of Application Service Provider (ASP) selection. Therefore, this paper proposes a decision model, which uses the fuzzy gray matter-element space theory (FHW) method, to help enterprises prioritizing the IT outsourcing objects. By this decision model, enterprises can make decision about selective IT outsourcing. At last, the paper gives an example as the illustration to show the practicability of the model.

XIE Xiang, GUAN Zhong-liang
An Efficient Database System for Utilizing Gis Queries

In this paper, we propose a new technique to answer GIS queries quickly. The technique uses phone zones as main indexes to narrow down the search processes. In this paper, we build modest GIS software system to handle different GIS queries efficiently. Using our new software, many benefits can be gained including search efficiency, simple interface, displaying spatial relationships, showing sales and service territories.

Fawaz A. Masoud, Moh’d Belal Al- Zoubi
Effective Adaptive Plans

-Many iterative search processes, or

adaptive plan

, that aim to find an optimal solution in a given problem domain, suggest that an optimal search process has an exponential character. Plans that consist of multiple strategies running in parallel, such an

bandit searches

, aim to demonstrate this pattern in probabilistic distributions of finding a best observed strategy amongst a number of alternatives. This paper introduces a hypothetical adaptive plan that consists of three strategies. One strategy guarantees a better result with each iteration, one has comparable results, and one guarantees worse results. The idea behind this approach is the suspicion that every adaptive plan can basically be mapped to these three base strategies, and that the exponential character of an optimal plan is a trait of its recurrent character.

Kees P. Pieters
A Morphosyntactical Complementary Structure for Searching and Browsing

This paper is a proposal for the construction of a pseudo-net built with precisely defined tokens describing the content and structure of the original WWW. This construction is derived by morphosyntactical analysis and should be structured with a post-processing mechanism. It is provided also an in-depth analysis of requirements and hypothesis to be stated to accomplish with this goal. An in-depth analysis of requirements and hypothesis to be stated to accomplish this goal is also provided. Such derived structure could act as an alternate network theme organization with a compacted version of the original web material. This paper does not describe nor study the post-processing approaches. Instead, it is posted here as a future work. A statistical analysis is presented here with the evaluation of the understanding degree of a hand-made structure built with some tokens derived under the hypothesis presented here. A comparison with the keyword approach is also provided.

M. D. De López Luise
Remote Monitoring and Control System of Physical Variables of a Greenhouse through a 1-Wire Network†

In this project a design and implementacion of a platform (hardware-software) for control and monitoring, using Internet, the physical variables of a greenhouse such as temperature and luminosity was made; for this a new type of microcontroller, TINI (Tiny InterNet Interfaces) was used, which can be used as a small server with additional advantages as being able to program it with JAVA language and to support a lot of communication protocols, like 1-wire protocol. Due to the form the platform was designed and implemented, this technology could be incorporated with very low cost in the PYMES (Small and Medium Companies) dedicated, for example to the production of flowers. An additional value of the platform is its adaptability in other applications like for example, laboratories, monitoring and control of manufactures, monitoring systems, among others.

N. Montoya, L. Giraldo, D. Aristizá bal, A. Montoya
Multimedia Content's Metadata Management for Pervasive Environment

One of the major challenges of a pervasive environment is the need for adaptation of content to suit a client’s specific needs and choices such as the client's preferences, the characteristics of the client device, the characteristics of the network to which the client is currently connected, as well as other related factors. In order for the adaptation to be efficient while satisfying the client's requirements and maintaining the semantics and quality of the content, the adaptation system needs to have adequate information regarding the content to be adapted, the client's profile, the network profile and others. The information regarding the content can be found from the content metadata. This work addresses the issue of content metadata management in a pervasive environment in relation to content adaptation. For this purpose, a distributed architecture for the management of metadata of multimedia content is proposed. The proposed architecture consists of components for storage, retrieval, update, and removal of metadata in the system. It also provides interfaces to external components through which metadata can be accessed. In addition, it proposes ways to specify, in the metadata, restrictions on the adaptations that may be applied on the content. This enables the content creator to impose constraints on adaptations that may potentially cause the loss of critical information or a decrease in the quality of the adapted content.

Fitsum Meshesha, Dawit Bekele, Jean-Marc Pierson
Data Loss Rate versus Mean Time To Failure in Memory Hierarchies

A common metric used for the assessment of overall reliability, in a memory hierarchy is the Mean To Failure (MTTF),but it doesn’t into accunt for time of data storage in each level. We propose another metric, Data Loss Rate (DLR), for this reason. We will derive a recurrent formula for computing DLR, and we will validate it by computer simulation.

Novac Ovidiu, Gordan Mirceaast, Mihaelea Novac
Towards Living Cooperative Information Systems for Virtual Organizations Using Living Systems Theory

In order to conceive and design cooperative information systems to better adapt to the dynamics of modern organizations in the changing environment, this paper explores the requirements and approaches to build “living” cooperative information systems for virtual organizations – both in terms of system flexibility and co-evolution of organization and information systems. The object of our case study is the Beijing Organizing Committee for the Games of the XXIX Olympiad. To meet the requirements of “living” cooperative information systems in the context of virtual organizations, we propose a unified peer-to-peer architecture based on the foundational concepts and principles of Miller’s Living Systems Theory, which is a widely accepted theory about how all living systems “work”. In our system framework, every peer belongs to one of the 6 organizational levels, e.g. peers, groups, organizations, communities, societies, supranational systems. Each level has the same types of components but different specializations. In addition, every peer has the same critical subsystems that process information. The case studies show how our architecture effectively supports the changing organization, dynamic businesses, and decentralized ownerships of resources.

Shiping Yang, Martin Wirsing
Research and Application of A Integrated System
—Textile Smart ProcessUsing ANN combined with CBR

In this pqper, the disadvantages and advantages of artificial neural networks (ANNs) and Case-base Reasoning (CBR) have been briefly introduced respectively. The capacity of network can be Improved through the mechanisum of CBR in the dynamic processing environment. And the limitation of CBR, that could not complete their reasoning process and propose a solution to a given task without intervention of experts, can be strong self-learning ability of ANN. The combination of these two artificial intelligent techniques not only benefits to control the quality and enhance the efficiency, but also to shorten the design cycle and save the cost, which paly an important role in promoting the intelligentized level of the textile industry. At the same time, utilizing ANN prediciting model, the sensitive process variables that affect the processing performances and quality of yarn and fabric can be decided, which are often adjusted during solving the new problems to form the desired techniques.

Xianggang YIN, Weidong YU*
Proposed Life–Cycle Modelm For Web-BAsed Hypermedia Applications Development MEthodologies.

-Developing WBHAs is moving fast due to an explosive increase of Internet/WWW use. Furthermore, the ability to use the internet/WWW tecnologies in closed environments (e.g., intrranet and extranet) have also helped in the spread of WBHAs development. However, the classical life-cycle model reveal seeral limitations in term of describing the process of deeloping WBHAs Recently, several design methods such as RMM[15], HDM[1], OOHDM [25], EORM[3] have been proposed for the design requirements of WBHAs. However, none of them have addressed the life-cycle of WBHAs development. Hece, in this paper, we first study different WBHA aexhitectures, then we identify a list of requirements for WBHAs development. After that, we show how the waterfall model lacks the capability to satisfy these requirements. Finally, we propose a new life-cycle model for WBHAs development.

Shaziz Arshad, M. Shoaib, A. Shah
Reverse Engineering Analyze for Microcontrollers’ Assembly language Projects

The problem of reverse engineering assembly language projects for microcontrollers from embedded systems is approached in this paper. A tool for analyzing projects is described which starts from the source files of the project to be analyzed, grouped in a Project Folder and from a Configuration files and generates diagrams for describing the program’s functionality. The tool is useful for the programmer and for the project manager in different phases of a project: code review, design review and development. It is also useful for understanding and documenting older projects.

M. Popa, M. Macrea, L. Mihu
Composition Linear Control In Stirred Tank Chemical Reactors

This work studies loop control composition in continuous chemical reactors with simple structures, due to its large acceptance in chemical industry. A linear cascade composition control (master/slave) is proposed, designed with basic control structures based on

Laplace

tools. Two configurations are designed, which were evaluated in a dynamic model of continuous stirred tank. From a stability analysis it is noted that, for such configurations, system assent time is 7 to 8 times reduced if compared to the assent time without loop control. Besides, the system shows a good performance when coming to the asked reference. Implementation of such control configurations can solve the problem of loop control composition.

A. Regalado Méndez, J. Álvarez Ramírez
Intelligent analysis to the Contamination in the city of Santiago from Chile

Air contamination is one of the biggest problems taht affects to the countries in almost any part of the world. The increase in the quantities of gases and enviornments has been verified to world scale, and each day it becomes more obvios that the answer to these problems should concentrate in the seacrb of intelligent solutions. In Chile, by law, the obligations settles down of developeing decontatmination plans in areas where the levels of pollutants systematically exceed the enviornmental norms, and plans of prevention where these norms are in danger of being ovecome. During the autumun-winter season the population of Santigo’s city centre is affected by a sudden increase in the levels of ai contamination. This situation is known as a critical episode of atmospheric contamination, and it takes place when there are register high levels of concertration of pollutants during a short period of time. These epsiodes originate from the covergence of a series of meteorological factos that impete the ventilation of Snatiago’s basin due to an increase in the previous emissions to the episode. According to the existing by-law, the criteria to decrease critical esisode of contamination is referred to in the index ICAP, that it generated by the data of the Net of Mensuration of gases and particles that are entered into a prognostic method developed by the physicist J. Cassmassi [44], [47], [55] This way, the authority makes the decisions with regard to the critical episodes that can affect Santiago’s city centre depending on the predictions that gives a pattern. Our investigational work is framed in the line of looking for intelligent methodologies for prediction and control of environmental probelms in Santiago’s city centre and its Metropolitan Area.

Delgado C. Miguel, Sanchez F. Daniel, Zapata C. Santigo, Escobar R. Luis, Goday M Jimmy
Designing Human-Centered User—Interfaces for Technical Systems

This contribution emphasizes the appropriateness of using human experience for designing rule-based user displays for controlling technical systems. To transform the human (operator) experience into a computer presentation, suitable means are needed. The proposed method and technique for designing user-machine interfaces for system control uses fuzzy logic techniques to covert human experience into a computer representation, and to computer values to animate graphical objects on the user display. This modest contribution investigates and clarifies the reasons for considering the appropraitness of fuzzy logic in designing rule-based human-machine interfaces for technical system control.

Salaheddin Odeh
Service Selection Should be Trust–and Reputation–Based

While existing Web Services standars provide the basic functionality needed to implement Web Service-based applications, wider acceptance of the Web Service paradigm requires improvements in serveral areas. A natural extension of the existing centralized approach to service discovery and selection is to rely on the community of peers—clients that can share their experiences obtained through previous interactions with service providers—as the source for both service discovery and service selection. We show that distributed algorithms for trust maintenance and reputation acquistion can effectively supplant the current centralized approach, even in cases where not all possible information is available, even in cases where not all possible information is available at any given time. Soem preliminary results demonstrate the feasibility of trust and reputation-based service selection approach.

Vojislav B. Mišić
A short discussion about the comparison between two software quality approaches mafteah/MethodAi method and the software Capability Maturityii Integration
Conference Paper for International Joint Conferences on Computer, Information, and Systems Sciences, and Engineering (CIS2E 05)

Two different methods for obtaining software programs with predictable quality are compared by positioning these models in a product/process and confirmation/improvement framework. The Mafteah/Method A method can be placed in the confirmation segment, whereas the software Capability Maturity Model can be positioned in the improvement/process quadrant.

G.R. Kornblum
Using Association Rules and Markov Model for Predit Next Access on Web Usage Mining

Predicting the next request of a user as visits Web pages has gained importance as Web-based activity increases. A large amount of research has been done on trying to predict correctly the pages a user will request. This task requires the development of models that can predicts a user’s next request to a web server. In this paper, we propose a method for constructing first-order and second-order Markov models of Web site access prediciton based on past visitor behavior and compare it association rules technique. In these qpproaches, sequences of user requests based on past visitor behavior and compare it association rules technique. In these approaches, sequences of user requests are collected by the session identificaiton techinique, which distinguishes the requests for the same web page in different browses. We report experimental studies using real server log for comparison between methods and show that degree of precision.

Siriporn Chimphlee, Naomie Salim, Mohd Salihin Bin Ngadiman, Witcha Chimphlee
New Protocol Layer for Guaranteeing Trustworthy Smart Card Transaction
Zheng Jianwu, Liu Mingsheng, Liu Hui
Metadata Guided Statistical Information Processing

Metadata may be used for handling for handling of statistical information. Some metadata standards have already emerged as guiding lines for information processing within statistical information system. Conceptual models for metadata representation have to address beside the dataset itself additional data objects occurring in connection with a dataset. A unified description framework for such data objects is discussed with respect to metadata handling. Basic ideas on integration and translation of metadata standards are given with a focus on the framwork. Hereby principles of ontology engineering play a key role as starting point.

Wilfried Grossmann, Markus Moschner
Combinaorial Multi-attribute Auction (CMOA) Framework for e-auction
A paper for the CIS2E 05 Conference, December 10–20, 2005. Aloysius Edoh City University London and University of East London

The traditional auction protocols (aka Dutch, English auctions) that Ebay, Amazon and Yahoo use; although considered success stories [9], have limited negotiation space: The combinatorial auction (CA) and multi-attribute auction (MAA) [10], [17] have been developed to address these shortages but even these do not allow users to negotiate more than one attribute at a time. As an answer to these limitations a new e-auction protocol has been created to enable agents negotiate on many attributes and combinations of goods simultaneously. This paper therefore shows how the automated hybrid auction was created to reduce computational and bid evaluation complexity based on CMOA bidding language and Social Construction of Technology (SCOT) principles. SCOT states that (i) the ’relevant social group’; (ii) their ’interpretative flexibility; and workability / functionality of the technology must be considered for the development of system such as e-auction as an alternative to E-Bay. SCOT is of the view that technologies emerge out of the process of choice and negotiations between ’relevant social groups’ [15], [8], in this case the bidders, auctioneers, sellers and auction house. This paper represents a collaboration of studies in progress - The Combinatorial Multi-attribute Auction as an online auction as compared to existing e-auction protocols such as Amazon, eBay, and applicaiton of intelligent software and Agent UML in e-auction.

Tarek Sobh, Khaled Elleithy
Measuring of Spectral Franctal Dimension

There were great expectations in the 1980s in connection with the practical applications of mathematical processes which were built mainly upon fractal dimension mathematical basis. Results were achieved in the first times in several fields: examination of material structure, simulation of chaotic phenomena (earthquake, tornado), modelling real processes with the hilp of information technology equipment, the definition of length of revers or riverbanks. Significant results were achieved in practical applications later in the fields of informationtechnology, certain image processing areas. data compression, and computer classification. In the present publication the so far well known algorithms calculating fractal dimension in a simple way will be introduced as well sa the new mathematical concept named by th author ’ spectral fractal dimension’ [8] the algorithm derived from this concept and the possiblities of their practical usage.

J. Berke
Design and Implementation of an Adaptive LMS- based Parallel System for Noise Cancellation

- When a desired sugnal is encompassed by a noisy environment, active noise. The presented algorithm is based on the standard Least mean Squares (LMS) algorithm developed by Bernard Widrow. Modifications to the LMS algorithm were made in order ot optimixe its performance in extracting a desired speech signal form a noisy environment. The system consists of two adaptive systems running in paralles. withone having a much higher convergence rate to provide rapid adaptation in a non-stationary environment. However, theoutput of the higher converging system results indistorted speech. Therefore, the second system, which runs at a lower convergence rate but regularly has its coeffients updated by the first system, provides the actual output of the desired signal. All of the algorithm development and simulation were initially performed in matlab, and were then implemented on TMS320C6416 Digital Signal Processor(DSP) evaluation board to produce a real-time, noise-reduced speech signal.

Kevin S. Biswas, Jason G. Tong
Requirements Analysis: A Review

- Many software organizations often bypass the requirements analysis phase of the software development life cycle process and skip directly to the implementation phase in an effort to save time and money. The results of such an approach often leads to projects not meeting the expected deadline, exceeding budget, and not meeting user needs or expectations. One of the primary benefits of requirements analysis is to catch problems early and Minimize thier impact with respect to time and money. This paper is a literature review of the requirements analysis phase and the multitude of techniques available to perform the analysis. It is hoped that by compiling the information into a single document, readers will be more in a position to understand the requirements engineering process and provide analysts a compelling argument as to why it should be employed in modern day software development.

Joseph T. Catanio
Nuclear Matter Critical Temperature and Charge Balance

An interative algorithm was developed to fit Fisher Law for Heavy Ion Collisions with distinct charge balance, obtaining different critical temperatures in agreement with recent theoretical and experimental results. This way is confirmed the influence of charge balance on the caloric curve of nuclear matter.

A. Barranón-Cedillo, J.A. Ló Gallarado, F.L. de Castillo-Alvarado
Activation-Adjusted Scheduling Algorithms for Real-Times Systems

Scheduling in real-time is an important problem due to its role in practical applications. Among the scheduling algorithms proposed in the literature, static priority scheduling algorithms have less run-time scheduling overhead due to their logical simplicity. Rate monotonic scheduling is the first static priority algorithm proposed for real-time scheduling[1]. It has been extensively analyzed and heavily used in practice for its simplicity. One of the limitations of rate monotonic scheduling, as shown recently in [26], is that it incurs significant of this paper is to propose static priority scheduling algorithms with reduced preemptions. We present two frameworks, called off-line activation-adjusted scheduling (OAA) and adaptive activation-adjusted scheduling (AAA), from which many static priority scheduling algorithms can be derived by appropriately implementing the abstract components. The proposed algorithms reduce the number of unnecessary preemptions and hence: (i) increase processor utilization in realtime systems; and (iii) increase tasks schedulability. We conducted a simulation study for selected algorithms derived from the frameworks and the results indicate that the algorithms reduce preemptions significantly. The appeal of our algorithms is that they generally achieve significant reduction in preemptions while retaining the simplicity of static priority algorithms intact.

Alex A. Aravind, Jeyaprakash Chelladurai
Backmatter
Metadaten
Titel
Advances in Systems, Computing Sciences and Software Engineering
herausgegeben von
Tarek Sobh
Khaled Elleithy
Copyright-Jahr
2006
Verlag
Springer Netherlands
Electronic ISBN
978-1-4020-5263-7
Print ISBN
978-1-4020-5262-0
DOI
https://doi.org/10.1007/1-4020-5263-4

Neuer Inhalt