Skip to main content

2008 | Buch

Advances in Computer and Information Sciences and Engineering

insite
SUCHEN

Über dieses Buch

Advances in Computer and Information Sciences and Engineering includes a set of rigorously reviewed world-class manuscripts addressing and detailing state-of-the-art research projects in the areas of Computer Science, Software Engineering, Computer Engineering, and Systems Engineering and Sciences.

Advances in Computer and Information Sciences and Engineering includes selected papers from the conference proceedings of the International Conference on Systems, Computing Sciences and Software Engineering (SCSS 2007) which was part of the International Joint Conferences on Computer, Information and Systems Sciences and Engineering (CISSE 2007).

Inhaltsverzeichnis

Frontmatter
A New Technique for Unequal-Spaced Channel-Allocation Problem in WDM Transmission System

For long-haul fiber-optic transmission systems to support multiple high speed channels wavelength-division multiplexing (WDM), is currently being deployed to achieve high-capacity. It allows information to be transmitted at different channels with different wavelength. But FWM is one of the major problems needed to be taken into account when one designs high-capacity long-haul WDM transmission system. Recently, unequal-spaced channel-allocation technique have been studied and analyzed to reduce four wave mixing (FWM) crosstalk. Finding a solution by this proposed channel allocation technique need two parameters such as minimum channel spacing and number of channel used in WDM system. To get better result minimum channel spacing has to be selected perfectly.

A.B.M. Mozzammel Hossain, Md. Saifuddin Faruk
An Algorithm to Remove Noise from Audio Signal by Noise Subtraction

This paper proposes an algorithm for removing the noise from the audio signal. Filtering is achieved through recording the pattern of noise signal. In order to validate our algorithm, we have implementation in MATLAB 7.0. We investigated the effect of proposed algorithm on human voice and compared the results with the existing related work, most of which employ simple algorithm and effect the voice signal. The results show that the proposed algorithm makes efficient use of Voice over the IP communication with less noise in comparison to similar existing algorithm.

Abdelshakour Abuzneid, Moeen Uddin, Shaid Ali Naz, Omar Abuzaghleh
Support Vector Machines based Arabic Language Text Classification System: Feature Selection Comparative Study

feature selection (FS) is essential for effective and more accurate text classification (TC) systems. This paper investigates the effectiveness of five commonly used FS methods for our Arabic language TC System. Evaluation used an in-house collected Arabic TC corpus. The experimental results are presented in terms of macro-averaging precision, macro-averaging recall and macro-averaging F1 measure.

Abdelwadood. Moh’d. Mesleh
Visual Attention in Foveated Images

In this paper, we present a new visual attention system which is able to detect attentive areas in the images with non-uniform resolution. Since, one of the goals of the visual attention systems is simulating of human perception, and in human visual system the foveated images processed, therefore, visual attention systems should be able to identify the saliency region to these images. We test the system by two types of the images: real world and artificial images. Real world images include the cloth images with some defect, and the system should detect these defects. In artificial images, one element is different from the others only in one feature and the system should detect it.

Abulfazl Yavari, H.R. Pourreza
Frequency Insensitive Digital Sampler and Its Application to the Electronic Reactive Power Meter

This paper presents the new reactive power (RP) measurement algorithms and their realization on the electronic elements. The proposed Walsh function (WF) based algorithms simplify the multiplication procedure to evaluate the reactive components from instantaneous power signal. The advantages of the proposed algorithms have been verified by experimental studies. One of these advantages is that in contrast to the known existing methods which involve phase shift operation of the input signal, the proposed technique does not require the time delay of the current signal to the π/2 with respect to the voltage signal. Another advantage is related to the computational saving properties of the proposed algorithms coming from use of the Walsh Transform (WT) based signal processing approach. It is shown that the immunity of the measurement results to the input frequency variation can be obtained if the sampling interval of the instantaneous power is adopted from the input power frequency. Validity and effectiveness of the suggested algorithms have been tested by use of a simulation tools developed on the base of “Matlab“ environment.

Adalet N. Abiyev
Non-Linear Control Applied to an Electrochemical Process to Remove Cr(VI) from Plating Wastewater

In this work the process to reduce hexavalent Chromium (Cr(VI)) in an electrochemical continuous stirred tank reactor is optimized. Chromium (VI) is toxic and potentially carcinogenic to living beings. It is produced by many industries and commonly discharged without any pretreatment. This electrochemical reactor itself can not satisfy the Mexican environmental legislation, which manages a maximum discharge limit of Chromium of 0.5 mg L

-1

. In order to comply with this restriction, a proportional non-linear control law is implemented. Through numeric simulations, it is observed that the proposed control law has an acceptable performance, because the

A. Regalado-Méndez, D. Tello-Delgado, H. O. García-Arriaga
Semantics for the Specification of Asynchronous Communicating Systems (SACS)

The objective of the paper is to describe the formal definitions for the Specification of Asynchronous Communicating System (SACS). This is a process algebra which is a descendent of the synchronous variant of Calculus of Communicating Systems (CCS) known as Synchronous Calculus of Communicating Systems (SCCS). To this end, we present Structured Operational Semantics (SOS) for the constructs of SACS using Labelled Transition Systems (LTS) in order to describe the behaviour of the processes. Also, we discuss the semantic equivalences for SACS, especially bisimulation, which provides a method for verifying the behaviour of a specified process.

A.V.S. Rajan, S. Bavan, G. Abeysinghe
Application Multicriteria Decision Analysis on TV Digital

In domains (such as digital TV, smart home, and tangible interfaces) that represent new paradigms of interactivity, deciding the most appropriate interaction design solution is a challenge. There is not yet any work that explores how to consider the users’ experience with technology to analyze the best solution(s) to the interaction design of a mobile TV application. In this study we applied usability tests to evaluate prototypes of one application to mobile digital TV. The main focus of this work is to develop a multicriteria model for aiding in decision making for the definition of suitable prototype. We used M-MACBETH and HIVIEW as modeling tools.

Ana Karoline Araüjo de Castro, Plácido Rogério Pinheiro, Gilberto George Conrado de Souza
A Framework for the Development and Testing of Cryptographic Software

With a greater use of the Internet and electronic devices in general for communications and monetary transac- tions it is necessary to protect people’s privacy with strong cryptographic algorithms. This paper describes a framework for the development and testing of cryptographic and mathematical software. The authors argue this type of software needs a development and testing framework that is better tailored to its needs than other more general approaches. The proposed framework uses the symbolic mathematics package, Maple, as a rapid prototyping tool, test oracle and also in test case refinement. Finally, we test our hypothesis by evaluation of systems developed with and without the framework using quantitative techniques and a multi component qualitative metric.

Andrew Burnett, Tom Dowling
Transferable Lessons from Biological and Supply Chain Networks to Autonomic Computing

Autonomic computing undoubtedly represents the solution for dealing with the complexity of modern computing systems, driven by ever increasing user needs and requirements. The design and management of autonomic computing systems must be performed both rigorously and carefully. Valuable lessons in this direction can be learned from biological and supply chain networks. This paper identifies and discusses but a few transferable lessons from biological and supply chain networks to autonomic computing. Characteristics such as structural and operational complexity and the agent information processing capabilities are considered for biological and supply chain networks. The relevance of the performance measures and their impact on the design, management and performance of autonomic computing systems are also considered. For example, spare resources are often found in biological systems. On the other hand, spare resources are frequently considered a must-not property in designed systems, due to the additional costs associated with them. Several of the lessons include the fact that a surprisingly low number of types of elementary agents exist in many biological systems, and that architectural and functional complexity and dependability are achieved through complex and hierarchical connections between a large number of such agents. Lessons from supply chains include the fact that, when designed and managed appropriately, complexity becomes a value-adding property that can bring system robustness and flexibility in meeting the users’ needs. Furthermore, information-theoretic methods and case-study results have shown that an integrated supply chain requires in-built spare capacity if it is to remain manageable.

Ani Calinescu
Experiences from an Empirical Study of Programs Code Coverage

The paper is devoted to functional and structural testing of programs. Experimental results of a set of programs are presented. The experiments cover selection of functional tests, analysis of function and line coverage and optimization of test suites. The comparison of code coverage results and selection of the most effective tests are discussed in the relation to the test-first approach for program development. The types of the code not covered by the tests are classified for different categories.

Anna Derezińska
A Secure and Efficient Micropayment System

In this paper, we analyze some key issues concerning systems for processing Internet micropayments. At the heart of the matter is the trade-off between security and cost considerations. A set of criteria is proposed, against which a number of previously proposed and currently operational micropayment systems are reviewed. We also propose a model, called WebCoin, embodying a number of innovations that may enhance micropayment systems currently in use.

Anne Nguyen, Xiang Shao
An Empirical Investigation of Defect Management in Free/Open Source Software Projects

Free/Open Source software (F/OSS) is a novel way of developing and deploying software systems on a global basis. Collaborative efforts of developers and users towards identifying and correcting defects as well as requesting and enhancing features form the major foundation of this innovative model. An empirical study using questionnaire survey has been conducted among F/OSS developers to gain an insight of defect management among various F/OSS projects. The present paper focuses on exploring extent of use of defect tracking systems among F/OSS projects and evaluating the perception of F/OSS developers about effectiveness of user participation and efficiency of defect resolution. The findings indicate that the defect management activities significantly vary among F/OSS projects depending on project size, type of user participation and severity of defects. The findings of the present study can help in future to determine the impact of such variations in defect management on overall quality of F/OSS products by using defect data extracted from publicly accessible F/OSS repository.

Anu Gupta, Ravinder Kumar Singla
A Parallel Algorithm that Enumerates all the Cliques in an Undirected Graph

In this paper a new algorithm that guarantees to enumerate all the existing cliques including the maximal clique based on set operations is presented. The first advantage of the algorithm is that it can be parallelized to improve the performance. The second being that it guarantees to find all the cliques. The algorithm cannot be considered as radically different from some of the existing algorithm except it uses simple set operation explicitly to find the cliques thus making the computation simpler. This also allows us to verify the results.

A. S. Bavan
Agent Based Framework for Worm Detection

Worms constitute a major source of Internet delay. A new worm is capable of replicating itself to vulnerable systems in a very short time, and infecting thousands of computers across the Internet before human response. A new worm floods the internet and halts most Internet related services, which spoils Internet economy. Therefore, detecting new worms is considered crucial and should gain highest priority. Most of research effort was dedicated to modeling worm behavior; recently defending worm is receiving more interest, but the defense against Internet worms still an open problem. The purpose of this paper is to describe a framework for multiagent-based system for detecting new worms, auto-generating its signature, and distributing this signature. This goal can be achieved through a set of distributed agents residing on computers, routers, and servers.

New worm floods Internet in a very high speed. Human role in detecting new worm and generating its signature takes long time. This gives worms a good chance to flood the whole Internet before any reaction. Autonomous, reliable, adaptive, responsive and proactive system is needed to detect new worms without human intervention. These features characterize agents. A framework for automated multiagent-based system for worm detection and signature generation, deployed on routers, computers, and servers is proposed in this paper.

A.M. El-Menshawy, H.M. Faheem, T. Al-Arif, Z. Taha
Available Bandwidth Based Congestion Avoidance Scheme for TCP: Modeling and Simulation

Available bandwidth being the capacity of the tightest link along a data path established by virtual circuit service paradigm, is being proposed as an efficient congestion avoidance scheme for TCP to be deployed on a grid system. To achieve our goal, we modified the congestion avoidance segment of the reno version of TCP - multiplicative decrease; by replacing it with a scheme based on available bandwidth. The proposed scheme was modeled using finite automata theory and simulated in Matlab7.0 across a 10-hop of 1Gbps per link data path. To estimate available bandwidth, the size of the probing packet was set to 100 packets using non-intrusive probing method. We observed that our estimation algorithm converged, at most, at the third iteration and at steady-state, TCP current window size increased by 3000-4000% at 95% confidence interval.

A. O. Oluwatope, G. A. Aderounmu, E. R. Adagunodo, O. O. Abiona, F. J. Ogwu
On the modeling and control of the Cartesian Parallel Manipulator

The Cartesian Parallel Manipulator (CPM) which proposed by Han Sung Kim, and Lung-Wen Tsai [1] consists of a moving platform that is connected to a fixed base by three limbs. Each limb is made up of one prismatic and three revolute joints and all joint axes are parallel to one another. In this way, each limb provides two rotational constraints to the moving platform and the combined effects of the three limbs lead to an over-constrained mechanism with three translational degrees of freedom. The manipulator behaves like a conventional X-Y-Z Cartesian machine due to the orthogonal arrangement of the three limbs.

In this paper, the dynamics of the CPM has been presented using Lagrangian multiplier approach to give a more complete characterization of the model dynamics. The dynamic equation of the CPM has a form similar to that of a serial manipulator. So, the vast control literature developed for serial manipulators can be easily extended to this class of manipulators. Based on this approach, four control algorithms; simple PD control with reference position and velocity only, PD control with gravity compensation, PD control with full dynamic feedforward terms, and computed torque control, are formulated. Then, the simulations are performed using Matlab and Simulink to evaluate the performance of the four control algorithms.

Ayssam Y. Elkady, Sarwat N. Hanna, Galal A. Elkobrosy
Resource Allocation in Market-based Grids Using a History-based Pricing Mechanism

In an ad-hoc Grid environment where producers and consumers compete for providing and employing resources, trade handling in a fair and stable way is a challenging task. Dynamic changes in the availability of resources over time makes the treatment yet more complicated. Here we employ a continuous double auction protocol as an economic-based approach to allocate idle processing resources among the demanding nodes. Consumers and producers determine their bid and ask prices using a sophisticated history-based dynamic pricing strategy and the auctioneer follows a discriminatory pricing policy which sets the transaction price individually for each matched buyer-seller pair. The pricing strategy presented generally simulates human intelligence in order to define a logical price by local analysis of the previous trade cases. This strategy is adopted to meet the user requirements and constraints set by consumers/producers. Experimental results show waiting time optimization which is particularly critical when resources are scarce.

Behnaz Pourebrahimi, S. Arash Ostadzadeh, Koen Bertels
Epistemic Structured Representation for Legal Transcript Analysis

HTML based standards and the new XML based standards for digital transcripts generated by court recorders offer more search and analysis options than the traditional CAT (Computer Aided Transcription) technology. The LegalXml standards are promising opportunities for new methods of search for legal documents. However, the search techniques employed are still largely restricted to keyword search and various probabilistic association techniques. Rather than keyword and association searches, we are interested in semantic and inference-based search. In this paper, a process for transforming the semi-structured representation of the digital transcript to an epistemic structured representation that supports semantic and inference-based search is explored.

Tracey Hughes, Cameron Hughes, Alina Lazar
A Dynamic Approach to Software Bug Estimation

There is an increase in the number of software projects developed in a globally distributed environment. As a consequence of such dispersion, it becomes much harder for project managers to make resource estimations with limited information. In order to assist in the process of project resource estimations as well as support management planning for such distributed software projects, we present a methodology using software bug history data to model and predict future bug occurrences. The algorithm works in a two-step analysis mode: Local and Global Analysis. Local analysis leverages the past history of bug counts for a specific month. On the other hand, global analysis considers the bug counts over time for each individual component. The bug prediction achieved by the algorithm is close to actual bug counts for individual components of Eclipse software.

Chuanlei Zhang, Hemant Joshi, Srini Ramaswamy, Coskun Bayrak
Soft Biometrical Students Identification Method for e-Learning

bf The paper describes a soft biometrical characteristics based approach to the students’ identification process to be used mainly for e-learning environments. This approach is designed to increase security of the examination process from the involved attendees’ identification point of view and should improve the overall security in relatively weakly protected e-learning systems. The approach is called "soft" as doesn’t require any special systems to be used other than e-learning pages embedded software. The paper discusses how the approach can be applied and what kind methods should be used together with the proposed one to produce a complete identification system for e-learning.

Deniss Kumlander
Innovation in Telemedicine: an Expert Medical Information System based on SOA, Expert Systems and Mobile Computing

This paper presents Expert Medical Information System (EMIS) that supports the medical information management such as electronic health records (EHR) and it helps in the diagnostic processes. EMIS is designed using Service Oriented Architecture (SOA). All functionalities of EMIS are provided as services that are consumed by end clients or other services. A service layer masks a business logic and database tiers that execute the functionalities. The business logic contains an electronic health record and an expert system based on PROLOG for improving the diagnostic processes. We present an illustrative example of EMIS to diagnose hemorrhagic dengue.

Denivaldo Lopes, Bruno Abreu, Wesly Batista
Service-Enabled Business Processes: Constructing Enterprise Applications – An Evaluation Framework

Web services aiming at spontaneously integrating services from different providers in ensuring at offering a user the best possible experience, face considerable challenges in materialising that notion of improved experience. The interrelationships between the concepts of security, performance and interoperability, while critical to any web service, pose the biggest test and constitute the key features of the proposed evaluation framework. The key objective of this framework is to establish a means of evaluating how close a web service can get to achieving best practice without one of the three key features overriding the others.

Christos K. Georgiadis, Elias Pimenidis
Image Enhancement Using Frame Extraction Through Time

Image enhancement within machine vision systems has been performed a variety of ways with the end goal of producing a better scene for further processing. The majority of approaches for enhancing an image scene and producing higher quality images have been explored employing only a single image. An alternate approach, yet to be fully explored, is the use of image sequences utilizing multiple image frames of information to generate an enhanced image for processing within a machine vision system.

This paper describes a new approach to image enhancement for controlling lighting characteristics within an image called Frame Extraction Through Time (FETT). Using multiple image frames of data, FETT can be used to perform a number of lighting enhancement tasks such as scene normalization, nighttime traffic surveillance, and shadow removal for video conferencing systems.

Elliott Coleshill, Alex Ferworn, Deborah Stacey
A Comparison of Software for Architectural Simulation of Natural Light

This paper reports a study with daylighting simulation in the architectural design process, through the evaluation of two simulation tools – ECOTECT 5.5 (as interface to RADIANCE) and RELUX2006 VISION. This study was developed from the architect’s point of view and in the context of the design process. The evaluation was conducted taking into account criteria such as User Interface, Geometry, Output, Daylight Parameters, Material Description, Processing, Validation, and User Support. Among these criteria, User Interface is especially important for the architect. It is the best way to overcome the natural barriers in using digital processes in architectural design, with the objective of reaching environmental comfort and energy efficiency. The methodology used includes a large number of simulations in tropical climate, aiming at testing the simulation tools in different daylight conditions in Brazil. The results of this evaluation show a great potential for improving the use of simulation tools in the design process, through a better understanding of these tools.

Evangelos Christakou, Neander Silva
Vehicle Recognition Using Curvelet Transform and Thresholding

This paper proposes the performance of a new algorithm for recognition vehicle’s system. This recognition system is based on extracted features on the performance of image’s curvelet transform & achieving standard deviation of curvelet coefficients matrix in different scales & various orientations. The curvelet transform is a multiscale transform with frame elements indexed by location, scale and orientation parameters, and have time-frequency localization properties of wavelets but also shows a very high degree of directionality and anisotropy.The used classifier in this paper is called k nearest-neighbor.In addition, the proposed recognition system is obtained by using different scales information as feature vector. So, we could clarify the most important scales in aspect of having useful information. The results of this test show, the right recognition rate of vehicle’s model in this recognition system, at the time of using the total scales information numbers 2,3&4 curvelet coefficients matrix is about 95%. We’ve gathered a data set that includes of 300 images from 5 different classes of vehicles. These 5 classes of vehicles include of: PEUGEOT 206, PEUGEOT 405, Pride, RENULT5 and Peykan. We’ve examined 230 pictures as our train data set and 70 pictures as our test data set.

Farhad Mohamad Kazemi, Hamid Reza Pourreza, Reihaneh Moravejian, Ehsan Mohamad Kazemi
Vehicle Detection Using a Multi-agent Vision-based System

In this paper we propose a multi-agents system for vehicle detection in image. The goal of this system is to be able to localize vehicles in a given image. Developed agents are capable of detecting pre-specified shapes from processing this image. Cooperation involves communicating hypotheses and resolving conflicts between the interpretations of individual agents. Specifically in the proposed system, eight process agents, consisting of edge, contour, wheel, LPL (License-Plate Line), LPR (License-Plate Rectangle), PCV (Plate-Candidates Verification), and vehicle symmetry agents, were developed for vehicle detection in various outdoor scenes. In the testing data, there are 500 car blobs and 100 non-car blobs. We show through experiments that our system is 90.16% effective on Detecting vehicles in various outdoor scenes.

Saeed Samadi, Farhad Mohamad Kazemi, Mohamad-R. Akbarzadeh-T
Using Attacks Ontology in Distributed Intrusion Detection System

In this paper we discussed about utilizing methods and techniques of semantic web in the Intrusion Detection Systems. We study, using of ontology, in a Distributed Intrusion Detection System for extracting semantic relation between computer attacks and intrusions. We used Protégé software for building an ontology specifying computer attacks and intrusion. Our Distributed Intrusion Detection System is a network, contains some systems that every system has an individual Intrusion Detection System; and special central system, that contains our proposed attacks ontology. Every time any system detects an attack or new suspected situation, send detection report for central system , with this ontology the central system can extract the semantic relationship among computer attacks and suspected situations in the network; and it is better to decide about them and consequently reduce the rate of false positive and false negative in Intrusion Detection Systems.

F. Abdoli, M. Kahani
Predicting Effectively the Pronunciation of Chinese Polyphones by Extracting the Lexical Information

One of the difficult tasks on Natural Language Processing (NLP) is the sense ambiguity of characters or words on text, such as polyphones, homonymy, and homograph. The paper addresses the ambiguity and issues of Chinese character polyphones, and disambiguity approaches for it. The Sinica ASCED will be used as the dictionary by matching the Chinese words. Furthermore we proposed the two voting schemes, preference and winner take all scoring, for solving the issues. The approach of unifying several methods in the paper will be discussed for promoting the better performance. The final precision rate of experimental result achieves 92.72%.

Feng-Long Huang, Shu-Yu Ke, Qiong-Wen Fan
MiniDMAIC: An Approach for Causal Analysis and Resolution in Software Development Projects

Handling problems and defects in software development projects is still a difficult matter in many organizations. The problems analyses, when performed, usually do not focus on the problems sources and root causes. As a result, bad decisions are taken and the problem is not solved or can even be aggravated by rework, dissatisfaction and cost increase due to lack of quality. These difficulties make it hard for organizations that adopt the CMMI model to implement the Causal Analysis and Resolution (CAR) process area in software projects, as projects usually have to deal with very limited resources. In this context, this work proposes an approach, called MiniDMAIC, for analyzing and resolving defect and problem causes in software development projects. The MiniDMAIC approach is based on Six Sigma’s DMAIC methodology and the Causal Analysis and Resolution process area from CMMI Level 5.

Márcia G. S. Gonçalves
Light Vehicle Event Data Recorder Forensics

While traffic crash reconstruction focuses primarily on interpreting physical evidence, the proper generation and preservation of digital data from Event Data Recorders (EDRs) can provide invaluable evidence to crash reconstruction analysts. However, data collected from the EDR can be difficult to use and authenticate, as exemplified through the analysis of a General Motors 2001 Sensing and Diagnostic Module (SDM). Fortunately, advances in the digital forensics field and memory technology can be applied to EDR analysis in order to provide more complete and usable results. This paper presents a developmental model for EDR forensics, centered on the use of existing digital forensic techniques to preserve digital information stored in automobile event data recorders.

Jeremy S. Daily, Nathan Singleton, Beth Downing, Gavin W. Manes*
Research of Network Control Systems with Competing Access to the Transfer Channel

The original method of modelling of a network control system as continuous - discrete with a random structure is submitted in the article. The method allows to take into account stochastic character of data transfer via channel, to determine the areas of stability of control systems and to choose the most effective algorithms of control by the distributed technological systems. Moreover, the application of the given method for the analysis and calculation of areas of network control systems stability with competing access to the data transfer channel is shown by the example of standard Ethernet.

G. V. Abramov, A. E. Emelyanov, M. N. Ivliev
Service-Oriented Context-Awareness and Context-Aware Services

This paper describes the design, implementation, deployment, and performance evaluation of a Service-Oriented Wireless Context-Aware System (SOWCAS), a distributed pervasive system having computers, a context-aware middleware server and mobile clients that support context-awareness and high adaptation to context changes for distributed heterogeneous indoor and outdoor mobile ubiquitous systems and environments. The client SOWCAS application runs on different mobile platforms and exploits the modern wireless communication facilities provided by the new state of the art mobile devices. The client architecture is designed and implemented with paying attention to supportability features, i.e., understandable, maintainable, scalable, and portable, real-time constraints, and using service-oriented and object-oriented design principles and design patterns. The SOWCAS Server provides basic and composite services for mobile clients and handles indoor and outdoor context information. The implementations were tested at the Fatih University campus and given as typical mobile context-aware scenarios from a campus life of students and academicians in the paper.

H. Gümüşkaya, M. V. Nural
Autonomous Classification via Self-Formation of Collections in AuInSys

This paper describes a mechanism, called SIF, for autonomous and unsupervised hierarchical classification of information items in AuInSys, a pro-interactive information system, where each single information element is represented by a compact software agent. The classification is carried out through the self-organization of information items which form the collections incrementally. The information items compete with each other through the self-interactions of their agents which lead to incremental grouping of items into collections. The proposed mechanism provides an automatic, more flexible, and dynamic partitioning of information compared with what is used in traditional file systems.

Hanh H. Pham
Grid Computing Implementation in Ad Hoc Networks

The development of ubiquitous computing and mobility opens challenges for implementation of grid computing in ad hoc network environments. In this paper, a new grid computing implementation for ad hoc networks is proposed. The proposed addition of the ad hoc network protocols suite offers an easy and effective way to exploit the computing power of the network nodes. The model is implemented in the NS-2 network simulator providing the possibility to investigate its performances and tune the network grid parameters.

Aksenti Grnarov, Bekim Cilku, Igor Miskovski, Sonja Filiposka, Dimitar Trajanov
One-Channel Audio Source Separation of Convolutive Mixture

Methods based on one-channel audio source separation are more practical than multi-channel ones in the real world applications. In this paper we proposed a new method to separate audio signals from single convolutive mixture. This method is based on subband domain to blindly segregate this mixture and it is composed of three stages. In the first stage, the observed mixture is divided into a finite number of subbands through filtering with a parallel bank of FIR band-pass filters. The second stage employed empirical mode decomposition (EMD) to extract intrinsic mode functions (IMFs) in the each subband. Then we obtain independent basis vectors by applying principle component analysis (PCA) and independent component analysis (ICA) to the vectors of IMFs in the each subband. In the third stage we perform subband synthesis process to reconstruct fullband separated signals. We have produced experimental results using the proposed separation technique. The results showed that the proposed method truly performs separation of speech and interfering sound from a single mixture.

Jalal Taghia, Jalil Taghia
Extension of Aho-Corasick Algorithm to Detect Injection Attacks

In this paper we propose an extension to the Aho-Corasick algorithm to detect injection of characters introduced by a malicious attacker. We show how this is achieved without significantly increasing the size of the finite-state pattern matching machine of the Aho-Corasick algorithm. Moreover, the machine would detect a match only if the number of stuffed characters is within a specified limit so that the number of false positives remains low. A comparison of the CPU time consumption between Aho-Corasick algorithm and the proposed algorithm is provided. It was found the proposed algorithm can outperform the Aho-Corasick, while ignoring the stuffed characters and detecting a valid match.

Jalel Rejeb, Mahalakshmi Srinivasan
Use of Computer Vision during the process of Quality Control in the classification of grain

In this work the date collected is a result of the development of a system of computer vision taking place during the quality control stage of the process of classifying grain sizes. This is important as the machines may need to be adjusted if samples of produce are found to have differing percentages of grain quality during the process. This study focussed on the machine process involved in the production of rice for small and medium-sized companies, referred to in this study as S&MC. This production process is made up of different stages; the first is the acquisition of images of the grains of rice which are then condensed to obtain better results in the segmentation process. The segmentation process uses images which seek the outline of the grains only. This study took these images and later developed them further and algorithms were programmed in order to segment the whole rice grains and those three quarter size, using techniques of characterization and grain classification based on the morphologic properties of these grains. Finally, it was possible to segment and to classify 90.92% of the images analyzed in the correct form.

Rosas Salazar Juan Manuel, Guzmán Ruiz Mariana, Valdes Marrero Manuel Alejandro
Theoretical Perspectives for E-Services Acceptance Model

Web-based e-service user adoption and continuance requires an understanding of consumer contextual factors. In this context, e-service is referred to (interactive) service being provided on the website. This paper examines specific factors such as user experience and perception, motivation, support, control, and usage frequency in the online/offline user task environment and its affect on web-based e-service adoption and continuance.

Kamaljeet Sandhu
E-Services Acceptance Model (E-SAM)

This paper reports on developing a framework for Web-based EServices Acceptance Model (E-SAM). The paper argues that user experience, user motivation, perceived usefulness, perceived ease of use may influence user acceptance of web-based eServices. The model was examined at a university where students and staff use services that have moved from a paper-based to Web-based eService system. The findings from the data analysis suggest that user experience is strongly related to perceived ease of use; and perceived usefulness to user motivation in user acceptance of Web-based eServices. The bonding of these variables and the lack of strong relationships between other components suggest that the model’s application to eServices and highlight a research framework.

Kamaljeet Sandhu
Factors for E-Services System Acceptance: A Multivariate Analysis

This study investigates factors that influence the acceptance and use of e-Services. The research model includes factors such as user experience, user motivation, perceived usefulness and perceived ease of use in explaining the process of e-Services acceptance, use and continued use. The two core variables of the Technology Acceptance Model (TAM), perceived usefulness and perceived ease of use, are integrated into the Electronic Services Acceptance Model (E-SAM).

Kamaljeet Sandhu
A Qualitative Approach to E-Services System Development

This paper reports a case study of user activities in electronic services systems development. The findings of this study explain the acceptance of an e-services system that is being implemented for processing student’s admission applications on a university website. The user’s interface with the systems development provides useful information on the characteristics of e-services. The ease of use construct in doing the task is an important feature of e-services. The other attributes are user control characteristics of an e-services system.

Kamaljeet Sandhu
Construction of Group Rules for VLSI Application

An application of cellular automata (CA) has been proposed for building parallel processing systems, cryptographic hashing functions, and VLSI technology. Recently, Das et al. have reported characterization of reachable/non-reachable CA states. Their algorithm has only offered the characterization under a null boundary CA (NBCA). However, in the simplifications of hardware implementation, a periodic boundary CA (PBCA) is suitable for constructing cost-effective algorithms such as a linear feedback shift register (LFSR) structure because of its circular property. Therefore Kang et al. have provided an algorithm for deciding of group/non-group CA based on periodic boundary condition. However, they did not have provided an algorithm for constructing of group CA rules. Thus, this paper presents an efficient evolutionary algorithm for constructing of group CA rules based on periodic boundary condition. We expect that the proposed algorithm will be efficiently used in variety of applications including cryptography and VLSI technology.

Byung-Heon Kang, Dong-Ho Lee, Chun-Pyo Hong
Implementation of an Automated Single Camera Object Tracking System Using Frame Differencing and Dynamic Template Matching

In the present work the concepts of dynamic template matching and frame differencing have been used to implement a robust automated single object tracking system. In this implementation a monochrome industrial camera has been used to grab the video frames and track a moving object. Using frame differencing on frame-by-frame basis, a moving object, if any, is detected with high accuracy and efficiency. Once the object has been detected it is tracked by employing an efficient Template Matching algorithm. The templates used for the matching purposes are generated dynamically. This ensures that any change in the pose of the object does not hinder the tracking procedure. To automate the tracking process the camera is mounted on a pan-tilt arrangement, which is synchronized with a tracking algorithm. As and when the object being tracked moves out of the viewing range of the camera, the pan-tilt setup is automatically adjusted to move the camera so as to keep the object in view. Here the camera motion is limited by the speed of the stepper motors which mobilize the pan-tilt setup. The system is capable of handling entry and exit of an object. Such a tracking system is cost effective and can be used as an automated video conferencing system and also has application as a surveillance tool.

Karan Gupta, Anjali V. Kulkarni
A Model for Prevention of Software Piracy through Secure Distribution

Although many anti-piracy methods are prevalent today in the software market, still software piracy is constantly on the rise. The cause of this can be attributed to (a) either these methods are very expensive to be implemented (b) or are easy to be defeated (c) or are simply not very convenient to use. Anti-piracy methods usually consist of lock-and-key policy where the software to be protected is locked using some encryption method (lock) and this lock requires a key to be unlocked. The key is called as registration code and the mechanism is called as registration mechanism in the software parlance. The registration mechanism may be implemented in many ways- software, hardware, or a combination of both. The way it is implemented makes the protection scheme more vulnerable or less vulnerable to piracy. The method of implementation also affects the user convenience to use the software. Some mechanisms are very convenient to the user but are more vulnerable to piracy. Some others are least vulnerable to piracy but are very inconvenient to the user. Different vendors choose different ways to implement the lock and unlock mechanism to make it as hard as possible to pirate their software. In this paper, we discuss several anti-piracy methods, the security issues related to them, and present a model which allows prevention of software piracy through a secure mechanism for software distribution.

Vaddadi P. Chandu, Karandeep Singh, Ravi Baskaran
Performance Enhancement of CAST-128 Algorithm by modifying its function

There has been a tremendous enhancement in the field of cryptography, which tries to manipulate the plain text so that it becomes unreadable, less prone to hacker and crackers. In this regard, we have made an attempt to modify CAST-128 algorithm which is a secret-key block cipher so as to enhance performance by modifying its function, which would not only be a secure one, but would also reduce total time taken for encryption and decryption. This paper attempts to improve performance without violating memory requirements, security and simplicity of existing CAST-128 algorithm. The proposed modification is only limited to the change in the implementation of the function F of the CAST-128’s Feistel network. Because the change in the total time taken for encryption and decryption cannot be understood on software implementation, we have implemented VHDL application to show the differences in the delay.

Krishnamurthy G.N, Ramaswamy V, Leela G.H, Ashalatha M.E
A Concatenative Synthesis Based Speech Synthesiser for Hindi

The document presents a speech synthesis system based upon hidden Markov models and decision trees. The development follows the approach generally implemented for speech recognition systems. A front end analysis of the database is done and the derived feature vectors are subject to phonetic hidden Markov models, which are then clustered based upon a decision-tree approach that models the context of the phones being considered. The individual phone segments form an acoustic leaf sequence which divides various contexts into equivalence classes. During synthesis, the phone sequence to be synthesised is translated into an acoustic leaf sequence by posing questions associated with different nodes of the decision tree to the immediate context of the phone. The sequence of terminal nodes thus obtained is amalgamated to obtain the desired pronunciation.

Kshitij Gupta
Legibility on a Podcast: Color and typefaces

As in printed matter, black text on white background is the most legible color combination on small computer screens like media players.

Lennart Strand
The sensing mechanism and the response simulation of the MIS hydrogen sensor

The Pd

0.96

Cr

0.04

alloy gated metal-insulator-semiconductor (MIS) hydrogen sensor has been studied. A new sensing mechanism is proposed that the sensor response is related to the protons. The capacitance-voltage (CV) curve, conductance-voltage (GV) curve, and the sensor response are simulated.

Linfeng Zhang, Erik McCullen, Lajos Rimai, K. Y. Simon Ng, Ratna Naik, Gregory Auner
Visual Extrapolation of Linear and Nonlinear Trends: Does the Knowledge of Underlying Trend Type Affect Accuracy and Response Bias?

The purpose of these experiments was to examine the ability of experienced and inexperienced participants to predict future curve points on time-series graphs. To model real-world data, the graphed data represented different underlying trends and included different sample sizes and levels of variability. Six trends (increasing and decreasing linear, exponential, asymptotic) were presented on four graph types (histogram, line graph, scatterplot, suspended bar graph). The overall goal was to determine which types of graphs lead to better extrapolation accuracy. Participants viewed graphs on a computer screen and extrapolated the next data point in the series. Results indicated higher accuracy when variability was low and sample size was high. Extrapolation accuracy was higher for asymptotic and linear trends presented on scatterplots and histograms. Interestingly, although inexperienced participants made expected underestimation errors, participants who were aware of the types of trends they would be presented with made overestimation errors.

Lisa A. Best
Resource Discovery and Selection for Large Scale Query Optimization in a Grid Environment

Current peer to peer (P2P) methods employing distributed hash tables (DHT) for resource discovery in a Grid environment suffer from these problems: (i) the risk of network congestion due to the sent messages for updating data on resources and (ii) the risk of the churn effect if a large number of nodes want to update their data at the same time. These problems form big challenges in a large scale dynamic Grid environment. In this paper we propose a method of resource discovery and selection for large scale query optimization in a Grid environment. The resource discovery extends the P2P system Pastry. DHT are used to save only static data on resources. We retrieve the dynamic properties of these resources during the resource selection by a monitoring tool. First, the originality of our approach is to delay the monitoring of resources to the phase of resource selection. This strategy will avoid a global monitoring of the system that is often employed in the current resource discovery systems. Second, the method is executed in a decentralized way in order to help the database optimizer to better discover and select resources.

Mahmoud El Samad, Abdelkader Hameurlain, Franck Morvan
Protecting Medical Images with Biometric Information

Medical images are private to doctor and patient. Digital medical images should be protected against unauthorized viewers. One way to protect digital medical images is using cryptography to encrypt the images. This paper proposes a method for encrypting medical images with a traditional symmetric cryptosystem. We use biometrics to protect the cryptographic key. Both encrypted image and cryptographic key can be transmitted over public networks with security and only the person that owns the biometrics information used in key protection can decrypt the medical image.

Marcelo Fornazin, Danilo B.S. Netto Jr., Marcos Antonio Cavenaghi, Aparecido N. Marana
Power Efficiency Profile Evaluation for Wireless Communication Applications

With the advances on semiconductor technologies, wireless communication applications are an emerging class of applications on battery powered devices. The multitude and complexity of components that implement a large spectrum of communication protocols on these devices requires closer evaluation and understanding in respect to power efficiency. As there is very little analysis on the relationship between wireless specific components and their power profile in the context of mobile wireless communication, we investigate the landscape of wireless communication efficiency methods. Next we apply power efficiency metrics on different wireless systems and analyze the experimental data we obtained through a monitoring application. Evaluation scenarios start with different wireless communication systems (WLAN, Bluetooth, GSM) and then analyze power consumption and efficiency in several scenarios on the WLAN communication system

Marius Marcu, Dacian Tudor, Sebastian Fuicu
Closing the Gap between Enterprise Models and Service-oriented Architectures

In 2005, four German universities created a research program to improve and standardize their administrative processes. Therefore, reams of processes were analyzed and core functions identified. As a result, automatable core functions have been implemented as web-services using the service-oriented architecture (SOA) paradigm. To facilitate reuse, this functionality has been documented using a service catalog. However, the real advantage of SOA does not evolve until these services become configurable at the business level. We introduce a modeling grammar to model service function interfaces and standardized business object types allowing the redesign of enterprise systems functionalities at the business level. We illustrate the usefulness of our approach with a pilot study conducted at the University of Munster (Germany).

Martin Juhrisch, Werner Esswein
Object Normalization as Contribution to the area of Formal Methods of Object-Oriented Database Design

An overview of the current status in the area of formal technique of object database design is described in the article. It is discussed here, why relational normalization is not able to be easy used in object databases. The article introduces various proposals of object normal forms and it brings own authors’ evaluation in this area.

Vojtěch Merunka, Martin Molhanec
A Novel Security Schema for Distributed File Systems

Distributed file systems are key tools that enable collaboration in any environment consisting of more than one computer. Security is a term that covers several concepts related to protecting data. Authentication refers to identifying the parties involved in using or providing file services. Authorization is the process of controlling actions in the system both at the user level, such as read and write, and at the administrative level, such as setting quotas and moving data between servers. Communication security addresses the integrity and privacy of messages exchanged over networks vulnerable to attack. Storage security concerns the safety of data “at rest“ on disks or other devices. But these are mechanistic approaches to security. At a higher level, the issue of trust of all the parties in each other and the system components control the security decisions. This project is a successful endeavor to make security in distributed file systems. The main object of the paper was to get in depth knowledge about base concepts of distributed file system like fault tolerance, fault recovery, scalability and specifically security.

Bager Zarei, Mehdi Asadi, Saeed Nourizadeh, Shapour Jodi Begdillo
A Fingerprint Method for Scientific Data Verification

This article discusses an algorithm (called "UNF") for verifying digital data matrices. This algorithm is now used in a number of software packages and digital library projects. We discuss the details of the algorithm, and offer an extension for normalization of time and duration data.

Micah Altman
Mobile Technologies in Requirements Engineering

This paper presents the current capabilities of mobile technologies to support the requirements engineering phase of software development projects. After a short insight into the state of the art of requirements engineering, different areas of application of mobile requirements engineering processes and tools - such as Arena-M or the Mobile Scenario Presenter - are being presented. The paper concludes with a critical statement regarding the benefit of current mobile tools in real-life requirements engineering.

Gunnar Kurtz, Michael Geisser, Tobias Hildenbrand, Thomas Kude
Unsupervised Color Textured Image Segmentation Using Cluster Ensembles and MRF Model

We propose a novel approach to implement robust unsupervised color image content understanding approach that segments a color image into its constituent parts automatically. The aim of this work is to produce precise segmentation of color images using color and texture information along with neighborhood relationships among image pixels which will provide more accuracy in segmentation. Here, unsupervised means automatic discovery of classes or clusters in images rather than generating the class or cluster descriptions from training image sets. As a whole, in this particular work, the problem we want to investigate is to implement a robust unsupervised SVFM model based color medical image segmentation tool using Cluster Ensembles and MRF model along with wavelet transforms for increasing the content sensitivity of the segmentation model. In addition, Cluster Ensemble has been utilized for introducing a robust technique for finding the number of components in an image automatically. The experimental results reveal that the proposed tool is able to find the accurate number of objects or components in a color image and eventually capable of producing more accurate and faithful segmentation and can. A statistical model based approach has been developed to estimate the Maximum a posteriori (MAP) to identify the different objects/components in a color image. The approach utilizes a Markov Random Field model to capture the relationships among the neighboring pixels and integrate that information into the Expectation Maximization (EM) model fitting MAP algorithm. The algorithm simultaneously calculates the model parameters and segments the pixels iteratively in an interleaved manner. Finally, it converges to a solution where the model parameters and pixel labels are stabilized within a specified criterion. Finally, we have compared our results with another well-known segmentation approach.

Mofakharul Islam, John Yearwood, Peter Vamplew
An Efficient Storage and Retrieval Technique for Documents Using Symantec Document Segmentation (SDS) Approach

Today, many institutions and organizations are facing serious problems due to the tremendously increasing size of documents, and this problem is further triggering the storage and retrieval problems due to the continuously growing space and efficiency requirements. This problem is becoming more complex with time and the increase in the size and number of documents in an organization. Therefore, there is a growing demand to address this problem. This demand and challenge can be met by developing more efficient storage and retrieval techniques for electronic documents. Various techniques were developed and reported in the literature by different investigators. These techniques attempt to solve this problem to some extent but, most of the existing techniques still face the efficiency problem, in the case, the number and size of documents increase rapidly. The efficiency is further affected in the existing techniques when documents in a system are reorganized and then stored again. To handle these problems, we need special and efficient storage and retrieval techniques for this type of information retrieval (IR) systems. In this paper, we present an efficient storage and retrieval technique for electronic documents. The proposed technique uses the Symantec Document Segmentation (SDS) approach to make the technique more efficient than the existing techniques. The use of this approach minimizes the storage size of the documents and as a result makes the retrieval efficient.

Mohammad A. ALGhalayini, ELQasem ALNemah
A New Persian/Arabic Text Steganography Using “La” Word

By expanding communication, in some cases there is a need for hidden communication. Steganography is one of the methods used for hidden exchange of information. Steganography is a method to hide the information under a cover media such as image or sound. In this paper a new method for Steganography in Persian and Arabic texts is presented. Here we use the special form of “La” word for hiding the data. This word is created by connecting “Lam” and “Alef” characters. For hiding bit 0 we use the normal form of word “La” (“لـا”) by inserting Arabic extension character between “Lam” and “Alef” characters. But for hiding bit 1 we use the special form of word “La” (“لا”) which has a unique code in Unicode standard texts. Its code is FEFB in Unicode hex notation. This method is not limited to electronic documents (E-documents) and also can be used on printed documents. This approach can be categorized under feature coding methods.

Mohammad Shirali-Shahreza
GTRSSN: Gaussian Trust and Reputation System for Sensor Networks

This paper introduces a new Gaussian trust and reputation system for wireless sensor networks based on sensed continuous events to address security issues and to deal with malicious and unreliable nodes. It is representing a new approach of calculating trust between sensor nodes based on their sensed data and the reported data from surrounding nodes. It is addressing the trust issue from a continuous sensed data which is different from all other approaches which address the issue from communications and binary point of view.

Mohammad Momani, Subhash Challa
Fuzzy Round Robin CPU Scheduling (FRRCS) Algorithm

Scheduling is to determine when and on what processor each process has to run. Scheduling objectives are achieving high processor utilization; minimizing average (or maximum) response time, maximizing system throughput, and being fair (avoid process starvation).Many scheduling have been introduced in order to optimizing the processor utilization. One of this algorithm is Round Robin CPU scheduling (RRCS) .We optimize RR by using the concept of fuzzy rule base system. We introduced a new fuzzy algorithm called fuzzy rule based round robin CPU scheduling (FRRCS) .The rules are extracted from the knowledge of experts and they are experimental. The advantage of FRRCS is compared with the RRCS. It is shown that the FRRCS algorithm is improved the waiting and response times.

M.H. Zahedi, M. Ghazizadeh, M. Naghibzadeh
Fuzzy Expert System In Determining Hadith1 Validity

There is a theoretical framework in the Islamic science that helps us to distinguish the valid Hadith from invalid one, named “Hadith science”. In addition to Hadith Science, “Rejal science” that concentrates upon the examination of the characters of those who narrated the Hadith .These sciences together can contribute to prove the validity of Hadith. The main objective of this paper is to determine the rate of validity of a Hadith through a fuzzy system with respect to some parameters. According to view point of expert, the data knowledge base has been designed and the essential rules have been extracted. Then the system was implemented by the usage of expert system software’s. After that the samples taken from” KAFI “ volume 1 were inserted into the data base to be assessed by means of documentary information. The results deduced from our designed expert system were compared with expert view points. The comparison shows that our system was correct in 94% cases.

M. Ghazizadeh, M.H. Zahedi, M. Kahani, B. Minaei Bidgoli
An Investigation into the Performance of General Sorting on Graphics Processing Units

Sorting is a fundamental operation in computing and there is a constant need to push the boundaries of performance with different sorting algorithms. With the advent of the programmable graphics pipeline, the parallel nature of graphics processing units has been exposed allowing programmers to take advantage of it. By transforming the way that data is represented and operated on parallel sorting algorithms can be implemented on graphics processing units where previously only graphics processing could be performed. This paradigm of programming exhibits potentially large speedups for algorithms.

Nick Pilkington, Barry Irwin
An Analysis of Effort Variance in Software Maintenance Projects

Quantitative project management, understanding process variations and improving overall process capability, are fundamental aspects of process improvements and are now strongly propagated by all best-practice models of process improvement. Organizations are moving to the next level of quantitative management where empirical methods are used to establish process predictability, thus enabling better project planning and management. In this paper we use empirical methods to analyze Effort Variance in software maintenance projects. The Effort Variance model established was used to identify process improvements and baseline performance.

Nita Sarang, Mukund A Sanglikar
Design of Adaptive Neural Network Frequency Controller for Performance Improvement of an Isolated Thermal Power System

An adaptive neural network control scheme for thermal power system is described. Neural network control scheme does not require off-line training. The online tuning algorithm and neural network architecture are described and a stability proof is given. The performance of the controller is illustrated via simulation for different changes in process parameters and for different disturbances. Performance of neural network controller is compared with conventional proportional-integral control scheme for frequency control in thermal power systems.

Ognjen Kuljaca, Jyotirmay Gadewadikar, Kwabena Agyepong
Comparing PMBOK and Agile Project Management software development processes

The objective of this article is to compare a generic set of project management processes as defined in Project Management Body of Knowledge (PMBOK) with a number of agile project management processes. PMBOK is developed by Project Management Institute and it is structured around five process groups (initiating, planning, execution, controlling and closure) and nine knowledge areas (integration management, scope management, time management, cost management, quality management, human resource management, communication management, risk management, procurement management). On the other hand, agile software project management is based on the following principles: embrace change, focus on customer value, deliver part of functionality incrementally, collaborate, reflect and learn continuously. The purpose of this comparison is to identify gaps, differences, discrepancies etc. The result is that, agile project management methodologies cannot be considered complete, from the traditional project management point of view, since a number of processes either are missing or not described explicitly.

P. Fitsilis
An Expert System for Diagnosing Heavy-Duty Diesel Engine faults

Heavy-Duty Diesel Engines (HDDEs) support critical and high cost services, thus failure of such engines can have serious economic and health impacts. It is necessary that diagnosis is done during both preventive maintenance and when the engine has failed. Because of their complexity, HDDEs require high expertise for the diagnosis of their faults; such expertise is in many cases scarce, or just unavailable. Current computerized tools for diagnosing HDDEs are tied to particular manufacturer’s products. In addition, most of them do not have the functionality that is required to assist inexperienced technicians to completely diagnose and repair HDDE faults, because most of the tools have only the capability to identify HDDE faults. These tools are not able to recommend corrective action. This paper presents an easy to use expert system for diagnosing HDDE faults that is based on the Bayesian Network Technology. Using Bayesian Networks simplified the modeling of the complex process of diagnosing HDDEs. Moreover, it enabled us to capture the uncertainty associated with engine diagnosis, and to incorporate learning capabilities in the expert system.

Peter Nabende, Tom Wanyama
Interactive Visualization of Data-Oriented XML Documents

There is many data stored and exchanged in XML nowadays. However, understanding of the information included in large data-oriented XML instances may not be simple in contrast to the presentation of document-oriented XML instances. Moreover, acquaint the desired knowledge from complex XML data is hard or even impossible.

Although we propose two solutions for simple presentation and further analysis of the information, the main goal of the paper is to get a feedback from researchers and professionals who need to deal with large data-oriented XML in a discussion on the data visualization, modeling and understanding.

Petr Chmelar, Radim Hernych, Daniel Kubicek
Issues in Simulation for Valuing Long-Term Forwards

This paper explores valuing long-term equity forward-contracts or futures where both the underlying volatility and the interest rates are modeled as stochastic random variables. For each future, the underlying is an equity or index with no dividends. A key computational question we wish to understand is the relationship between (1) stochastically modeling the interest rates using different interest rate models while holding the volatility fixed and (2) stochastically modeling the underlying volatility using a volatility model while holding the interest rates fixed. In other words, let a “pure-X model” be a futures model where X varies stochastically while all else is fixed. Given a single future to model, this paper works towards understanding when a pure-interest rate model is equivalent to a pure-volatility model. This paper is focused on simulation and modeling issues and does not offer economic interpretation.

Phillip G. Bradford, Alina Olteanu
A Model for Mobile Television Applications Based on Verbal Decision Analysis

The emergence of Digital Television - DTV brings the necessity to select an interface that may be used in Digital Television for Mobile Devices. Many different candidate solutions for interaction applications can be possible. This way, usability specialists prototyped interfaces that were analyzed according to users’ preferences through usability test, considering criteria classified in accordance with users’ preferences and alternatives, which were obtained from a ranking modeled by verbal decision analysis. Results revealed great influence of the application’s functions evidence in the ease of navigation.

Isabelle Tamanini, Thais C. Sampaio Machado, Marília Soares Mendes, Ana Lisse Carvalho, Maria Elizabeth S. Furtado, Plácido R. Pinheiro
Gene Selection for Predicting Survival Outcomes of Cancer Patients in Microarray Studies

In this paper, we introduce a multivariate approach for selecting genes for predicting survival outcomes of cancer patients in gene expression microarray studies. Combined with survival analysis for gene filtering, the method makes full use of individual’s survival information (both censored and uncensored) in selecting informative genes for survival outcome prediction. Application of our method to published data on epithelial ovarian cancer has identified genes that discriminate unfavorable and favorable outcomes with high significance ( χ

2

= 21.933, p = 3e - 06 ). The method can also be generalized to categorical variables for selecting gene expression signatures for predicting tumor metastasis or tumor subtypes.

Q Tan, M Thomassen, KM Jochumsen, O Mogensen, K Christensen, TA Kruse
Securing XML Web Services by using a Proxy Web Service Model

XML Web Services play a fundamental role in creating distributed, integrated and interoperable solutions across the Internet. The main problem that XML Web Services face is the lack of support for security in the Simple Object Access Protocol (SOAP) specification. SOAP is the vital communication protocol used by XML Web Services. Our research study aims at implementing a proxy based lightweight approach to provide security in XML Web Services. The approach provides an end-to-end message level security not only in the new Web Services but also the ones which are already running without their being disturbed. The SOAP message in a Web Services environment travels over the HTTP protocol which is not considered secure as messages are transmitted over the network in clear text format and can be easily read by protocol sniffers. Therefore, in our project we establish a secure connection between the end user and Proxy Web Service using Secure Sockets Layer (SSL) and the connection between the Proxy Web Service and actual Web Service is also secured using SSL.

Quratul-ain Mahesar, Prof. Dr. Asadullah Shah
O-Chord: A Method for Locating Relational Data Sources in a P2P Environment

Due to the characteristics of Peer-to-Peer systems, SQL query processing in these systems is more complex than traditional distributed DBMS. In this context, semantic and structural heterogeneity of local schemas create a real problem for locating data sources. Schema heterogeneity prevents peers to exchange their data in a comprehensive way. It could lead to incorrect answers for the localization query. In this paper, we propose O-Chord method which integrates domain ontology in Chord protocol [22]. This integration provides comprehensive data exchange while carrying out efficient data sources localization. Our proposed method allows Chord protocol to select the relevant data sources to answer the localization query.

Raddad Al King, Abdelkader Hameurlain, Franck Morvan
Intuitive Interface for the Exploration of Volumetric Datasets

Conventional human-computer interfaces for the exploration of volume datasets employ the mouse as an input device. Specifying an oblique orientation for a cross-sectional plane through the dataset using such interfaces requires an indirect approach involving a combination of actions that must be learned by the user. In this paper we propose a new interface model that aims to provide an intuitive means of orienting and translating a cross-sectional plane through a volume dataset. Our model uses a hand-held rectangular panel that is manipulated by the user in free space, resulting in corresponding manipulations of the cross-sectional plane through the dataset. A basic implementation of the proposed model was evaluated relative to a conventional mouse-based interface in a controlled experiment in which users were asked to find a series of targets within a specially designed volume dataset. The results of the experiment indicate that users experienced significantly less workload and better overall performance using our system.

Rahul Sarkar, Chrishnika de Almeida, Noureen Syed, Sheliza Jamal, Jeff Orchard
New Trends in Cryptography by Quantum Concepts

Communication of key to the receiver of cipher message is an important step in any cryptographic system. But the field of cryptanalysis is also progressing keeping pace with progress in cryptography and everyday newer and faster computers with more and more memory space are appearing. Increasing the vulnerability of key distribution process and in turn vulnerability of secret message. People are exploring totally different areas of technology in search of a secure key distribution system. "Quantum cryptography" is one such technology with promises for future. This paper describes issues related to integrity check of quantum cryptography based on qbits.

SGK Murthy, M.V.Ramana Murthy, P.Ram Kumar
On Use of Operation Semantics for Parallel iSCSI Protocol

iSCSI [1,2], is an emerging communication protocol enabling block data transport over TCP/IP network. The iSCSI protocol is perceived as a low cost alternative to the FC protocol for networked storages [8, 9, 10, and 11]. This paper uses a storage architecture allowing parallel processing of iSCSI commands and presents two novel techniques for improving the performance of iSCSI protocol – the first is the elimination technique for reducing latency caused by redundant overwrites and second technique reduces the latency caused due to multiple reads to same storage sector. The performance analysis shows that use of operation semantics has the potential to improve performance as compared to the existing iSCSI storages.

Ranjana Singh, Rekha Singhal
BlueCard: Mobile Device-Based Authentication and Profile Exchange

In this paper we present a framework for ubiquitous authentication using commodity mobile devices. The solution is intended to be a replacement for the proliferation of physical authentication artifacts that users typically have to carry today. We describe the authentication protocol and its prototypical implementation for a solution designed for the retail industry. We also propose a means of personalizing user–service interactions with embedded user-controlled profiles.

Riddhiman Ghosh, Mohamed Dekhil
Component Based Face Recognition System

The goal of this research paper was to design and use a component based approach to face recognition and show that this technique gives us recognition rates of up to 92%. A novel graphical user interface was also developed as part of the research to showcase and control the process of face detection and component extraction and to display the recognition results.The paper essentially consists of two parts, face detection and face recognition. The face detection system takes a given image as the input from which a face is located and extracted using 2D Haar Wavelets and Support Vector Machines. The face region is then used to locate and extract the individual components of the face such as eyes, eyebrows, lips and nose which are then sent to the face recognition system where the individual components are recognized by using Wavelets, Principal Component Analysis and Error Backpropagation Neural Networks. Pattern dimension reduction technique is used to significantly reduce the dimensionality and complexity of the task.

Pavan Kandepet, Roman W. Swiniarski
An MDA-Based Generic Framework to Address Various Aspects of Enterprise Architecture

With a trend toward becoming more and more information based, enterprises constantly attempt to surpass the accomplishments of each other by improving their information activities. Building an Enterprise Architecture (EA) undoubtedly serves as a fundamental concept to accomplish this goal. EA typically encompasses an overview of the entire information system in an enterprise, including the software, hardware, and information architectures. Here, we aim the use of Model Driven Architecture (MDA) in order to cover different aspects of Enterprise Architecture. MDA, the most recent de facto standard for software development, has been selected to address EA across multiple hierarchical levels spanned from business to IT. Despite the fact that MDA is not intended to contribute in this respect, we plan to enhance its initial scope to take advantage of the facilities provided by this innovative architecture. The presented framework helps developers to design and justify completely integrated business and IT systems which results in improved project success rate.

S. Shervin Ostadzadeh, Fereidoon Shams Aliee, S. Arash Ostadzadeh
Simulating VHDL in PSpice Software

This paper incorporates the use of PSpice to simplify massive complex circuits. This involves the simulation of a VHDL based designed processor in PSpice software. After reading through the various properties that have been displayed, it would seem as an easy method of how a VHDL coded program can be easily converted into PSpice module. This can be treated as an assessment tool for the students which not only help them interact with the circuitry in VHDL, but, become much more involved with the practical aspects of PSpice software.

Saeid Moslehpour, Chandrasekhar Puliroju, Christopher L Spivey
VLSI Implementation of Discrete Wavelet Transform using Systolic Array Architecture

In this paper, we introduce an architectural design for efficient hardware acceleration of the Discrete Wavelet Transform. The unit designed can be used to accelerate multimedia applications as JPEG2000. The design is based on the Systolic Architecture Design for Discrete Wavelet Transform, which is a fast implementation of the Discrete Wavelet Transform. The design utilizes various techniques such as pipelining, data reusability, parallel execution and specific features of the Xilinx (Core Generator) Spartan II to accelerate the transform. For performance analysis, simulators like (Model-Sim 5.8c, MATLAB 7.1) were used along with xilinx ISE. The Matlab simulator was used to see the images obtained at intermediate stages and check the performance at each stage.

S.Sankar Sumanth, K.A.Narayanan Kutty
Introducing MARF: a Modular Audio Recognition Framework and its Applications for Scientific and Software Engineering Research

In this paper we introduce a Modular Audio Recognition Framework (MARF), as an open-source research platform implemented in Java. MARF is used to evaluate various pattern-recognition algorithms and beyond in areas such as audio and text processing (NLP) and may also act as a library in applications as well as serve as a basis for learning and extension as it encompasses good software engineering practices in its design and implementation. Thus, the paper subsequently summarizes the core framework’s features and capabilities and where it is heading.

Serguei A. Mokhov
TCP/IP Over Bluetooth

As new communication technologies are emerging they are introducing new form of short range wireless networks. Bluetooth is one of them as well, which allows information exchange over a short range. Bluetooth is a low cost, short range and low power radio technology which was originally developed to connect the devices such as mobile phone handsets, portable computers, headsets without having cable. Bluetooth was started in about 1994 by Ericson mobile communications but version 1.0 of Bluetooth came out in 1999. Bluetooth is a fastest growing technology. Its applications are increasing as the research goes on. By using Bluetooth we can make connection for traffic between sender and receiver. It can be either synchronous traffic such as voice or asynchronous traffic such as traffic over the internet protocol. In this paper we shall discuss that how efficiently Bluetooth can carry the TCP/IP traffic and as well as we shall analyse that how retransmission and delays are handled when there is an error in a packet of data. In addition we shall discuss the Bluetooth layer model and how it works and make the comparison between OSI reference model and Bluetooth layer model.

Umar F. Khan, Shafqat Hameed, Tim Macintyre
Measurement-Based Admission Control for Non-Real-Time Services in Wireless Data Networks

Non-real-time services are an important category of network services in future wireless networks. When mobile users access non-real-time services, mobile users usually care about two important points; one is whether mobile users are not forced to terminate during their lifetime, and the other is whether the total time to complete mobile users’ data transfer is within their time tolerance. Low forced termination probability can be achieved through use of the technique of bandwidth adaptation which dynamically adjusts the number of bandwidths allocated to a mobile user during the mobile user’s connection time. However, there is not a metric at a connection level to present the degree of the length of the total completion time. In this paper, we describe a quality-of-service metric, called stretch ratio, to present the degree of the length of the total completion time. We design a measurement based call admission control scheme that uses the stretch ratio to determine whether or not to accept a new mobile user into a cell. Extensive simulation results show that the measurement based call admission control scheme not only satisfies the quality-of-service requirement of mobile users (in terms of the stretch ratio) but also highly utilizes radio resource.

Show-Shiow Tzeng, Hsin-Yi Lu
A Cooperation Mechanism in Agent Organization

Current research in autonomous Agents and Multi-agent systems (MAS) has reached a level of maturity sufficient enough for MAS to be applied as a technology for solving problems in an increasingly wide range of complex applications. Our aim in this paper is to define a simple, extendible, and formal framework for multi-agent cooperation, over which businesses may build their business frameworks for effecting cooperative business strategies using distributed multi-agent systems. It is a simple, fair and efficient model for orchestrating effecting cooperation between multiple agents.

W. Alshabi, S. Ramaswamy, M. Itmi, H. Abdulrab
A Test Taxonomy Applied to the Mechanics of Java Refactorings

In this paper, we describe the automated production of all interactions between the mechanics of seventy-two refactorings proposed by Fowler in the form of chains. Each chain represents the paths a specific refactoring may follow due to its dependencies on n possible other refactorings. We enumerate all possible chains and then investigate three hypotheses related firstly, to the number of chains generated by specific refactorings and their chain length, secondly, the relevance and applicability of a test taxonomy proposed by van Deursen and Moonen (vD&M) and, finally as to whether certain ‘server’ refactorings exist when those chains are scrutinized. Two of the proposed hypotheses were supported by the data examined, suggesting a far deeper complexity to refactoring inter-relationships than first envisaged. We also investigated the possibility that chains eradicate bad code smells.

Steve Counsell, Stephen Swift, Rob M. Hierons
Classification Techniques with Cooperative Routing for Industrial Wireless Sensor Networks

Industrial environment pose concern for harsh noisy environment upon use of Wireless Sensor Networks (WSN). Wavelet Transform is used as preprocessor for denoising the real word data from sensor node. Wireless Sensor Networks are battery powered. Hence every aspect of WSN is designed with energy constrain. Communication is the largest consumer of energy in WSN. Hence energy consumption during communication must be reduced to the minimum possible. This paper focuses on reduced energy consumption on communication. The co-operative routing protocol is designed for communication in a distributed environment. In a distributed environment, the data routing takes place in multiple hops and all the nodes take part in communication. The main objective is to achieve a uniform dissipation of energy for all the nodes in the whole network, and make the classifier work in industrial environment. The paper discusses classification technique using ART1 and Fuzzy ART neural network models.

Sudhir G. Akojwar, Rajendra M. Patrikar
Biometric Approaches of 2D-3D Ear and Face: A Survey

The use of biometrics in human recognition is rapidly gaining in popularity. Considering the limitations of biometric systems with a single biometric trait, a current research trend is to combine multiple traits to improve performance. Among the biometric traits, the face is considered to be the most non-intrusive and the ear is the most promising candidate to be combined with the face. In this survey, existing approaches involving these two biometric traits are summarized and their scopes and limitations are discussed. Some challenges to be addressed are discussed and few research directions are outlined.

S. M. S. Islam, M. Bennamoun, R. Owens, R. Davies
Performance Model for a Reconfigurable Coprocessor

This paper presents an analytical model for the performance of a generic reconfigurable coprocessor (RC) system. The system is characterized by a standard processor with a portion that is reconfigurable. We describe a general performance model for the speedup of a generic RC system. We demonstrate how different parameters of speedup model can affect the performance of reconfigurable system (RS). In addition, we implement our pre-developed speedup model for a system that allows preloading of the functional blocks (FB) into the reconfigurable hardware (RH). The redevelopment of speedup model with the consideration of preloading demonstrates some interesting results that can be used to improve the performance of RH with a coprocessor. Finally, we develop a performance model for a specific application. The application is characterized by a main iterative loop in which a core operation is to be defined in a FB. Our experiments show that the minimum and maximum speedup mainly depends on the probabilities of miss and hit for the FB that resides in the RH of a coprocessor. In addition, our simulation results for application specific model demonstrate how the probability of dependency degrades the achievable speedup.

Syed S. Rizvi, Syed N. Hyder, Aasia Riasat
RFID: A New Software Based Solution to Avoid Interference

RFID (Radio Frequency Identification) interference is one of the most important issues for RFID applications. The dense-reader mode in Class 1 Gen2 in EPC-global is one of the solutions which can reduce the interference problem. This paper presents a new software based solution that can be used for improving the performance of dense-reader mode through effectively suppressing the unwanted interference. We use two existing methods, Reva and Feig’s systems, to build a complete simulation program for determining the effectiveness of the new proposed software based solution. For the sake of our simulation results, we strictly follow the EPC-global Gen2 standard and use the Listen and Wait protocols [2]. Our simulation results demonstrate that the interference can be reduced significantly by using the proposed software based solution.

Syed S. Rizvi, Eslam M. Gebriel, Aasia Riasat
A Software Component Architecture for Adaptive and Predictive Rate Control of Video Streaming

Quality of Service and Transmission Rate Optimization in live and on-demand video streaming is a very important issue in lossy IP networks. Infrastructure of the Internet exhibits variable bandwidths, delays, congestions and time-varying packet losses. Because of such attributes, video streaming applications should not only have good end-to-end transport layer performance, but also a robust rate control optimization mechanisms. This paper gives an overview of video streaming applications and proposal of a new software architecture that controls transport QoS and path and bandwidth estimation. Predictive Control is one of the best solutions for difficult problems in control engineering applications that can be used in Internet environment. Therefore, we provide an end-to-end software architecture between the video requesting clients, their destination servers, distant streaming servers and video broadcasters. This architecture contains an important Streaming Server Component with Application Layer QoS Manager and Transport Layer Path and Bandwidth Estimator. QoS Manager considers the necessary parameters such as network delay, packet loss, distortions, round trip time, channel errors, network discontinuity and session dropping probability to make video streaming more efficient and to provide required video quality. Transport Path and Bandwidth Estimator, on the other hand provides transmission rates to Destination Servers for optimum video streaming. The paper provides and discusses a software component model of video streaming.

Taner Arsan, Tuncay Saydam
Routing Table Instability in Real-World Ad-Hoc Network Testbed

In this paper we have carried out an experimental study of the stability of routing tables in a real-world ad-hoc network. Two sets of experiments were conducted, one with a static topology, and the other with node mobility. The Ad-Hoc On-Demand Distance Vector (AODV) routing protocol was used to route packets over multiple hops. Both experiments showed that unstable wireless channels have significant effects on the stability of the routing tables.

Tirthankar Ghosh, Benjamin Pratt
Quality Attributes for Embedded Systems

Software quality attributes (QAs) such as reliability and modifiability have been used to define nonfunctional requirements of software systems for many years. More recently, they have been used as the basis for generating utility trees in the Software Engineering Institute’s Architecture Tradeoff Analysis Model (ATAM). Software processes and models, such as the ATAM, are often utilized when developing embedded systems which consist of both software and hardware. In order to determine whether the QA’s defined for software are adequate when working with embedded systems, research based on trade studies performed during the development of embedded system architectures were evaluated. The results of the research shows that while many of the embedded system quality attributes map directly to existing software quality attributes, some attributes such as portability take on a modified definition, and others, such as weight, do not normally apply to software systems. This paper presents the quality attributes for embedded systems identified as a result of the trade study evaluation.

Trudy Sherman
A Mesoscale Simulation of the Morphology of the PEDT/PSS Complex in the Water Dispersion and Thin Film: the Use of the MesoDyn Simulation Code

intrinsically conducting polymer. Its complex with polystyrene sulfonic acid (PSS) attracts general interest ([1] and citations therein). The dispersion of the PEDT/PSS complex is usually prepared by an oxidative oligomerization of ethylenedioxythiophene (EDT) in the water solution in the presence of the soluble charge balancing and stabilizing polyanion of PSS. The oxidation (doping) process controls the number of the supposed ionic groups but the positions of the latter are random and not controlled (Fig. 1). The electrophoretic and viscosity characteristics of the swollen particles present in the dispersion are similar to those of nonstoichiometric polyelectrolyte complexes [2].

T. Kaevand, A. Öpik, Ü. Lille
Developing Ontology-Based Framework Using Semantic Grid

Semantic Grid is a type of grid in which resources, services, etc can be searched very easily and efficiently. It helps in finding the best possible match for the desired service or resource. In a Semantic Grid it is very necessary to have an ontology-based framework which can be used to discover most appropriate resource or service required. In this paper, an ontology-based framework is defined which can be used to discover the resources and the services depending upon the match. The framework will define the properties, the major concepts, their description and also the relationship between them in order to find the best possible match for the required resource or service. Also a B2B (Business to Business) marketplace will be used to explain its working more clearly and efficiently.

P. Venkata Krishna, Ratika Khetrapal
A Tree Based Buyer-Seller Watermarking Protocol

To restrict the unauthorized duplication and distribution of multimedia contents, a content owner (seller) may insert a unique digital watermark into each copy of the multimedia contents before it is sold to a buyer. Some time later, the seller can trace the unlawful reseller (original buyer) by extracting and checking the watermark embedded, if an illegal replica is found in the market. However, the accusation against the seller cannot be overruled as he also has access to the watermarked copies. This paper proposes a new efficient watermarking protocol which is based on tree model and public key encryption standard. Buyer places the purchase order to sales point which will be forwarded to the certification authority via the dealer. The trusted certification authority issues the watermarked image in such a way that the buyer can check its originality. This prevents the buyer from claiming that an unauthorized copy may have originated from the seller. This gives highest privilege to the user to ensure the quality of the purchased digital content.

Vinu V Das
A Spatiotemporal Parallel Image Processing on FPGA for Augmented Vision System

In this paper we describe a spatiotemporal parallel algorithm to optimize the power consumption of image processing on FPGA’s. We show how the implementation of our method can significantly reduce power consumption at higher processing speeds compared to traditional spatial (pipeline) parallel processing techniques. We demonstrated a real-time image processing system on a FPGA device and calculated the power consumption. The results show that when the image partitioned into 6 sections the power consumption drops by 45% compared to previous approaches.

W. Atabany, P. Degenaar
Biometrics of Cut Tree Faces

An issue of some interest to those in the lumber and timber industry is the rapid matching of a cut log face with its mate. For example, the U.S. Forest Service experiences a considerable loss of its valuable tree properties through poaching every year. They desire a tool that can rapidly scan a stack of cut timber faces, taken in a suspect lumber mill yard, and identify matches to a scanned photograph of stump faces of poached trees. Such a tool clearly falls into the category of a biometric identifier.

We have developed such a tool and have shown that it has usefully high biometric discrimination in the matching of a stump photograph to its cut face. It has certain limitations, described in this paper, but is otherwise eminently suitable for the task for which it was created.

W. A. Barrett
A Survey of Hands-on Assignments and Projects in Undergraduate Computer Architecture Courses

Computer Architecture and Organization is an important area of the computer science body of knowledge. How to teach and learn the subjects in this area effectively has been an active research topic. This paper presents results and analyses from a survey of hands-on assignments and projects from 35 undergraduate computer architecture and organization courses which are either required or elective for the BS degree in CS. These surveyed courses are selected from universities listed among the 50 top Engineering Ph.D. granting schools by the US News & World Report 2008 rankings, and their teaching materials are publicly accessible via their course websites.

Xuejun Liang
Predicting the Demand for Spectrum Allocation Through Auctions

The projected rare resource spectrum generates high profits if utilized efficiently. The current static allocation lead the spectrum to underutilized with fixed income. Predicting the user requirement for spectrum and auctioning the spectrum helps to better serve the customers and at the same time increases the income. In this research we use the automated collaborative filtering model for predicting the customer requirement and then allocate the spectrum through auctions (bidding for spectrum in open market). Genetic algorithm is used for optimization of the spectrum bidding problem and concluded that the spectrum will be used efficiently while generating more revenue by bidding for spectrum in the market.

Y. B. Reddy
Component-Based Project Estimation Issues for Recursive Development

In this paper we investigated the component-based specific issues that might affect project cost estimation. Component-based software development changes the style of software production. With component-based approach the software is developed as the composition of reusable software components. Each component production process must be treated as a stand-alone software project, which needs individual task of management. A typical pure component-based development can be considered as decomposition/integration activities successively applied at different levels and therefore results in recursive style of development. We analyzed and presented our results of studies on the component-based software development estimation issues from recursive point of view.

Yusuf Altunel, Mehmet R. Tolun
Backmatter
Metadaten
Titel
Advances in Computer and Information Sciences and Engineering
herausgegeben von
Tarek Sobh
Copyright-Jahr
2008
Verlag
Springer Netherlands
Electronic ISBN
978-1-4020-8741-7
Print ISBN
978-1-4020-8740-0
DOI
https://doi.org/10.1007/978-1-4020-8741-7

Premium Partner