Skip to main content

2014 | Buch

Modern Trends and Techniques in Computer Science

3rd Computer Science On-line Conference 2014 (CSOC 2014)

herausgegeben von: Radek Silhavy, Roman Senkerik, Zuzana Kominkova Oplatkova, Petr Silhavy, Zdenka Prokopova

Verlag: Springer International Publishing

Buchreihe : Advances in Intelligent Systems and Computing

insite
SUCHEN

Über dieses Buch

This book is based on the research papers presented in the 3rd Computer Science On-line Conference 2014 (CSOC 2014).

The conference is intended to provide an international forum for discussions on the latest high-quality research results in all areas related to Computer Science. The topics addressed are the theoretical aspects and applications of Artificial Intelligences, Computer Science, Informatics and Software Engineering.

The authors provide new approaches and methods to real-world problems, and in particular, exploratory research that describes novel approaches in their field. Particular emphasis is laid on modern trends in selected fields of interest. New algorithms or methods in a variety of fields are also presented.

This book is divided into three sections and covers topics including Artificial Intelligence, Computer Science and Software Engineering. Each section consists of new theoretical contributions and applications which can be used for the further development of knowledge of everybody who is looking for new knowledge or new inspiration for further research.

Inhaltsverzeichnis

Frontmatter

Artificial Inteligence

Frontmatter
Intelligence Digital Image Watermark Algorithm Based on Artificial Neural Networks Classifier

An intelligence robust digital image watermarking algorithm using artificial neural network (ANN) is proposed. In new algorithm, for embedding watermark, the original image first is divided into some

N

1

×

N

2

small blocks, different embedding strengths are determined by RBFNN classifier according to different textural features of every block after DCT. The experimental results show that the proposed algorithm are robust against common image processing attacks, such as JPEG compression, Gaussian noise, cropping, mean filtering, wiener filtering, and histogram equalization etc. The proposed algorithm achieves a good compromise between the robustness and invisibility, too.

Cong Jin, Shu-Wei Jin
PPSA: A Tool for Suboptimal Control of Time Delay Systems: Revision and Open Tasks

During the development of algebraic controller design in a special ring for time delay systems (TDSs) a problem of a suitable free controller parameters setting appeared. The first author of this contribution recently suggested a natural idea of placing the dominant characteristic numbers (poles) and zeros of the infinite-dimensional feedback control system on the basis of the desired overshoot for a simple finite-dimensional matching model and shifting of the rest of the spectrum. However, the original procedure called the Pole-Placement Shifting based controller tuning Algorithm (PPSA) was not developed and described entirely well. The aim of this paper is to revise the idea of the PPSA and suggest a possible ways how to improve or extend the algorithm. A concise illustrative example is attached to clarify the procedure for the reader as well.

Libor Pekař, Pavel Navrátil
Logistic Warehouse Process Optimization Through Genetic Programming Algorithm

This paper introduces process planning, scheduling and optimization in warehouse environment. The leading companies of the logistics warehouse industry still do not use planning and scheduling by automatic computer methods. Processes are planned and scheduled by an operational manager with detailed knowledge of the problem, processed tasks and commodities, warehouse layout, performance of employees, parameters of equipment etc. This is a quantum of information to be handled by a human and it can be very time-consuming to plan every process and schedule the timetable. The manager is usually also influenced by stress conditions, especially by the time of holidays when everyone is making supplies and the performance of the whole warehouse management goes down. The main contribution of this work is (a) to introduce the novel automatic method for optimization based on the evolutionary method called genetic programming, (b) to give a description of a tested warehouse, and (c) to show the metrics for performance measurement and to give a results which states the baseline for further research.

Jan Karasek, Radim Burget, Lukas Povoda
A New Approach to Solve the Software Project Scheduling Problem Based on Max–Min Ant System

This paper presents a new approach to solve the Software Project Scheduling Problem. This problem is NP-hard and consists in finding a worker-task schedule that minimizes cost and duration for the whole project, so that task precedence and resource constraints are satisfied. Such a problem is solved with an Ant Colony Optimization algorithm by using the Max–Min Ant System and the Hyper-Cube framework. We illustrate experimental results and compare with other techniques demonstrating the feasibility and robustness of the approach, while reaching competitive solutions.

Broderick Crawford, Ricardo Soto, Franklin Johnson, Eric Monfroy, Fernando Paredes
An Artificial Bee Colony Algorithm for the Set Covering Problem

In this paper, we present a new Artificial Bee Colony algorithm to solve the non-unicost Set Covering Problem. The Artificial Bee Colony algorithm is a recent metaheuristic technique based on the intelligent foraging behavior of honey bee swarm. Computational results show that Artificial Bee Colony algorithm is competitive in terms of solution quality with other metaheuristic approaches for the Set Covering Problem problem.

Rodrigo Cuesta, Broderick Crawford, Ricardo Soto, Fernando Paredes
A Binary Firefly Algorithm for the Set Covering Problem

The non-unicost Set Covering Problem is a well-known NP-hard problem with many practical applications. In this work, a new approach based on Binary Firefly Algorithm is proposed to solve this problem. The Firefly Algorithm has attracted much attention and has been applied to many optimization problems. Here, we demonstrate that is also able to produce very competitive results solving the portfolio of set covering problems from the OR-Library.

Broderick Crawford, Ricardo Soto, Miguel Olivares-Suárez, Fernando Paredes
Neural Networks in Modeling of CNC Milling of Moderate Slope Surfaces

Computer numerical control (CNC) allows achieving a high degree of automation of machine tools by pre-programmed numerical commands. CNC milling process is widely used in industry for machining of complex parts. The need of a description of the CNC milling process is necessary for production of precise parts. This paper introduces artificial neural network based modeling, while the CNC milling of moderate slope shapes is studied. The developed neural models consist of two inputs and two outputs. The created neural models were experimentally tested on the real data. Then, the evaluation and comparison of all models were performed.

Ondrej Bilek, David Samek
Application of Linguistic Fuzzy-Logic Control in Technological Processes

This paper presents the use of modern numerical methods such as Fuzzy Logic Control for control of fast technological processes with sampling period 0.01 [s] or less. The paper presents a real application of the Linguistic Fuzzy-Logic Control, developed at the University of Ostrava for the control of magnetic levitation model in the laboratory at the Institute for Research and Applications of Fuzzy Modeling and Department of Informatics and Computers, Faculty of Science. This technology and real models are also used as a background for problem-oriented teaching realized at the department for master students and their collaborative as well as individual final projects. The paper shows how the used technology can help people easily describe the control strategy from the technological control strategy point of view.

Radim Farana
Hybrid Intelligent System for Point Localization

The article introduces a hybrid intelligent system for point localization in 3D Euclidean space. There are two models presented. The first one is based on neural networks and the second one represents a classical approach. The classical model calculates Euclidean distances between two points in the defined domain. As regards the experimental study, we proposed appropriate topologies of the systems that depend on the required accuracy. At first, we identified distances between a randomly generated point and a reference points in the defined domain. Then a neural network uses the obtained distances as its inputs to determine the actual position of the point in the domain space. The experimental study was repeated several times. All obtained results are mutually compared in the conclusion.

Robert Jarusek, Eva Volna, Alexej Kolcun, Martin Kotyrba
On the Simulation of the Brain Activity: A Brief Survey

This article represents the brief introduction into the issues of simulation of brain activity. Firstly, there is shown a physiological description of the human brain, which summarizes current knowledge and also points out its complexity. These facts were obtained through the technologies, which are intended for observing electrical activity of the brain; for example invasive methods, electroencephalography (EEG) and functional magnetic resonance imaging (fMRI). Then, there are described approaches to simulate the brain activity. First of them is a standard model, which is the basis of most current methods. Second model is based on simulation of brain rhythm changes. Finally, there is discussed possible utilization of complex networks to create a biological neural network.

Jaromir Svejda, Roman Zak, Roman Jasek, Roman Senkerik
Q-Learning Algorithm Module in Hybrid Artificial Neural Network Systems

Presented topic is from the research field called Artificial Life, but contributes also to the field of Artificial Intelligence (AI), Robotics and potentially into many other aspects of research. In this paper, there is reviewed and tested new approach to autonomous design of agent architectures. This novel approach is inspired by inherited modularity of biological brains. During designing of new brains, the evolution is not directly connecting individual neurons. Rather than that, it composes new brains by connecting larger, widely reused areas (modules). In this approach, agent architectures are represented as hybrid artificial neural networks composed of heterogeneous modules. Each module can implement different selected algorithm. Rather than describing this framework, this paper focuses on designing of one module. Such a module represents one component of hybrid neural network and can seamlessly integrate a selected algorithm into the node. The course of design of such a module is described on example of discrete reinforcement learning algorithm. The requirements posed by the framework are presented, the modifications on the classical version of algorithm are mentioned and then the resulting performance of module with expectations is evaluated. Finally, the future use cases of this module are described.

Jaroslav Vítků, Pavel Nahodil
A Probabilistic Neural Network Approach for Prediction of Movement and Its Laterality from Deep Brain Local Field Potential

Prediction of neural activity relating to movement is essential to understanding and treatment of neurodegenerative diseases and cybernetic interfaces. Here we had shown that it is possible to decode deep brain local field potentials (LFPs) related to movements and its laterality, left or right sided visually cued movements using Probabilistic Neural Network (PNN) classifier. The frequency related components of LFPs were extracted using the wavelet packet transform (WPT). Then the signal features were computed as the instantaneous power of each band using the Hilbert Transform (HT) with defined windows for motor response. Based on the extracted feature, PNN classifier was designed and evaluated using 10-fold cross validation method to identify the robustness for predicting movements. The Classification accuracy 82.72 ± 7.2 % achieved for distinguishing movement condition from the rest. While for subsequent discrimination of left and right movement, the accuracy reached up to 74.96 ± 10.5 %. Considering the classification performance (accuracy, sensitivity, specificity and the area under the Receiver Operating Characteristic (AUC) curve), PNN classifier successfully achieved better than chance level. The proposed modality and computational process may promisingly effective and powerful method to open up several possibilities for improving BMI applications, diagnosis of chronic neurological disorders and robust monitoring system with propitious result.

Mohammad S. Islam, Khondaker A. Mamun, Muhammad S. Khan, Hai Deng
Patterns and Trends in the Concept of Green Economy: A Text Mining Approach

The term ‘green economy’ has recently become a topical issue that has engaged the attention of Governments, International bodies and the media. The understanding of this concept and policy concentration is carved in various ways depending on the body that is engaged. There exist varied definitions of the ‘green economy’ with many associating it directly to agriculture since it has the ‘green’ connotation. However, despite the varied definitions, one principle that stands out most is the term “Sustainable development” or simply “sustainability. It has 3 pillars namely; social sustainability, economic and environment sustainability. Based on the in-depth of knowledge of the concept of green economy and the commitment of Governments and other international organizations, several policy documents and articles have been published on the web for global consumption. This paper uses the web mining algorithms in-built in the R programming language to mine over 402 English articles on the internet on green economy. It identifies relevant terms and patterns, reveals frequent associative words and gives a conglomerate understanding of the concept. It also brings out the most active participants in the green economic drive and sought to find if by chance any of the three pillars of sustainability would be found in the most frequent terms.

Eric Afful-Dadzie, Stephen Nabareseh, Zuzana Komínková Oplatková
Utilization of the Discrete Chaotic Systems as the Pseudo Random Number Generators

This paper investigates the utilization of the discrete dissipative chaotic system as the chaotic pseudo random number generators. (CPRNGs) Several discrete chaotic maps are simulated, statistically analyzed and compared within this initial research study.

Roman Senkerik, Michal Pluhacek, Ivan Zelinka, Zuzana Kominkova Oplatkova
MIMO Pseudo Neural Networks for Iris Data Classification

This research deals with a novel approach to classification. This paper deals with a synthesis of a complex structure which serves as a classifier. Compared to previous research, this paper synthesizes multi-input–multi-output (MIMO) classifiers. Classical artificial neural networks (ANN) were an inspiration for this work. The proposed technique creates a relation between inputs and outputs as a whole structure together with numerical values which could be observed as weights in ANN. The Analytic Programming (AP) was utilized as the tool of synthesis by means of the evolutionary symbolic regression. Iris data (a known benchmark for classifiers) was used for testing of the proposed method. For experimentation, Differential Evolution for the main procedure and also for meta-evolution version of analytic programming was used.

Zuzana Kominkova Oplatkova, Roman Senkerik

Computer Science

Frontmatter
Compliance Management Model for Interoperability Faults Towards Governance Enhancement Technology

The objective of the research is to propose a software compliance management model for interoperability faults of regulatory non-compliances in IT industries. The enterprise software is exercised to minimize the risks on different types of non-compliances. The framework activities and procedures are kept in adherence to the guidelines and regulatory laws of the information related business or industries. The entities that are non-adherence to the standards and failed to follow the enumerated regulations are analyzed for the non-compliances. The non-compliances in procedure-oriented processes and coding are mapped with the risks associated with severity and impact on the chosen applications. The interoperability fault is tolerated by the customized rules based on criticality of the applications. The conformance to the requirement specifications pertaining to process, people, product and its quality are verified as a distributed system to manage the non-compliances. The existing information governance can be improvised by the proposed GET technique.

Kanchana Natarajan, Sarala Subramani
Reducing Systems Implementation Failure: A conceptual Framework for the Improvement of Financial Systems Implementations within the Financial Services Industries

The financial industry continues to change, become more global, complex and important to economies all around the work. The industry continues to be in flux and the world financial crisis has resulted in changes that have changed the industry for good. The need for agile, accurate and detailed financial systems has never been so important. This research discusses the issues associated with implementing financial systems within financial services companies, a conceptual framework has been built that will help reduce the risk of implementation failure in future financial systems implementations. Financial experts can use the framework to reduce system implementation risk; help deliver projects on time to budget whilst meet the functionality requirements of stakeholders.

Derek Hubbard, Raul Valverde
Merging Compilation and Microarchitectural Configuration Spaces for Performance/Power Optimization in VLIW-Based Systems

The rediscovery of VLIW architecture in the field of embedded multimedia applications introduces new challenges for computing paradigms historically oriented towards Instruction Level Parallelisms and performance optimization. In this work we perform an extensive multi-objective analysis which includes VLIW compiler as part of the configuration space, avoiding any explicit distinction between micro-architectural parameters and compilation strategies. After performing an high-level estimation of power/performance trade-offs by compiling and simulating some common application kernels, we qualitatively and quantitatively analyze of how the design space available can be greatly affected by the interaction of compiler behavior, processor-related features and memory subsystem.

Davide Patti, Maurizio Palesi, Vincenzo Catania
Numerical Solution of Ordinary Differential Equations Using Mathematical Software

The differential equation is mathematical tool widely used for description various linear or nonlinear systems and behaviour in the nature not only in the industry. The numerical solution of the differential equation is basic tool of the modelling and simulation procedure. There are various types of numerical methods, the ones described in this contribution comes from the Taylor’s series and big advantage of all of them is in easy programmability or even more some of them are included as a build-in functions in mathematical softwares such as Mathematica or MATLAB. The goal of this contribution is to show how proposed Euler and Runge-Kutta’s methods could be programmed and implemented into MATLAB and examine these methods on various examples. The comparable parameters are accuracy and also speed of the computation.

Jiri Vojtesek
Global Dynamic Window Approach for Autonomous Underwater Vehicle Navigation in 3D Space

The marine world becomes more narrow and full of different objects that move unpredictably in the ocean space. The problem of increasing the capacity of the systems management in any kind of underwater robots is highly relevant based on the development of new methods for the dynamic analysis, pattern recognition, artificial intelligence and adaptation. Among the huge number of navigation methods, Dynamic Window Approach is worth noting. It was originally presented by Fox et al. and implemented into indoor office robots. In this paper Dynamic Window Approach was developed for marine world and extended to manipulate the vehicle in 3D environment. This algorithm is provided to avoid obstacles and reach targets in efficient way. It was tested using MATLAB environment and assessed as an effective obstacle avoidance approach for marine vehicles.

Inara Tusseyeva, Yong-Gi Kim
UAC: A Lightweight and Scalable Approach to Detect Malicious Web Pages

Attackers mostly target users with vulnerable browsers thus inducting client side attacks through various exploitation means, where dynamic client-side JavaScript is most instrumental. In this paper, we present UAC (URL Analyzer and Classifier), a novel lightweight and browser-independent solution that leverages static analysis combined with run-time emulation to identify malicious web pages. UAC performs multi-facet inspection of web page which includes DOM parsing to identify suspicious DOM elements including hidden iframes and malicious links, JavaScript analysis to detect obfuscated and malicious behavior using function-call profiling based on supervised learning, tracking dynamic domain redirections and scanning for suspicious patterns. An Active potential URL hunt to seed web pages is conducted using an integrated web crawler to cover the maximum cyber space for a given URL. The solution is employed as a Low Interaction Honeyclient in a Distributed Honeynet System where the scalability is addressed using a hash-based redundancy check.

Harneet Kaur, Sanjay Madan, Rakesh Kumar Sehgal
A Preciser LP-Based Algorithm for Critical Link Set Problem in Complex Networks

The

critical link set problem

in a network is to find a certain number of links (or edges) whose removal will degrade the connectivity of the network to the maximum extent. It is a fundamental problem in the evaluation of the vulnerability or robustness of a network because the network performance highly depends on its topology. Since it is an NP-complete problem, a LP-based (linear programming-based) approximation algorithm is proposed in this paper to find out the critical link set in a given network. The algorithm is evaluated with a real-world network and random networks generated by the ER model and the BA model. The experimental results have shown that the algorithm has better precision than the best-known HILPR algorithm with a polynomial-time extra cost.

Xing Zhou, Wei Peng
Modeling Intel 8085A in VHDL

In this paper we present a model of completely functional Intel 8085A processor in VHDL, starting from scratch. The majority of the work is based on the specification for 8085A, with some changes that are considered better for the implementation. All of the processor building blocks are modeled and integrated. An interface to the memory and I/O address space is also provided. Since each instruction is distinguished by a unique operational code, the final product is a processor capable of successfully executing an assembler program which is loaded in memory as a sequence of operational codes.

Blagoj Jovanov, Aristotel Tentov
A Novel Texture Description for Liver Fibrosis Identification

In this study, the proposed texture description method is applied to obtain the description of ultrasound images of hepatic parenchyma. The result of performance characteristics for distinguishing liver fibrosis and normal liver is shown. The diagnostic performance is accessed on two different approaches and two set of parameters including CO-LBP 50 × 50, CO-RLBP 50 × 50, CO-LBP 75 × 75 and CO-RLBP 75 × 75. We find that CO-RLBP method is better than that of CO-LBP method in overall accuracy.

Nan-Han Lu, Meng-Tso Chen, Chi-Kao Chang, Min-Yuan Fang, Chung-Ming Kuo
Topology Discovery in Deadlock Free Self-assembled DNA Networks

In this paper we present a novel approach to topology discovery and defect mapping in nano-scale self-assembled DNA networks. The large scale randomness and irregularity of such networks makes it necessary to achieve deadlock freedom without the availability of a topology graph or any other kind of centralized algorithms to configure network paths. Results show how the proposed distributed approach preserves some important properties (coverage, defect tolerance, scalability), reaching a segment-based deadlock freedom while avoiding centralized tree-based broadcasting and hardware node hungry solutions not feasible in such a limited nanoscale scenario. Finally, we quantitatively evaluate an not-optimised gate-level hardware implementation of the required control logic that demonstrates a relatively acceptable impact ranging from 10 to about 17 % of the budget of transistors typically available at each node using such technology.

Davide Patti, Andrea Mineo, Salvatore Monteleone, Vincenzo Catania
Automated Design of 5 GHz Wi-Fi FSS Filter

This article presents a technique for analysis and automated design of frequency selective surfaces. The approach allows to automate the whole process of the filter design and frees the users from the detailed knowledge of the filter design theory. Whole process of automation is implemented in Matlab. An optimisation of a band-stop filter for Wi-Fi communication on 5 GHz serves as a practical example. Therefore the goal is to design a band-stop filter which ideally does not transmit mentioned band of frequencies. The geometry of double-layer Jerusalem-cross serves as a structure to be optimized.

Pavel Tomasek
Obstacle Detection for Robotic Systems Using Combination of Ultrasonic Sonars and Infrared Sensors

An obstacle detection became one of the most important tasks in a robotic system development. DistanceDetector is a device which can detect large obstacles by utilizing a combination of ultrasonic sonars and infrared sensors. This paper deals with the DistanceDetector description and reveals its hardware and firmware structure, used technologies and provides a simple use case scenario.

Peter Janku, Roman Dosek, Roman Jasek
Automatic Sensor Configuration for Creating Customized Sensor Network

A sensor network is expected to provide effective delivery of its services by taking an appropriate action based on one or more situations that it senses in the environment. However, due to the dynamism of application requirements and user context, it is often required to re-configure services from a sensor network to meet specific application needs. This paper is an attempt to enable dynamic adaptation of sensor network services with a web-based database consists of sensors’ MAC address, vendor ID and data frame. We propose a semantics-based architecture where ASC connects with sensor and matches the received data with the database and connects the sensor with the mobile device. ASC works on full-duplex communication to observe sensors as well as control mobile devices. Wide ranges of android and java libraries are capable to manipulate embedded systems from smart-phones. It’s application in the field of education, security and surveillance, environment research and military. In this paper, we represent the structure and functioning of ASC for Bluetooth devices and its applications.

Ketul B. Shah, Young Lee
Adapting User’s Context to Understand Mobile Information Needs

The use of the user’s environmental and physical context can reveal important information to enhance Mobile Information Retrieval. However the typical mobile search process integrates all gathered information about the user’s context. These approaches do not take into account user’s intention behind the query, which decreases their reliability and effectiveness in terms of leading to the appropriate user’s information need. In this paper, we study the problem of finding a set of user’s context information allow to disambiguate user’s query. These contextual informations, that we call relevant dimensions, can help to personalize the mobile search process. To this aim we develop a context filtering approach CFA. The problem of finding such set of dimensions can be assimilated to a context filtering problem. We propose a novel measure that directly precises the relevance degree of each contextual dimension, which leads to finally filter the user’s context by retaining only relevant. Our experiments show that our measure can analyze the real user’s context of up to 6,000 of dimensions related to more than 2,000 of user’s queries. We also show experimentally the quality of the set of contextual dimensions proposed, and the interest of the measure to understand mobile user’s needs and to filter his context.

Sondess Missaoui, Rim Faiz
Web Service Based Data Collection Technique for Education System

This paper presents a web service data collection technique that facilitates observing data collection process for education system. Our technique uses web service for collection of education data. Education Management Information System (EMIS) as a centralized data collection system consumes web service of regional centers for collecting data from application to application. Our data collection technique is divided in two phases. First phase collects data from education providers at regional level centers and second phase collects data for EMIS from all regional levels centers using web service. First phase is divided in four components to register contact addresses of education providers, send data collection format (DCF) and collect filled DCF from web addresses. Thus data is stored at regional levels which can be later collected by EMIS when required. Regional level centers are required to give access of data using web service. Web service can be easily consumed by EMIS using WSDL information to collect data which does not need any modifications in information systems at both regional and central level. Data can be entered directly by education providers to be stored at regional centers, as against centralized system for data entry. So our technique enables platform-independent, computer to computer collection of data. We illustrate our technique by a case study for data collection from schools.

Ruchika Thukral, Anita Goel
Approximate Dynamic Programming for Traffic Signal Control at Isolated Intersection

As a new optimization technique for discrete dynamic systems, approximate dynamic programming (ADP) for the optimization control of a simple traffic signalized intersection is proposed. ADP combines the concepts of reinforcement learning and dynamic programming, and it is an effective and practical approach for real-time traffic signal control. This paper aims at minimizing the average number of vehicles waiting in the queue or the vehicles average waiting time at isolated intersection by using the action-dependent ADP (ADHDP). ADHDP signal controller is designed with neural networks to learn and achieve a near optimal traffic control policy by measuring the traffic states. As shown by the comparison with other traffic control methods, the simulation results indicate that the approach is efficient to improve traffic control at a simple intersection.

Biao Yin, Mahjoub Dridi, Abdellah El Moudni
An Approach to Semantic Text Similarity Computing

The use of text similarity plays an important role in many applications in Computational Linguistics, such as Text Classification and Information Extraction and Retrieval. Besides, there are several tasks that require computing the similarity between two short segments of text. In this work, we propose a sentence similarity computing approach that takes account of the semantic and the syntactic information contained in the sentences. The proposed method can be applied in a variety of applications to mention, text knowledge representation and discovery. Experiments on a set of sentence pairs show that our approach presents a similarity measure that illustrates a considerable correlation to human judgment.

Imen Akermi, Rim Faiz
Object Recognition with the Higher-Order Singular Value Decomposition of the Multi-dimensional Prototype Tensors

In the paper an extension of object recognition based on the Higher-Order Singular Value Decomposition (HOSVD) to the 4th dimension is discussed. HOSVD based object recognition expands the concept of object recognition in the pattern spaces spanned by the PCA decomposition of vector patterns into the higher dimensions. However, contrary to the PCA, in the HOSVD the bases of the pattern space are tensors rather than 1D vectors. Nevertheless, the already presented works on HOSVD recognition were limited to the images with only scalar valued pixels. In the proposed framework images are allowed to contain multi-dimensional pixels, which adds an additional dimension to the pattern tensor. The proposed method opens new possibility of the HOSVD based recognition to color or other multi-valued images. Experimental results show improved accuracy as compared to the scalar valued data, as well as fast execution time.

Bogusław Cyganek
A Quality Driven Approach for Provisioning Context Information to Adaptive Context-Aware Services

The growing adoption of the Service Oriented Architecture (SOA) for provisioning services and the proliferation of Internet-enabled handheld devices are changing the services landscape. Users are increasingly demanding services that can adapt to their current context. In this paper, we propose a framework for provisioning context information to adaptive services. The framework relies on negotiated Context Level Agreements (CLAs) between context-consumers (adaptive services) and context-providers by means of a context broker. The CLA specifies the context information and the agreed upon level of quality-of-context (QoC) that the context-provider shall deliver. We describe the components of the framework and the CLA negotiation process. One of the advantages of the approach is that context-providers can provide several types of context information at different QoC levels. Moreover, the publish/subscribe model allows the broker to be aware of significant variations in QoC offerings; and consequently, be able to monitor the execution of CLAs.

Elarbi Badidi
Studying Informational Sensitivity of Computer Algorithms

This study is focused on informational sensitivity of an algorithm, defined as impact of different fixed-length inputs on the value of the algorithm’s complexity function. In addition to classic worst-case complexity this characteristic provides a supplementary tool for more detailed and more “real world” approach to studying algorithms. Statistical measure of informational sensitivity is calculated based on statistical analysis of results obtained from multiple runs of the same program implementation of the algorithm in question with random inputs. This theory is illustrated by an example of algorithm that solves the travelling salesman problem by branch and bound method using the concorde package. For a sample of different input graphs with 1,000÷10,000 vertices the statistical measurements of informational sensitivity were found and confidence ranges for complexity function were constructed. It was proven that this particular algorithm is highly sensitive to fixed-size inputs by complexity function.

Anastasia Kiktenko, Mikhail Lunkovskiy, Konstantin Nikiforov
Binary Matchmaking for Inter-Grid Job Scheduling

Inter-Grid is a composition of small interconnected Grid domains; each has its own local broker. The main question is how to implement cross-Grid job scheduling achieving stability and load balancing, together with maintaining the local policies of interconnected Grid. Existing Inter-Grid methodologies are based on either centralised meta-scheduling or decentralised scheduling which carried out by local brokers, but without proper coordination. The question is how to perform matchmaking between a particular local job and the workers of a remote domain. Performing matchmaking remotely would result in computational overhead in case of many domains asking for match from one domain. Performing matchmaking locally requires transmission of the resource information set of the remote domain, which would result in high data traffic. This position paper introduces a coordinated scheduling technique for broker based inter-Grid architectures. Resource information set of each Grid domain is stored in a binary form. Matchmaking is carried out in the local domain using fast logical operations. Our primary results show that the proposed technique achieves 26 speedup in the matchmaking process compared to Condor negotiator, and a reduction up to 99.92 % in the resource information size compared to Condor ClassAd.

Abdulrahman Azab
Complex Objects Remote Sensing Forest Monitoring and Modeling

In this paper the concept of integrated modeling and simulation processes of the Complex Natural and Technological Object (CNTO) is presented. The main goal of the investigations consists in the practice of the predetermined modeling. The practice direction as the remote sensing forest monitoring is proposed by the authors. Here the methodical foundations of the integrated modeling and simulation, the process of CNTO operation, the technology of the remote sensing forest monitoring are considered. Principal concern is attended to the continuity of the model and object solving practical issues. More over results of CNTO remote sensing forest monitoring make it possible to adapt models of this system to changing environment conformably to the forest management.

Boris V. Sokolov, Vyacheslav A. Zelentsov, Olga Brovkina, Victor F. Mochalov, Semyon A. Potryasaev
Building a Non-monotonic Default Theory in GCFL Graph-Version of RDF

The aim describes the idea of graph-based representation of clauses. This approach follows the Richards idea of graph-based clausal form knowledge representation. Moreover, it enabled to build up the graph-based formal system GCFL (Graph-based Clausal Form Logic) that cannot only illustrate knowledge bases graphically, but also allows us to obtain consequents of a knowledge base in a special graph-based way. The article continues the idea by creation of a graph-based formal system of generating revisable theories following the known Reiter’s default principle of building non-monotonic theories.

Alena Lukasová, Martin Žáček, Marek Vajgl
An Intranet Grid Computing Tool for Optimizing Server Loads

The article describes the principles of the developed Intranet grid computing used in the corporate sector as a tool for optimizing computing loads on the server that is deployed for production planning and scheduling. ICT development allows companies to install higher computing performance with lower costs. This trend is particularly evident for investments in personal computers, laptops or smartphones. Investments in the backbone infrastructure (servers, networks) are controlled by a different philosophy. For this area, it is important (in many cases due to software licensing policy) which is a very rigorous consideration of the system performance parameters to be used. This result is well-known as a problem with high-loads on servers, as against almost negligible computing loads on the user-side.

Petr Lukasik, Martin Sysel
Discovering Cheating in Moodle Formative Quizzes

Introduction of modern information technologies in educational process provides new opportunities for students and teachers. However, apart from indisputable contributions, modern information technologies and its broad usage bring forth some new challenges. This paper presents an approach which we use to detect cheating during formative assessment in Moodle environment. We describe process of obtaining data from Moodle backup archive, its transformation and, consequently, its evaluation. Later we show what possibilities the evaluated data give us to identify potential cheaters. Some ideas about enhancement of cheating detection process are also discussed.

Jan Genci
Mobile Video Quality Assessment: A Current Challenge for Combined Metrics

Rapid development of mobile devices such as smartphones and tablets causes the growing interest in video transmission and display dedicated for mobile devices. Considering the typical distortions introduced mainly by video compression and transmission errors, their influence on the perceived video quality is not necessarily very similar to subjective evaluation of still images or videos presented using typical computers equipped with monitors. Therefore, there is a need of verification of usefulness of known image and video quality metrics for this purpose together with recently proposed combined metrics leading to highly linear correlation with subjective quality evaluations. In this paper some results of such verifications conducted using LIVE Mobile Video Quality Database as well as results of optimisation of proposed combined metric are presented. Obtained results are superior in comparison to other known metrics applied using frame-by-frame approach.

Krzysztof Okarma
Face Extraction from Image with Weak Cascade Classifier

The aim of this paper is to propose an artificial vision-based face detection approach, which could be primarily used in robotics. Three main problems arise from this expectation. The first one is the computation time of the whole process. The second one is the quality of the input information due to a camera with low resolution. The third one is the robustness of the involved techniques regarding the implementation. The paper discusses all three problems in the first part and introduces the Haar Cascade theory. The second part of the paper proposes a new noise reduction approach to improve detection result mostly in eyes and mouth area. Next part of the paper shows experimental results and finds the best threshold parameter to minimize overlapping areas. The last part explains advantages of the proposed technique.

Václav Žáček, Jaroslav Žáček, Eva Volná
Computer Aided Analysis of Direct Punch Force Using the Tensometric Sensor

This research was focused on measuring and analyzing of the direct punch force of young adults. The main focus was on the differences between genders and among groups of participants with different level of training. In this long-term study more than 200 participants took part. The collected data were analyzed and stored for future use in research. This paper presents the results of first analysis focused on the difference in the mean maximum of direct punch force of participants in different categories.

Dora Lapkova, Michal Pluhacek, Milan Adamek

Software Engineering

Frontmatter
Application of Semantic Web and Petri Calculus in Changing Business Scenario

The paper highlights correlation between Adaptive Business Environment and Web Semantic, Petridynamics, Adaptive Semantic Web. The Business environment use of Activity Theory and Web Semantic help in formatting the Ontology.

Diwakar Yagyasen, Manuj Darbari
Method-Level Code Clone Modification Environment Using CloneManager

The primary objective of code clone research is to provide techniques and tools through which the topics such as clone detection and clone management. A number of techniques have been proposed for clone detections and sure to have even more detectors in future. Some limited methods have been proposed for clone modifications. A technique that helps for clone modification is refactoring. But this is not possible for all the clones, as there are clones which cannot be modified. Moreover, some of the clones have to exist to maintain the consistency of the problem. Most of the programmers modify the clone and need to make the modification throughout all the identical clones. We propose a method, which provide a modification environment of the clones for the programmer. We use the clone detection tool CloneManager. We embedded this feature as an enhancement of the clone detection tool, CloneManager.

E. Kodhai, S. Kanmani
An Educational HTTP Proxy Server

The efficiency and safety of Web access can be enhanced by the deployment of an http proxy server in many cases. The first part of this paper provides an introduction to the issue of an HTTP proxy server. The second part of the paper describes used technologies and an implementation of a multithreaded HTTP proxy server with an embedded WWW server used for the graphics user interface. In its current state, the developed proxy server can be used to monitor the WWW traffic of a local area network and, with further development of its functionalities, can include such areas as content filtering or access control.

Martin Sysel, Ondřej Doležal
The Software Analysis Used for Visualization of Technical Functions Control in Smart Home Care

The article describes the analysis of software environment used for communication between the user and the control center and to processes data during visualization application environment creation to achieve comfortable control of operational and technological functions in intelligent (smart) buildings and finally, the use of the application in smart houses which provide nursing and assistant services for handicapped people and for the elderly.

Jan Vanus, Pavel Kucera, Jiri Koziorek
Visualization Software Designed to Control Operational and Technical Functions in Smart Homes

To control operational and technical functions in Smart Homes using wireless system xComfort a visualization software application was developed. Visualization was created as a web application for operational data storage. In terms of communication between a database using visualization and active elements, a software driver was created which makes this communication possible. Visualization was made with regard to user requirements, web interface, ability to control the software through a mobile phone and also with regard to easy expandability, scalability and modularity.

Jan Vanus, Pavel Kucera, Jiri Koziorek
Using Analytical Programming and UCP Method for Effort Estimation

This article is aimed to using the analytical programming and the Use Case Points method to estimate time effort in software engineering. The calculation of Use Case Points method is strictly algorithmically defined, and the calculation of this method is simple and fast. Despite a lot of research on this field, there are many attempts to calibrating the weights of Use Case Points method. In this paper is described idea that equation used in Use Case Points method could be less accurate in estimation than other equations. The aim of this research is to create new method, that will be able to create new equations for Use Case Points method. Analytical programming with self-organizing migration algorithm is used for this task. The experimental results shows that this method improving accuracy of effort estimation by 25–40 %.

Tomas Urbanek, Zdenka Prokopova, Radek Silhavy, Stanislav Sehnalek
Optimizing the Selection of the Die Machining Technology

The selection of a material for an application in engineering or its replacement with another material, superior in terms of economics, engineering and environmental impact is an important stage in the design process of a product. The paper presents a modern and original method for optimizing the selection of a manufacturing process for a part, for maximizing its performance and minimizing its cost, to attain the sustainable development objectives. The work strategy involves setting the functions of the product, the matrix and the related programs to select the optimal technology, applying the value analysis approach in order to obtain an optimal design—machining process. For the automation of calculations and ease of design work, the author developed calculus programs.

Florin Chichernea
Object-Oriented FSM-Based Approach to Process Modelling

We presents with this paper approach based on combination of the FSM and the Object-Oriented Approach, which is convergent. This convergent approach to modelling of business requirements and software development is main idea of this paper. The paper is divided into three parts, motivation and discussion is about needs connect two areas business requirements and software engineering, the idea of modelling of processes [

3

] and business situations as FSM and the third part is mapping of the proposed approach to BPMN-based and UML-based models. Mapping provides interesting new findings resulting from the proposed approach. This approach is based on our experience with our recent practical projects concerning business modelling and simulation in various application areas (e.g. health care, gas supply industry, regional management, administration process design of a new faculty of a university, administration process of building permission) and subsequent software development in these application areas.

Jakub Tůma, Vojtěch Merunka, Robert Pergl
Performance Analysis of Built-in Parallel Reduction’s Implementation in OpenMP C/C++ Language Extension

Parallel reduction algorithms are frequent in high performance computing areas, thus, modern parallel programming toolkits and languages often offer support for these algorithms. This article discusses important implementation aspects of built-in support for parallel reduction found in well-known OpenMP C/C++ language extension. It shows that the implementation in widely used GCC compiler is not efficient and suggests usage of custom reduction implementation improving the computational performance.

Michal Bližňák, Tomáš Dulík, Roman Jašek
User Testing and Trustworthy Electronic Voting System Design

In this contribution the user interface design for trustworthy system is presented. The principle of the Electronic Voting is discussed. The research aim was to discuss a users trust and its issues, which are connected to the design process of the prototype electronic voting system.

Petr Silhavy, Radek Silhavy, Zdenka Prokopova
Metadaten
Titel
Modern Trends and Techniques in Computer Science
herausgegeben von
Radek Silhavy
Roman Senkerik
Zuzana Kominkova Oplatkova
Petr Silhavy
Zdenka Prokopova
Copyright-Jahr
2014
Electronic ISBN
978-3-319-06740-7
Print ISBN
978-3-319-06739-1
DOI
https://doi.org/10.1007/978-3-319-06740-7

Premium Partner