Skip to main content

Über dieses Buch

This book constitutes the refereed proceedings of the Second International Conference, ICT Innovations 2010, held in Ohrid, Macedonia, in September 2010. The 33 revised papers presented together with 5 invited papers were carefully reviewed and selected. The papers address the following topics: internet applications and services, artificial intelligence, bioinformatics, internet, mobile and wireless technologies, multimedia information systems, computer networks, computer security, e-business, cryptography, high-performance-computing, social networks, e-government, as well as GPU computing.



Invited Keynote Papers

Finite State Automata by DNA Self-assembly

Several models of finite state automata in biomolecular computing are already in literature and some of these models have been also implemented in vitro showing their possible feasibility. On the other side, DNA self assembly of two-dimensional arrays have been achieved by variety of DNA-like tiles, moreover, algorithmic self assembly simulations of the Sierpinski triangle and binary counters have also been recorded. With this talk we describe an implementation of couple of models by DNA and we concentrate on the recent implementation of a finite state transducer (finite state automaton with output) by Wang like DNA tiles simulated with triple cross-over DNA molecules.

Nataša Jonoska, Nadrian C. Seeman

Length Extension Attack on Narrow-Pipe SHA-3 Candidates

In this paper we show that narrow-pipe SHA-3 candidates BLAKE-32, BLAKE-64, Hamsi, SHAvite-3-256, SHAvite-3-512, Skein-256-256 and Skein-512-512 do not provide


bits of security where


is the hash output size. The actual security against length extension attack that these functions provide is




bits of security, where


is an arbitrary value chosen by the attacker who wants to perform one-time pre-computation of 2


 + 1

compression functions. The attack can be in two variants: 1. The attacker is not collecting the hash values given by the user or 2. The attacker is collecting the hash values given by the user. In any case, the attacker does not know the content of the hashed messages. The optimal value for this attack from the perspective of minimizing the number calls to the compression function and increasing the probability of the successful attack is achieved when


has a value


, thus reducing the security against the length-extension attack from





Danilo Gligoroski

Review of Knowledge Sharing: Conceptual Foundations for Micro-level Knowledge Sharing and Readiness-for Change Related Behaviours

In the organisational change and knowledge sharing literature, recognition of high failures of change efforts is said to be associated to the organisations lack of understanding of how to manage readiness for change. In this paper, the case for change readiness is invoked by a need for further explanation of micro level foundations. A survey of 105 scholarly academic journals in the area of knowledge sharing research from 1994 to 2009 with keywords salient to knowledge sharing studies was conducted to explore current thinking about organisational change issues. The findings reveal that there is yet no well-established method or clear conceptual definition to exploring the phenomena of change for knowledge sharing on both individual and organisational levels. Based on the literature survey a model is proposed to integrate the relevant themes that influence knowledge readiness. A discussion is presented, demonstrating future directions towards knowledge sharing for micro-level knowledge sharing and readiness for change related behaviours.

Dilip Patel, Khalid Samara, Shushma Patel

Inferring Causal Interpretations of Change-Readiness Using Causal-Models: A Knowledge-Based Perspective

The ability to understand the conditions in which humans make causal judgements continues to arouse debate from cognitive science, philosophy, and even the domain of computer science. While for most organisations, change is a necessary impetus to sustainability, it is difficult to directly infer cause and affect relationships on human readiness without understanding how humans arrive causal inferences during a complex change situation. To explore the causal interpretations of human readiness-for change the research applies the systems thinking approach, utilising causal models to analyse the cause and effect of human readiness. The research contributes to a knowledge-based perspective examining the various factors effecting readiness-feedback, and how readiness-for change knowledge is received, and processed. The paper demonstrates the application of causal models to interpret the role of human readiness through a case study on the infectious outbreak of Clostridium Difficile (C.


). Then we propose a theory of readiness-for change through the lenses of Systems Thinking into a Knowledge Based Reasoning Framework.

Shushma Patel, Khalid Samara, Dilip Patel

E-Business, Emerging Trends in the European Union

E-Business is often linked with business to consumer (B2C) processes, but a larger and still to a large extent not exploited potential exists in business-to-business (B2B) and government-to-business (G2B) processes. The European Commission encourages businesses and governments to increase the uptake in order to promote European competitive performance. Various initiatives have been launched, which have shown some impact, but the great large scale breakthrough across Europe is still ahead. There are big differences between countries and economic sectors in approaching E-Business. Critical success factors can be extracted from those, which have successfully implemented B2B and G2B initiatives on a broad scale.

Peter Sonntagbauer

Proceeding Papers

On Some Cryptographic Properties of the Polynomial Quasigroups

A polynomial quasigroup is said to be a quasigroup that can be defined by a polynomial over a ring. The possibility for use of these quasigroups in cryptography mainly relies on their simple properties, easy construction and huge number. The quasigroup string transformations that are usually used in cryptographic primitives make use of the quasigroup operation as well as one of the parastrophic operations. That is why one of the most important questions posed about the polynomial quasigroups is the one concerning the nature of their parastrophic operations. In this paper we investigate the parastrophes of the polynomial quasigroups of order 2


and propose effective algorithm for finding them.

Simona Samardjiska

Some Probabilistic Properties of Quasigroup Processed Strings Useful for Cryptanalysis

Quasigroup string transformations are already used for designing of several cryptographic primitives. The application of quasigroup transformations as encryption algorithm is based on the previously proved result: the distribution of


-tuples in arbitrary string processed by application of quasigroup





-transformation is uniformly distributed for




. In this paper, we give some probabilistic properties of quasigroup processed strings that can be used in cryptanalysis. Namely, we find the distribution of


-tuples in an arbitrary string processed by application of quasigroup





-transformation (




). Since this distribution is not uniform, it can be used for statistical attack in order to discover the original message and we give an algorithm for this kind of attack. Suitable experimental results are presented as well.

Verica Bakeva, Vesna Dimitrova

A Compositional Method for Deciding Program Termination

One of the major challenges in computer science is to put programming on a firmer mathematical basis, in order to improve the correctness of programs. This paper describes a concrete implementation of a semantic-based approach for verifying termination of open nondeterministic programs with finite data types. The presentation is focused on Erratic Idealized Algol, which represents a nondeterministic programming language that embodies many of the core ingredients of imperative and higher-order functional languages. The fully abstract game semantics of the language is used to obtain a compositional, incremental way of generating accurate models of programs. The CSP process algebra is used as a concrete formalism for representation of game models and their efficient verification. Termination of programs is decided by checking divergence-freedom of CSP processes using the FDR tool. The effectiveness of this method is presented by several examples.

Aleksandar Dimovski

Practical Consequences of the Aberration of Narrow-Pipe Hash Designs from Ideal Random Functions

In a recent note to the NIST hash-forum list, the following observation was presented: narrow-pipe hash functions differ significantly from ideal random functions






that map bit strings from a big domain where

$N=n+m,\ m\geq n$



 = 256 or


 = 512). Namely, for an ideal random function with a big domain space {0,1}


and a finite co-domain space


 = {0,1}


, for every element




, the probability

$Pr\{H^{-1}(y) = \varnothing\} \approx e^{-2^{m}} \approx 0$



− 1



) ⊆ {0,1}



$H^{-1}(y) = \{x \ |\ H(x)=y \}$

(in words - the probability that elements of


are “unreachable” is negligible). However, for the narrow-pipe hash functions, for certain values of


(the values that are causing the last padded block that is processed by the compression function of these functions to have no message bits), there exists a huge non-empty subset





with a volume

$|Y_\varnothing|\approx e^{-1}|Y|\approx 0.36 |Y|$

for which it is true that for every

$y \in Y_\varnothing,\ H^{-1}(y) = \varnothing$


In this paper we extend the same finding to SHA-2 and show consequences of this abberation when narrow-pipe hash functions are employed in HMAC and in two widely used protocols: 1. The pseudo-random function defined in SSL/TLS 1.2 and 2. The Password-based Key Derivation Function No.1, i.e, PBKDF1.

Danilo Gligoroski, Vlastimil Klima

Unique and Minimum Distance Decoding of Linear Codes with Reduced Complexity

Given a linear [






] code, we show that for




/2 the time complexity of unique decoding is












) and the time complexity of minimum distance decoding is












). The proposed algorithms inspect all error patterns in the information set of the received message of weight less than


/2 or


, respectively.

Dejan Spasov, Marjan Gusev

Comparison of the Power Consumption of the 2nd Round SHA-3 Candidates

In the paper we show that the second round candidates of the NIST hash competition differ up to 22 % in their power consumption. We perform a detailed analysis of the candidates with respect to different performance parameters. Finally, we discuss the


, the

power consumption

, and the

energy per byte

as criteria to distinguish the candidates with respect to the performance.

Benedikt Westermann, Danilo Gligoroski, Svein Knapskog

Self-Heating Effects in High Performance Devices

We investigate self-heating effects in single-gate and dual-gate device structures and structures that have AlN (aluminum nitride) and diamond as a buried oxide layer. We also investigate both electrical and thermal enhancement and degradation respectively, due to self-heating effects in fully-depleted SOI devices that have arbitrary transport and crystallographic direction. Our simulation analysis suggests that in all these alternative device technologies self-heating is dramatically reduced in short channel devices due to the pronounced velocity overshoot effect. Moreover, the use of AlN and diamond as a buried oxide layer further reduces the current degradation due to self heating to insignificant values because of the drastic reduction of the thermal resistance of the buried oxide layer.

Katerina Raleva, Dragica Vasileska, Stephen M. Goodnick

Performance Analysis of Dual-Hop MIMO Systems

In this paper we study the end-to-end bit error and outage probability (OP) performance of dual-hop multiple input multiple output (MIMO) systems with Alamouti’s coding using modified amplify and forward (MAF) relaying under flat Rayleigh fading channels. The bit error performances of dual-hop MIMO systems with variable gain relays is compared with dual-hop single antenna systems and regenerative i.e. decode and forward (DF) dual-hop MIMO system. We show that MAF MIMO systems achieve significantly lower bit error probability then dual-hop single-antenna systems and comparable performance with DF systems. The performance gap increases with usage of dual antenna in relay and the receiver. The OP performances of these systems are compared with single-antenna dual-hop and dual-antenna single-hop systems. We show significant improvement of OP performance compared to single-antenna dual-hop and comparable performance with dual-antenna singlehop systems.

Jovan Stosic, Zoran Hadzi-Velkov

Parallel Machine Translation for gLite Based Grid Infrastructures

Statistical machine translation is often criticized for slow decoding time. We address this issue by presenting a new tool for enabling Moses, a state of the art machine translation system, to be run on gLite based Grid infrastructures. It implements a workflow model for equally distributing the decoding task among several worker nodes in a cluster. We report experimental results for possible speed-ups and envision how natural language processing scientists can benefit from existing Grid infrastructures for solving processing, storage and collaboration issues.

Miloš Stolić, Anastas Mišev

e-Consumer Online Behavior: A Basis for Obtaining e-Commerce Performance Metrics

Nowadays the e-Commerce and e-Business paradigm have prevailed all over the world, definitely changing the way of running businesses and shaping the new business environment. The Web-based business solutions have produced a novel type of a consumer, the e-Consumer, and a specific way of doing business on the Net, based upon the human – computer – Web interaction. The online behavior of the e-Consumer has proven to be a key not only to a successfulness of a given e-Commerce site, but also a key to obtain a significant knowledge about customer’s habits, needs, expectations, etc. Nonetheless, the e-Consumer online behavior can also be used as a basis for obtaining the basic Web performance metrics, necessary for assuring a relevant level of Quality of Service (QoS) through capacity planning. The paper aims to highlight some of the most important aspects of modeling e-Consumer online behavior regarding Web performance. We propose a novel approach to modeling online behavior, based on usage of Deterministic and Stochastic Petri Nets (DSPNs).

Pece Mitrevski, Ilija Hristoski

e-Government and e-Business in Western Balkans 2010

This study analyses results of the e-Government and e-Business reports for Western Balkan Countries in 2010 and also results from EBIZ4ALL project led by Austrian Agency for research promotion (FFG) within the COIN (Cooperation and Innovation) program. The results are evaluated according to well known methodologies and compared with EU average results. Results of conveyed surveys for use of e-Services by companies are provided, and their use of electronic means for exchange of invoices and orders. Conclusions also include evaluation of current situation of Western Balkan countries in develop ment of efficient government and infrastructure for Information Society capable to be interoperable in networked world and global market.

P. Sonntagbauer, M. Gusev, S. Tomic Rotim, N. Stefanovic, K. Kiroski, M. Kostoska

Information Brokering with Social Networks Analysis

Social network services like Facebook are still very popular on the internet. They are also becoming useful in companies for business tasks, where their structure is examined with social network analysis. This approach makes it possible to determine, who in the social network is connected with whom and what interests these people share without regarding the formal organizational structure. In large organizations people often need information, but they do not even know that it exists in the company. So they either suffer a lack of information in business processes or keep turning to people in key positions, who get the information for them. In order to take work off the people in key positions it would be necessary to offer information to employees even before they ask for it. This paper discusses the potential of social network analysis to be used for information brokering goals.

Jorge Marx Gómez, Peter Cissek

A Distributed Catalog for Digitized Cultural Heritage

Peer-to-peer networks have emerged recently as a flexible decentralized solution to handle large amount of data without the use of high-end servers. In this paper we present a distributed catalog built up on an overlay network called “Synapse”. The Synapse protocol allows interconnection of different overlay networks each of them being an abstraction of a ”community” of virtual providers. Data storage and data retrieval from different kind of content providers (


libraries, archives, museums, universities, research centers, etc.) can be stored inside one catalog. We illustrate the concept based on the Synapse protocol: a catalog for digitized cultural heritage of Serbia.

Bojan Marinković, Luigi Liquori, Vincenzo Ciancaglini, Zoran Ognjanović

Toward an Integration Technology Selection Model for Information Systems Integration in Supply Chains

The need to satisfy a more demanding customer in a scenery where deadlines and costs must be ever smaller to maintain competitiveness, together with increased uncertainty about demand, has been leading organizations to collaborate to such a level that now the competition is not between isolated enterprises, but between supply chains. The integration of information systems in such environment is a recognized problem, aggravated by the selection complexity of a combination of technologies to support, to the greatest possible extent, the supply chain performance. This paper proposes an approach for a decision support model based on compensatory fuzzy logic, to facilitate the selection of technologies to be used for integrating the information systems in a supply chain.

Dania Pérez Armayor, José Antonio Díaz Batista, Jorge Marx Gómez

Development of an English-Macedonian Machine Readable Dictionary by Using Parallel Corpora

The dictionaries are one of the most useful lexical resources. However, most of the dictionaries today are not in digital form. This makes them cumbersome for usage by humans and impossible for integration in computer programs. The process of digitalizing an existing traditional dictionary is expensive and labor intensive task. In this paper, we present a method for development of Machine Readable Dictionaries by using the already available resources. Machine readable dictionary consists of simple word-toword mappings, where word from the source language can be mapped into several optional words in the target language. We present a series of experiments where by using the parallel corpora and open source Statistical Machine Translation tools at our disposal, we managed to develop an English- Macedonian Machine Readable Dictionary containing 23,296 translation pairs (17,708 English and 18,343 Macedonian terms). A subset of the produced dictionary has been manually evaluated and showed accuracy of 79.8%.

Martin Saveski, Igor Trajkovski

Information Retrieval Using a Macedonian Test Collection for Question Answering

Question answering systems solve many of the problems that users encounter when searching for focused information on the web and elsewhere. However, these systems cannot always adequately understand the user’s question posed in a natural language, primarily because any particular language has its own specifics that have to be taken into account in the search process. When designing a system for answering questions posed in a natural language, there is a need of creating an appropriate test collection that will be used for testing the system’s performance, as well as using an information retrieval method that will effectively answer questions for that collection. In this paper, we present a test collection we developed for answering questions in Macedonian language. We use this collection to test the performance of the vector space model with pivoted document length normalization. Preliminary experimental results show that our test collection can be effectively used to answer multiple-choice questions in Macedonian language.

Jasmina Armenska, Aleksandar Tomovski, Katerina Zdravkova, Jovan Pehcevski

A Modeling Framework for Performance Analysis of P2P Live Video Streaming Systems

Client/server media streaming systems exhibit streaming limitations when number of clients rises and the server can no longer sustain the upload load. At first, IP Multicast was proposed as an alternative solution to this problem but its deployment brought many practical issues in scalability and deployment that prevented it from wider use. Recently, a new promising technique emerged which is cost effective, easy to deploy and can support thousands of simultaneous users. It’s a peer to peer network of logically connected clients which form an application level overlay network on top of the physical network. This new paradigm brings numerous advantages, but also a lot of open issues that need to be resolved. This paper exposes the fundamental characteristics of p2p live video streaming systems, gives a survey of p2p video streaming applications and presents a novel modeling framework for performance analysis of such systems as our main goal in future research.

Zoran Kotevski, Pece Mitrevski

Correlation between Object-Oriented Metrics and Refactoring

Repeated code modification lowers code quality and impacts object-oriented system design. Object-oriented metrics have proven as indicators of problems in system design. They have been grouped in minimal sets, known as quality models, to assess object-oriented system quality. Improvement can be gained by establishing relationships between quality characteristics and metrics computed from object-oriented diagrams. Quality models include metrics that can produce better code in object-oriented systems. Code quality can also be gained with refactoring. Refactoring is used to reduce complexity and eliminate redundant code. It is important to identify when and where to use refactoring. There are many different approaches. This work presents early stage analysis and focuses on exploring whether object-oriented metrics can be used as indicators where in the code refactoring can be used. Through multiple itera tions of successive measurement and refactoring, relation between metric values and need of refactoring can be concluded.

Daniela Boshnakoska, Anastas Mišev

Mobile Robot Environment Learning and Localization Using Active Perception

The human brain does not remember the images seen through the eyes just by the light influx into the retina, but by the sensory-motor interaction between the human movement (including eye movement) and the environment. We propose a system for autonomous mobile robot localization in the environment that uses the behavioral aspects of the human brain, especially the saccadic movement of the eyes. The hypothesis for the robot location in the environment is built not by the image itself, but as a collection of saccadic sensory-motor pairs of the image observations that create the mental representation of all perceived objects in the environment. This approach is implemented and tested on a dataset of images taken from a real environment. The obtained results from the localization show that the comparison of the saccadic sequences of the descriptors outperforms the naïve descriptor matching of the images.

Petre Lameski, Andrea Kulakov

Diatom Classification with Novel Bell Based Classification Algorithm

Diatoms are ideal indicators of certain physical-chemical parameters and in the relevant literature they are classified into one of the water quality classes (WQCs). Using information technologies methods, we can classify old and new diatoms directly from measured data. In this direction, a novel method for diatom classification is proposed in this paper. The classification models are induced by using modified bell fuzzy membership functions (MFs) in order to make more accurate models. An intensive comparison study of the fuzzy MFs distribution with the proposed method and the classical classification algorithms on the classification accuracy is studied. Based on this evaluation results, three models are presented and discussed. The experimental results have shown that the proposed algorithm remains interpretable, robust on data change and achieve highest classification accuracy. The obtain results from the classification models are verified with existing diatom ecological preference and for some diatoms new knowledge is added.

Andreja Naumoski, Kosta Mitreski

Organizations Analysis with Complex Network Theory

In this paper, we propose network measures and analytical procedures for modeling the structure and the behavior of the basic types of organizations, such as: line, functional, line-and-staff, project and matrix organization. In order to obtain some tangible information about the connectivity between employees and structural properties of organizations, we develop network generators for all five types of organizations. We review various roles and groups of employees within the organizational network, and we assess social position and impact of a particular employee. Except, assessed locations of actors within an organizational network, we analyze the structure of network to find specific employees who have similar roles in the organization and have a tendency to be equivalent in terms of their potential to act in the organization. We estimate what is the confidentiality of the organizational network depending on the removal of a certain communication between employees and what is the percentage of communications that disconnect the organization in unconnected parts.

Todorka Banova, Igor Mishkovski, Dimitar Trajanov, Ljupco Kocarev

An Agglomerative Clustering Technique Based on a Global Similarity Metric

In this paper we address the problem of detecting communities or clusters in networks. An efficient hierarchical clustering algorithm based on a global similarity metric is introduced. The technique exploits several characteristic average values of the similarity function. Also an analytical result is provided for the average similarity of vertices on an Erdos-Renyi graph in the asymptotic limit of the graph size. Finaly the performance of the algorithm is evaluated over a set of computer-generated graphs. Our analysis shows that newly proposed algorithm is superior when compared to the popular algorithm of Girvan and Newman and has equal or lower running time.

Angel Stanoev, Igor Trpevski, Ljupco Kocarev

Accelerating Clustering Coefficient Calculations on a GPU Using OPENCL

The growth in multicore CPUs and the emergence of powerful manycore GPUs has led to proliferation of parallel applications. Many applications are not straight forward to be parallelized. This paper examines the performance of a parallelized implementation for calculating measurements of Complex Networks. We present an algorithm for calculating complex networks topological feature clustering coefficient, and conducted an execution of the serial, parallel and parallel GPU implementations. A hash-table based structure was used for encoding the complex network’s data, which is different than the standard representation, and also speedups the parallel GPU implementations. Our results demonstrate that the parallelization of the sequential implementations on a multicore CPU, using OpenMP produces a significant speedup. Using OpenCL on a GPU produces even larger speedup depending of the volume of data being processed.

Leonid Djinevski, Igor Mishkovski, Dimitar Trajanov

Selective Attack in Virus Propagation Processes

Computer viruses are still one of the main security threats in the Internet. For virus outbreaks prevention, we need to fully understand their spreading dynamics and how it can be affected. Many viruses include inter-human contacts in their spreading and observations have shown that these inter-contact times are often heavy-tail distributed. The contacts between humans form a logical network over which viruses spread and the topology of this network plays important role in the spreading dynamics. The rapidity of spreading also depends on the location of initially infected nodes in the network. By selectively infecting the most influential nodes in the network the virus outbreak can grow faster. We analyze this effect for networks with different topologies, by using nodes selection based on several node centrality measures and the k-medoids graph clustering algorithm.

Miroslav Mirchev, Igor Mishkovski, Ljupco Kocarev

Object Recognition Based on Local Features Using Camera – Equipped Mobile Phone

The work presented in this paper analyses the viability of using a cell-phone as students’ guidance for literature selection. The integrated cell-phone camera is used to recognize the book covers in the bookstores and libraries. The chosen solution is based on client-server architecture and the object recognition is based on local features. Detecting, identifying, and recognizing salient regions or feature points in images is a very important and fundamental problem to the artificial intelligence and computer vision community. This paper mainly focuses on the comparison, in terms of time and performance, of two promising new approaches for markerless object recognition algorithms: the Scale-Invariant Feature Transform (SIFT) and the Speeded Up Robust Features (SURF). The study was performed using a smart cell-phone with Symbian OS and the results are reported.

Saso Koceski, Natasa Koceska, Aleksandar Krstev

HDL IP Cores Search Engine Based on Semantic Web Technologies

A few years ago, the System on Chip idea grew largely and ‘flooded’ the market of embedded systems. Many System on Chip designers started to write their own HDL components and made them available on the Internet. The idea of searching for a couple of pre-written cores and building your own System on Chip only by connecting them seemed time saving. We’ve developed a system that enables a semantic description of VHDL IP components, allows search of specific components based on the unambiguous semantic description and works with prebuilt VHDL IP cores. We present an application built around the system and focus on the benefits the application user gains during the process of System on Chip design.

Vladimir Zdraveski, Milos Jovanovik, Riste Stojanov, Dimitar Trajanov

Monitoring Wireless Sensor Network System Based on Classification of Adopted Supervised Growing Neural Gas Algorithm

Wireless sensor network system for monitoring predefined events with adaptation of one popular model of neural networks algorithm - Supervised Growing Neural Gas will be presented. Data reduction, energy savings, detection of dead nodes and event notification over internet are implemented. Architecture of the system allows investigating and comparing proposed algorithm results with Fuzzy ART model of neural network. Real-time measurements of physical data when vehicle is present in the sensed area are used to investigate advantages and disadvantages of neural networks adaptation in WSN based system.

Stojanco Gancev, Danco Davcev

Assessing the Performance of Assembly Tools on Simulated Sequencing Data and Their Sensitivity to GC Content

De novo assembly remains an ongoing challenge for the bio infor ma tics community. Many strategies have been developed; however there is no perfect

de novo

assembler. This publication evaluates four assemblers run on simulated data, comparing their outputs and testing their performance on two bacterial strains with different GC content (guanine – cytosine content). The results suggest assemblers’ sensitivity on GC content and highlight their advantages and disadvantages.

Aleksandra Bogojeska, Mihaela Angelova, Slobodan Kalajdziski, Ljupco Kocarev

Durkin’s Propagation Model Based on Triangular Irregular Network Terrain

Propagation models that are commonly used in assessing the performances of ad hoc networks, take into account the mechanisms of reflection, diffraction and scattering on the ground. However, it must not be forgotten that the communication between devices is usually carried out in irregular terrain, so it’s necessary to use the terrain profile in order to determine the signal coverage. In this paper we layout the extension of the Durkin’s propagation model using Triangular Irregular Network (TIN) based terrain. The verification of the proposed propagation model is done by comparing the results with the ones obtained with a SRTM map used in the Radio Mobile software.

Marija Vuckovik, Dimitar Trajanov, Sonja Filiposka

An Enhanced Visualization Ontology for a Better Representation of the Visualization Process

One purpose of the Top Level Visualization Ontology is to provide a common vocabulary to describe visualization data, processes, and products. However, there are two aspect where its expressiveness is poor: the models of the visualization process and the data used on it. The aim of this paper is to describe some modifications of this ontology which lead to a better representation of the visualization process and data models, and facilitate the accommodation of new models. A detailed description of the new ontology’s components is given.

Alberto Morell Pérez, Carlos Pérez Risquet, Jorge Marx Gómez

Framework to Design a Business Intelligence Solution

In the present research we propose a framework to design a business intelligence solution based on the integration of business and technological domains. The main contributions of the framework are: 1) enterprise architecture that combines the approach of the Zachman Framework and the Balanced Scorecard, 2) mapping the System Model layer with the tools of the Pentaho BI Suite and 3) an indicator for the control management that facilitates to measure the performance of the strategy from the compensation of the indicators defined in a Balanced Scorecard, based on compensatory fuzzy logic.

Pablo Marin Ortega, Lourdes García Ávila, Jorge Marx Gómez

A New Methodology to Benchmark Sophistication of e-Invoicing and e-Ordering

This paper introduces new, innovative means for benchmarking on line sophistication of e-Ordering and e-Invoicing solutions. Today, electronic means to conduct purchases and sales are essential activities for companies operations, and they need to decide how to implement e-Ordering and e-Invoicing, what software solutions to use, and how to integrate them successfully with their current software. Due to non-existence of such methodology, our primary goal is to present methods and indicators which constitute new benchmark for this type of software. We also give detailed description of the introduced indicators, and means to assess relative importance of different indicators. This methodology will be used to evaluate current solutions assessable by Internet in form of web service solutions (SAAS, plug in service, etc.). In this paper we have compared results and confirmed the validity of the new methodology.

Kiril Kiroski, Marjan Gusev, Magdalena Kostoska

ASGRT – Automated Report Generation System

We have come to a point in time when there is an abundance of database usage in almost all aspects of our lives. However, most of the end users have neither the knowledge nor the need to manage the databases. Even more important, they are unable to generate the ever changing reports they need, based on the data in their databases. Our Applicative Solution for Generating Reports from Templates (ASGRT) tries to deal efficiently with this issue. It has a simple yet effective architectural design aimed to give power to the more experienced administrators and simplicity to common end users, to generate reports with their own criteria and design, from their databases. The presented software enables creation of templates containing text and tags that are recognized and substituted by values retrieved from the database, therefore enabling creation of customized reports with varying ease of use and flexibility.

Dejan Gjorgjevikj, Gjorgji Madjarov, Ivan Chorbev, Martin Angelovski, Marjan Georgiev, Bojan Dikovski


Weitere Informationen

Premium Partner