Skip to main content
main-content

Über dieses Buch

These proceedings consist of 30 selected research papers based on results presented at the 10th Balkan Conference & 1st International Symposium on Operational Research (BALCOR 2011) held in Thessaloniki, Greece, September 22-24, 2011. BALCOR is an established biennial conference attended by a large number of faculty, researchers and students from the Balkan countries but also from other European and Mediterranean countries as well. Over the past decade, the BALCOR conference has facilitated the exchange of scientific and technical information on the subject of Operations Research and related fields such as Mathematical Programming, Game Theory, Multiple Criteria Decision Analysis, Information Systems, Data Mining and more, in order to promote international scientific cooperation. The carefully selected and refereed papers present important recent developments and modern applications and will serve as excellent reference for students, researchers and practitioners in these disciplines. ​

Inhaltsverzeichnis

Frontmatter

Moderately Exponential Approximation: Bridging the Gap Between Exact Computation and Polynomial Approximation

This paper proposes a way to bring together two seemingly “foreign” domains that are the polynomial approximation and the exact computation for

NP

-hard problems. We show how one can match ideas from both areas in order to design approximation algorithms achieving ratios unachievable in polynomial time (unless a very unlikely complexity conjecture is confirmed) with worst-case complexity much lower (though super-polynomial) than that of an exact computation.

Vangelis Th. Paschos

Multistart Branch and Bound for Large Asymmetric Distance-Constrained Vehicle Routing Problem

In this chapter we revise and modify an old branch-and-bound method for solving the asymmetric distance-constrained vehicle routing problem suggested by Laporte et al. in 1987. It is based on reformulating distance-constrained vehicle routing problem into a travelling salesman problem and use of assignment problem as a lower bounding procedure. In addition, our algorithm uses the best-first strategy and new tolerance-based branching rules. Since our method was fast but memory consuming, it could stop before optimality is proven. Therefore we introduce the randomness, in case of ties, in choosing the node of the search tree. If an optimal solution is not found, we restart our procedure. In that way we get multistart branch-and-bound method. As far as we know instances we solved exactly (up to 1,000 customers) are much larger than instances considered for other VRP models from the recent literature. So, despite its simplicity, this proposed algorithm is capable of solving the largest instances ever solved in the literature. Moreover, this approach is general and may be used in solving other types of vehicle routing problems.

Samira Almoustafa, Said Hanafi, Nenad Mladenović

On a Relationship Between Graph Realizability and Distance Matrix Completion

We consider a certain subclass of Henneberg-type edge-weighted graphs which is related to protein structure, and discuss an algorithmic relationship between the

Distance Geometry Problem

and the

Euclidean Distance Matrix Completion Problem

.

Leo Liberti, Carlile Lavor

Effect Oriented Planning of Joint Attacks

We consider tactical planning of a military operation on a large target scene where a number of specific targets of interest are positioned, using a given number of resources which can be, for example, fighter aircraft, unmanned aerial vehicles, or missiles. The targets could be radar stations or other surveillance equipment, with or without defensive capabilities, which the attacker wishes to destroy. Further, some of the targets are defended, by, for example, Surface-to-Air Missile units, and this defense capability can be used to protect also other targets. The attacker has knowledge about the positions of all the targets and also a reward associated with each target. We consider the problem of the attacker, who has the objective to maximize the expected outcome of a joint attack against the enemy. The decisions that can be taken by the attacker concern the allocation of the resources to the targets and what tactics to use against each target. We present a mathematical model for the attacker’s problem. The model is similar to a generalized assignment problem, but with a complex objective function that makes it intractable for large problem instances. We present approximate models that can be used to provide upper and lower bounds on the optimal value, and also provide heuristic solution approaches that are able to successfully provide near-optimal solutions to a number of scenarios.

Nils-Hassan Quttineh, Torbjörn Larsson, Kristian Lundberg, Kaj Holmberg

Competitive Multilevel Capacity Allocation

In this paper we consider a supply chain where the purchasing behavior of final users of the product influences the decisions that are made. We particularly examine the effects of customers’ competition for the offered service level on the facility location decisions. We consider two types of decision makers, the producer who tries to provide at facilities the best level of service at minimum cost and the customers who make their choices in order to minimize their perceived costs. We consider first the case where customers are assigned for service to the facilities by the producer. In such a case the producer could be considered as monopolist who dominates and tries, through the facility location decisions, to ensure the best, at his opinion, service level at minimum cost. Then we suppose that the customers are involved in a Nash type game in their effort to ensure the best level of services for themselves, i.e. we assume that they are involved in an oligopsony. In order to take into consideration the effects of this competition to the facilities location decisions we formulated the problem as a bilevel programming model. Next, we suppose that there are two producers operating in the network, who constitute a duopoly. The producers compete with each other with respect to the service level they offer in order to attract customers. We propose a bilevel model with two leaders in order to take into account both the competition between producers and the competition among customers.

A. Karakitsiou

A Hybrid Particle Swarm Optimization Algorithm for the Permutation Flowshop Scheduling Problem

This paper introduces a new hybrid algorithmic nature inspired approach based on Particle Swarm Optimization, for successfully solving one of the most computationally complex problems, the Permutation Flowshop Scheduling Problem. The Permutation Flowshop Scheduling Problem (PFSP) belongs to the class of combinatorial optimization problems characterized as NP-hard and, thus, heuristic and metaheuristic techniques have been used in order to find high quality solutions in reasonable computational time. The proposed algorithm for the solution of the PFSP, the Hybrid Particle Swarm Optimization (HybPSO), combines a Particle Swarm Optimization (PSO) Algorithm, the Variable Neighborhood Search (VNS) Strategy and a Path Relinking (PR) Strategy. In order to test the effectiveness and the efficiency of the proposed method we use a set of benchmark instances of different sizes.

Yannis Marinakis, Magdalene Marinaki

Optimization Over Stochastic Integer Efficient Set

In this paper we study the problem of optimizing a linear function over an integer efficient solution set of a Multiple objective Stochastic Integer Linear Programming problem (MOSILP). Once the problem is converted into a deterministic one by adapting the 2-levels recourse approach, a new pivoting technique is applied to generate an optimal efficient solution without having to enumerate all of them. This method combines two techniques, the L-Shaped method and the combined method developed in [Kall, Stochastic Linear Programming (1976)]. A detailed didactic example is given to illustrate different steps of our algorithm.

Djamal Chaabane, Fatma Mebrek

Open-Pit Mining with Uncertainty: A Conditional Value-at-Risk Approach

The selection of a mine design is based on estimating net present values of all possible, technically feasible mine plans so as to select the one with the maximum value. It is a hard task to know with certainty the quantity and quality of ore in the ground. This geological uncertainty and also the future market behavior of metal prices and foreign exchange rates, which are always uncertain, make mining a high risk business. Value-at-Risk (VaR) is a measure that is used in financial decisions to minimize the loss caused by inadequate monitoring of risk. This measure does, however, have certain drawbacks such as lack of consistency, nonconvexity, and nondifferentiability. Rockafellar and Uryasev [J. Risk 2, 21–41 (2000)] introduce the Conditional Value-at-Risk (CVaR) measure as an alternative to the VaR measure. The CVaR measure gives rise to a convex optimization problem. An optimization model that maximizes expected return while minimizing risk is important for the mining sector as this will help make better decisions on the blocks of ore to mine at a particular point in time. We present a CVaR approach to the uncertainty involved in open-pit mining. We formulate investment and design models for the open-pit mine and also give a nested pit scheduling model based on CVaR. Several numerical results based on our models are presented by using scenarios from simulated geological and market uncertainties.

Henry Amankwah, Torbjörn Larsson, Björn Textorius

Incidence Graphs of Bipartite G-Graphs

Defining graphs from groups is a widely studied area motivated, for example, by communication networks. The most popular graphs defined by a group are Cayley graphs.

G

-graphs correspond to an alternative construction. After recalling the main properties of these graphs and their motivation, we propose a characterization result. With the help of this result, we show that the incidence graph of a symmetric bipartite

G

-graph is also a

G

-graph and we give a proof that, with some constraints, if the incidence graph of a symmetric bipartite graph is

G

-graph, the graph is also a

G

-graph. Using these results, we give an alternative proof for the fact that mesh of

d

-ary trees are

G

-graphs.

Cerasela Tanasescu, Ruxandra Marinescu-Ghemeci, Alain Bretto

A Tight Bound on the Worst-Case Number of Comparisons for Floyd’s Heap Construction Algorithm

In this paper a tight bound on the worst-case number of comparisons for Floyd’s well-known heap construction algorithm is derived.

1

It is shown that at most

$$2n - 2\mu (n) - \sigma (n)$$

comparisons are executed in the worst case, where μ(

n

) is the number of ones and σ(

n

) is the number of zeros after the last one in the binary representation of the number of keys

n

.

Ioannis Paparrizos

A Parallel Implementation of the Revised Simplex Algorithm Using OpenMP: Some Preliminary Results

Linear Programming (LP) is a significant research area in the field of operations research. The simplex algorithm is the most widely used method for solving Linear Programming problems (LPs). The aim of this paper is to present a parallel implementation of the revised simplex algorithm. Our parallel implementation focuses on the reduction of the time taken to perform the basis inverse, due to the fact that the total computational effort of an iteration of simplex type algorithms is dominated by this computation. This inverse does not have to be computed from scratch at any iteration. In this paper, we compute the basis inverse with two well-known updating schemes: (1) The Product Form of the Inverse (PFI) and (2) A Modification of the Product Form of the Inverse (MPFI); and incorporate them with revised simplex algorithm. Apart from the parallel implementation, this paper presents a computational study that shows the speedup among the serial and the parallel implementations in large-scale LPs. Computational results with a set of benchmark problems from Netlib, including some infeasible ones, are also presented. The parallelism is achieved using OpenMP in a shared memory multiprocessor architecture.

Nikolaos Ploskas, Nikolaos Samaras, Konstantinos Margaritis

Maximum Induced Matchings in Grids

An induced matching in a graph is a matching such that no two edges are joined by an edge of

G

. For a connected graph

G

, denote by

i

μ(

G

) the maximum cardinality of an induced matching in

G

. In this paper we study the proble of finding a maximum induced matching in grid graphs with

n

lines and

m

columns—

G

n

,

m

, and determine the exact value for

i

μ(

G

n

,

m

) when

n

or

m

are even.

Ruxandra Marinescu-Ghemeci

Determining the Minimum Number of Warehouses and their Space-Size for Storing Compatible Items

We present an exact procedure for determining the smallest number of necessary warehouses and their space-size for storing compatible items. The required floor space housing of every item is known. The method developed here refers to store compatible items in the same warehouse in order to diminish the maximum necessary space-size of every warehouse and consequently to the determination of the minimum number of needed warehouses. The problem is formulated in the context of graph theory. Compatible items stored in the same warehouse are the elements of a color class of a specific coloring of a weighted conflict graph

G

= (

V

,

E

,

W

), where the vertices of

V

represent the items to be stored and all couples of non-compatible items define the edge set

E

. The elements of

W

are the numbers assigned to the vertices of

V

that express the required storing space of every corresponding item. That is the problem is reduced to find a coloring of

G

that correspond to an optimal solution.

Dimitra Alexiou, Stefanos Katsavounis

Duality for Multiple Objective Fractional Programming with Generalized Type-I Univexity

In this paper, a multiobjective fractional subset programming problem (Problem (P)) is considered. A new class of

$$\left(\mathcal{F},b,\phi,\rho,\theta \right)$$

-type-I univex function is introduced and a general dual model for (P) is presented. Based on these functions, weak, strong and converse duality theorems are derived. Almost all results presented in the literature were obtained under the assumption that the function

is sublinear in the third argument. Here, our results hold assuming only the convexity of this one.

Ioan M. Stancu-Minasian, Andreea Mădălina Stancu

A Markov-Based Decision Model of Tax Evasion for Risk-Averse Firms in Greece

We develop a Markov-based optimization model that captures the process via which a risk-averse firm in Greece decides whether to engage in tax evasion. The firm seeks to maximize the expected utility of its wealth, the latter viewed as a function of the portion of profits which the firm attempts to conceal from the government. Our model takes into account the basic features of the Greek tax system, including random audits and tax penalties applied when the audit reveals any wrongdoing. The proposed model is used to (1) show that the parameters currently in place are conducive to tax evasion and (2) “chart” the problem’s parameter space in order to identify “virtuous” combinations (from the point of view of the government), and obtain a relationship between audit probability, tax penalty and likelihood of the firm engaging in tax evasion.

Nikolaos D. Goumagias, Dimitrios Hristu-Varsakelis

Stochastic Decentralized Control of a Platoon of Vehicles Based on the Inclusion Principle

In this paper the Stochastic Inclusion Principle is applied to decentralized Linear Quadratic Gaussian (LQG) suboptimal longitudinal control design of a platoon of automotive vehicles. Starting from a stochastic linearized platoon state model, input/state overlapping subsystems are identified and extracted after an adequate expansion. An algorithm for approximate LQG optimization of these subsystems is developed in accordance with their hierarchical lower-block-triangular (LBT) structure. Vehicle controllers obtained after contraction, which leaves the local Kalman filters uncontracted, provide high performance tracking and noise immunity. Simulation results illustrate characteristic properties of the proposed algorithm.

Srdjan S. Stanković, Milorad J. Stanojević, Dragoslav D. Šiljak

Homogeneous and Non-homogeneous Algorithms

Motivated by recent best case analyses for some sorting algorithms and based on the type of complexity we partition the algorithms into two classes:

homogeneous

and

non-homogeneous

algorithms.

1

Although both classes contain algorithms with worst and best cases, homogeneous algorithms behave uniformly on all instances. This partition clarifies in a completely mathematical way the previously mentioned terms and reveals that in classifying an algorithm as homogeneous or not best case analysis is equally important with worst case analysis.

Ioannis Paparrizos

Service Quality Evaluation in the Tourism Industry: A SWOT Analysis Approach

The quality evaluation of the provided tourism services constitutes the most important issue for the viability of this particular sector and the improvement of the total tourism product. This paper presents the results of a tourist satisfaction survey that took place in the island of Mykonos during the period of May–September 2009. The final sample consists of 1,026 questionnaires that were distributed to Greek and foreign tourists during their departure from the island (harbor and airport). The main objective of this paper is to evaluate tourists’ satisfaction and identify the strong and the weak points of the tourism services offered. These results may help the development of a strategic plan for the quality improvement of the overall tourism product. Beyond descriptive statistical techniques, the analysis of the collected data is based on the multicriteria method MUSA. The method is able to combine satisfaction importance and performance results and provides a SWOT (Strengths-Weaknesses-Opportunities-Threats) analysis for the whole set of the tourist satisfaction criteria. The presented analytical results reveal that the main strong points of the offered tourist product are the fame and the natural beauties of the island, as well as the high level of expenses. On the other hand, the most important weak points concern the small duration of stay, as well as the low level of satisfaction in specific service quality criteria (local transports, information, and environment).

Marianna Tsitsiloni, Evangelos Grigoroudis, Constantin Zopounidis

Correcting Certain Estimation Methods for the Generalized Pareto Distribution

Generalized Pareto distributions (GPD) are widely used for modeling excesses over high thresholds. When its shape parameter is positive, the GPD has a finite upper bound that is a function of the distribution parameters. A well-known problem when estimating GPD parameters is inconsistency with the sample data, which is that one or more sample observations exceed the estimated upper bound. This paper proposes a new, general technique to overcome the inconsistency problem and improve performance of the existing GPD estimation methods. The technique is successfully applied to method-of-moments and method-of-probability-weighted-moments estimates, and, due to its flexibility, can be also applied to other estimation methods and distributions.

Jelena Jocković

Consistent Sequences of Tests Defined by Bans

Finite probability spaces are important in such problems of operation research as data mining, computer simulation, network and computer security, cryptography and many others. We consider complexity of testing a simple hypothesis

H

0,

n

against complex alternative

H

1,

n

in finite models. The way to make calculation of tests simpler is to build critical sets dependent on smallest bans (the shortest vectors, which have probability zero). We prove necessary and sufficient conditions when consistent sequence of statistical tests exists and all critical sets of the tests are defined by smallest bans. Existence of such sequences of tests is equivalent to existence of strictly consistent sequence of tests.

Alexander Grusho, Elena Timonina

Impact Assessment Through Collaborative Asset Modeling: The STORM-RM Approach

Existing Risk Management (RM) methodologies are mainly expert driven and require a large number of interviews with the security experts, which makes rather inefficient to take into account the knowledge from all the organization’s participants. In this paper we extend the STORM-RM multi-criteria group decision-making methodology. More specifically, we propose specific asset and user models, which make use of the AHP multi-criteria decision-making methodology in order to identify the organization’s assets and calculate their potential security impacts.

Theodoros Ntouskas, Panayiotis Kotzanikolaou, Nineta Polemi

Testing the Homoskedasticity/Heteroskedasticity of the Errors Using the White Test: Pattern Classification by k-Variances and Informational Criteria

In this paper we will test the homoskedasticity/heteroskedasticity of the errors for a linear regression model using the White homoskedasticity test. In the case of heteroskedasticity we use the

k

-variances algorithm to classify the data such that all the classes are homoskedastic. The informational criteria analogues to the Akaike and Schwartz criteria are used to choose the best classification.

Daniel Ciuiu

An Innovative Decision Making e-key Application For the Identification of Fish Species

The most important tool for ichthyologists, as well as biologists, fishery biologists and other relevant scientists is an identification key, that is an information system providing them the capability to identify specimens accurately or to find information on correct names, biology and distribution of species. Dichotomous identification keys organize fishes based on their similarities and differences. This research work focuses on the development and implementation of a new innovative information system which is able to identify correctly fish species. The developed system is a fully interactive fish identification e-key which can be used in both forms; locally and remotely via Internet, and more specifically the Telnet service. This new dichotomous classification e-key provides the capability to identify any species in a compact and easy-to-use environment which gives the user excellent operation capabilities and complete information about all included fish species. Moreover, the application provides the capability to search for a sporadic fish species and to show a list which includes all the fish species that exist to the application’s database until that time.

George Minos, Vassilis Kostoglou, Emmanouil Tolis

Primal-Dual Algorithms for P ∗(κ) Linear Complementarity Problems Based on Kernel-Function with Trigonometric Barrier Term

Recently, El Ghami et al. [Journal of Computational and Applied Mathematics, May, 2011, doi:10.1016/j.cam.2011.05.036.] investigated a new kernel function which differs from the self-regular kernel functions. The kernel function has a trigonometric Barrier Term. In this paper we generalize the analysis presented in the above paper for

P

(κ) Linear Complementarity Problems (LCPs). It is shown that the interior-point methods based on this function for large-update methods, the iteration bound is improved significantly. For small-update interior point methods the iteration bound is the best currently known bound for primal-dual interior point methods. The analysis for LCPs deviates significantly from the analysis for linear optimization. Several new tools and techniques are derived in this paper.

Mohamed El Ghami

An Approximation Algorithm for the Three Depots Hamiltonian Path Problem

In this paper, we propose an approximation algorithm for solving the three depots Hamiltonian path problem (3DHPP). The problem studied can be viewed as a variant of the well-known Hamiltonian path problem with multiple depots (cf., Demange [Mathématiques et Informatique, Gazette, 102 (2004)] and Malik et al. [Oper. Res. Lett.

35

, 747–753 (2007)]). For the 3DHPP, we show the existence of a

$$\frac{3} {2}$$

-approximation algorithm for a broad family of metric cases which also guarantees a ratio

r

< 2 in the general metric case. The proposed algorithm is mainly based on extending the construction scheme already used by Rathinam et al. [Oper. Res. Lett.

38

, 63–68 (2010)]. The aforementioned result is established for a variant of the three-depot problem, that is, when costs are symmetric and satisfy the triangle inequality.

Aristotelis Giannakos, M’hand Hifi, Rezika Kheffache, Rachid Ouafi
Weitere Informationen

Premium Partner

Stellmach & BröckersBBL | Bernsau BrockdorffMaturus Finance GmbHPlutahww hermann wienberg wilhelm

BranchenIndex Online

Die B2B-Firmensuche für Industrie und Wirtschaft: Kostenfrei in Firmenprofilen nach Lieferanten, Herstellern, Dienstleistern und Händlern recherchieren.

Whitepaper

- ANZEIGE -

Künstliche Intelligenz und der Faktor Arbeit - Implikationen für Unternehmen und Wirtschaftspolitik

Künstliche Intelligenz und ihre Auswirkung auf den Faktor Arbeit ist zum Modethema in der wirtschaftswissenschaftlichen Politikberatung avanciert. Studien, die alarmistisch die baldige Verdrängung eines Großteils konventioneller Jobprofile beschwören, leiden jedoch unter fragwürdiger Datenqualität und Methodik. Die Unternehmensperspektive zeigt, dass der Wandel der Arbeitswelt durch künstliche Intelligenz weitaus langsamer und weniger disruptiv ablaufen wird. Jetzt gratis downloaden!

Bildnachweise