Skip to main content
Top

2010 | Book

Algorithmic Aspects in Information and Management

6th International Conference, AAIM 2010, Weihai, China, July 19-21, 2010. Proceedings

insite
SEARCH

About this book

While the areas of information management and management science are full of algorithmic challenges, the proliferation of data has called for the design of e?cient and e?ective algorithms and data structures for their management and processing. The International Conference on Algorithmic Aspects in Information and Management(AAIM) is intended for originalalgorithmicresearchon immediate applications and/or fundamental problems pertinent to information mana- ment and management science to be broadly construed. The conference aims at bringing together researchers in computer science, operations research, applied mathematics, economics, and related disciplines. This volume contains papers presented at AAIM 2010: the 6th International Conference on Algorithmic Aspects in Information and Management, which was held during July 19-21, 2010, in Weihai, China. We received a total of 50 s- missions.Eachsubmissionwasreviewedbythreemembersof the ProgramC- mittee or their deputies on the quality, originality, soundness, and signi?cance of its contribution. The committee decided to accept 31 papers. The program also included two invited keynote talks. The success of the conference resulted from the input of many people. We would like ?rst of all to thank all the members of the Program Committee for their expert evaluation of the submissions. The local organizers in the School of Computer Science and Technology, Shandong University, did an extraordinary job, for which we are very grateful. We thank the National Natural Science Foundation of China, Montana State University (USA), University of Warwick (UK), and Shandong University (China) for their sponsorship.

Table of Contents

Frontmatter
Comparison of Two Algorithms for Computing Page Importance

In this paper we discuss the relation and the difference between two algorithms BrowseRank and PageRank. We analyze their stationary distributions by the ergodic theory of Markov processes. We compare in detail the link graph used in PageRank and the user browsing graph used in BrowseRank. Along with the comparison, the importance of the metadata contained in the user browsing graph is explored.

Yuting Liu, Zhi-Ming Ma
The Invisible Hand for Risk Averse Investment in Electricity Generation

We consider a perfectly competitive situation consisting of an electricity market (2nd stage) preceded by investment in generating plant capacity (1st stage). The second stage environment is uncertain at the time of investment, hence the first stage also involves trading in financial instruments, eg, hedges against high generation costs due to rising fuel costs.

The classical Invisible Hand says that if generators and consumers act in their own best interests, the result will be to minimize the net cost (or max net welfare) of the system. This holds true in the stochastic risk neutral case, when a probability distribution of future events is known and used by all generators to evaluate their investment strategies (via two stage stochastic programming with recourse).

Motivated by energy developments in the European Union, our interest is the case when electricity generators are risk averse, and the cost of future production is assessed via coherent risk measures instead of expectations. This results in a new kind of stochastic equilibrium framework in which (risk neutral) probability distributions are endogenous can only be found at equilibrium.

Our main result is that if there are enough financial products to cover every future situation, ie, the financial market is complete, then the Invisible Hand remains in force: system equilibrium is equivalent to system optimization in risk averse investment equilibria. Some practical implications will be discussed.

Daniel Ralph, Yves Smeers
Efficient Algorithms for the Prize Collecting Steiner Tree Problems with Interval Data

Given a graph

G

 = (

V

,

E

) with a cost on each edge in

E

and a prize at each vertex in

V

, and a target set

V

′ ⊆ 

V

, the Prize Collecting Steiner Tree (PCST) problem is to find a tree

T

interconnecting vertices in

V

′ that has minimum total costs on edges and maximum total prizes at vertices in

T

. This problem is NP-hard in general, and it is polynomial-time solvable when graphs

G

are restricted to 2-trees. In this paper, we study how to deal with PCST problem with uncertain costs and prizes. We assume that edge

e

could be included in

T

by paying cost

$x_e\in[c_e^-,c_e^+]$

while taking risk

$\frac{ c_e^+-x_e}{ c_e^+-c_e^-}$

of losing

e

, and vertex

v

could be awarded prize

$p_v\in [p_v^-,p_v^+]$

while taking risk

$\frac{ y_v-p_v^-}{p_v^+-p_v^-}$

of losing the prize. We establish two risk models for the PCST problem, one minimizing the maximum risk over edges and vertices in

T

and the other minimizing the sum of risks. Both models are subject to upper bounds on the budget for constructing a tree. We propose two polynomial-time algorithms for these problems on 2-trees, respectively. Our study shows that the risk models have advantages over the tradional robust optimization model, which yields NP-hard problems even if the original optimization problems are polynomial-time solvable.

E. Álvarez-Miranda, A. Candia, X. Chen, X. Hu, B. Li
The (K,k)-Capacitated Spanning Tree Problem

This paper considers a generalization of the capacitated spanning tree, in which some of the nodes have capacity

K

, and the others have capacity

k

 < 

K

. We prove that the problem can be approximated within a constant factor, and present better approximations when

k

is 1 or 2.

Esther M. Arkin, Nili Guttmann-Beck, Refael Hassin
Optimal Algorithms for the Economic Lot-Sizing Problem with Multi-supplier

This paper considers the economic lot-sizing problem with multi-supplier in which the retailer may replenish his inventory from several suppliers. Each supplier is characterized by one of two types of order cost structures: incremental quantity discount cost structure and multiple set-ups cost structure. The problem is challenging due to the mix of different cost structures. By analyzing the optimal properties, we reduce the searching range of the optimal solutions and develop several optimal algorithms to solve all cases of this multi-supplier problem.

Qing-Guo Bai, Jian-Teng Xu
Synthetic Road Networks

The availability of large graphs that represent huge road networks has led to a vast amount of experimental research that has been custom-tailored for road networks. There are two primary reasons to investigate graph-generators that construct synthetic graphs similar to real-world road-networks: The wish to theoretically explain noticeable experimental results on these networks and to overcome the commercial nature of most datasets that limits scientific use. This is the first work that experimentally evaluates the practical applicability of such generators. To this end we propose a new generator and review the only existing one (which until now has not been tested experimentally). Both generators are examined concerning structural properties and algorithmic behavior. Although both generators prove to be reasonably good models, our new generator outperforms the existing one with respect to both structural properties and algorithmic behavior.

Reinhard Bauer, Marcus Krug, Sascha Meinert, Dorothea Wagner
Computing Exact and Approximate Nash Equilibria in 2-Player Games

The problem of computing a Nash equilibrium in a normal form 2-player game (or bimatrix games) is PPAD-complete in general, while it can be efficiently solved in a special subclass which we call regular bimatrix games. The current best approximation algorithm, proposed in [19], achieves a guarantee of 0.3393. In this paper we design a polynomial time algorithm for computing exact and approximate Nash equilibria for bimatrix games. The novelty of this contribution is twofold. For regular bimatrix games, it allows to compute equilibria whose payoffs optimize any objective function and meet any set of constraints which can be expressed through linear programming, while, in the general case, it computes

α

-approximate Nash equilibria, where

α

is the maximum difference between any two payoffs in the same strategy of any player. Hence, our algorithm improves the best know approximation guarantee for the bimatrices in which

α

< 0.3393.

Vittorio Bilò, Angelo Fanelli
Where Would Refinancing Preferences Go?

We study the relation between the non-tradable shares reform and the refinancing preferences. From the viewpoints of change in market and policy environments led by the reform, we find that right issues dominate before the reform, however, public offerings (including private placement) dominate after reform, which could be attributed to more money encirclement induced by the shift of the public offering mechanism from in discount to in premium after reform and no requirements for large shareholders’ participation commitments in public offerings.

Yajun Chai, Bo Liu
Approximating Maximum Edge 2-Coloring in Simple Graphs

We present a polynomial-time approximation algorithm for legally coloring as many edges of a given simple graph as possible using two colors. It achieves an approximation ratio of roughly 0.842 and runs in

O

(

n

3

m

) time, where

n

(respectively,

m

) is the number of vertices (respectively, edges) in the input graph. The previously best ratio achieved by a polynomial-time approximation algorithm was

$\frac{5}{6}\approx 0.833$

.

Zhi-Zhong Chen, Sayuri Konno, Yuki Matsushita
A Linear Kernel for Co-Path/Cycle Packing

Bounded-Degree Vertex Deletion is a fundamental problem in graph theory that has new applications in computational biology. In this paper, we address a special case of Bounded-Degree Vertex Deletion, the Co-Path/Cycle Packing problem, which asks to delete as few vertices as possible such that the graph of the remaining (residual) vertices is composed of disjoint paths and simple cycles. The problem falls into the well-known class of ’node-deletion problems with hereditary properties’, is hence NP-complete and unlikely to admit a polynomial time approximation algorithm with approximation factor smaller than 2. In the framework of parameterized complexity, we present a kernelization algorithm that produces a kernel with at most 37

k

vertices, improving on the super-linear kernel of Fellows

et al.

’s general theorem for Bounded-Degree Vertex Deletion. Using this kernel,and the method of bounded search trees, we devise an FPT algorithm that runs in time

O

*

(3.24

k

). On the negative side, we show that the problem is APX-hard and unlikely to have a kernel smaller than 2

k

by a reduction from Vertex Cover.

Zhi-Zhong Chen, Michael Fellows, Bin Fu, Haitao Jiang, Yang Liu, Lusheng Wang, Binhai Zhu
A VaR Algorithm for Warrants Portfolio

Based on Gamma Vega-Cornish Fish methodology, this paper propose the algorithm for calculating VaR via adjusting the quantile under the given confidence level using the four moments (e.g. mean, variance, skewness and kurtosis) of the warrants portfolio return and estimating the variance of portfolio by EWMA methodology. Meanwhile, the proposed algorithm considers the attenuation of the effect of history return on portfolio return of future days. Empirical study shows that, comparing with Gamma-Cornish Fish method and standard normal method, the VaR calculated by Gamma Vega-Cornish Fish can improve the effectiveness of forecasting the portfolio risk by virture of considering the Gamma risk and the Vega risk of the warrants. The significance test is conducted on the calculation results by employing two-tailed test developed by Kupiec. Test results show that the calculated VaRs of the warrants portfolio all pass the significance test under the significance level of 5%.

Jun Dai, Liyun Ni, Xiangrong Wang, Weizhong Chen
Some Results on Incremental Vertex Cover Problem

In the classical

k

-vertex cover problem, we wish to find a minimum weight set of vertices that covers at least

k

edges. In the incremental version of the

k

-vertex cover problem, we wish to find a sequence of vertices, such that if we choose the smallest prefix of vertices in the sequence that covers at least

k

edges, this solution is close in value to that of the optimal

k

-vertex cover solution. The maximum ratio is called

competitive ratio

. Previously the known upper bound of competitive ratio was 4

α

, where

α

is the approximation ratio of the

k

-vertex cover problem. And the known lower bound was 1.36 unless

P

 = 

NP

, or 2 − 

ε

for any constant

ε

assuming the Unique Game Conjecture. In this paper we present some new results for this problem. Firstly we prove that, without any computational complexity assumption, the lower bound of competitive ratio of incremental vertex cover problem is

φ

, where

$\phi=\frac{\sqrt{5}+1}{2}\approx 1.618$

is the golden ratio. We then consider the restricted versions where

k

is restricted to one of two given values(Named 2-IVC problem) and one of three given values(Named 3-IVC problem). For 2-IVC problem, we give an algorithm to prove that the competitive ratio is at most

φα

. This incremental algorithm is also optimal for 2-IVC problem if we are permitted to use non-polynomial time. For the 3-IVC problem, we give an incremental algorithm with ratio factor

$(1+\sqrt{2})\alpha$

.

Wenqiang Dai
Finding Good Tours for Huge Euclidean TSP Instances by Iterative Backbone Contraction

This paper presents an iterative, highly parallelizable approach to find good tours for very large instances of the Euclidian version of the well-known Traveling Salesman Problem (TSP). The basic idea of the approach consists of iteratively transforming the TSP instance to another one with smaller size by contracting pseudo backbone edges. The iteration is stopped, if the new TSP instance is small enough for directly applying an exact algorithm or an efficient TSP heuristic. The pseudo backbone edges of each iteration are computed by a window based technique in which the TSP instance is tiled in

non-disjoint

sub-instances. For each of these sub-instances a good tour is computed, independently of the other sub-instances. An edge which is contained in the computed tour of

every

sub-instance (of the current iteration) containing this edge is denoted to be a pseudo backbone edge. Paths of pseudo-backbone edges are contracted to single edges which are fixed during the subsequent process.

Christian Ernst, Changxing Dong, Gerold Jäger, Dirk Richter, Paul Molitor
Point Location in the Continuous-Time Moving Network

We discuss two variations of the moving network Voronoi diagram. The first one addresses the following problem: given a network with

n

vertices and

E

edges. Suppose there are

m

sites (cars, postmen,

etc

) moving along the network edges and we know their moving trajectories with time information. Which site is the nearest one to a point

p

located on network edge at time

t

′? We present an algorithm to answer this query in

O

(log(

mW

log

m

)) time with

O

(

nmW

log

2

m

 + 

n

2

log

n

 + 

nE

) time and

O

(

nmW

log

m

 + 

E

) space for preprocessing step, where

E

is the number of edges of the network graph (the definition of

W

is in section 3). The second variation views query point

p

as a customer with walking speed

v

. The question is which site he can catch the first? We can answer this query in

O

(

m

 + log(

mW

log

m

)) time with same preprocessing time and space as the first case. If the customer is located at some node, then the query can be answered in

O

(log(

mW

log

m

)) time.

Chenglin Fan, Jun Luo
Coordinated Scheduling of Production and Delivery with Production Window and Delivery Capacity Constraints

This paper considers the coordinated production and delivery scheduling problem. We have a planning horizon consisting of

z

delivery times each with a unique delivery capacity. Suppose we have a set of jobs each with a committed delivery time, processing time, production window, and profit. The company can earn the profit if the job is produced in its production window and delivered before its committed delivery time. From the company point of view, we are interested in picking a subset of jobs to process and deliver so as to maximize the total profit subject to the delivery capacity constraint. We consider both the single delivery time case and the multiple delivery times case.

Suppose the given set of jobs are

k

-disjoint, that is, the jobs can be partitioned into

k

lists of jobs such that the jobs in each list have disjoint production windows. When

k

is a constant, we developed a PTAS for the single delivery case. For multiple delivery times case, we also develop a PTAS when the number of delivery times is a constant as well.

Bin Fu, Yumei Huo, Hairong Zhao
Inverse 1-median Problem on Trees under Weighted l  ∞  Norm

The inverse 1-median problem is concerned with modifying the weights of the customers at minimum cost such that a prespecified supplier becomes the 1-median of modified location problem. We first present the model of inverse 1-median problem on trees. Then we propose two algorithms to solve the problem under weighted

l

 ∞ 

norm with bound constraints on modifications. Based on the approach of the unbounded case, we devise a greedy-type algorithm which runs in

O

(

n

2

) time, where

n

is the number of vertices. Based on the property of the optimal solution, we propose an

O

(

n

log

n

) time algorithm using the binary search.

Xiucui Guan, Binwu Zhang
On the Approximability of the Vertex Cover and Related Problems

In this paper we show that the problem of identifying an edge (

i

,

j

) in a graph

G

such that there exists an optimal vertex cover

S

of

G

containing exactly one of the nodes

i

and

j

is NP-hard. Such an edge is called a weak edge. We then develop a polynomial time approximation algorithm for the vertex cover problem with performance guarantee

$2-\frac{1}{1+\sigma}$

, where

σ

is an upper bound on a measure related to a weak edge of a graph. Further, we discuss a new relaxation of the vertex cover problem which is used in our approximation algorithm to obtain smaller values of

σ

.

Qiaoming Han, Abraham P. Punnen
Feasibility Testing for Dial-a-Ride Problems

Hunsaker and Savelsbergh have proposed an algorithm for testing feasibility of a route in the solution to the dial-a-ride problem. The constraints that are checked are load capacity constraints, time windows, ride time bounds and wait time bounds. The algorithm has linear running time. By virtue of a simple example, we show in this work that their algorithm is incorrect. We also prove that by increasing the time complexity by only a logarithmic factor, a correct algorithm is obtained.

Dag Haugland, Sin C. Ho
Indexing Similar DNA Sequences

To study the genetic variations of a species, one basic operation is to search for occurrences of patterns in a large number of very similar genomic sequences. To build an indexing data structure on the concatenation of all sequences may require a lot of memory. In this paper, we propose a new scheme to index highly similar sequences by taking advantage of the similarity among the sequences. To store

r

sequences with

k

common segments, our index requires only

O

(

n

 + 

N

log

N

) bits of memory, where

n

is the total length of the common segments and

N

is the total length of the distinct regions in all texts. The total length of all sequences is

rn

 + 

N

, and any scheme to store these sequences requires Ω(

n

 + 

N

) bits. Searching for a pattern

P

of length

m

takes

O

(

m

 + 

m

log

N

 + 

m

log(

rk

)

psc

(

P

) + 

occ

log

n

), where

psc

(

P

) is the number of prefixes of

P

that appear as a suffix of some common segments and

occ

is the number of occurrences of

P

in all sequences. In practice,

rk

 ≤ 

N

, and

psc

(

P

) is usually a small constant. We have implemented our solution and evaluated our solution using real DNA sequences. The experiments show that the memory requirement of our solution is much less than that required by BWT built on the concatenation of all sequences. When compared to the other existing solution (RLCSA), we use less memory with faster searching time.

Songbo Huang, T. W. Lam, W. K. Sung, S. L. Tam, S. M. Yiu
Online Scheduling on Two Uniform Machines to Minimize the Makespan with a Periodic Availability Constraint

We consider the problem of online scheduling on 2 uniform machines where one machine is periodically unavailable. The problem is online in the sense that when a job presents, we have to assign it to one of the 2 uniform machines before the next one is seen. Preemption is not allowed. The objective is to minimize makespan. Assume that the speed of the periodically unavailable machine is normalized to 1, while the speed of the other one is

s

. Given a constant number

α

> 0, we also suppose that

T

u

 = 

αT

a

, where

T

u

and

T

a

are the length of each unavailable time period and the length of the time interval between two consecutive unavailable time periods, respectively. In the case where

s

 ≥ 1, we show a lower bound of the competitive ratio

$1+\frac{1}{s}$

and prove that

LS

algorithm is optimal. We also show that for the problem

P

2,

M

1

PU

|

online

,

T

u

 = 

αT

a

|

C

max

,

LS

algorithm proposed in [7] is optimal with a competitive ratio 2. After that, we give some lower bounds of competitive ratio in the case 0 < 

s

 < 1. At last, we study a special case

P

2,

M

1

PU

|

online

,

T

u

 = 

αT

a

,

non

 − 

increasing

sequence

|

C

max

, where non-increasing sequence means that jobs arrive in a non-increasing order of their processing times. We show that

LS

algorithm is optimal with a competitive ratio

$\frac{3}{2}$

.

Ming Liu, Chengbin Chu, Yinfeng Xu, Lu Wang
A New Smoothing Newton Method for Symmetric Cone Complementarity Problems

Based on a new smoothing function, a smoothing Newton-type method is proposed for the solution of symmetric cone complementarity problems (SCCP). The proposed algorithm solves only one linear system of equations and performs only one line search at each iteration. Moreover, it does neither have restrictions on its starting point nor need additional computation which keep the iteration sequence staying in the given neighborhood. Finally, the global and Q-quadratical convergence is shown. Numerical results suggest that the method is effective.

Lixia Liu, Sanyang Liu
Approximation Algorithms for Scheduling with a Variable Machine Maintenance

In this paper, we investigate the problem of scheduling weighted jobs on a single machine with a maintenance whose starting time is prior to a given deadline and whose duration is a nondecreasing function of the starting time. We are asked not only to schedule the jobs but also the maintenance such that the total weighted job completion time is minimum. The problem is shown to be weakly NP-hard. In the case that the duration of the maintenance is a concave (and nondecreasing) function of its starting time, we provide two approximation algorithms with approximation ratio of 2 and at most

$1+\sqrt{2}/2+\epsilon$

, respectively.

Wenchang Luo, Lin Chen, Guochuan Zhang
Bounded Parallel-Batch Scheduling on Unrelated Parallel Machines

In this paper, we consider the bounded parallel-batch scheduling problem on unrelated parallel machines. Problems

R

m

|

B

|

F

are NP-hard for any objective function

F

. For this reason, we discuss the special case with

p

ij

 = 

p

i

for

i

 = 1, 2, ⋯ ,

m

,

j

 = 1, 2, ⋯ ,

n

. We give optimal algorithms for the general scheduling to minimize total weighted completion time, makespan and the number of tardy jobs. And we design pseudo-polynomial time algorithms for the case with rejection penalty to minimize the makespan and the total weighted completion time plus the total penalty of the rejected jobs, respectively.

Cuixia Miao, Yuzhong Zhang, Chengfei Wang
Exact Algorithms for Coloring Graphs While Avoiding Monochromatic Cycles

We consider the problem of deciding whether a given directed graph can be vertex partitioned into two acyclic subgraphs. Applications of this problem include testing rationality of collective consumption behavior, a subject in micro-economics. We identify classes of directed graphs for which the problem is easy and prove that the existence of a constant factor approximation algorithm is unlikely for an optimization version which maximizes the number of vertices that can be colored using two colors while avoiding monochromatic cycles. We present three exact algorithms, namely an integer-programming algorithm based on cycle identification, a backtracking algorithm, and a branch-and-check algorithm. We compare these three algorithms both on real-life instances and on randomly generated graphs. We find that for the latter set of graphs, every algorithm solves instances of considerable size within few seconds; however, the CPU time of the integer-programming algorithm increases with the number of vertices in the graph while that of the two other procedures does not. For every algorithm, we also study empirically the transition from a high to a low probability of YES answer as function of a parameter of the problem. For real-life instances, the integer-programming algorithm fails to solve the largest instance after one hour while the other two algorithms solve it in about ten minutes.

Fabrice Talla Nobibon, Cor Hurkens, Roel Leus, Frits C. R. Spieksma
Randomized Approaches for Nearest Neighbor Search in Metric Space When Computing the Pairwise Distance Is Extremely Expensive

Finding the closest object for a query in a database is a classical problem in computer science. For some modern biological applications, computing the similarity between two objects might be very time consuming. For example, it takes a long time to compute the edit distance between two whole chromosomes and the alignment cost of two 3D protein structures. In this paper, we study the nearest neighbor search problem in metric space, where the pair-wise distance between two objects in the database is known and we want to minimize the number of distances computed on-line between the query and objects in the database in order to find the closest object. We have designed two randomized approaches for indexing metric space databases, where objects are purely described by their distances with each other. Analysis and experiments show that our approaches only need to compute

O

(log

n

) objects in order to find the closest object, where

n

is the total number of objects in the database.

Lusheng Wang, Yong Yang, Guohui Lin
A Primal-Dual Approximation Algorithm for the k-Level Stochastic Facility Location Problem

We present a

combinatorial

primal-dual 7-approximation algorithm for the

k

-level stochastic facility location problem, the stochastic counterpart of the standard

k

-level facility location problem. This approximation ratio is slightly worse than that of the primal-dual 6-approximation for the standard

k

-level facility location problem [3] because of the extra stochastic assumption. This new result complements the recent

non-combinatorial

3-approximation algorithm for the same problem by Wang et al [21].

Zhen Wang, Donglei Du, Dachuan Xu
Optimal Semi-online Scheduling Algorithms on Two Parallel Identical Machines under a Grade of Service Provision

This paper investigates semi-online scheduling problems on two parallel identical machines under a grade of service (GoS) provision. We consider two different semi-online versions where the optimal offline value of the instance is known in advance or the largest processing time of all jobs is known in advance. Respectively for two semi-online problems, we develop algorithms with competitive ratios of 3/2 and

$(\sqrt{5}+1)/2$

, which are shown to be optimal.

Yong Wu, Qifan Yang
Varieties of Regularities in Weighted Sequences

A weighted sequence is a string in which a set of characters may appear at each position with respective probabilities of occurrence. A common task is to identify repetitive motifs in weighted sequences, with presence probability not less than a given threshold. We consider the problems of finding varieties of regularities in a weighted sequence. Based on the algorithms for computing all the repeats of every length by using an iterative partitioning technique, we also tackle the all-covers problem and all-seeds problem. Both problems can be solved in

O

(

n

2

) time.

Hui Zhang, Qing Guo, Costas S. Iliopoulos
Online Uniformly Inserting Points on Grid

In this paper, we consider the problem of inserting points in a square grid, which has many background applications, including halftone in reprographic and image processing. We consider an online version of this problem, i.e., the points are inserted one at a time. The objective is to distribute the points as uniformly as possible. Precisely speaking, after each insertion, the gap ratio should be as small as possible. In this paper, we give an insertion strategy with a maximal gap ratio no more than

$2\sqrt{2}\approx 2.828$

, which is the first result on uniformly inserting point in a grid. Moreover, we show that no online algorithm can achieve the maximal gap ratio strictly less than 2.5 for a 3×3 grid.

Yong Zhang, Zhuo Chang, Francis Y. L. Chin, Hing-Fung Ting, Yung H. Tsin
Kernelization for Cycle Transversal Problems

We present new kernelization results for the

s

-cycle transversal

problem for

s

 > 3. In particular, we show a 6

k

2

kernel for

4-cycle transversal

and a

O

(

k

s

 − 1

) kernel for

s

-cycle transversal

when

s

 > 4. We prove the NP-completeness of

s

-cycle transversal

on planar graphs and obtain a 74

k

kernel for

4-cycle transversal

on planar graphs. We also give several kernelization results for a related problem ( ≤ 

s

)

-cycle transversal

.

Ge Xia, Yong Zhang
Online Splitting Interval Scheduling on m Identical Machines

This paper investigates online scheduling on

m

identical machines with

splitting intervals

, i.e., intervals can be split into pieces arbitrarily and processed simultaneously on different machines. The objective is to maximize the throughput, i.e., the total length of satisfied intervals. Intervals arrive over time and the knowledge of them becomes known upon their arrivals. The decision on splitting and assignment for each interval is made irrecoverably upon its arrival. We first show that any non-split online algorithms cannot have bounded competitive ratios if the ratio of longest to shortest interval length is unbounded. Our main result is giving an online algorithm

ES

(for Equivalent Split) which has competitive ratio of 2 and

$\frac{2m-1}{m-1}$

for

m

 = 2 and

m

 ≥ 3, respectively. We further present a lower bound of

$\frac{m}{m-1}$

, implying that

ES

is optimal as

m

 = 2.

Feifeng Zheng, Bo Liu, Yinfeng Xu, E. Zhang
Extended Tabu Search on Fuzzy Traveling Salesman Problem in Multi-criteria Analysis

The paper proposes an extended tabu search algorithm for the traveling salesman problem (TSP) with fuzzy edge weights. The algorithm considers three important fuzzy ranking criteria including expected value, optimistic value and pessimistic value, and performs a three-stage search towards the Pareto front, involving a preferred criterion at each stage. Simulations demonstrate that our approach can produce a set of near optimal solutions for fuzzy TSP instances with up to 750 uniformly randomly generated nodes.

Yujun Zheng
Efficient Exact and Approximate Algorithms for the Complement of Maximal Strip Recovery

Given two genomic maps

G

and

H

represented by a sequence of

n

gene markers, a

strip

(syntenic block) is a sequence of distinct markers of length at least two which appear as subsequences in the input maps, either directly or in reversed and negated form. The problem

Maximal Strip Recovery

(MSR) is to find two subsequences

G

′ and

H

′ of

G

and

H

, respectively, such that the total length of disjoint strips in

G

′ and

H

′ is maximized (i.e., conversely, the complement of the problem CMSR is to minimize the number of markers deleted to have a feasible solution). Recently, both MSR and its complement are shown to be NP-complete. A factor-4 approximation is known for the MSR problem and an FPT algorithm is known for the CMSR problem which runs in

O

(2

3.61

k

n

 + 

n

2

) time (where

k

is the minimum number of markers deleted). We show in this paper that there is a factor-3 asymptotic approximation for CMSR and there is an FPT algorithm which runs in

O

(3

k

n

 + 

n

2

) time for CMSR, significantly improving the previous bound.

Binhai Zhu
Backmatter
Metadata
Title
Algorithmic Aspects in Information and Management
Editor
Bo Chen
Copyright Year
2010
Publisher
Springer Berlin Heidelberg
Electronic ISBN
978-3-642-14355-7
Print ISBN
978-3-642-14354-0
DOI
https://doi.org/10.1007/978-3-642-14355-7

Premium Partner