Next Article in Journal
Effective Optimisation of the Patient Circuits of an Oncology Day Hospital: Mathematical Programming Models and Case Study
Previous Article in Journal
Software Reliability Modeling Incorporating Fault Detection and Fault Correction Processes with Testing Coverage and Fault Amount Dependency
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Combinatorial 2-Approximation Algorithm for the Parallel-Machine Scheduling with Release Times and Submodular Penalties

1
School of Mathematics and Statistics, Yunnan University, Kunming 650500, China
2
School of Information Science and Engineering, Yunnan University, Kunming 650500, China
3
School of Electronic Engineering and Computer Science, Peking University, Beijing 100871, China
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Mathematics 2022, 10(1), 61; https://doi.org/10.3390/math10010061
Submission received: 16 November 2021 / Revised: 11 December 2021 / Accepted: 13 December 2021 / Published: 25 December 2021

Abstract

:
In this paper, we consider parallel-machine scheduling with release times and submodular penalties ( P | r j , r e j e c t | C max + π ( R ) ), in which each job can be accepted and processed on one of m identical parallel machines or rejected, but a penalty must paid if a job is rejected. Each job has a release time and a processing time, and the job can not be processed before its release time. The objective of P | r j , r e j e c t | C max + π ( R ) is to minimize the makespan of the accepted jobs plus the penalty of the rejected jobs, where the penalty is determined by a submodular function. This problem generalizes a multiprocessor scheduling problem with rejection, the parallel-machine scheduling with submodular penalties, and the single machine scheduling problem with release dates and submodular rejection penalties. In this paper, inspired by the primal-dual method, we present a combinatorial 2-approximation algorithm to P | r j , r e j e c t | C max + π ( R ) . This ratio coincides with the best known ratio for the parallel-machine scheduling with submodular penalties and the single machine scheduling problem with release dates and submodular rejection penalties.

1. Introduction

All jobs must be accepted and processed in classical scheduling problems [1,2,3,4]. However, to gain more profit, we can reject some jobs that have a larger processing time and result in smaller profits. Bartal et al. [5] first addressed the multiprocessor scheduling problem with rejection (MSR), in which the jobs can be rejected and a penalty must paid for each rejected job. The objective is to minimize the makespan of the accepted jobs plus the total penalty of the rejected jobs. For the MSR, Bartal et al. [5] proposed a 2-approximation algorithm in time O ( n log n ) and a polynomial-time approximation scheme (PTAS). Later, Ou et al. [6] improved a (3/2+ ε )-approximation algorithm in time O ( n log n + n ε ) , where ε > 0 can be any small given constant.
Variants of the MSR have been studied extensively [7,8,9]. Zhang et al. [7] considered single machine scheduling with release dates and rejection, where jobs cannot be processed before their corresponding release dates. The objective is to minimize the makespan of the accepted jobs plus the total penalty of the rejected jobs. They proved that this problem is N P -hard, and presented a 2-approximation algorithm and a fully polynomial-time approximation scheme (FPTAS). Zhong et al. [10] considered two parallel-machine scheduling with release dates and rejection, and presented a (3/2+ ε )-approximation algorithm with time complexity O ( ( n ε ) 2 ) , where ε is any given small positive constant. Zhang and Lu [8] considered parallel-machine scheduling with release dates and rejection, and presented a 2-approximation algorithm. In particular, when m is a fixed constant, Zhang and Lu [8] designed an FPTAS.
A set function f ( · ) of J is a mapping from all subsets of J to real numbers, i.e., f ( · ) : 2 J R . A set function f ( · ) is submodular if it satisfies f ( X Y ) + f ( X Y ) f ( X ) + f ( Y ) , X , Y J , which has the property of decreasing marginal return. Recently, submodular functions have played a key role in the field of combinatorial optimization [9,11,12,13]. Liu and Li [14] considered parallel-machine scheduling with submodular penalties and proposed a ( 2 1 m ) -approximation algorithm based on the greedy method and list scheduling algorithm. Zhang et al. [15] considered precedence-constrained scheduling with submodular rejection on parallel machines, and proposed a 3-approximation algorithms. Based on the primal-dual method, Liu and Li presented a 2-approximation algorithm for [16] single machine scheduling with release dates and submodular rejection penalty. More related results can be found in the surveys [17,18,19,20,21,22,23].
Motivated by the optimization problems mentioned-above, we consider parallel-machine scheduling with release times and submodular penalties ( P | r j , r e j e c t |   C max + π ( R ) ), which is defined as follows.
Given a set J = { J 1 , J 2 , , J n } of n jobs and a set M = { M 1 , M 2 , , M m } of m parallel machines, each job J j J has a processing time p j ( 0 ) and a release time r j ( 0 ) , where the job can be processed at or after its release time, without loss of generality, we assume that min j : J j J r j = 0 . For the penalty submodular function π ( · ) : 2 J R 0 , without loss of generality, we assume that π ( ) = 0 . The P | r j , r e j e c t | C max + π ( R ) is to find a rejected set R, The objective is to minimize the makespan of the accepted jobs J \ R plus the penalty of R, where the penalty is determined by penalty submodular function π ( · ) .
Clearly, if r j = 0 for J j J , the P | r j , r e j e c t | C max + π ( R ) problem is exactly the parallel-machine scheduling with submodular penalties considered in [14]; If the rejection cost function is linear, the P | r j , r e j e c t | C max + π ( R ) problem is exactly the parallel-machine scheduling with penalties considered in [9]; If m = 1 , the P | r j , r e j e c t | C max + π ( R ) problem is exactly the single machine scheduling problem with release dates and submodular rejection penalty considered in [16].
A difficulty of implementing the algorithm presented in [9] on the P | r j , r e j e c t | C max + π ( R ) problem is that the release time of the jobs is different and the jobs cannot be processed immediately in the given order. In order to overcome this problem, using the traversal method, we determine the set of jobs with designated release time and unify the releasing time of the other jobs. Then, in this paper, we present a combinatorial 2-approximation algorithm for P | r j , r e j e c t |   C max + π ( R ) . This ratio coincides with the best known ratio for the parallel-machine scheduling with submodular penalties and the single machine scheduling problem with release dates and submodular rejection penalties.
The structure of this paper is organized as follows. In Section 2, we present some terminologies and fundamental lemmas. In Section 3, we provide the 2- approximation algorithm for the P | r j , r e j e c t | C max + π ( R ) . In Section 4, we present our conclusions.

2. Terminologies and Key Lemmas

Zhang and Lu [8] showed that the P | r j | C max problem can be solved by the earliest release date (ERD)-rule. That is, whenever some machine is idle and some job is available, process the unscheduled job with the ERD-rule. Thus, we have the following lemma.
Lemma 1.
For, P | r j , r e j e c t | C max + π ( R ) , there exists an optimal schedule such that the accepted jobs are processed in the ERD-rule on each machine.
For convenience, for any j { 1 , 2 , , n } , write u j = r j + p j and let
U j = { J i J | u i > u j }
be the set of jobs with release time plus processing time larger than u j . Correspondingly, let B j be the set of jobs such that
B j U j , and π ( B j ) is minimized .
Then, the following lemma is obtained.
Lemma 2.
For any j { 1 , 2 , , n } , B j can be computed in polynomial time.
Proof. 
For each j { 1 , 2 , , n } , we can find the set U j in polynomial time. Obviously, if π ( · ) is monotonically nondecreasing, then B j = U j and the lemma holds.
Otherwise, we construct an auxiliary set function w ( · ) defined on all subsets of J \ U j as follows:
w ( X ) = π ( X U j ) π ( U j ) , X J \ U j .
By the submodularity of π ( · ) , for any two subsets X 1 , X 2 J \ U j , we have
w ( X 1 ) + w ( X 2 ) = π ( X 1 U j ) π ( U j ) + π ( X 2 U j ) π ( U j ) π ( ( X 1 U j ) ( X 2 U j ) ) + π ( ( X 1 U j ) ( X 2 U j ) ) 2 π ( U j ) = π ( ( X 1 X 2 ) U j ) π ( U j ) + π ( ( X 1 X 2 ) U j ) π ( U j ) = w ( X 1 X 2 ) + w ( X 1 X 2 ) .
This implies that w ( · ) is a submodular function. Thus, X = arg min X : X J \ U j { w ( X ) } can be computed within polynomial time using the method in [24].Therefore, for any X J \ U j , we have
w ( X ) w ( X ) π ( X U j ) π ( U j ) π ( X U j ) π ( U j ) π ( B j ) π ( X U j ) ,
where B j = X U j . Thus, this lemma holds.    □
Let σ * be the optimal schedule and let R * be the rejected job set of σ * . Write A * = J \ R * , u j * = max { u j | J j A * } and Z * = C m a x ( σ * ) + π ( R * ) , where C m a x ( σ * ) is the makespan of σ * . Then, we have the following.
Lemma 3.
There exists an optimal schedule σ * that satisfies B j * R * .
Proof. 
Let u j * = max { u j | J j J \ R * } and U j * = { J i J | u i > u j * } , then we have
U j * R * .
By Lemma 2, B j * can be found in polynomial time, where B j * is the set with minimum penalty satisfied U j * B j * . This implies that
π ( B j * ) π ( R * B j * )
by U j * R * B j * . Since π ( · ) is a submodular function, we have π ( R * ) + π ( B j * ) π ( R * B j * ) + π ( R * B j * ) , i.e.,
π ( R * ) π ( R * B j * ) .
Notably, assuming B j * \ R * , we prove that there exists an optimal schedule σ , in which all the jobs in R * B j * are rejected.
Because the process time of any job is nonnegative, it follows that we can schedule all the jobs in A * \ B j * by schedule σ * . This implies that the makespan C m a x ( σ ) of the jobs in A * \ B j * is no more than C m a x ( σ * ) . Thus, we have
Z * = C m a x ( σ * ) + π ( R * ) C m a x ( σ ) + π ( R * B j * ) .
Therefore, σ is an optimal schedule and this lemma holds.   □

3. Approximation Algorithm

In this section, we consider the problem P | r j , r e j e c t | C max + π ( R ) and propose a 2-approximation algorithm.
For each j { 1 , 2 , , n } , we introduce an auxiliary variable α j , which is similar to the dual variable in the primal-dual method.
Lemma 4.
Algorithm 1 can be implemented in polynomial time.
Algorithm 1: A
Mathematics 10 00061 i001
Proof. 
Clearly, B k can be found in polynomial time by Lemma 2 for any k { 1 , 2 , , n } . Then, we consider the implementation of the while loops of Algorithm 1. Let α j ( t ) and F ( t ) be the value of dual variable of job J j and the set of frozen jobs after the t- t h execution of while loops of Algorithm 1, respectively. For convenience, we define α j ( 0 ) = 0 , J j J and F ( 0 ) = B k , R k ( 0 ) = B k . Note that the while loops of Algorithm 1 need to execute at most n times for any k { 1 , 2 , . n } .
For any t- t h ( t 1 ) execution of the while loops of Algorithm 1, it is obvious that Δ 1 ( t ) = min j : J j J \ F ( t 1 ) { p j m } can be found in polynomial time for t { 1 , 2 , , n } .
Write
Δ 2 ( t ) = min S J : S \ F ( t 1 ) π ( S B k ) π ( B k ) j : J j S F ( t 1 ) α j ( t 1 ) j : J j S \ F ( t 1 ) 1 = min S J : S \ F ( t 1 ) w ( S ) + α ( S ) k ( S ) ,
where we define w ( S ) = π ( S B k ) - π ( B k ) , α ( S ) = j : J j S F ( t 1 ) ( α j ( t 1 ) ) and k ( S ) = j : J j S \ F ( t 1 ) 1 for any subset S J . Similar to the proof of Lemma 2, we have that w ( · ) is a submodular function. In particular, we can obtain that w ( · ) + α ( · ) is a submodular function because α ( · ) and k ( · ) are linear functions. Then, using the combinatorial algorithm for the ratio of two submodular functions minimization problem considered in [25], the value of Δ 2 ( t ) can be found in polynomial time. Thus, S ( t ) = arg min S : S J , S \ F ( t 1 ) { π ( S B k ) π ( B k ) j : J j S F ( t ) α j ( t 1 ) j : J j S \ F ( t 1 ) 1 } can be found in polynomial time.
Therefore, the lemma holds. □
For any k { 1 , 2 , , n } , let σ k be a feasible schedule in Algorithm 1 and let R k be the rejected set of σ k . In addition, let α j be the value of dual variables when job J j is frozen. Then, we have the following results because α j = 0 , J j B k and the other α j ( J j J \ B k ) can be obtained by the value min { Δ 1 , Δ 2 } .
α j = p j m for each job J j J \ R k ; α j p j m for each job J j J ;
Additionally, we suppose that J j is frozen at the t ¯ - t h execution of the while loops of Algorithm 1. Then, during the t- t h implementation of the while loops of Algorithm 1, we have
α j = 0 , if t < t ¯ α j ( t ) = α j , if t t ¯ .
Moreover, we have the following results.
Lemma 5.
For any k { 1 , 2 , , n } and any S J , we have
j : J j S α j + π ( B k ) π ( S B k ) .
Proof. 
For any k { 1 , 2 , , n } , we consider the t- t h implementation of the while loops of Algorithm 1. We suppose that the number of the while loops of Algorithm 1 is T k , i.e., F ( T k ) = J .
For any t { 1 , 2 , , T k } , assume that J j is added to F ( t ) , we have
α j ( t ) = min { Δ 1 ( t ) , Δ 2 ( t ) } Δ 2 ( t ) ,
and, for each subset S J with S \ F ( t 1 ) , we have the following
j : J j S α j ( t ) + π ( B k ) = j : J j S \ F ( t 1 ) α j ( t ) + j : J j S F ( t 1 ) α j ( t ) + π ( B k ) j : J j S \ F ( t 1 ) Δ 2 ( t ) + j : J j S F ( t 1 ) α j ( t ) + π ( B k ) = Δ 2 ( t ) j : J j S \ F ( t 1 ) 1 + j : J j S F ( t 1 ) α j ( t ) + π ( B k ) π ( S B k ) π ( B k ) j : J j S F ( t 1 ) α j ( t ) j : J j S \ F ( t 1 ) 1 j : J j S \ F ( t 1 ) 1 + j : J j S F ( t 1 ) α j ( t ) + π ( B k ) = π ( S B k ) ,
where the first inequality follows from relation (3) and inequality (5), i.e., α j ( t ) = 0 for J j S \ F ( t ) and α j ( t ) = Δ 2 ( t ) for J j F ( t ) \ F ( t 1 ) , and the second inequality follows from the definition of Δ 2 ( t ) . We reach the conclusion of this lemma. □
Lemma 6.
For any k { 1 , 2 , , n } , the rejected job set R k satisfies
π ( R k ) = j : J j R k α j + π ( B k ) .
Proof. 
For any k { 1 , 2 , , n } , we consider the t- t h implementation of the loops of Algorithm 1. Then, during the t - t h and t - t h   ( t t ) implementation of the loops of Algorithm 1, we assume that Δ 2 ( t ) < Δ 1 ( t ) , Δ 2 ( t ) < Δ 1 ( t ) ,let S ( t ) and S ( t ) be the selected job sets in J at time t and t , respectively. Thus, we have
j : J j S ( t ) α j ( t ) + π ( B k ) = j : J j S ( t ) \ F ( t ) α j ( t ) + j : J j S ( t ) F ( t ) α j ( t ) + π ( B k ) ; = j : J j S ( t ) \ F ( t ) Δ 2 ( t ) + j : J j S ( t ) F ( t ) α j ( t ) + π ( B k ) ; = π ( S ( t ) B k ) π ( B k ) j : J j S F ( t ) α j ( t ) j : J j S \ F ( t ) 1 j : J j S ( t ) \ F ( t ) 1 + j : J j S ( t ) F ( t ) α j ( t ) + π ( B k ) ; = π ( S ( t ) B k ) ,
where B k can be found by Algorithm 1, and then π ( B k ) is a constant. Similarly, we have
j : J j S ( t ) α j ( t ) + π ( B k ) = π ( S ( t ) B k ) .
For any job J j S ( t ) , since J j S ( t ) is frozen at t (or even earlier), we have α j ( t ) = α j ( t ) by relation (3), and
j : J j S ( t ) S ( t ) α j ( t ) + π ( B k ) + j : J j S ( t ) S ( t ) α j ( t ) + π ( B k ) = j : J j S ( t ) α j ( t ) + π ( B k ) + j : J j S ( t ) α j ( t ) + π ( B k ) = π ( S ( t ) B k ) + π ( S ( t ) B k ) π ( ( S ( t ) S ( t ) ) B k ) + π ( ( S ( t ) S ( t ) ) B k ) π ( ( S ( t ) S ( t ) ) B k ) + j : J j S ( t ) S ( t ) α j ( t ) + π ( B k ) ,
where the first inequality comes from the submodularity of π ( · ) , and the second inequality follows by inequality (6).
This implies that
j : J j S ( t ) S ( t ) α j ( t ) + π ( B k ) π ( ( S ( t ) S ( t ) ) B k ) .
Inequality (6) and relation (3) indicate that
j : J j S ( t ) S ( t ) α j + π ( B k ) = π ( ( S ( t ) S ( t ) ) B k ) .
Therefore, we have π ( R k ) = j : J j R k α j + π ( B k ) because R k is equal to merging all subsets selected by this similar case, and then the lemma holds. □
Let σ be the schedule obtained from Algorithm 1. Let Z and Z * be the objective value of schedule σ and optimal schedule σ * , respectively. Then, we have the following.
Theorem 1.
Z 2 Z * and this bound is tight.
Proof. 
Let C m a x ( σ * ) be the makespan of σ * and let R * be the rejected job of σ * ; then, the objective value of σ * is
Z * = C m a x ( σ * ) + π ( R * ) .
Let u j * = max { u j | J j J \ R * } , by the pigeonhole principle, we have
C m a x ( σ * ) J j A * p j m ,
where A * = J \ R * . By Lemma 3, without loss of generality, we assume that
B j * R * .
Consider k = j * during the implementation of Algorithm 1. We define σ j * and C max ( σ j * ) as the output schedule and the makespan of this schedule. Let R j * be the rejected job set of σ j * and let { α j } J j J be the value of dual variables when all jobs in J are frozen. Then, we have
Z * = C m a x ( σ * ) + π ( R * ) J j A * p j m + π ( R * ) = J j A * p j m + π ( R * B j * ) J j A * p j m + j : J j R * α j + π ( B j * ) = J j A * A j * p j m + J j A * R j * p j m + j : J j R * A j * α j + j : J j R * R j * α j + π ( B j * ) J j A * A j * p j m + J j A * R j * α j + j : J j R * A j * p j m + j : J j R * R j * α j + π ( B j * ) = j : J j A j * p j m + π ( R j * ) ,
where A j * = J \ R j * , the first inequality follows from inequality (8), the second inequality follows from Lemma 5, the third inequality follows from inequality (2), and the last equality follows from Lemma 6.
Let J q A j * be the last completion job, i.e., the completion processed time of J q is C m a x ( σ j * ) . Furthermore, let S q be the starting processing time of J q in σ j * . Thus, we have s q r q and C m a x ( σ j * ) = s q + p q .
By the ERD-rule, if s q > r q , all machines are busy in the time interval [ r q , s q ) . Then, we have
m · ( s q r q ) j : J j A j * p j s q r q j : J j A j * p j m .
Otherwise, for s q = r q , we have s q r q = 0 J j J j A j * p j m by p j 0 for each J j J . Therefore, we have
s q r q J j J j A j * p j m .
Furthermore, we have r q + p q u j * C m a x ( σ * ) because J q A j * and the definition of u j * . Then, summing up the inequalities (9) and (10), we can obtain the following
Z Z j * = C m a x ( σ j * ) + π ( R j * ) = s q + p q + π ( R j * ) = r q + p q + ( s q r q ) + π ( R j * ) C m a x ( σ * ) + J j J j A j * p j m + π ( R j * ) 2 Z * .
To show that the bound is tight, we present an instance with four jobs and two parallel machines:
J 1 = ( r 1 , p 1 ) = ( 0 , 2 ) ; J 2 = ( r 2 , p 2 ) = ( 2 , 1 ) ; J 3 = ( r 3 , p 3 ) = ( 2 , 2 ) ; J 4 = ( r 4 , p 4 ) = ( 4 , 0 ) .
and the submodular function is
π ( S ) = 0 , | S = | = 0 ; 6 , | S | = 1 ; 9 , | S | = 2 ; 11 , | S | = 3 ; 12 , | S = J | = 4 .
By Algorithm 1, when k = 1 , the resulting schedule σ 1 is to reject all the jobs, and Z 1 = π ( J ) = 12 ; when k = 2 or k = 3 , both the resulting schedule σ 2 and σ 3 are to reject J 4 and to process J 1 and J 2 on machine M 1 and process J 3 on machine M 2 , and Z 1 = r 2 ( o r 3 ) + C m a x ( σ 2 ( o r 3 ) ) + π ( { J 4 } ) = 2 + 4 + 6 = 12 ; when k = 4 , the resulting schedule σ 4 is to process J 1 , J 2 and J 4 on machine M 1 and process J 3 on machine M 2 , and Z 1 = r 4 + C m a x ( σ 4 ) = 4 + 4 = 8 . The optimal schedule is to process J 1 , J 2 and J 4 on machine M 1 and process J 3 on machine M 2 , and Z * = 4 . Thus, we have Z + min { Z 1 , Z 2 , Z 3 , Z 4 } = 8 = 2 Z * .
Hence, we reach the conclusion of this theorem. □

4. Conclusions

In this paper, we investigate parallel-machine scheduling with release times and submodular penalties ( P | r j , r e j e c t | C max + π ( R ) ), which is a generalization of parallel-machine scheduling with release times and rejection penalties and single machine scheduling with release dates and submodular penalties. For P | r j , r e j e c t | C max + π ( R ) , we propose a 2-approximation algorithm.
For parallel-machine scheduling with release times and rejection penalties, there exists a PTAS. For P | r j , r e j e c t | C max + π ( R ) , there is a question of whether it is possible to design a PTAS or a further improved algorithm. Furthermore, establishing a better algorithm is an interesting direction for future work.
The vector scheduling problem [19,22,23] is a generalization of parallel machine scheduling, where each job J j is associated with a d-dimensional vector. Thus, the vector parallel-machine scheduling with release times and rejection penalties, which can be viewed as one generalization of the P | r j , r e j e c t | C max + π ( R ) , deserves to be explored. It is possible to design a 2-approximation algorithm, but it is a challenge.
In [26], Liu et al considered a k-prize-collecting cover problem, in which at least k points are covered. The k-prize-collecting scheduling problem with release times and rejection penalties, which can be viewed as another generalization of the P | r j , r e j e c t | C max + π ( R ) , deserves to be explored.

Author Contributions

Conceptualization, W.W. and X.L.; methodology, X.L.; validation, W.W.; formal analysis, W.W.; investigation, X.L.; resources, X.L.; writing—original draft preparation, W.W.; writing—review and editing, X.L.; visualization, W.W.; supervision, X.L.; project administration, X.L. All authors have read and agreed to the published version of the manuscript.

Funding

The work is supported in part by the Project of Yunnan Provincial Department of Education Science Research Fund (NO. 2019Y0022).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Graham, R.L. Bounds on multiprocessing timing anomalies. SIAM J. Appl. Math. 1969, 17, 416–429. [Google Scholar] [CrossRef] [Green Version]
  2. Davis, E.; Jaffe, J.M. Algorithms for scheduling tasks on unrelated processors. J. ACM 1981, 28, 721–736. [Google Scholar] [CrossRef]
  3. Hochbaum, D.S.; Shmoys, D.B. Using dual approximation algorithms for scheduling problems theoretical and practical results. J. ACM 1987, 34, 144–162. [Google Scholar] [CrossRef]
  4. Hochbaum, D.S.; Shmoys, D.B. Approximation algorithms for scheduling unrelated parallel machines. Math. Program. 1990, 46, 259–271. [Google Scholar]
  5. Bartal, Y.; Leonardi, S.; Marchetti-Spaccamela, A.; Sgall, J.; Stougie, L. Multiprocessor scheduling with rejection. SIAM J. Discret. Math. 2000, 13, 64–78. [Google Scholar] [CrossRef]
  6. Ou, J.; Zhong, X.; Wang, G. An improved heuristic for parallel machine scheduling with rejection. Eur. J. Oper. Res. 2015, 241, 653–661. [Google Scholar] [CrossRef]
  7. Zhang, L.; Lu, L.; Yuan, J. Single machine scheduling with release dates and rejection. Eur. J. Oper. Res. 2009, 198, 975–978. [Google Scholar] [CrossRef]
  8. Zhang, L.; Lu, L. Parallel-machine scheduling with release dates and rejection. 4OR-A Q. J. Oper. Res. 2016, 14, 387–406. [Google Scholar] [CrossRef]
  9. Xu, D.; Wang, F.; Du, D.; Wu, C. Approximation algorithms for submodular vertex cover problems with linear/submodular penalties using primal-dual technique. Theor. Comput. Sci. 2016, 630, 117–125. [Google Scholar] [CrossRef]
  10. Zhong, X.; Pan, Z.; Jiang, D. Parallel-machine scheduling with release dates and rejection. J. Comb. Optim. 2016, 33, 934–944. [Google Scholar] [CrossRef]
  11. Fujishige, S. Submodular Functions and Optimization, 2nd ed.; Elsevier: Amsterdam, The Netherlands, 2005. [Google Scholar]
  12. Edmonds, J. Submodular functions, matroids, and certain polyhedra. In Combinatorial Optimization; Jünger, M., Reinelt, G., Rinaldi, G., Eds.; Springer: Berlin/Heidelberg, Germany, 2003. [Google Scholar]
  13. Du, D.; Lu, R.; Xu, D. A Primal-dual approximation algorithm for the facility location problem with submodular penalties. Algorithmica 2012, 63, 191–200. [Google Scholar] [CrossRef]
  14. Liu, X.; Li, W. Approximation algorithms for the submodular load balancing with submodular penalties. Optim. Lett. 2021, 15, 2165–2180. [Google Scholar] [CrossRef]
  15. Zhang, X.; Xu, D.; Du, D.; Wu, C. Approximation algorithms for precedence-constrained identical machine scheduling with rejection. J. Comb. Optim. 2018, 35, 318–330. [Google Scholar] [CrossRef]
  16. Liu, X.; Li, W. Approximation algorithm for the single machine scheduling problem with release dates and submodular rejection penalty. Mathematics 2020, 8, 133. [Google Scholar] [CrossRef] [Green Version]
  17. Liu, X.; Li, W. Combinatorial approximation algorithms for the submodularmulticut problem in trees with submodular penalties. J. Comb. Optim. 2020. [Google Scholar] [CrossRef]
  18. Liu, X.; Xing, P.; Li, W. Approximation algorithms for the submodular load balancing with submodular penalties. Mathematics 2020, 8, 1785. [Google Scholar] [CrossRef]
  19. Liu, X.; Li, W.; Zhu, Y. Single machine vector scheduling with general penalties. Mathematics 2021, 9, 1965. [Google Scholar] [CrossRef]
  20. Guan, L.; Li, W.; Xiao, M. Online algorithms for the mixed ring loading problem with two nodes. Optim. Lett. 2021, 15, 1229–1239. [Google Scholar] [CrossRef]
  21. Li, W.; Li, J.; Zhang, X.; Chen, Z. Penalty cost constrained identical parallel machine scheduling problem. Theor. Comput. Sci. 2015, 607, 181–192. [Google Scholar] [CrossRef]
  22. Li, W.; Cui, Q. Vector scheduling with rejection on a single machine. 4OR-A Q. J. Oper. Res. 2018, 16, 95–104. [Google Scholar] [CrossRef]
  23. Dai, B.; Li, W. Vector scheduling with rejection on two machines. Int. J. Comput. Math. 2020, 97, 2507–2515. [Google Scholar] [CrossRef]
  24. Iwata, S.; Fleischer, L.; Fujishige, S. A combinatorial strongly polynomial algorithm for minimizing submodular functions. J. ACM 2001, 48, 761–777. [Google Scholar] [CrossRef]
  25. Fleischer, L.; Iwata, S. A push-relabel framework for submodular function minimization and applications to parametric optimization. Discret. Appl. Math. 2003, 131, 311–322. [Google Scholar] [CrossRef] [Green Version]
  26. Liu, X.; Li, W.; Xie, R. A primal-dual approximation algorithm for the k-prize-collecting minimum power cover problem. Optim. Lett. 2021. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Wang, W.; Liu, X. A Combinatorial 2-Approximation Algorithm for the Parallel-Machine Scheduling with Release Times and Submodular Penalties. Mathematics 2022, 10, 61. https://doi.org/10.3390/math10010061

AMA Style

Wang W, Liu X. A Combinatorial 2-Approximation Algorithm for the Parallel-Machine Scheduling with Release Times and Submodular Penalties. Mathematics. 2022; 10(1):61. https://doi.org/10.3390/math10010061

Chicago/Turabian Style

Wang, Wencheng, and Xiaofei Liu. 2022. "A Combinatorial 2-Approximation Algorithm for the Parallel-Machine Scheduling with Release Times and Submodular Penalties" Mathematics 10, no. 1: 61. https://doi.org/10.3390/math10010061

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop