Anda di halaman 1dari 19

A contribution to the stochastic ow shop scheduling problem

Michel Gourgand, Nathalie Grangeon


*
, Sylvie Norre
Laboratoire dInformatique, de Mod eelisation et dOptimisation de Syst eemes, Universit ee Blaise Pascal, Clermont Ferrand II,
LIMOS CNRS UMR 6158, BP 10125, 63173 Aubiere Cedex, France
Abstract
This paper deals with performance evaluation and scheduling problems in m machine stochastic ow shop with
unlimited buers. The processing time of each job on each machine is a random variable exponentially distributed with
a known rate. We consider permutation ow shop. The objective is to nd a job schedule which minimizes the expected
makespan. A classication of works about stochastic ow shop with random processing times is rst given. In order to
solve the performance evaluation problem, we propose a recursive algorithm based on a Markov chain to compute the
expected makespan and a discrete event simulation model to evaluate the expected makespan. The recursive algorithm
is a generalization of a method proposed in the literature for the two machine ow shop problem to the m machine ow
shop problem with unlimited buers. In deterministic context, heuristics (like CDS [Management Science 16 (10) (1970)
B630] and Rapid Access [Management Science 23 (11) (1977) 1174]) and metaheuristics (like simulated annealing)
provide good results. We propose to adapt and to test this kind of methods for the stochastic scheduling problem.
Combinations between heuristics or metaheuristics and the performance evaluation models are proposed. One of the
objectives of this paper is to compare the methods together. Our methods are tested on problems from the OR-Library
and give good results: for the two machine problems, we obtain the optimal solution and for the m machine problems,
the methods are mutually validated.
2003 Elsevier B.V. All rights reserved.
Keywords: Scheduling; Stochastic ow shop; Metaheuristics; Markov processes; Simulation
1. Introduction
Industrial systems are subject to random events
which may disturb their working process: machine
failure, operator unavailability, out-of-stock con-
dition, change in availability date and latest com-
pletion time [28]. So, considering a system in a
stochastic context is more realistic than in a de-
terministic one: an optimal solution, obtained
without taking into account random events may
present no interest in a stochastic context. In this
case, the objective is to nd a solution which
minimizes a criterion in expectation or to study
the solution robustness.
Since several years, the deterministic ow shop
scheduling problem has received considerable at-
tention and a lot of papers exist concerning this
*
Corresponding author.
E-mail addresses: gourgand@isima.fr (M. Gourgand), gran-
geon@iris.univ-bpclermont.fr (N. Grangeon), norre@moniut.
univ-bpclermont.fr (S. Norre).
0377-2217/$ - see front matter 2003 Elsevier B.V. All rights reserved.
doi:10.1016/S0377-2217(02)00835-4
European Journal of Operational Research 151 (2003) 415433
www.elsevier.com/locate/dsw
problem. But works remain to be done concerning
the stochastic version of this problem.
We have rst realized a state of the art on the
stochastic ow shop scheduling problem and we
have proposed a classication of works for two
kinds of random events: machine breakdowns and
random processing times [13]. Additional surveys
have been given by [11,26,29].
In stochastic context, there is an underly-
ing problem which is the evaluation of the crite-
rion (for instance expected makespan). In general,
authors do not give any method for criterion
evaluation. We propose a recursive algorithm
based on Markov chains to compute the expected
makespan and a discrete event simulation model
to evaluate the expected makespan. The recursive
algorithm is a generalization of the method pro-
posed by [5] for the two machine ow shop
problem to the m machine ow shop problem with
unlimited buers.
In deterministic context, heuristics (like CDS [4]
and Rapid Access [6]) and metaheuristics
(like simulated annealing) provide good results.
We propose to adapt and to test this kind of
methods for the stochastic scheduling problem.
Combinations between heuristics or metaheuristics
and the performance evaluation models are pro-
posed.
This paper is organized as follows. Section 2
presents the stochastic ow shop scheduling prob-
lem. Section 3 is a state of the art on scheduling and
performance evaluation in a stochastic ow shop
with random processing times. The performance
evaluation problem is tackled in Section 4. Two
approaches are presented: Markovian analysis and
discrete event stochastic simulation. Proposed
scheduling methods are given in Section 5: they are
based on combinations of heuristics or metaheu-
ristics with Markovian model or stochastic simu-
lation model. Section 6 is devoted to results. We
present a comparison between the Markovian
model and the simulation model, then results for
combinations of stochastic descent with perfor-
mance evaluation model for two machine ow
shop problem and a comparison between heuristics
and metaheuristics for m machine ow shop
problem.
2. The stochastic ow shop scheduling problem
In this part, we rst describe the deterministic
permutation ow shop problem and its classical
assumptions, then the stochastic case is presented.
2.1. Deterministic ow shop scheduling problem
A deterministic permutation ow shop system is
composed of a set of m machines and n jobs. The n
jobs are processed by the machines in the same
order (machine 1, machine 2, . . . ,machine m). Each
job i i 1; . . . ; n is processed during a time p
ij
by
each machine j j 1; . . . ; m.
Classical assumptions for the deterministic ow
shop are the following:
job release dates are known,
machines are always available,
processing times are deterministic and indepen-
dent,
setup times and removal times are included in
processing times,
there is no splitting,
transportation times are negligible,
a machine can not process more than one job at
a time,
no job may be processed on more than one ma-
chine at a time,
between two machines, jobs can wait in an un-
limited buer.
The deterministic ow shop scheduling problem
consists in nding a job schedule which minimizes
a criterion. For instance, the criterion may be the
completion time of the last job (makespan), the
total ow time, the tardiness; . . .
States of the art on the literature about deter-
ministic ow shop scheduling can be found for
instance in [12,19,33].
2.2. Stochastic ow shop scheduling problem
In a stochastic context, the three rst classical
assumptions for the deterministic ow shop may
be replaced by the following assumptions:
416 M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433
job release dates are not known in advance,
machines can break down,
processing times are modeled by independent
random variables.
The stochastic ow shop scheduling problem
consists in nding a job schedule which minimizes
a criterion in expectation.
The study, simultaneously, of all types of ran-
dom events is too complex. Indeed, they are not
together treated in the literature. In order to solve
scheduling problem, two approaches are proposed:
static scheduling and dynamic scheduling. When
the scheduling is static, the job schedule is deter-
mined before the beginning of the processing. In
the dynamic approach, the schedule is built at each
event occurring (arrival of a job, end of processing
by a machine, breakdown; . . .).
In this paper, we consider the m machine ow
shop static scheduling problem with job processing
times exponentially distributed: the processing
time p
ij
of job i by machine j follows an expo-
nential distribution function with rate l
ij
given by
F
ij
t Pp
ij
6t 1 e
t=l
ij
; t P0:
The objective is to nd an input job schedule
which minimizes the expected makespan (noted
EC
max
).
3. State of the art
In this part, we present a short state of the art
for the stochastic ow shop scheduling problem
with random processing times. A detailed state of
the art is given in [13] with a state of the art on
scheduling in a ow shop with breakdowns. Fig. 1
presents a classication of works on scheduling in
a ow shop with random processing times ac-
cording to the number of machines (2 or m), the
buer size (0 or unlimited), the distribution func-
tion of the processing times and the studied crite-
rion.
In a two machine ow shop, when processing
times are random variables generally distributed, a
job schedule which minimizes a regular criterion in
expectation is a permutation schedule (same job
schedule on all machines) [5].
Minimizing the makespan in a deterministic
two machine ow shop has a polynomial solution
by using Johnsons algorithm [15]. This algorithm
implements the following rule:
Job i
1
precedes job i
2
if minfp
i
1
1
; p
i
2
2
g 6 minfp
i
1
2
; p
i
2
1
g:
Minimizing the makespan in expectation in a
stochastic two machine ow shop with process-
ing time p
ij
of job i for machine j exponentially
Fig. 1. Classication of works about stochastic ow shop problem (with random processing times) [13].
M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433 417
distributed with rate l
ij
has a similar solution by
using Talwars rule [30]:
Job i
1
precedes job i
2
if l
i
1
1
l
i
2
2
Pl
i
2
1
l
i
1
2
:
This rule is rst presented and demonstrated for
two jobs in [20] by Makino, then Talwar [30]
demonstrates it for three jobs and conjectures that
it is true for n jobs. In [1], Bagga demonstrates the
rule for four jobs. Finally, Cunningham and Dutta
[5] give the complete proof of the optimality of
Talwars rule for n jobs. From the rule, they de-
duce the following theorem proved by Weiss [32]:
Theorem 3.1 [5,32]. Scheduling jobs in decreasing
order of l
i1
l
i2
minimizes EC
max
in a two ma-
chine flow shop with processing times exponentially
distributed with rate l
ij
.
By reformulating Talwars rule, Ku and Niu
[18] remark that both Johnsons and Talwars rules
state that:
Theorem 3.2 [18]. In a two machine flow shop with
processing times p
ij
exponentially distributed, job i
1
precedes job i
2
if and only if Eminfp
i
1
1
; p
i
2
2
g 6
Eminfp
i
2
1
; p
i
1
2
g in order to minimize EC
max
.
Kamburowski presents, in [16], a sucient
condition to minimize stochastically the makespan
in a two machine stochastic ow shop with pro-
cessing times exponentially distributed. An exten-
sion to the three machine ow-shop is proposed
in [17].
In [2], Bagga derives formulas using integral
calculus for computing the EC
max
in a three
machine ow-shop and two jobs. In [5], Cunn-
ingham and Dutta note that there is no method to
evaluate the value of the criterion EC
max
which
depends on random variables. By using Markov
chains and ChapmanKolmogorov equations,
they give an iterative scheme, for the two machine
ow shop, which computes EC
max
when job
processing times are exponentially distributed:
EC
max
l
n;2
b
n1;n
where
a
1;1
1=l
1;1
; b
1;1
1=l
2
1;1
;
a
q;r
1=l
q;1
l
r;2
l
q1;1
a
q1;r
l
r1;2
a
q;r1

b
q;r
1=l
q;1
l
r;2
a
q;r
l
q1;1
b
q1;r
l
r1;2
b
q;r1

_
;
n 1 Pq > r P1;
a
q;q
l
q1;2
=l
q;1
a
q;q1
b
q;q
1=l
q;1
a
q;q
l
q1;2
b
q;q1

_
; n Pq P2;
with l
n1;1
l
0;2
0.
In [22], Mittal and Bagga consider the two
machine ow shop with processing times expo-
nentially distributed. The objective is to determine
a schedule which minimizes EC
max
and which
ensures that the expected completion time of a
job is lower than a given value. They propose an
algorithm which determines such a schedule.
In [9,10], Forst studies the two machine ow
shop problem, with processing times exponentially
distributed and costs. A deferral cost per unit of
time (w
i
) is assigned to each job i: if job i is com-
pleted on machine 2 at time C
i2
, then its cost is
given by w
i
C
i2
. The objective is to determine a job
schedule which minimizes the total expected cost
for all the jobs (noted EC
w
). He proposes a
dominance relation for the problem and domi-
nance relations for three special cases. Dominance
relations applied to a given problem instance may
not yield an optimal schedule since it is unlikely to
obtain a complete job schedule. But, as the num-
ber of dominance relations increases, the number
of schedules which must be considered for opti-
mality decreases. After applying dominance rela-
tions to achieve a partial ordering of the jobs,
implicit enumeration techniques are typically used
in the search for an optimal solution.
In [27], Prasad studies the EC
max
in a two
machine ow shop in which processing times fol-
low a geometric distribution with parameters l
ij
(i 1; . . . ; n; j 1; 2). The following theorem is
given and demonstrated.
Theorem 3.3 [27]. The job schedule given by pro-
cessing the jobs in non-decreasing order of the ratio
1 l
i1
=1 l
i2
minimizes EC
max
in a two ma-
chine flow shop with processing times geometrically
distributed.
Pinedo, in [25] shows that the problem of min-
imizing the expected makespan in a two machine
418 M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433
ow shop without buer is equivalent to an
asymmetric traveling salesman problem: nd a
tour that minimizes the total distance for a trav-
eling salesman where the distance matrix D d
ij

is dened by
d
i
1
i
2
Eminfp
i
2
1
; p
i
1
2
g
_
1
0
F
i
2
1
tF
i
1
2
t dt;
i
1
; i
2
6 0;
d
0i
2
d
i
1
0
0:
Other papers consider the particular case in
which the processing times of a job for all the
machines follow the same distribution function. In
a two machine ow shop, if distribution functions
are stochastically ordered, Pinedo in [25] and Jia in
[14] give a theorem which provides a schedule with
minimum expected makespan. In [8], Foley and
Suresh demonstrate that a SEPT-LEPT schedule
minimizes the EC
max
.
In [25], Pinedo gives a lower bound for the ex-
pected makespan in a m machine ow shop and
denes a schedule with minimum expected make-
span in the case of non overlapping processing
time distribution functions.
Our state of the art is based on the states of
the art of Forst [11], Pinedo [26] and Shaked [29].
We remark that a lot of works exist for the
two machine ow shop scheduling problem and
that few papers about m machine ow shop con-
sider that job processing times on m machines
are independent realizations of the same proba-
bility distribution. The criterion considered is
the expected makespan (except [9,10]). The ma-
jority of works concerns the input job schedule but
gives no method for computing the criterion (ex-
cept [5,25]).
4. Performance evaluation problem
When stochastic events disturb the system,
there is an underlying problem which is the eval-
uation of the criterion. Indeed, in the deterministic
context, the criterion of a schedule depends only
on deterministic values. In the stochastic context,
it depends on a great number of random variables.
In order to solve this performance evaluation
problem, we propose two methods: Markovian
analysis and stochastic discrete event simulation.
4.1. Markovian model
The considered system is a queuing system,
where n already-arrived customers are to be served
by m service channels in series. It can be modeled
as a Markov chain. Solving this Markov chain
(with QNAP2 software for example) gives the ex-
act value of EC
max
. The major disadvantage of
the method is its limitation in terms of computa-
tion time, memory size; . . .
In order to avoid the use of a dedicated soft-
ware (like QNAP2), we have generalized the
method proposed by [5] for the two machine ow
shop problem with unlimited buer to the m ma-
chine ow shop problem. This generalization will
allow to treat larger size problems than with
QNAP2 but will also be limited in terms of num-
ber of jobs and number of machines (which is
natural in Markovian analysis).
Our assumptions are the following:
the buer capacities are unlimited,
the n jobs are numbered according to the lexico-
graphical order x 1; 2; 3; . . . ; i; i 1; . . . ; n,
the processing time of job i by machine j fol-
lows an exponential distribution with rate l
ij
.
We propose to represent the states of the system
by a m-length vector:
~
kk k
1
; k
2
; . . . ; k
m

where
n 1 Pk
1
Pk
2
P Pk
j
Pk
j1
P Pk
m
P1.
In order to simplify the reading, vector
~
kk will be
noted k.
Machine j is processing job number k
j
or is
waiting for job number k
j
:
if k
1
< n 1, then machine 1 is processing job
number k
1
,
if k
j
< k
j1
, j 2; m, then machine j is process-
ing job number k
j
,
if k
j
k
j1
, j 2; m, then machine j is idle and
is waiting for job number k
j
,
M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433 419
if k
j
n 1, j 1; m, then all the jobs have
been processed by machine j.
Fig. 2 represents the state k 9; 8; 8; 5; 3 in a
ow shop with ve machines and eight jobs:
k
1
9 means that there is no more job on ma-
chine 1 (all the jobs are completed by machine
1),
k
2
k
3
8 means that machine 2 is processing
job number 8 and machine 3 is idle, waiting for
job number 8,
k
4
5 < k
3
8 means that machine 4 is pro-
cessing job number 5,
k
5
3 means that machine 5 is processing job
number 3,
the jobs are processed according to the lexico-
graphical order, so we can deduce the jobs num-
ber that are in the buers.
For example:
pk; t where n Pk
1
> k
2
> > k
m
P1 repre-
sents the probability that k
1
is processed by ma-
chine 1, k
2
by machine 2; . . . ; k
m
by machine m
at time t,
pk; t where n Pk
1
> k
2
> > k
j
k
j1

k
h1
k
h
> k
h1
> > k
m
P1 represents
the probability that k
1
is processed by machine
1; . . . ; k
j1
by machine j 1, k
j
by machine j,
k
h1
by machine h 1; . . . ; k
m
by machine m
and machines j 1; . . . ; h are idle at time t,
pk; t where n 1 k
1
k
j
> k
j1
>
> k
m
P1 represents the probability that all
the jobs have been processed by machines
1; . . . ; j and k
j1
is processed by machine
j 1; . . . ; k
m
by machine m at time t,
pk; t where k
1
k
2
k
j
k
j1

k
m
n 1 represents the probability that all
the jobs have been processed by the m machines
at time t.
The number N of states of the system is given by
the formula
N

n
i1
m 1
i m 1
_ _
:
We consider three particular states:
k
1
1; 1; . . . ; 1 where k
j
1, 8j 1, m rep-
resents the initial state of the system. The job
number 1 is processed by the rst machine
and the others machines are idle (Fig. 3),
k
N
n 1; n 1; . . . ; n 1; n where k
j

n 1, 8j 1, m 1 and k
m
n represents the
state where job n is processed by machine m
and others machines are idle (they have pro-
cessed all the jobs) (Fig. 4),
k
N1
n 1; n 1; . . . ; n 1; n 1 where
k
j
n 1, 8j 1, m represents the nal state
of the system: all the jobs have been processed
by the m machines.
Let be ak; j the vector such as
ak; j k
1
; . . . ; k
j1
; k
j
1; k
j1
; . . . ; k
m
:
This vector ak; j is identical to vector k except k
j
which is decremented of 1.
Fig. 2. k 9; 8; 8; 5; 3.
Fig. 3. Vector k
1
.
420 M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433
The set of vectors ak; j, 8j 1; . . . ; m includes
all the previous states of k but contains some states
which have no signicance for our problem i.e. the
relation k
1
Pk
2
P Pk
m
is not satised.
Example 4.1 (for eight jobs and five machines).
k 9; 8; 8; 5; 3
ak; 1 8; 8; 8; 5; 3;
ak; 2 9; 7; 8; 5; 3;
ak; 3 9; 8; 7; 5; 3;
ak; 4 9; 8; 8; 4; 3;
ak; 5 9; 8; 8; 5; 2:
Vectors ak; 1, ak; 3, ak; 4, and ak; 5 are
previous states of state k but ak; 2 does not
correspond to a state of the system.
We dene
sk; j
0 if k
j
k
j1
or 9l > j=k
j
< k
l
or k
j
n 1;
l
k
j
;j
otherwise:
_

_
sk; j is the rate of the processing time of job k
j
for machine j, if k
j
is processed by machine j. Oth-
erwise it is equal to 0, for the following three states:
k
j
k
j1
, k represents a state of the system, but
machine j is not processing job k
j
but is waiting
for it,
9l > jjk
j
< k
l
, k does not represent a state of the
system,
k
j
n 1, k represents a state of the system,
but there is no job on machine j (all the jobs
have been processed by machine j).
Theorem 4.1. In a m machine permutation flow
shop, under the following assumptions:
there is unlimited buffers between the machines,
the n jobs are numbered according to the lexico-
graphical sequence: x 1; 2; . . . ; i; i 1; . . . ; n,
the job processing times are independent and ex-
ponentially distributed: the processing time of job
i i 1; n by machine j (j 1; m) follows an ex-
ponential distribution function with rate l
ij
,
the expected makespan EC
max
is computed by the
formula
EC
max
l
n;m
bk
N

where
ak
1
1=l
1;1
;
bk
1
1=l
2
1;1
;
ak

m
j1
aak; jsak; j; j

m
j1
sk; j
;
bk
ak

m
j1
bak; jsak; j; j

m
j1
sk; j
:
Proof. Byusingthe proposednotation, we canwrite
the associated ChapmanKolmogorov equations:
pk; t dt pk; t 1
_

m
j1
sk; j
_
dt

m
j1
pak; j; tsak; j; j dt
odt: 1
In the above equation, if we let dt ! 0, then in the
limit, we get
p
0
k; t
dpk; t
dt

m
j1
sk; jpk; t

m
j1
pak; j; tsak; j; j: 2
Fig. 4. Vector k
N
.
M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433 421
And we have the following initial conditions:
pk
1
; 0 1;
pk; 0 0 if k 6 k
1
:
In order to solve the above set of dierential
equations, we introduce the LaplaceStieltjes
transform of pk; t:
Lpk; t p

k; h

_
1
0
exphtpk; t dt h > 0: 3
We note that
Lp
0
k; t n
k
hp

k; h 4
where
n
k

1 if k k
1
;
0 if k 6 k
1
:
_
Introducing (2) and combining with (4), we
obtain
h
_

m
j1
sk; j
_
p

k; h

m
j1
p

ak; j; hsak; j; j; 8k 6 k
1
; 5
h l
1;1
p

k
1
; h 1: 6
From the denition of the state probabilities,
pk
N1
; t is the distribution function of the
makespan. Therefore, the corresponding expected
makespan is
EC
max

_
1
0
tp
0
k
N1
; t dt

_
1
0
tl
n;m
pk
N
; t dt
l
n;m
d
dh
p

k
N
; hj
h0
: 7
Let
p

k; h ak bkh Oh
2
8
where ak and bk are constants.
It follows from (7) and (8)
EC
max
l
n;m
bk
N
: 9
Combining (8) with (5) and (6), we get the fol-
lowing sets of relations which evaluate the con-
stants: ak et bk:
h l
1;1
ak
1
bk
1
h Oh
2
1;
h
_

m
j1
sk; j
_
ak bkh Oh
2

m
j1
aak; j bak; jh Oh
2
sak; j; j
and we obtain:
ak
1
1=l
1;1
;
bk
1
1=l
2
1;1
;
10
ak

m
j1
aak; jsak; j; j
_ __

m
j1
sk; j
_ _
;
bk
_
ak

m
j1
bak; jsak; j; j
__

m
j1
sk; j
_ _
:
11
With the help of Eqs. (10) and (11), we can now
evaluate the values of ak and bk for all possible
values of vector k. Then, we can nd bk
N
and
put it into Eq. (9) to get EC
max
for the sequence
x 1; 2; 3; . . . ; i; i 1; . . . ; n.
Note that the expected makespan for any other
sequence x
0
can be directly obtained from the ex-
pression of EC
max
simply by interchanging the
appropriate subscripts.
4.2. Stochastic simulation model
The proposed Markovian model allows us to
compute the expected makespan for the m machine
ow shop with unlimited buers. In the other cases
(large number of states, m machine ow shop with
limited or no buer, other criteria; . . .), we propose
to use a discrete event simulation model to evalu-
ate EHx, where H is the criterion (C
max
for
example) and the corresponding condence inter-
val. The ow shop is modeled as a terminating
system: initial state and nal state correspond to
the same state: no job is being processed. So we
can realize successively several simulations statis-
tically independent, called replications. At each
422 M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433
replication, a sample of jobs processing times is
chosen and the corresponding criterion is com-
puted. The mean of the obtained criteria is an es-
timation of the expected criterion. The number of
replications must be large enough to assure a good
sampling of the stochastic behavior of the model.
In this part, we give the algorithm (Algorithm
1) of the stochastic simulation model for evaluat-
ing EHx for a given schedule x.
We dene:
NbRep: the number of replications,
x
r
: a sample of processing times according to
the distribution functions (r 1; . . . ; NbRep),
x: a schedule,
Hx; x
r
: the criterion corresponding to x ac-
cording to x
r
,
EHx: an estimation of the expected criterion
for x,
CI: the condence interval of the estimation of
the expected criterion.
When the size of the problem allows it, the
Markovian model is used to validate the simula-
tion model (i.e. the exact value of Markovian
model is in the condence interval of simulation
model).
Algorithm 1 (Evaluation of an estimation of
EHx).
1: for r 1; . . . ; NbRep do
2: Choose randomly x
r
3: H
r
: Hx; x
r

4: end for
5: EHx :

NbRep
r1
H
r
=NbRep
6: Compute the condence interval CI
5. Scheduling problem
In order to solve the stochastic ow shop
scheduling problem, we propose to use a combi-
nation between an iterative improvement method
(metaheuristics like simulated annealing) or a
heuristic (like CDS) and a performance evaluation
model (Markovian model or simulation model).
When using an iterative improvement method,
we must compare, at each iteration, two schedules
x and y (the current solution x and a candidate
solution y). For the Markovian model, we only
need to compute EHy and to compare EHx
with EHy EHx is exactly known). For the
simulation model, we compare EHx and
EHy as described in Algorithm 2. The same
principle is used for heuristics.
Algorithm 2 (Evaluation of an estimation of
EHx and EHy).
1: for r 1; . . . ; NbRep do
2: Choose randomly x
r
3: H
x
r
: Hx; x
r

4: H
y
r
: Hy; x
r

5: end for
6: EHx :

NbRep
r1
H
x
r
=NbRep
7: EHy :

NbRep
r1
H
y
r
=NbRep
5.1. Heuristics
In this part, we propose to adapt some existing
heuristics for the deterministic ow shop schedul-
ing problem to the stochastic one.
Proposed heuristics are adaptations of heuristic
CDS proposed by [4] and RA proposed by [6] and
presented in [33]. In deterministic context, heuris-
tics CDS convert, by using an aggregation on the
processing times, a m machine ow shop into
(m 1) two machine ow shop. The two machine
problem is solved by Johnsons rule [15]. In the
stochastic case, we take up this principle by using
the Talwars rule [30].
Two heuristics CDS exist. The rst one (Algo-
rithm 3) consists in the aggregation of the k rst
machines and the k last (k 1; . . . ; m 1). The
second one (Algorithm 4) consists in the aggrega-
tion of the k rst machines and the (m k) last
(k 1; . . . ; m 1).
The eciency of those algorithms in the deter-
ministic context lies on two main points: an in-
tensive use of the Johnsons rule, which is
a polynomial algorithm and the creation of m 1
schedules in which the best one is chosen. In
the stochastic context, these two points subsist.
M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433 423
Indeed, Talwars rule is a polynomial algo-
rithm and the Markovian model provides the
expected makespan for any schedule. However,
using the Markovian model is not always possible
and it may be replaced by a stochastic simulation
model.
Heuristic RA (Algorithm 5) converts, by using
an aggregation on the processing times, a m ma-
chine ow shop in a two machine ow shop. The
aggregation consists in weighting the processing
times. Then the Talwars rule is used to build a
schedule.
Algorithm 3 (Adaptation of CDS1 to the stochastic
context).
1: for k 1; . . . ; m 1 do
2: T
0
i;1

k
j1
1=l
i;j
and T
0
i;2

m
jmk1
1=l
i;j
3: Let x be the schedule obtained by applying
the Talwars rule on the two machine ow
shop with rates 1=T
0
i;1
and 1=T
0
i;2
.
4: if EC
max
x < EC
max
best then
5: best : x
6: end if
7: end for
8: best is the solution of the heuristic
Algorithm 4 (Adaptation of CDS2 to the stochastic
context).
1: for k 1; . . . ; m 1 do
2: T
0
i;1

k
j1
1=l
i;j
and T
0
i;2

m
jk1
1=l
i;j
3: Let x be the schedule obtained by applying
the Talwars rule on the two machine ow
shop with rates 1=T
0
i;1
and 1=T
0
i;2
.
4: if EC
max
x < EC
max
best then
5: best : x
6: end if
7: end for
8: best is the solution of the heuristic
The step 4 of Algorithms 3 and 4 uses Theorem
4.1 for Markovian model or calls Algorithm 1 for
the simulation model.
Algorithm 5 (Adaptation of RA to the stochastic
context).
1: T
0
i;1

m
j1
m j 1 1=l
i;j
2: T
0
i;2

m
j1
j 1=l
i;j
3: Apply the Talwars rule on the two machine
ow shop with rates 1=T
0
i;1
and 1=T
0
i;2
to ob-
tain the solution schedule of the heuristic.
5.2. Metaheuristics
In this part, we propose to combine the per-
formance evaluation model with the following
metaheuristics:
Stochastic descent,
Inhomogeneous simulated annealing,
Kangaroo algorithm.
As we can see in [7,31], metaheuristics based on
simulated annealing have been proved to converge
in probability. Contrary to greedy heuristics, we
have a feasible solution at any time. States of the
art concerning metaheuristics are given in [23,24]
for example.
These methods are based on the notion of
neighboring system. The current state x is modied
by applying a transformation to obtain a new state
y, called neighbor of x. The denition of this
transformation generates a function which asso-
ciates to each state the set of neighbor states. This
function is called a neighboring system.
5.2.1. Combining stochastic descent and perfor-
mance evaluation model
The basic algorithm is the stochastic descent
(Algorithm 6), which accepts the neighbor if its
criterion is equal or better than the criterion of the
current solution. This algorithm allows to nd
generally a local minimum.
Algorithm 6 (Combining stochastic descent and
performance evaluation model ).
1: Let be x an initial solution
2: while necessary do
3: Choose y randomly and uniformly in the
neighborhood V of x
4: if EHy 6EHx then
5: x : y
424 M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433
6: end if
7: end while
8: x is the solution of the metaheuristic
The step 4 uses Theorem 4.1 for the Markovian
model or calls Algorithm 2 for the simulation
model.
The initial solution can be randomly generated
or obtained by a heuristic. While necessary
means: maximum number of iterations reached, or
EHx lower than a given value (for example,
given by the user), or EHx equal to the optimal
solution or to a lower bound.
5.2.2. Combining inhomogeneous simulated anneal-
ing and performance evaluation model
The major disadvantage of stochastic descent is
its convergence towards a local minimum. It is
bypassed by simulated annealing due to the pos-
sibility of accepting with a probability a bad
solution. We have chosen inhomogeneous simu-
lated annealing (Algorithm 7) with less parameters
than homogeneous one.
Originally, the simulated annealing was used by
Metropolis [21] to simulate the physical annealing
in metallurgy. It randomizes the search procedure
in a way to attempt to reduce the probability to
become stuck in a poor, but locally optimal solu-
tion. Simulated annealing is a metaheuristic that
guides a local search procedure to explore the so-
lution space. It avoids becoming trapped in one of
them by accepting some worst transition. These
moves are accepted according to a probability
function which depends on a parameter called
temperature (noted T). Simulated annealing is ap-
plicable to a wide array of optimization problems
and his development has recently received much
attention. The stochastic descent is the simulated
annealing with null temperature. This means that
the probability to accept a worst transition is null.
One of the following functions (k is the number
of iterations) can be used for the modication
of T [3]:
constant function: Tk constant,
arithmetic function: Tk Tk 1-constant,
geometric function: Tk a Tk 1, (a is
often constant, it is a parameter to tune),
logarithmic function: Tk C
R
=Log1 k,
with C
R
> 0 (C
R
is a parameter to tune).
Simulated annealing converges in probability
toward the set of optimal solutions under some
hypotheses concerning the neighboring system
properties: neighboring system V must satisfy the
following properties:
accessibility: 8x; 8y, there is a nite sequence
u
0
; u
1
; . . . ; u
n
of states with u
0
x, u
n
y and
u
k
2 V u
k1
; 8k,
reversibility: y 2 V x () x 2 V y; 8x; 8y.
Algorithm 7 (Combining inhomogeneous simulated
annealing and performance evaluation model ).
1: Let be T > 0; x an initial solution
2: while necessary do
3: Choose y randomly and uniformly in the
neighborhood V of x
4: if EHy 6EHx then
5: x : y
6: else
7: x : y with probability: expEHy
EHx=T
8: end if
9: Modication of T
10: end while
11: x is the solution of the metaheuristic.
5.2.3. Combining kangaroo algorithm and perfor-
mance evaluation model
A simple way to leave a local minimum is to
restart a stochastic descent from a new randomly
chosen starting point. This scheme introduces the
successive descents algorithm. The kangaroo
algorithm (Algorithm 8) follows this scheme but
the new starting point is obtained by perturb-
ing the local minimum. This algorithm allows, after
a stochastic descent to accept any solution (by
using another neighboring system) and to start
again with a new stochastic descent.
The kangaroo algorithm uses two neighboring
systems, a rst one V for the stochastic descent, and
a second one W , generally larger than V , for the
M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433 425
perturbation (or kangaroo jump). A is the maxi-
mum number of iterations without improvement,
best corresponds to the best found solution. k is the
number of iterations since the last improvement.
This algorithm has been proved to converge in
probability [7] if neighboring system W satises the
accessibility property. The proof of the conver-
gence lies in the fact that the kangaroo algorithm
constructs a Markov chain where any state can
lead to an absorbing state and the absorbing states
constitute the global optimal set.
Algorithm 8 (Combining kangaroo algorithm and
performance evaluation model ).
1: Let be A > 0, x an initial solution
2: k : 0, best : x
3: while necessary do
4: if (k < A) then
5: Choose y randomly and uniformly in
the neighborhood V of x
6: if EHy 6EHx then
7: if EHy < EHx then
8: k : 0
9: if EHy < EHbest then
10: best : y
11: end if
12: end if
13: x : y
14: else
15: k : k 1
16: end if
17: else
18: Choose y randomly and uniformly in
the neighborhood W of x
19: k : 0
20: if EHy < EHbest then
21: best : y
22: end if
23: x : y
24: end if
25: end while
26: best is the solution of the metaheuristic.
5.2.4. Neighboring systems
Classical neighboring systems for the ow shop
problem are permutations and insertions. We
propose to use this type of neighboring system (V )
for the job scheduling problem:
P
i;i1
permutation of two contiguous jobs
randomly chosen,
P
i;j
permutation of any two jobs randomly
chosen,
I
i;j
insertion of one job randomly chosen.
They satisfy the accessibility and reversibility
properties.
For the kangaroo algorithm, we use, for
neighboring system W , ve times neighboring
system V .
6. Results
We have tested proposed methods (performance
evaluation methods and scheduling methods) on
ow shop problems from the OR-Library (http://
mscmga.ms.ic.ac.uk/info.html). OR-Library is a
collection of test data sets for a variety of Opera-
tions Research problems. Retained problems for
this paper are: car1, car2; . . . ;car8 and reC01, re-
C03; . . . ;reC41 with from 7 to 75 jobs and from 4 to
20 machines. In this library, the data (job pro-
cessing times t
ij
) correspond to deterministic
problems. In our study, the rates of exponential
distribution (l
ij
) are deduced from these processing
times by
l
ij
1=t
ij
:
First, we present some results for the perfor-
mance evaluation problem and then we give results
for two machine ow shop scheduling problems
and for m machine ow shop scheduling problems
(m > 2).
6.1. Performance evaluation problem
For the performance evaluation problem, we
propose two methods: a Markovian model which
computes the expected makespan and a stochastic
simulation model which evaluates the expected
makespan. In Table 1, we present, for each m
machine ow shop problem, a comparison be-
tween the two methods for the evaluation of the
426 M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433
expected makespan of the lexicographical order.
For the Markovian model, we give the number of
states, the expected makespan and the CPU time
(on an O2 Silicon Graphics). For the simulation
model, we have run 200,000 replications in order
to have a very good estimation of the expected
makespan. We give an estimation of the expected
makespan, the corresponding condence interval
(given for 0.95%) and the CPU time.
Concerning the small problems (up to 20 jobs
and ve machines), the Markovian model gives a
result in a small CPU time (from less than 1 s to 1
min) and the obtained expected makespan is in
the condence interval of the expected makespan
estimated by the simulation model. Simulation
model is validated by Markovian model.
The CPU time for the Markovian model in-
creases very quickly with the number of machines
and the number of jobs. This is due to the number
of states of the Markov chain. From 20 jobs and
10 machines, the Markovian model obtains no
result in less than 4 hours. The CPU time of the
simulation model increases more slowly.
To conclude, we propose to use the simulation
model for problems from reC07 to reC41 instead
of using the Markovian model.
6.2. Scheduling problem
For the scheduling problem, we propose meth-
ods based on combination of heuristics and
metaheuristics with performance evaluation
methods. We present rst results for two machine
problem and then for m machine problem.
6.2.1. Two machine problems
We have tested proposed scheduling methods
on two machine ow shop for which we have the
optimal solution by using Talwars rule [30]. Two
machine problems are car1
0
; . . . ;car8
0
, which data
are deduced from car1; . . . ;car8 data. Methods
tested are: combination of stochastic descent with
Markovian model (noted SD/M) and with simu-
lation model (noted SD/S), with 1000 iterations for
the stochastic descent and 5000 replications for the
simulation model. The neighboring system is I
i;j
.
Stochastic descent is a stochastic algorithm so
many runs of the method may not give the same
result. We have run each method Nb times, for
each problem (Nb 30). When combining with
the simulation model, we obtain a schedule and we
compute by Markovian model EC
max
for this
schedule (Algorithm 9).
Table 1
Comparison Markovian model and simulation model
Pb n m Markovian model Simulation model
Number of states EC
max
CPU (s) EC
max
CI CPU (s)
car1 11 5 4367 11341.2 1 11341.1 10.3 16
car2 13 4 2379 11070.8 0 11076.0 9.9 15
car3 12 5 6187 12897.6 1 12893.7 12.5 18
car4 14 4 3059 12666.3 0 12671.6 10.8 17
car5 10 6 8007 12093.4 2 12098.6 12.2 18
car6 8 9 24309 14399.6 27 14417.4 12.5 22
car7 7 7 3431 10620.3 1 10613.1 10.1 14
car8 8 8 12869 13081.2 7 13081.6 10.7 19
reC01 20 5 53129 2019.6 65 2018.2 1.4 30
reC03 20 5 53129 1642.9 65 1641.7 1.2 30
reC05 20 5 53129 1916.2 64 1915.0 1.4 30
reC07 20 10 3.0045e 07 2603.6 1.4 59
reC13 20 15 3.24794e 09 3448.4 1.6 88
reC19 30 10 8.47661e 08 3416.3 1.6 88
reC25 30 15 3.44867e 11 4387.9 1.8 126
reC31 50 10 7.5394e 10 5034.0 1.9 126
reC41 75 20 1.71145e 20 8762.2 2.5 443
() No results in less than 4 hours on an O
2
Silicon Graphics.
M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433 427
Algorithm 9 (Computation of the result by using
Markovian model ).
1: for i 1; . . . ; Nb do
2: Let x be the schedule obtained by a combi-
nation with the simulation model
3: Compute H
i
EC
max
x by using the
Markovian model
4: end for
Table 2 presents obtained results. For each
problem and each method, we give:
the optimal value opt in stochastic context ob-
tained by applying Talwars rule,
the mean of obtained EC
max
dened by
EC
max
1=Nb

Nb
i1
H
i
and in parenthesis the standard deviation r
dened by
r 1=Nb

Nb
i1
H
i
_
EC
max

2
_
1=2
where H
i
is the obtained value at ith run of the
stochastic descent (i 1; . . . ; Nb),
the mean CPU time (CPU) to obtain H
i
.
Combination of stochastic descent with
Markovian model allows us, for each problem, to
nd the optimal solution and as the standard de-
viation is equal to zero, for a given problem, the
obtained schedules, at each run of the method,
provide the same makespan.
The results are obtained in a short CPU time (1
or 2 seconds on an O2 Silicon Graphics) and in a
quite small number of iterations. The combination
of stochastic descent with simulation model allows
us to obtain solutions close to or equal to the
optimal solution in a greater CPU time (from 200
to 500 seconds on an O2 Silicon Graphics). Stan-
dard deviations are small so, even if, for a given
problem, the obtained schedules are not the same,
the dierent EC
max
are close.
The combination with the simulation model
requires more iterations than the combination with
the Markovian model because the method uses
only estimations of the criterion and hesitates over
many schedules with close EC
max
. Increasing the
number of iterations does not improve the quality
of the solution.
We can conclude that, although working on
estimations of the criterion and with a small
number of replications (5000), obtained results by
combination of stochastic descent with simulation
model are of good quality and close to the optimal
solution.
6.2.2. m machine problems
For the m machine ow shop scheduling prob-
lems, we propose combination of heuristics
or metaheuristics with performance evaluation
model.
6.2.2.1. Combination of heuristics with performance
evaluation model. Table 3 presents results for the
combination of heuristic CDS1 with Markovian
model and heuristic CDS1 with simulation model
for dierent values of the number of replications.
Table 2
Stochastic descent for the stochastic two machine ow shop problem
Pb n m opt SD/M SD/S
EC
max
(r) CPU (s) EC
max
(r) CPU (s)
car1
0
11 2 5471.82 5471.82 (0.000) 1 5472.20 (0.007) 433
car2
0
13 2 6452.01 6452.01 (0.000) 2 6452.40 (0.012) 516
car3
0
12 2 7397.71 7397.71 (0.000) 1 7398.10 (0.015) 477
car4
0
14 2 8159.07 8159.07 (0.000) 2 8159.31 (0.005) 554
car5
0
10 2 5324.57 5324.57 (0.000) 1 5325.53 (0.021) 397
car6
0
8 2 5045.06 5045.06 (0.000) 1 5045.56 (0.021) 321
car7
0
7 2 4220.55 4220.55 (0.000) 1 4220.8 (0.01) 282
car8
0
8 2 4976.68 4976.68 (0.000) 1 4976.78 (0.002) 178
428 M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433
We consider 1000, 5000, 10,000, 20,000, 50,000,
100,000 and 200,000 replications. The purpose of
this table is to study the eciency of our com-
parison mechanism according to the number of
replications, to know if the combination with the
simulation model allows to choose the same
schedule than the combination with the Marko-
vian model.
The combination of heuristic with Markovian
model compare the schedule according the exact
value of the criterion, so we run only one time the
method. But, combinations of heuristic with sim-
ulation model use an estimation of the criterion, so
many runs of the method may not provide the
same solution. So we have run Nb times each
combination of heuristic with simulation for each
problem (Nb 30). For each run of the combi-
nation with the simulation model, we have com-
puted (Algorithm 9) with the Markovian model
the expected makespan of the obtained schedule.
For each number of replications, we give the mean
and the standard deviation of the obtained
EC
max
.
From 5000 replications, results are reliable:
standard deviation is close or equal to zero and the
mean EC
max
is close or equal to the result ob-
tained by the combination with the Markovian
model. Results are similar for 10,000 and 20,000
replications.
Algorithm 10 (Computation of the result by using
simulation model ).
1: for i 1; . . . ; Nb do
2: Let x be the schedule obtained by the com-
bination with the simulation model
3: Compute H
i
EC
max
x by using the sim-
ulation model with 200,000 replications
4: end for
In Tables 4 and 5, we present a comparison
between the combination of CDS1, CSD2 and RA
with a performance evaluation model:
CDS1/M Heuristic CDSl with Markovian
model;
CDS1/S Heuristic CDSl with simulation model;
CDS2/M Heuristic CDS2 with Markovian
model;
CDS2/S Heuristic CDS2 with simulation model;
RA/M Heuristic RA with Markovian model.
In Table 4, results are for problems car1; . . . ;
car8, reC01; . . . ;reC03 for which we can use the
Markovian model. And in Table 5, results are for
problems reC05; . . . ;reC41 for which we can only
compute an estimation by simulation of the ex-
pected makespan. In this case, EC
max
of the ob-
tained schedule is computed by the simulation
Table 3
Heuristic CDS1 for the m machine stochastic ow shop scheduling problem
Pb n m CDS1/Markov CDS1/simulation model
Number of replications
1000 5000 50,000 100,000 200,000
car1 11 5 9551.68 9551.68 (0.00) 9551.68 (0.00) 9551.68 (0.00) 9551.68 (0.00) 9551.68 (0.00)
car2 13 4 9725.76 9725.76 (0.00) 9725.76 (0.00) 9725.76 (0.00) 9725.76 (0.00) 9725.76 (0.00)
car3 12 5 10446.80 10446.80 (0.00) 10446.80 (0.00) 10446.80 (0.00) 10446.80 (0.00) 10446.80 (0.00)
car4 14 4 10640.50 10640.50 (0.00) 10640.50 (0.00) 10640.50 (0.00) 10640.50 (0.00) 10640.50 (0.00)
car5 10 6 11007.60 11011.40 (7.34) 11008.70 (1.15) 11008.30 (1.06) 11008.70 (1.16) 11008.30 (1.06)
car6 8 9 12484.20 12490.20 (17.95) 12484.20 (0.00) 12484.20 (0.00) 12484.20 (0.00) 12484.20 (0.00)
car7 7 7 9473.71 9474.44 (0.56) 9474.25 (0.58) 9474.02 (0.51) 9474.33 (0.58) 9474.25 (0.58)
car8 8 8 12024.70 12028.70 (14.95) 12024.70 (0.00) 12024.70 (0.00) 12024.70 (0.00) 12024.70 (0.00)
reC01 20 5 1818.94 1818.94 (0.00) 1818.94 (0.00) 1818.94 (0.00) 1818.94 (0.00) 1818.94 (0.00)
reC03 20 5 1610.59 1611.29 (2.08) 1610.59 (0.00) 1610.59 (0.00) 1610.59 (0.00) 1610.59 (0.00)
reC05 20 5 1734.66 1734.66 (0.00) 1734.66 (0.00) 1734.66 (0.00) 1734.66 (0.00) 1734.66 (0.00)
We give the mean EC
max
and the corresponding standard deviation (r) in parenthesis.
M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433 429
model with 200,000 replications in order to have
a good estimation of the result and to be
able to compare the dierent methods (Algorithm
10).
For each problem, we compare the methods
together (the optimal value in the stochastic con-
text is unknown) and give:
when using the Markovian model, the obtained
expected makespan EC
max
,
when using the simulation model, the mean of
obtained expected makespan dened by
EC
max
1=Nb

Nb
i1
H
i
and in parenthesis the standard deviation r
dened by
r 1=Nb

Nb
i1
H
i
_
EC
max

2
_
1=2
where H
i
is the obtained value at the ith run
of the method (i 1; . . . ; Nb), for problems
car1; . . . ;car8, reC01; . . . ;reC03, H
i
is the exact
Table 4
Heuristics for the m machine stochastic ow shop scheduling problem
Pb n m CDS1/M CDS1/S CDS2/M CDS2/S RA/M
EC
max
EC
max
(r) EC
max
EC
max
(r) EC
max

car1 11 5 9551.97 9551.97 (0.00) 9663.87 9665.98 (0.06) 9657.54


car2 13 4 9725.69 9725.69 (0.00) 9762.96 9762.96 (0.00) 9778.72
car3 12 5 10446.65 10446.65 (0.00) 10641.15 10641.88 (0.01) 10540.97
car4 14 4 10640.78 10640.78 (0.00) 10807.25 10812.05 (0.11) 10678.40
car5 10 6 11007.94 1108.72 (0.02) 10865.13 10865.13 (0.00) 11222.56
car6 8 9 12484.49 12484.49 (0.00) 12552.53 12552.53 (0.00) 12482.78
car7 7 7 9473.78 9474.44 (0.01) 9446.76 9446.76 (0.00) 9473.78
car8 8 8 12024.45 12024.45 (0.00) 12024.45 12026.12 (0.02) 12067.95
reC01 20 5 1818.99 1818.99 (0.00) 1812.26 1881.84 (2.34) 1826.48
reC03 20 5 1610.60 1620.91 (1.08) 1637.77 1637.77 (0.00) 1610.60
reC05 20 5 1734.70 1734.70 (0.00) 1744.88 1744.88 (0.00) 1714.95
Table 5
Heuristics for the m machine stochastic ow shop scheduling problem
Pb n m CDS1/S CDS2/S RA/S
EC
max
(r) EC
max
(r) EC
max

reC07 20 10 2340.38 (0.07) 2364.34 (0.07) 2354.63


reC09 20 10 2343.92 (0.08) 2418.01 (0.06) 2361.39
reC11 20 10 2243.95 (0.06) 2284.16 (0.07) 2243.52
reC13 20 15 3058.85 (0.04) 3083.36 (0.04) 3073.52
reC15 20 15 2988.57 (0.05) 3110.25 (0.05) 3074.37
reC17 20 15 3094.74 (0.07) 3058.22 (0.07) 3091.89
reC19 30 10 3252.10 (0.06) 3294.38 (0.05) 3255.87
reC21 30 10 3113.84 (0.06) 3207.23 (0.06) 3122.11
reC23 30 10 3136.96 (0.06) 3181.6 (0.04) 3110.41
reC25 30 15 4004.21 (0.04) 4044.17 (0.04) 4078.59
reC27 30 15 3769.74 (0.05) 3847.58 (0.05) 3800.59
reC29 30 15 3703.56 (0.14) 3702.19 (0.05) 3728.81
reC31 50 10 4598.55 (0.04) 4681.99 (0.04) 4697.52
reC33 50 10 4526.51 (0.05) 4596.57 (0.04) 4536.16
reC35 50 10 4572.72 (0.05) 4645.80 (0.04) 5462.16
reC37 75 20 7972.65 (0.03) 8054.81 (0.03) 8056.76
reC39 75 20 8010.8 (0.02) 8165.62 (0.02) 8029.96
reC41 75 20 7909.03 (0.03) 8109.84 (0.06) 7930.14
430 M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433
value of EC
max
of the schedule obtained by the
ith run of the method (Algorithm 9), and for
problems reC05; . . . ;reC41, H
i
is an estimation
of EC
max
(Algorithm 10).
In Tables 4 and 5, we can see that generally
heuristic CDS1 gives the best results, but results
for the three heuristics are quite similar. If we
compare combination with Markovian model and
combination with simulation model, we can re-
mark that for all the heuristics, obtained results for
the combinations with simulation model are close
to or equal to the value obtained by the corre-
sponding combination with Markovian model.
Moreover standard deviations are small, obtained
schedules are similar.
6.2.2.2. Combination of metaheuristics with perfor-
mance evaluation. In Tables 6 and 7, we present
results for the combination of metaheuristics with
performance evaluation model:
SD/M Stochastic descent with Markovian
model with 1000 iterations;
SD/S Stochastic descent with simulation model
with 1000 iterations and 5000 replications;
K/S Kangaroo algorithm with simulation model
with 10,000 iterations and 5000 replications.
In Table 6, results are for problems car1; . . . ;
car8, reC01; . . . ;reC03 for which we can use the
Markovian model. And in Table 7, results are for
problems reC05; . . . ;reC41 for which we can only
have an estimation of the expected makespan.
For each method and for each problem, we
give:
the mean of obtained expected makespan
EC
max
and in parenthesis the standard devia-
tion r,
the mean CPU time (CPU) to obtain H
i
.
In Table 6, we obtain similar results with SD/
M and SD/S and standard deviations are small.
SD/S needs more iterations than SD/M. This is
due to the fact that SD/S considers only an esti-
mation of the criterion, so at a given time, a
schedule x may be better than a schedule y and
next, the schedule y may be better than the
schedule x. This means that the two schedules are
similar. Considering the computation time, we can
remark that the CPU of SD/M increases quickly
with the number of machines and the number of
jobs whereas the CPU time of SD/S increases
slowly.
In Table 7, we present a comparison between
combinations of stochastic descent and kangaroo
algorithm with simulation model. Results seem to
be of the same quality than in Table 6. If we
compare SD/S and K/S, K/S gives better results
than SD/S but if we consider the mean CPU time,
K/S does not really improve the results.
For all the problems, we obtain better results
with the metaheuristics than with the heuristics.
Table 6
Stochastic descent for the m machine stochastic ow shop scheduling problem
Pb n m SD/M SD/S
EC
max
(r) CPU (s) EC
max
(r) CPU (s)
car1 11 5 9513.26 (0.00) 825 9513.26 (0.01) 595
car2 13 4 9651.88 (0.00) 229 9653.31 (0.04) 562
car3 12 5 10339.89 (0.05) 1714 10339.89 (0.06) 698
car4 14 4 10507.93 (0.11) 557 10503.13 (0.07) 602
car5 10 6 10789.47 (0.00) 2920 10791.01 (0.05) 645
car6 8 9 12618.87 (0.00) 26282 12619.72 (0.02) 792
car7 7 7 9436.88 (0.00) 692 9438.19 (0.05) 520
car8 8 8 12047.04 (0.01) 8820 12048.71 (0.07) 679
reC01 20 5 1781.71 (0.01) 74955 1782.33 (0.19) 237
reC03 20 5 1544.39 (0.00) 234799 1545.39 (0.05) 237
reC05 20 5 1691.60 (0.01) 235140 1693.34 (0.09) 230
M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433 431
7. Conclusion
This paper deals with scheduling in a stochastic
ow shop with exponentially distributed process-
ing times.
In [5], Cunningham and Dutta propose a for-
mula to compute the expected makespan in a two
machine ow shop with exponentially distributed
processing times and limited buer. We have ex-
tended these formula to the m machine case.
For the stochastic ow shop scheduling prob-
lem, we have proposed methods based on the
combination of heuristics (RA, CDS) and meta-
heuristics (stochastic descent, . . .) with stochastic
simulation model and Markovian model. The ad-
vantage of combination of heuristics or metaheu-
ristics with Markovian model is that solutions are
compared according to the exact value of their
criterion. The main disadvantage of the use of the
simulation model is that the criterion is only an
estimation, more or less good.
For the two machine problems, combination of
stochastic descent with Markovian model allows
us to obtain, in all studied cases, the optimal so-
lution in a short computing time (1 second on an
O2 Silicon Graphics). The combination of sto-
chastic descent and simulation model allows us to
obtain solutions close to or equal to the optimal
solution in a greater computing time (400 seconds
on an O2 Silicon Graphics).
For m machine problems, combination of sto-
chastic descent with performance evaluation model
gives better results than combination of heuristics
(CDS, RA) with performance evaluation model.
Combination of stochastic descent with simulation
or Markovian model provides similar results.
Combination of stochastic descent with simulation
model needs about 600 seconds whatever the
problem, when combination of stochastic descent
with Markovian model needs from about 200 to
33,000 seconds. The increase of computing time is
due to Markovian model: the number of states of
the Markov chain increases quickly with the num-
ber of jobs and more quickly with the number
of machines.
Tests on combination of simulated annealing
with performance evaluation model are in pro-
gress. First results are of good quality.
Further works are the following:
to study other systems: m machine ow shop
without buer or with limited buer, hybrid
ow shop, job shop, . . .
totake intoaccount breakdowns in suchsystems,
Table 7
Stochastic descent and kangaroo algorithm for the m machine stochastic ow shop scheduling problem
Pb n m SD/S K/S
EC
max
(r) CPU (s) EC
max
(r) CPU (s)
reC01 20 5 1782.34 (0.19) 236 1779.84 (0.03) 3090
reC03 20 5 1545.39 (0.05) 231 1545.39 (0.06) 3300
reC05 20 5 1693.34 (0.08) 231 1692.10 (0.08) 3802
reC07 20 10 2310.94 (0.39) 307 2310.63 (0.28) 6045
reC09 20 10 2299.04 (0.43) 307 2293.51 (0.28) 6117
reC11 20 10 2195.01 (0.5) 307 2190.14 (0.28) 6192
reC13 20 15 3000.18 (0.38) 442 2996.9 (0.25) 6044
reC15 20 15 2984.67 (0.36) 443 2978.82 (0.45) 4245
reC17 20 15 2984.62 (0.45) 443 2981.57 (0.21) 4231
reC19 30 10 3160.64 (0.34) 459 3153.1 (0.33) 4281
reC21 30 10 3022.87 (0.49) 462 3013.39 (0.52) 4318
reC23 30 10 3026.15 (0.41) 461 3013.68 (0.37) 4977
reC25 30 15 3932.09 (0.34) 663 3926.31 (0.38) 9294
reC27 30 15 3686.45 (0.21) 1251 3673.4 (0.25) 9300
reC29 30 15 3585.33 (0.52) 686 3570.46 (0.26) 9298
reC31 50 10 4462.75 (0.41) 764 4428.95 (0.23) 9798
reC33 50 10 4408.80 (0.38) 804 4375.48 (0.23) 9784
reC35 50 10 4477.04 (0.38) 763 4435.42 (0.30) 9788
432 M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433
to consider other probability distributions such
as uniform or geometric distributions,
to test other stochastic methods such as genetic
algorithms and taboo method,
to introduce the condence interval in the com-
parison mechanism,
to dene and test other neighboring systems,
to study robustness and to dene robustness
indicator for obtained solutions in stochastic
environment.
References
[1] P.C. Bagga, n jobs, 2 machines sequencing problems with
stochastic service times, Operations Research 7 (1970) 184
197.
[2] P.C. Bagga, Sequencing with random service times, Tech-
nometrics 12 (1970) 327334.
[3] C. Bonnemoy, S.B. Hamma, La meethode du recuit simulee:
optimisation globale dans R
n
, RAIRO APII 25 (5) (1991).
[4] H.G. Campbell, R.A. Dudek, M.L. Smith, A heuristic
algorithm for the n job, m machine sequencing problem,
Management Science 16 (10) (1970) B630B637.
[5] A.A. Cunningham, S.K. Dutta, Scheduling jobs with
exponentially distributed processing times on two machines
of a ow shop, Naval Research Logistics Quarterly 16
(1973) 6981.
[6] D.G. Dannenbring, An evaluation of ow shop scheduling
heuristics, Management Science 23 (11) (1977) 11741182.
[7] G. Fleury, Meethodes stochastiques et deeterministes pour
les probleemes NP-diciles, Doctorat dinformatique, Uni-
versitee Blaise Pascal, Clermont-Ferrand II, 1993.
[8] R.D. Foley, S. Suresh, Stochastically minimizing the
makespan in ow shops, Naval Research Logistics Quar-
terly 31 (1984) 551557.
[9] F.G. Forst, An analysis of the two machine static stochastic
ow shop with linear completion time costs, PhD Disser-
tation, University of Illinois at Urbana-Champaign, 1981.
[10] F.G. Forst, Minimizing total expected costs in the two
machine, stochastic ow shop, Operations Research Let-
ters 2 (1983) 5861.
[11] F.G. Forst, A review of the static stochastic job sequencing
literature, Operations Research 21 (1984) 127144.
[12] GOTHA, Les probleemes dordonnancement, RAIRO
Recherche opeerationnelle, Operations Research 27 (1)
(1993) 77150.
[13] M. Gourgand, N. Grangeon, S. Norre, A review of the
static stochastic ow shop scheduling problem, Journal
of Decision Systems 9 (2) (2000) 183214.
[14] C. Jia, Minimizing variation in stochastic ow shop,
Operations Research Letters 23 (1998) 109111.
[15] S.M. Johnson, Optimal two and three stage production
schedules with setup times included, Naval Research
Logistics Quarterly 1 (1954) 6168.
[16] J. Kamburowski, Stochastically minimizing the make-
span in two-machine ow shops without blocking, Euro-
pean Journal of Operational Research 112 (1999) 304
309.
[17] J. Kamburowski, On three-machine ow shops with
random job processing times, European Journal of Oper-
ational Research 125 (2000) 440449.
[18] P.S. Ku, S.C. Niu, On Johnsons two machine ow shop
with random processing times, Operations Research 34
(1986) 130136.
[19] P. Lopez, P. Esquirol, Lordonnancement, Economica
(1999).
[20] T. Makino, On a scheduling problem, Journal of the
Operations Research Society of Japan 8 (1965) 3244.
[21] N. Metropolis, A. Rozenbluth, M. Rozenbluth, A. Teller,
E. Teller, Equation of state calculations by fast computing
machines, Journal of Chemical Physics 21 (1953) 1087
1092.
[22] B.S. Mittal, P.C. Bagga, A priority problem in sequencing
with stochastic services times, Operations Research 14
(1977) 1928.
[23] I.H. Osman, An introduction to metaheuristics, in: M.
Lawrence, C. Wilson (Eds.), Operational Research Tuto-
rial Papers, Operational Research Society Press, Birming-
ham, UK, 1997, pp. 92122.
[24] I.H. Osman, G. Laporte, Metaheuristics: A bibliography,
Operations Research 63 (1996) 513628.
[25] M. Pinedo, Minimizing the expected makespan in stochas-
tic ow shops, Operations Research 30 (1982) 148162.
[26] M. Pinedo, Scheduling: Theory, Algorithms and Systems,
Prentice-Hall, Englewood Clis, NJ, 1995.
[27] V.R. Prasad, n 2 ow shop sequencing problem with
random processing times, Operations Research 18 (1981)
114.
[28] F.A. Rodammer, K. Preston White, A recent survey of
production scheduling, IEEE Transaction on Systems,
Man and Cybernetics 6 (18) (1988).
[29] M. Shaked, J.G. Shanthikumar, Stochastic Orders and
their Applications, Academic Press, Boston, 1994.
[30] T.T. Talwar, A note on sequencing problems with uncer-
tain job times, Journal of the Operations Research Society
of Japan 9 (1967) 9397.
[31] P.J.M. van Laarhoven, A.H.L. Aarts, Simulated Anneal-
ing: Theory and Applications, Kluwer Academic Publish-
ers, Dordrecht, 1987.
[32] G. Weiss, Multiserver stochastic scheduling, in: M.A.H.
Dempster, J.K. Lenstra, A.H.G. Rinnooy Kan (Eds.),
Deterministic and Stochastic Scheduling, D. Reidel,
Dordrecht, 1982, pp. 157179.
[33] M. Widmer, Modeeles matheematiques pour une gestion
ecace des ateliers exibles, PhD Dissertation, Ecole
Polytechnique Feedeerale de Lausanne, 1990.
M. Gourgand et al. / European Journal of Operational Research 151 (2003) 415433 433

Anda mungkin juga menyukai