Anda di halaman 1dari 11

Production Scheduling

Let all things be done decently and in order.


I Corinthians

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

Goals of Production Scheduling


High Customer Service: on-time delivery
Low Inventory Levels: WIP and FGI

UTILIZ
ATION
SERVICE

High Utilization: of machines

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

INVENTOR
Y

Meeting Due Dates Measures


Service Level:

Lateness:

Used typically in make to


order systems.
Fraction of orders which are
filled on or before their due
dates.

Used in shop floor control.


Difference between order due date
and completion date.
Average lateness has little meaning.
Better measure is lateness variance.

Tardiness:

Fill Rate:
Used typically in make to
stock systems.
Fraction of demands met
from stock.
Wallace J. Hopp, Mark L. Spearman, 1996, 2000

Used in shop floor control.


Is equal to the lateness of a job if it
is late and zero, otherwise.
Average tardiness is meaningful but
unintuitive.

http://www.factory-physics.com

Classic Scheduling Assumptions (cont.)


Classic Scheduling: (only classic in academia)
Benefits Optimal schedules
Problems Bad assumptions.
All jobs available at the start of the problem.
Deterministic processing times.
No setups.
No machine breakdowns.
No preemption.
No cancellation.

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

Classic Single Machine Results


Minimizing Average Cycle Time:
Minimize by performing in shortest process time (SPT) order.
Makespan is not affected.

Mean Lateness
Minimizing Maximum Lateness (or Tardiness):
Minimize by performing in earliest due date (EDD) order.
Makespan is not affected.
If there exists a sequence with no tardy jobs, EDD will do it.

Minimizing Average Tardiness:


If common due dates, SPT works. Otherwise:
No simple sequencing rule will work. Problem is NP Hard.
Makespan is not affected.
Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

Classic Multi Machine Results


Minimizing Makespan on Two Machines: given a set of jobs that
must go through a sequence of two machines, what sequence will yield the
minimum makespan?
Makespan is sequence dependent.
Simple algorithm (Johnson 1954):
1. Sort the times of the jobs on the two machines in two lists.
2. Find the shortest time in either list and remove job from both lists.
If time came from first list, place job in first available position.
If time came from second list, place job in last available
position in sequence.
3. Repeat until lists are exhausted.
The resulting sequence will minimize makespan.

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

Johnsons Algorithm Example


Data:
Job
1
2
3

Time on M1
4
7
6

Time on M2
9
10
5

Iteration 1: min time is 4 (job 1 on M1); place this job first and remove
from lists:
List 1
4 (1)
6 (3)
7 (2)

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

List 2
5 (3)
9 (1)
10 (2)

http://www.factory-physics.com

Johnsons Algorithm Example (cont.)


Iteration 2: min time is 5 (job 3 on M2); place this job last and remove
from lists:
List 1
6 (3)
7 (2)

List 2
5 (3)
10 (2)

Iteration 3: only job left is job 2; place in remaining position (middle).


Final Sequence: 1-2-3
Makespan: 28

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

Gantt Chart for Johnsons Algorithm Example

Machine 1

Machine 2
Time

3
1

9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28

Short task on M2 to
clear out quickly.

Short task on M1 to
load up quickly.

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

The Difficulty of Scheduling Problems


Dilemma:

9E+22

Too hard for optimal solutions.


Need something anyway.

Classifying Hardness:
Class P: has a polynomial solution.
Class NP: has no polynomial solution.

8E+22
7E+22
6E+22

e n /10000

5E+22
4E+22
3E+22
2E+22

10000n10

1E+22

Example: Sequencing problems grow as n!.

0
56

57

58

59

60

61

62

63

Compare en/10000 and 10000n10.


At n = 40, en/10000 = 2.4 1013, 10000n10 = 1.0 1020
At n = 80, en/10000 = 5.5 1030, 10000n10 = 1.1 1023
3! = 6, 4! = 24, 5! = 120, 6! = 720, 10! =3,628,800, while
13! = 6,227,020,800
25!= 15,511,210,043,330,985,984,000,000
Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

10

Computation Times
Current situation: computer can examine 1,000,000 sequences per
second and we wish to build a scheduling system that has response
time of no longer than one minute. How many jobs can we sequence
optimally?
Number of Jobs
5
6
7
8
9
10
11
12
13
14
15

20

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

Computer Time
0.12
0.72
5.04
40.32

millisec
millisec
millisec
millisec
0.36 sec
3.63 sec
39.92 sec
7.98 min
1.73 hr
24.22 hr
15.14 day

77,147 years

http://www.factory-physics.com

11

Effect of Faster Computers


Future Situation: New computer is 1,000 times faster, i.e. it can do 1
billion comparisons per second. How many jobs can we sequence
optimally now?
Number of Jobs
5
6
7
8
9
10
11
12
13
14
15

20
Wallace J. Hopp, Mark L. Spearman, 1996, 2000

Computer Time
0.12 microsec
0.72 microsec
5.04 microsec
40.32 microsec
362.88 microsec
3.63 millisec
39.92 millisec
479.00 millisec
6.23 sec
87.18 sec
21.79 min

77,147 years

http://www.factory-physics.com

12

Polynomial vs. Non-Polynomial Algorithms


Polynomial Example: dispatching (sorting) time goes up as n log n.
Suppose, for comparison, that is takes the same amount of time to sort
10 jobs as it does to examine the 10! sequences. How large a problem
can we solve using dispatching?
Number of Jobs
10
11
12
.
20
30
.
80
85
90
.
100
200
Wallace J. Hopp, Mark L. Spearman, 1996, 2000

Computer Time
3.6 sec
4.1 sec
4.7 sec
.
9.4 sec
16.1 sec
.
55.2 sec
59.5 sec
63.8 sec
.
72.6 sec
167.0 sec

http://www.factory-physics.com

13

Effect of Faster Computer


Situation: New computer 1000 times faster. How much does the size of
dispatching problem we can solve increase?
Number of Jobs
1000
2000
3000
.
10,000
20,000
30,000
35,000
36,000
.
50,000
100,000
200,000
Wallace J. Hopp, Mark L. Spearman, 1996, 2000

Computer Time
1.1 sec
2.4 sec
3.8 sec
.
14.5 sec
31.2 sec
48.7 sec
57.7 sec
59.5 sec
.
85.3 sec
181.4 sec
384.7 sec

http://www.factory-physics.com

14

Implications for Real Problems


Computation: NP algorithms are slow to use.
No Technology Fix: Faster computers dont help on NP algorithm.
Scheduling is Hard: Real scheduling problems tend to be NP Hard.
Scheduling is Big: Real scheduling problems also tend to be quite
large; impossible to solve optimally.

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

15

Implications for Real Problems (cont.)


Robustness? NP hard problems have many solutions, and presumably
many good ones.
Example: 25 job sequencing problem. Suppose that only one in a
trillion of the possible solutions is good. This still leaves 15
trillion good solutions. Our task is to find one of these.

Role of Heuristics: Polynomial algorithms can be used to obtain


good solutions. Example heuristics include:
Simulated Annealing
Tabu Search
Genetic Algorithms

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

16

The Bad News


Violation of Assumptions: Most real-world scheduling problems
violate the assumptions made in the classic literature:
There are always more than two machines.
Process times are not deterministic.
All jobs are not ready at the beginning of the problem.
Process time are sequence dependent.

Problem Difficulty: Most real-world production scheduling


problems are NP-hard.
We cannot hope to find optimal solutions of
realistic sized scheduling problems.
Polynomial approaches, like dispatching, may
not work well.
Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

17

The Good News


Due Dates: We can set the due dates.
Job Splitting: We can get smaller jobs by splitting larger ones.
Single machine SPT results imply small jobs clear out more
quickly than larger jobs.
Mechanics of Johnsons algorithm implies we should start with a
small job and end with a small job.
Small jobs make for small move batches and can be combined to
form larger process batches.

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

18

The Good News (cont.)


Feasible Schedules: We do not need to find an optimal schedule, only a
good feasible one.

Focus on Bottleneck: We can often concentrate on scheduling the


bottleneck process, which simplifies problem closer to single machine
case.

Capacity: Capacity can be adjusted dynamically (overtime, floating


workers, use of vendors, etc.) to adapt facility (somewhat) to
schedule.

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

19

Classic Dispatching Results


Optimal Schedules: Impossible to find for most real problems.
Dispatching: sorts jobs as they arrive at a machine.
Dispatching rules:

FIFO simplest, seems fair.


SPT Actually works quite well with tight due dates.
EDD Works well when jobs are mostly the same size.
LWR
SLK
Critical Ratio
Many (100?) others.

Problems with Dispatching:


Cannot be optimal (can be bad).
Tends to be myopic.

Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

20

10

Scheduling Takeaways
Scheduling is hard!
Even simple toy problems generate complicated mathematics.

Scheduling can be simplified through the environment:


due date quoting
flow simplification sequencing in place of scheduling

Finite capacity scheduling is coming.


But in what form is unclear (shifting bottleneck, genetic
algorithms, rule based systems, etc.).

Diagnostics are important in scheduling .


pure optimization generally impossible
need good interface to allow human intervention
Wallace J. Hopp, Mark L. Spearman, 1996, 2000

http://www.factory-physics.com

21

11

Anda mungkin juga menyukai