Anda di halaman 1dari 17

Building Valid and Credible Simulation Models

Ref: Law & Kelton, Chapter 5


1. Formulate problem and
plan the study

2. Collect data and define System


a model
Analysis 1, 2, 3 VALIDATION
Steps in a simulation study

3. Valid
Conceptual Model

4. Construct a computer
Programming 4 VERIFICATION
program and verify
Simulation Program
5. Make a pilot runs Experimental runs 5, 6, 7, 8, 9 VALIDATION
6. Valid “Correct” Results

Sell the decision 10 ESTABLISH


7. Design experiments
Implementation
CREDIBILITY
8. Make production runs

9. Analyze output data


System View
10. Documents, present
A M M Mukaddes
and implement results
What are Validation and Verification?
• Validation is the process of determining whether the
conceptual model is an accurate representation of the actual
system being analyzed. Validation deals with building the
right model.
• Verification is the process of determining whether a
simulation computer program works as intended (i.e.,
debugging the computer program). Verification deals with
building the model right.

Validation Conceptual
Real -World
3

Model
System

Simulation
Program
Credibility
» Credible: The process of ensuring that decision
makers believe in the results of the conceptual
model. When a simulation model and its
results are accepted by the manager/client as
being valid and are used as an aid in making
decisions, we call the model credible.

K. Salah
In a Picture

Credible

# of persons
Importance
Difficulty

Time
Validated

Verified
Guidelines for determining the level of detail in
simulation model

• What aspects of to include, what to ignore safely?


• Carefully define the
 issues to be investigated
 measures of performance
 alternative system configurations of interest

• It is important to understand the managers need.

• It is not necessary to model each part of the system in full detail;


– If you are simulating use of a bank’s parking space, you may
take the bank itself as a delay or waiting station without
simulating the operations inside in detail.
Guidelines for determining the level of detail in
simulation model
• Start with “moderately detailed” model and add detail later on as
needed.
– Simulation of a manufacturing plant
• Start with assuming unlimited WIP space and one type product

• Add buffer space limitation between machine and add multiple product
type

• Add machine breakdowns and so on.

» Use “expert” and sensitivity analysis to help determine the


level of model detail.
˃ A bottleneck machine is the one that determines the throughput in a
production system
Guidelines for determining the level of detail in
simulation model
• Do not have more details in a model than in necessary to address
the issues of interest. The model must have enough detail to be
credible.

• The level of model detail should be consistent with the type of


data that are available.
– Arrival times. Is the arrival times recorded based on urgent vs. non-urgent
customers? We can model the system in different ways depending on the
answer
– Simulation of a new system; less detail vs. Simulation to “fine-tune” an
existing system

• In simulation studies time and money constraints are a major


factor in determining the amount of model detail.
Guidelines for determining the level of detail in
simulation model

• If the number of factors are large, we should


determine the factors that are really important
using
– An analytical tool under simplifying assumptions
– Design of experiments using a simpler “rough-cut” simulation
model
Role of a Manager in modeling a system

– A manager of the system of interest should be aware that a


successful simulation study will require a commitment of his or
her time and resources.

– The manager must be personally involved during problem


formulation.

– The manager must be involved in model building process. This


increase model validity.

– A manager should allow technical personnel in modeling for


some period of time.

– A manager should be agree to hire a consultant for modeling


purposes.
Techniques for Verification of
Simulation Models
• Use good programming practice:
 Write and debug the computer program in modules or
subprograms. Key subprograms should be written and
debugged first keeping other subprograms as “dummy”.
 In general, it is always better to start with a “moderately
detailed” model which is gradually made as complex as
needed, than to develop immediately a complex model.
• Use “structured walk-through”:
 Have more than one person to read and write the
computer program.
 Debug the subprogram in front of all members.
• Use a “trace”:
 The analyst may use a trace to print out some
11

intermediate results and compare them with hand


calculations to see if the program is operating as intended.
Techniques for Verification of
Simulation Models
• Check simulation output for reasonableness:
 Run the simulation model for a variety of input scenarios
and check to see if the output is reasonable.
 In some instances, certain measures of performance can
be computed exactly and used for comparison.
• Intermediate run:
 Run the model under simplifying assumptions for which
its true characteristics are known or can easily be
computed.

• Animate:
 Using animation, the users see dynamic displays (moving
pictures) of the simulated system.
12

 Since the users are familiar with the real system, they can
detect programming and conceptual errors.
Techniques for Verification of
Simulation Models
• Compare final simulation output with analytical results:
 May verify the simulation response by running a
simplified version of the simulation program with a
known analytical result. If the results of the simulation
do not deviate significantly from the known mean
response, the true distributions can then be used.
 For example, for a queuing simulation model, queuing
theory can be used to estimate steady state responses
(e.g., mean time in queue, average utilization). These
formulas, however, assume exponential interarrival and
service times with n servers (M/M/n).

Mean and Variances


13

 Write out the sample mean and sample variance for each
simulation input probability distribution and compare
them with the desired mean and variance.
Techniques for Validation of
Simulation Models
• A three-step approach for developing a valid and
credible model:
1. Develop a model with high face validity:
 The objective of this step is to develop a model that, on the
surface, seems reasonable to people who are familiar with
the system under study.
 This step can be achieved through discussions with system
experts, observing the system, or the use of intuition.
 It is important for the modeler to interact with the client on
a regular basis throughout the process.
 It is important for the modeler to perform a structured
walk-through of the conceptual model before key people to
ensure the correctness of model’s assumptions .
14
Techniques for Validation of
Simulation Models

2. Test the assumptions of the model empirically:


 In this step, the assumptions made in the initial stages of
model development are tested quantitatively. For
example, if a theoretical distribution has been fitted to
some observed data, graphical methods and goodness of
fit tests are used to test the adequacy of the fit.
 Sensitivity analysis can be used to determine if the
output of the model significantly changes when an input
distribution or when the value of an input variable is
changed. If the output is sensitive to some aspect
of the model, that aspect of the model must be modeled
15

very carefully.
Techniques for Validation of
Simulation Models

3. Determine how representative the simulation output data are:


 The most definitive test of a model’s validity is determining how
closely the simulation output resembles the output from the
real system.
 The Turing test can be used to compare the simulation output
with the output from the real system. The output data from the
simulation can be presented to people knowledgeable about the
system in the same exact format as the system data. If the
experts can differentiate between the simulation and the
system outputs, their explanation of how they did that should
improve the model.
 Statistical methods are available for comparing the output from
16

the simulation model with those from the real-world system .


Self Study

General Perspectives on Validation


17

Anda mungkin juga menyukai