Anda di halaman 1dari 14

Computers in Industry 103 (2018) 14–27

Contents lists available at ScienceDirect

Computers in Industry
journal homepage: www.elsevier.com/locate/compind

Discrete-event simulation software selection for manufacturing based


on the maturity model
Alexandre Magno Castañon Guimarães, José Eugenio Leal* , Paulo Mendes
Pontifical Catholic University of Rio de Janeiro, Brazil

A R T I C L E I N F O A B S T R A C T

Article history: The main objective of this research is to develop a new method to help an industry select the right DESS
Received 26 June 2018 (discrete-event simulation software), which helps improve the productivity of a given process.
Received in revised form 15 August 2018 This paper addresses this issue by developing a methodology that undertakes two aspects of the
Accepted 10 September 2018
problem. First, it proposes a methodology that allows for companies to self-assess their current internal
Available online 21 September 2018
processes based on a maturity model to identify where they stand in the maturity continuum for
simulation. Second, it applies the analytical hierarchical process (AHP) to support simulation software
Keywords:
selection by detailing and weighting the components that are important for the specific company to meet
Simulation
Manufacturing
its business objectives. To the best of our knowledge, there are no other studies that combine these two
AHP methodological tools to help decision making for DESS selection.
Decision making process © 2018 Elsevier B.V. All rights reserved.

1. Introduction choice of software can have undesirable consequences, such as


financial losses, longer modeling times, project interruptions and a
According to Koch et al. [1], companies must continually adapt lack of suitable resources, leading to poor decisions and,
to the needs of the market. In order to do it effectively and consequently, poor organizational performance. Adopting the
efficiently, production resources must be organized in a planned correct approach when selecting simulation software is therefore
manner. essential [8,9].
Authors, such as Sandanayake et al. [2], Sandanayake and In this scenario, characterized by increasing use of simulation
Oduoza [3],Azadeh and Maghsoudi [4], Sawant and Mohite [5], software to evaluate production processes, and a wide range of
Bosch-Mauchand et al. [6] and Rakiman and Bon [7], note that softwares available in the market, methods for evaluating and
computer simulation is one of the most advanced and powerful selecting discrete-event simulation software that is suitable for a
tools for modeling and analyzing operational performance in company’s particular circumstances are clearly important.
companies to support continuously adaptation. The main objective of this research is to develop a method to
However, before computer simulation is adopted, companies help an industry select the right DES software (discrete-event
should satisfy certain basic conditions. One way of assessing simulation software), which improves the productivity of a given
whether these conditions are met is through maturity models, process.
which can be used to measure the quality of a company’s processes This study presents a model for the evaluation and selection of
and the extent to which the technicians are qualified to implement simulation software with the support of a maturity model as its
the software and exploit its full potential. main academic contribution. Additionally, it captures, in a
There is a wide range of simulation software packages available structured manner, the criteria to identify critical simulation
in the market. They provide a variety of applications, have different software features for managers in manufacturing companies. To
prices and features and use different approaches and modeling the best of our knowledge, there are no other studies that combine
strategies. This increasing choice of simulation software makes the these two methodological tools to help make DESS selection
task of selecting a suitable product a difficult one. An incorrect decisions. Along the paper we presented literature reviews on
maturity models in section 3.2 and on software selection in section
3.3. From our point of view, this section aims to give an overview of
the state of the art of the main experiences described in the
* Corresponding author at: R. Marques de Sao Vicente 225, DEI, Rio de Janeiro,
literature in each of those topics. From those reviews, we could
22453-900, Brazil.
E-mail addresses: alexcastanon@skydome.com.br (A.M.C. Guimarães), conclude the originality of the combination of these two
jel@puc-rio.br (J.E. Leal), pamendes@coca-cola.com (P. Mendes). methodological tools.

https://doi.org/10.1016/j.compind.2018.09.005
0166-3615/© 2018 Elsevier B.V. All rights reserved.
A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27 15

The article is organized in six sections, including this Considering the definition of Yin, our approach cannot be
introduction. Section 2 describes the methodology applied in defined as a pure case study because the first part of the research
the study. Section 3 consists of a review of literature on maturity aimed at developing a two-step decision-making method: i)
models and the selection of simulation software; the review forms conduct a self-evaluation of the company with respect to the
the basis of the proposed methodology. Section 4 presents a maturity of its processes and ii) evaluate the components of the
method for evaluating and selecting simulation software. In software to serve as the criterion for choosing the most suitable
Section 5, the proposed methodology is applied, and the results software. Once the method has been defined, we can consider our
of the analysis of the processes and the software selection are study as two exploratory case studies: one with four units of
detailed. Finally, in Section 6, the conclusions about the results and analysis (the self-evaluation of maturity) and the other with three
contributions made by this study are formalized. units of analysis (the evaluation of the software components). For
each case study, the research questions are as follows: i) how does
2. Study methodology the company evaluate its processes, and ii) how does the company
evaluate software components according to its objectives? For
Yin [10] stated that there are three types of case studies: each case study, a questionnaire was build, and the target group of
exploratory, descriptive, and explanatory. While experiments try the questionnaires was defined. The questionnaire was evaluated
to answer research questions as who, what, where, why, or how, in terms of the validity of the methods. Fig. 1 depicts the protocol
which require control over the events and focus on contemporary for the case studies.
events, a case study does not require control over the events and is This research can be considered a design-oriented method-
more appropriate for the questions of how and why. Case studies ology as defined by (apud Mendes [37]). The approach deals
include direct observations and systematic interviews. A case with “how” questions with the goal of designing a model to solve
study has five components: i) the questions of the study; ii) the a given problem as stated by (apud Mendes [37]). The
propositions; iii) the units of analysis; iv) the logic that links the application of the method follows a design-testing approach
data to the propositions and v) the criteria to interpret the findings. used in traditional empirical sciences as stated by Eisanhardt
The methodology used to develop the research had following [11] in his work on the theory of case studies. More of the
steps: research methods applied to building maturity models can be
found in Mendes [37].
1 Define the research issue: how can companies select the most To apply the proposed model, four different companies
adequate DESS for the manufacturing industry? operating in different market segments were selected. For each
2 Conduct a literature review on the causes of success or failure company, questionnaires were completed by the manager
when using DESS in the manufacturing industry. responsible for the operating unit, and the results were presented
3 Based on that first review and the author’s direct observations, and discussed in a group consisting of the manager, the person
a hypothesis is formulated on the relationship between the responsible for the processes, and the person responsible for
level of process development and chance of successful DESS systems.
implementation.
4 Define the solution strategy to formalize a self-assessment 3. Literature review
procedure of the level of process development in the industry in
the form of a maturity model. Additionally, a multicriteria 3.1. Conditions for a successful DESS implementation
decision making methodology—the AHP method—was defined
to guide software selection. Johansen et al. (2003) present a study with the purpose of
5 Conduct a literature review on maturity models and DESS determining why DESS was less successful than predicted by many
selection methodologies. experts. They analyzed 16 industries and concluded that the main
6 Build a maturity model. Define the threshold capability reason was the lack on the right kind of information at the right
characteristics that describe a company successfully using time due to inadequate practices within the organizations. They
DESS. conducted their analysis from the point of view of the require-
7 Build a structured framework to characterize the components ments for a successful implementation of DESS as a daily tool in the
of DESS according to the issues faced by the manufacturing companies; however, those requirements are tied to the level of
industry. process structure in the industries including the information
8 Build an AHP model based on the criteria and subcriteria and system.
their weights defined by DESS experts. Ingamansson et al. (2002) present the results of a survey on
9 Apply the maturity model to four companies using a using DESS that includes 80 companies. They comment that larger
questionnaire. Identify companies that fulfill the basic require- companies have better adaptability in using DESS than smaller
ments for using DESS. firms. They stress that a successful project involves not only
10 Use the questionnaire to evaluate the three main software knowledge about simulation software but also adequate produc-
available in the market and translate those answers to the AHP tion improvement techniques, which are, in general, expected to be
method framework. more developed in larger firms.
11 Analyze the validity of the hypothesis and draw conclusions on Norouzilame and Jackson (2013) focus their work on the
the strengths and limitations of the methodology. The main proposal of a framework to successfully implement DESS. They
criteria for the validity of the model is the perception of the suggest that two types of competence are required to apply DESS:
evaluation group in the companies about the correctness of the knowledge of the system to be simulated and simulation expertise,
maturity model and the software selection procedure. which include modeling techniques and DESS project management

Fig. 1. Protocol for a case study.


16 A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27

and understanding the results of the implementation of simulation The OPM3 (Organizational Project Management Maturity
tools. An organizational structure and adequate working process Model), which was launched in 1998 by the PMI (Project
are also crucial to a successful DESS implementation in the Management Institute), helps organizations ensure that manage-
company. Thus, the definition of a DESS team and standard work ment of all their projects supports the macro-level business
methods is important in addition to the basic competences cited process by tying these projects to the corporate strategy.
above. Another model frequently cited in the literature is the PMMM
This last two studies mentioned show that the knowledge of (Project Management Maturity Model), which combines the
industry process is crucial, but they do not suggest how to structure and maturity levels in the CMM with the structure
formalize this knowledge. We think this gap can be fulfilled with and knowledge areas in the PMBOK (Project Management Body of
the maturity model. Knowledge). It describes the five levels of maturity required to
The aim of the literature review in this section is to report on the achieve excellence in project management [14].
maturity models and methods for evaluating and selecting Other models include the MMGP (Management Projects
discrete-event simulation software described in the literature. In Maturity Model) proposed by Prado et al. [15], the PEMM (Process
our study, the maturity level is evaluated in terms of the capability and Enterprise Maturity Model) published by Hammer [16], the
of using and extracting the best results from discrete-event P2CMM, a condensed form of the CMM based on PRINCE2 (Projects
simulation. Section 4 presents the maturity model developed for IN Controlled Environments) developed by Lianyinga et al. [17], the
this specific purpose. OS-UMM (Open Source Usability Maturity Model described by
Raza et al. [18] and the DDSC (Demand Driven Supply Chain)
3.2. Maturity models developed by Mendes [37] and Mendes et al. [36].
Table 1 lists the maturity models discussed here together with
A concern to improve management methods and the need to their structure and the approach used in each model to evaluate
win new clients led to the emergence of studies on organizational maturity level.
maturity. Maturity models enable the quality of an organization’s Clearly, there is no single, integrated methodology for evaluating an
processes to be measured and classified in a particular stage of organization’s maturity and performance throughout its life cycle.
development. Such models are finding application in an increasing However, although there is no standard model, there may be a suitable
number of areas and gaining importance as a subject of research. model, i.e., one that is best suited to a particular organization.
Some existing models already provide organizations with a
measure of their degree of maturity. The following citation gives 3.3. Simulation software evaluation and selection methods
a good summarized definition of maturity models
Pullen [12] notes that these models can be defined as a A range of discrete-event simulation software is available in the
structured collection of elements that describe the characteristics market. The Simulation Software Survey [19] is a useful source of
of effective processes in different stages of development. He also information as it summarizes the main characteristics of a variety
suggests demarcation points between stages and methods for of simulation software packages, including general description,
transitioning from one phase to another. main features, typical applications, main markets in which the
A vast range of maturity models developed for applications in software can be used, support/training, price and animation.
different fields are described in the literature, including maturity Because of the differences between packages, none of them is
models for project and process management, models based on suitable for use with every type of manufacturing problem. The
quality management, and maturity analysis models for checking most appropriate simulation software should be selected for the
the status of business processes. specific application being studied.
One of the first publications to deal with the issue of maturity The starting point for the study was a review of the literature
was a book by Crosby [13], who proposed a five-level scale to on the evaluation, comparison and selection of simulation
evaluate process quality. Each level covers management attitude software. The following studies were identified: Banks [20],
and understanding, quality organization status, problem handling, Mackulak et al. [21], Davis and Williams [22], Hlupic et al. [23],
cost of quality, quality improvement actions and quality posture. Nikoukaran et al. [24], Tewoldeberhan et al. [25], Cochran and
In the fields of software development and engineering, the Chen [26], Azadeh et al. [9], Gupta et al. [8] and Sawant and
CMM (Capability Maturity Model) and CMMI (Capability Maturity Mohite [5].
Model Integration), developed by Carnegie-Mellon University, Table 2 summarizes the criteria mentioned in these studies. The
allow organizations to evaluate their software project manage- list is not exhaustive, and because of the large number and variety
ment maturity level and capability. of criteria, some have been grouped together.

Table 1
Maturity models discussed in this paper.

Model Structure Evaluation


CMM Hierarchical with 5 levels Each level has its own set of requirements.
CMMI Staged Hierarchical with 5 levels
CMMI Hierarchical with 6 levels
Continuous
OPM3 Non-hierarchical The processes cover three domains: projects, programs and portfolios.
PMMM Hierarchical with 5 levels Each level has its own set of requirements.
MMGP Hierarchical with 5 levels Knowledge, Methodology, Computerization, Organizational Structure, Human Relationships and Alignment with
Strategies.
PEMM Two matrices with 4 hierarchical Processes: design, performers, process managers, infrastructure and metrics. Organization: leadership, culture,
levels expertise and governance.
P2CMM Hierarchical with 5 levels Each level has its own set of requirements.
OS-UMM Hierarchical with 5 levels Usability methodology, design strategy, evaluation and documentation.
DDSC Hierarchical with 5 levels Management of supply and demand, operations and product life cycle.
A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27 17

Table 2
Summary of the criteria mentioned in the studies analyzed in this article.

Criteria A B C D E F G H I J
Syntax x
Random variable generator x
Global attributes and variables x
Conditional routing x x x
Standardized/personalized reports x x x x
Statistical analysis x x x x x x
Generation of graphics x x x x
Automatic data collection x x
Module library x x x
Graphical model construction x x x x x
Animation x x x x x
Optimization x x
Technical documentation x x x
Manufacturing resources x
Object-oriented resources x
General features such as type of simulation, logic, execution time, specification of the units of time and size x x x x x x
and networked version.
Visual features (37 items) including animation, reproduction mode, screen and icon editors, colors and effects. x x x
Aspects related to coding, such as programming flexibility, access to source code, built-in functions and x x x x x x x x
variables.
Aspects related to efficiency, such as robustness, number of model elements, automatic recording, interaction, x x x x
compilation time and queue policy.
Modeling aids, such as quality of prompting, error messages, undo/redo commands and online help. x x x x x
Aspects related to tests, such as logic checking, quality of error messages, ease of debugging and alarms. x x x x x x x
Characteristics related to compatibility, such as integration with spreadsheet, text and statistical software and x x x x x x x x
data management systems.
Data input and output, such as dialog boxes, multiple outputs and inputs, graphics and reports. x x x x x x x
Aspects related to experimentation, such as warming-up period, independent replications and speed x x x
adjustment.
Statistical characteristics, such as distribution, fitting, analysis of output data and confidence interval. x x x x
User support, such as documentation, tutorials, training, demonstration models and consultancy. x x x
Financial and technical characteristics, such as portability, file conversion, price, ease of installation and x x x x x x
consultancy rates.
Aspects related to credibility, such as how long the company has been trading, company track record, x x x x x
references, supplier reputation and information sources.
Aspects related to tests and efficiency, such as tracing, step-by-step execution, validation and verification. x x x
Characteristics related to use, such as type of simulation, hardware, operating system and network. x x x x x x

A: Banks [20]; B: Mackulak et al. [21]; C: Davis and Williams [22]; D: Hlupic et al. [23]; E: Nikoukaran et al. [24]; F: Tewoldeberhan et al. [25] – phase 2; G: Cochran and Chen
[26]; H: Azadeh et al. [9]; I: Gupta et al. [8] and J: Sawant and Mohite [5].

The number of criteria and subcriteria evaluated in the of hardware, are available in autorun versions and have a variety of
literature varies from author to author: Davis and William [22] functions in their main menus.
evaluated 14, Sawant and Mohite [5] 17, Gupta et al. [8] 204 and
Hlupic et al. [23] 266. Most of the studies reviewed here use two 4. Evaluation and selection of simulation software: the
hierarchical levels apart from those proposed by Banks [20] and proposed methodology
Hlupic et al. [23], who used three levels, and Nikoucaran et al.
(1999), with more than three levels. It should be stressed that before computer simulation is used,
Although criteria may have the same name, this does not companies should satisfy certain basic conditions. Processes must
mean that they are actually the same. For example, the criterion be defined and documented, there must be statistical data for
“animation” in the study by Banks [20] is intended to check ease them, and continuous improvement should be used. Processes
of development, picture quality, smoothness of movement, must be structured and documented so that work can be
portability for remote viewing and the interface with CAD performed in stable routines and knowledge about how to perform
software. In the study by Tewoldeberhan et al. [25], however, it the activities can be accumulated.
is used to evaluate integration of animation, library of icons, Maturity models offer the possibility of measuring an
screen layout, concurrent animation mode, on/off feature, 3D organization's stage in relation to the quality of its processes.
animation and development features. In another example, the The maturity model proposed in this study aims to contribute to
statistical analysis in Hlupic et al. [23] covers 12 subcriteria, the generation of knowledge about how to situate an industry in
including number of statistical distributions, distribution relation to the degree of maturity of its production processes in
fitting, goodness-of-fit tests and confidence intervals. In order to be in able to extract the best results of discrete-event
Tewoldeberhan et al. [25], by contrast, statistical analysis simulation. Section 4.1 presents the maturity model developed for
appears only as a subcriterion (statistical distribution) in model this specific purpose.
development. In this section, a methodology for evaluating and selecting
Another factor that should be considered is the level of discrete-event simulation software for manufacturing compa-
development of computer technology when the criteria were nies is described. The method involves two stages. In the first
included in these studies. For example, mouse, keyboard, trackball stage, the operational processes in the production systems are
and scanner interface; picture quality; undo/redo command and analyzed to determine whether the company satisfies the basic
autorun versions all lose their meaning nowadays as software conditions to adopt computer simulation. This is performed by
packages already come with an interface for these and other types applying a maturity model. In the second stage, the features of
18 A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27

the simulation software are evaluated using a set of require- stochastic studies and case studies involving the application of
ments weighted by users. Selection is performed with the AHP simulation software. This knowledge must be observed in the team
(analytic hierarchy process) method, which is used to solve managing the processes in question.
multicriteria problems. Process standardization (PS): Technical documentation has
According to Chai et al. [27], the AHP method attempts to assign been developed and implemented. The documentation aims to
to each alternative a value that represents the degree of preference minimize any non-standard performance of activities regardless of
for this alternative and can be used to classify or select alternatives who these are carried out by in order to ensure uniform processes.
by prioritization based on a hierarchical structure. According to This capability must extend to all those involved in operational
Gupta et al. [8] and Jadhav and Sonar [28], the AHP is the most processes.
widely used method for evaluating software. Specialist knowledge (SK): This represents the level of
technical knowledge of the processes accumulated by the
4.1. The proposed model: maturity evaluation specialists over the years, enabling them to understand and
explain how phenomena occur. Their knowledge has a direct
The starting point in the development of the model was to impact on the quality of the results of simulations.
determine which type of structure could be used to represent the Process organization (PO): This represents the degree of
maturity level. Two types of structure are available in the control over the sequencing of processes and is required to ensure
literature: hierarchical and process-oriented. In the former, the efficient, effective production flows in a company.
maturity level is progressive and sequential. Maturity is classified Measurement and evaluation (MA): The existence of qualita-
according to a set of requirements or capabilities. Progression is not tive and quantitative information about any activity of interest.
necessarily sequential. In the model developed here a hierarchical With this information, performance problems related to the
structure was adopted as by analyzing certain capabilities it can be processes in question can be diagnosed and understood.
determined whether the process in question is sufficiently mature Management programs (MP): This reflects the knowledge
for computer simulation to be used. acquired in programs implemented as a result of internal policies
Weckenmann and Akkasoglu [29] note that hierarchical models and strategies required for the smooth working of the processes in
have four to six levels. As most of the hierarchical models considered a company. It is what guarantees added value, clear objectives and
here have five levels (apart from the PEMM, which has four) and as the orderly growth, among other important factors required for the
scale proposed by Crosby [13] for evaluating process quality has five continuity of the business.
levels, it was decided to adopt a model with the following five levels: The five levels were defined as shown below.
basic, embryonic, structured, managed and optimized. Level 1: Basic. There is no knowledge of any kind about the
According to the Project Management Institute (PMI) [30], for simulation methodology. Processes are not repetitive, and there is
progress toward maturity to be analyzed, organizations must have no standardization. There is no incentive for process specialists to
a consolidated set of best practices (BPs), i.e., a set of capabilities be appointed. There is little organization of processes. Measure-
that will help organizations to achieve their objectives. Progression ment and evaluation systems are practically nonexistent. Neither
toward BPs is analyzed in terms of standardization, measurement, management programs nor quality programs are adopted. At this
control and continuous improvement. level, only some of the basics required for a good operation are in
It should be stressed that before computer simulation is used, place, and only some of these are well implemented. The
companies should satisfy certain basic conditions. Processes must environment is not stable, structured or documented. The process
be defined and documented, there must be statistical data for can be considered chaotic.
them, and continuous improvement should be used. Processes Level 2: Embryonic. The first steps toward level 3 have been
must be structured and documented, so that work can be carried taken. Managers are interested in learning about simulation
out in stable routines and knowledge about how to perform the methodologies. Some standards have been tentatively imple-
activities can be accumulated. mented in some processes. There are not yet any process
In the model developed here, a hierarchical structure was specialists. The first attempts to organize processes have been
adopted by analyzing certain capabilities to determine whether made. Some procedures have started to be outlined. Measure-
the process in question is sufficiently mature for computer ments and evaluations are made in the form of manual notes but
simulation to be used. only in some areas. Information on performance is starting to be
Based on these requirements, BPs, the literature review shared, but not systematically. Neither management programs nor
presented here, and the observations by the authors, the following quality programs have yet been adopted.
six capabilities for evaluating each of the five levels for production Level 3: Structured. There is some knowledge of simulation;
processes were defined: knowledge of simulation, process this has been acquired from presentations, technical lectures, and
standardization, specialist knowledge, process organization, mea- software vendors' websites. The processes are formally described,
surement and evaluation systems, and management programs. and standards, procedures, tools, and methods are being imple-
Each capability is described below. mented. A team of process specialists is beginning to emerge.
Knowledge of simulation (KS): This includes knowledge of Activities are analyzed, measured, controlled, and planned. Quality
modeling, model verification and validation, simulation software, inspections and quality maintenance procedures are in place.

Table 3
Relation between capabilities and maturity levels.

Maturity Capabilities
Level
KS PS SK PO MA MP
Basic Nonexistent Nonexistent Nonexistent Very limited Insignificant Nonexistent
Embryonic Notions Isolated attempts Nonexistent Basic Basic Nonexistent
Structured Basic Being implemented Basic Average Average Being implemented
Managed Average Implemented Average Advanced Advanced Implemented
Optimized Advanced Advanced implementation Advanced Advanced Advanced Advanced implementation
A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27 19

Engineering processes are also applied at this level. Quality moderate by, for example, applying the DESS in small isolated cases
management practices are being implemented. in the process.
Level 4: Managed. A knowledge of simulation has been
acquired in courses and/or training. Projects and processes are 4.2. The proposed model: Software evaluation and selection
being managed, including the organization and control of
products. Production is planned and scheduled to satisfy demand This section describes a method for evaluating and selecting
more effectively and productively according to an established discrete-event simulation software used to develop models in
timetable. The management team can analyze the status of industry.
processes. Some processes are chosen so that they can be The method has five stages. In the first, the criteria used to
controlled and managed statistically and quantitatively. Special compare the software packages are defined. In the second, the
causes of process variations are identified and analyzed. Manage- criteria are organized hierarchically as in the AHP method. In the
ment programs are implemented. third, the criteria and groups of criteria are compared with each
Level 5: Optimized. Evolution from level 4. Processes are other to create weights that will be used to assign weightings to
continually improved by incremental actions and innovations. each software package. In the fourth stage, the software packages
Quantitative goals are established and reviewed to improve are compared pairwise for each criterion. Finally, in the fifth stage,
processes. At this level, companies adopt programs such as the priorities are weighted using the weights for the criteria and
Organizational Process Performance (OPP), Quantitative Project groups of criteria to arrive at a priority for each software package
Management (QPM), Organizational Performance Management for the company in question. The first, second, third and fourth
(OPM) and Causal Analysis and Resolution (CAR). stages are described in Sections 4.2.1–4.2.4, and the fifth stage is
Table 3 shows the relation between capabilities and maturity illustrated in the case studies in the selected companies described
levels in this evaluation model. in Section 5.2.
The evaluation is performed by applying a questionnaire with
33 questions. The questions are grouped by capability and are 4.2.1. Definition of the criteria
answered on a scale of perception with five options varying from “I The criteria adopted were based on the studies discussed in
disagree completely” to “I agree completely”. The maturity level for Section 2 and included other important criteria considered by the
each capability is evaluated by assigning numerical values to each authors based on experience in the manufacturing industry. The
response in the questionnaire: aim was to map the features that are desirable in simulation
a: 0; b: 2.5; c: 5.0; d:7.5; e: 10. software packages in order to meet the specific requirements of the
The measure, which is henceforth referred to as the individual production processes in each company.
capability maturity level (ICML), is given by: The objective of Table 5 is to show the main criteria used in
! the literature and to understand how the model proposed here
Xn
ICML ¼ Q i =n 10 ð1Þ is structured. The criteria show that there is a concern with
i¼1 input and output data as well as with user support and the
evaluation and testing section of the models. Isolated criteria,
Where:
such as entity attributes and global variables, customized and
Qi is the numerical value associated with the response to
standardized reports, statistical analyses, graphs, animation
question i
and optimization can all be sub criteria of more general criteria,
n is the number of questions corresponding to each capability.
such as model development and execution. Although not all the
The resulting score can vary from 0 to 100. The range of possible
studies analyzed include cost-related criteria, it was deemed
scores was divided into five equal bands corresponding to the five
necessary to include these, so that, when a simulation software
maturity levels, as shown in Table 4, and the maturity level for each
package is evaluated, managers could decide whether to use
capability was determined using these bands. The final maturity
this group of criteria depending on the particular situation.
level evaluated by the model (FMML) is given by the mean of the
Finally, criteria such as portability, compatibility, network
ICMLs, as all the capabilities have the same weight.
version and safety devices were considered sub criteria of
The final classification is also based on the bands in Table 4.
technical requirements. It was considered necessary to identify
The results obtained in this way allow the organization to
any differences between software packages in terms of the
obtain a picture of the global maturity level of their operational
difficulty and cost of customization, as well as, the require-
processes.
ments for running the software, and the type of programming
From the correspondence between the maturity classifica-
used. Six sub criteria that were considered important were
tion levels and the FMML scores, we can clearly define that a
therefore included: cost of customization, specific cost varia-
company with FMML under 40 is not in a situation wherein they
bles, level of customization, level of programming knowledge,
can achieve successful permanent use of a DESS. This does not
flowchart-based programming and object-oriented program-
exclude all uses of a DESS but indicates that a punctual
ming. The proposed criteria was divided into seven groups as
application of simulation may be undertaken with only partial
shown in Table 5 to give a total of 56 items.
benefits.
Therefore, it is recommended that the use of the simulation
4.2.2. Hierarchical structure of the decision-making process
methodology for companies classified as basic or embryonic be
When there are several levels of criteria, the total priority for
each alternative is obtained by multiplying the value of the priority
Table 4 of the alternative for each criterion by the priority of each criterion
Classification of maturity level by score.
in the next level and adding the results for all criteria. Our problem
Maturity level Score band will be modeled in four levels with the objective (selection of a
Basic 0  ICML < 20 software package) in level 1, the groups of criteria in the second
Embryonic 20  ICML< 40 level, the criteria (subcriteria related to the group of criteria) in
Structured 40  ICML < 60 each group in level 3, and the alternatives in the lowest level, as
Managed 60  ICML < 80
shown in Fig. 2. A brief description of the priority calculation
Optimized 80  ICML  100
process in AHP is presented below.
20 A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27

Table 5
Criteria for evaluating simulation software.

General technical requirements Portability/Compatibility/Network version/Safety devices


Data input and output Import and export data/Statistical information on input data/Statistical information on output data/Automatic collection/Batch
input/Interactive input/Consistency checking
Specific technical requirements: Graphical interface/Library of icons/Icon editor/Import icons/Import graphical background images/Editors for graphical
development background images/Incorporate-merge models/Coding assistant/Source code/Internal functions/Editors for internal functions/
Variables and attributes/Library of programmable objects/Scheduling/Routing rules/Queuing rules /Specific cost variables/Level of
learning
Specific technical requirements: Configure parameters/User interaction/View an animation/Activate or deactivate animation/View instantaneous values of
execution variables/3D animation/Optimization/Flowchart-based programming/Object-oriented programming/Level of customization/
Level of programming/Visual quality/Options for experimentation
Evaluation of efficiency Error location and correction/Model validation/Logical consistency
Technical support Manual/ Tutorial/Online support/Demo version/ Specialized training/ Updates
Costs Acquisition/Installation/Training/Customization/Technical support

Fig. 2. The hierarchical structure used to select simulation software.

Sa: Alternative (Software) a, a = 1, . . . ,na Further, we calculate a vector G of priorities of each group of
Gk: Group of Criteria k, k = 1, . . . ,ng criteria for the objective.
Cj,k: Criteria j, inside group k, j = 1, . . . ,nck 2 3
g1
For each group of criteria k, we calculate: 4 5
G¼ :
A vector CGk of priorities of criteria in group k. gng
2 3
cgk1 The product of of the transpose of G by SG gives the vector PS of
CGk ¼ 4 : 5
final priorities for each alternative for the objective.
cgknck
PS ¼ GT SG ¼ ½ ps1 ... psna 
A matrix SC kj of priorities of alternative a in criteria j of group k
is A hierarchical structure for the problem in question is shown
2 3 in Fig. 2, where G1, G2, . . . ,G7 represent the groups of criteria,
sck1;1 :: sck1;na
C1G1, C2G1, . . . ,Cnc1G1 are the nc1 criteria for group G1 and so on,
SC kj ¼ 4 : : : 5
and S1, S2 and S3 are the simulation software packages being
scknck ;1 : scknck ;na
studied.
The product of the transpose of CGk by SC kj gives the vector
SCGjk of priorities for each alternative for criteria j,k. 4.2.3. Definition of the weights of the criteria and groups of criteria
  Each of the criteria in each group of criteria and each group of
SCGkj ¼ CGTk SC kj ¼ scgj;1 :: scg j;na criteria itself can be evaluated by applying a questionnaire with
63 questions in the company being analyzed. Interviewees are
Each vector SCGjk will be a line of a matrix SG of priorities of
expected to assign values to the proposed criteria and groups of
alternatives in each criteria j,k, for all groups of criteria.
criteria according to their requirements and objectives. Respond-
2 3
sg1;1 :: sg1;na ents should answer the questions by choosing one of the
SG ¼ 4 :: :: :: 5 following options based on their perception: (a) completely
sgng;1 :: sgng;na indispensable, (b) partially indispensable, (c) neutral, (d) partially
dispensable and (e) completely dispensable.

Table 6
Scores used to evaluate the criteria and groups of criteria.

Evaluation Completely indispensable Partially indispensable Neutral Partially dispensable Completely dispensable
Score 1 3 5 7 9
A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27 21

The criteria and each group of criteria are assigned a score based presented and discussed in a group consisting of the manager and the
on a modified version of the scale proposed by Saaty [31], as shown individuals responsible for the processes and for systems.
in Table 6.
The importance of each criterion and each group of criteria 5.1. First stage
should be evaluated considering the characteristics of the
production process under study and the company objectives. After the maturity questionnaire for the model had been
applied, the ICML for each of the six capabilities and the FMML
4.2.4. Prioritization of the alternatives for each criterion were calculated for each company, as shown in Table 8.
The next stage in the process is to assign values to the quality of Company M1 has capability levels essentially between the
the features of the discrete-event simulation software in terms of “structured” and “managed” levels and a global score between
each criterion. The most suitable software according to a particular these levels. The “Specialist Knowledge” capability is at an
criterion is assigned the highest priority. embryonic level.
It was felt that a certain level of technical knowledge would be Company M2 has higher capability levels. Apart from “Knowl-
required to compare the software packages in terms of each edge of Simulation”, which is at an embryonic level, these are
criterion and that the representatives of the companies analyzed basically between “managed” and “optimized”. The global
would not necessarily have this knowledge. However, it was also evaluation is between “managed” and “optimized”.
felt that only the representatives of the companies would be able to Company M3, like M1, has maturity levels essentially between
assess the importance of each criterion for their particular “structured” and “managed”, and the global evaluation is
objectives and define the weights to be used to evaluate each “managed”.
software package and calculate the final priorities. This is discussed Finally, company M4 clearly has lower maturity levels, with
in Section 5.2 three capabilities classified as “basic”: ‘Knowledge of Simulation’,
To calculate the priorities for each software package for each “Process Organization” and “Management Programs”. The global
criterion using the intensity scale proposed by Saaty [31], it was evaluation indicates an embryonic maturity level.
decided to consult specialists in discrete-event simulation that did The simulation methodology should therefore be used in
not have any links with the suppliers of the software under moderation in company M4 and applied, for example, in minor
evaluation. stages in the process.
To support the assessment of the software packages, the
proceedings of conferences, such as the Winter Simulation 5.2. Second stage
Conference (WSC), Summer Simulation Multi-Conference (Sum-
merSim), and Spring Simulation Multi-Conference (SpringSim), After the software evaluation and selection questionnaire for
were used as information sources. Other information sources the model had been applied, all the priorities were calculated, as
included manuals and course books. Information from these shown in summary form in Tables 9–11 and in detail in Appendices
sources was than considered along with the opinions of three B–D.
simulation specialists. The Simulation Software Survey [19] also In this section, the results for each company are analyzed. First,
proved very useful, as did the suppliers’ websites. some considerations are presented regarding the criteria that led
The simulation software packages analyzed in this study, all of to the weights assigned for each company. The prioritization of the
which are available on the market, were Arena, Promodel, and three software packages after the evaluation for each subcriterion
Flexim. However, as the aim of this study was to present a is also shown.
methodology for evaluating and selecting simulation software
rather than select a particular product, the packages will be 5.2.1. Company M1
referred to as S1, S2, and S3 in a random way. In this way, no Company M1 did not consider any of the criteria in the group
information, including weightings and classifications, is revealed Technical requirements completely indispensable. Portability be-
about any of the software packages, reflecting the fact that it is tween different hardware platforms and Compatibility with
not the purpose of this study to evaluate whether any of the different operating systems were considered partially indispens-
packages are better or worse in terms of a particular feature. able. Network version and Safety devices that limit use were
Furthermore, it should also be kept in mind that software classified as neutral criteria in the questionnaire. The group was
packages are constantly evolving and updated with new evaluated as neutral by the user.
functionalities. The criteria Statistical information on input data,Statistical
Table 7 shows the pairwise comparisons of the software information on output data and Consistency checking were consid-
packages S1, S2 and S3 for each criterion using the intensity scale ered completely indispensable in the Data input and output group.
proposed by Saaty [31]. This is the result of the evaluation by the The user considered the criteria Import and export data, Automatic
group of specialists. collection from other systems, Batch input and Interactive input
The model was verified and validated using a case study, an unnecessary for his application and classified this group as
approach that has been adopted by Davis and William [22], Hlupic partially dispensable.
et al. [23],Tewoldeberhan et al. [25] and Rincon et al. [32] among In the groups Specific technical requirements for Development and
others to test software evaluation models. The case study is Execution, only the criteria Import icons, Import graphical back-
presented in the following section. ground images, Editors for graphical background images, Incorporate-
merge models, Access to source code, User interaction, Activate or
5. Application of the proposed model deactivate animation, 3D animation, Visual quality and Options
available for experimenting were considered neutral.
The evaluation methodology proposed here was applied in four The criteria Optimization, Object-oriented programming and
manufacturing companies in the following sectors: synthetics, Level of customization were considered completely dispensable.
aluminum cans, paper and scaffolding. For reasons of confidentiality, The other criteria in these groups were considered completely
the companies’ names will not be given and they will instead be indispensable as the user deemed them to be necessary for the
referred to as M1, M2, M3 and M4. In each company, the questionnaires simulation model in their context. Both these groups were
were completed by the manager of the unit, and the results were classified as completely indispensable.
22 A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27

Table 7
Pairwise comparison of the software packages S1, S2 and S3 for each criterion.

Group of criteria Criterion S1 S2 S3


General technical requirements Portability 1 1 1
Compatibility 1 1 1
Network version 3 1 3
Safety devices 1 1 1

Data input and output Import and export data 1 5 3


Statistical information on input data 3 1 5
Statistical information on output data 1 1 1
Automatic collection 1 1 1
Batch input 1 1 1
Interactive input 1 1 1
Consistency checking 3 1 1

Specific technical requirements: development Graphical interface 1 1 1


Library of icons 1 1 5
Icon editor 3 3 1
Import icons 1 1 5
Import graphical background images 1 1 5
Editors for graphical background images 3 3 1
Incorporate-merge models 5 1 5
Coding assistant 1 1 3
Source code 9 9 1
Internal functions 1 3 3
Editors for internal functions 3 3 1
Variables and attributes 1 1 1
Library of programmable objects 1 1 1
Scheduling 1 1 1
Routing rules 1 3 3
Queuing rules 1 1 1
Specific cost variables 1 3 3
Level of learning 1 1 5

Specific technical requirements: execution Configure parameters 1 1 3


User interaction 1 3 5
View an animation 3 3 1
Activate or deactivate animation 1 1 1
View instantaneous values of variables 1 1 1
3D animation 9 5 1
Optimization 3 1 5
Flowchart-based programming 1 9 9
Object-oriented programming 7 1 1
Level of customization 7 5 1
Level of programming 1 3 7
Visual quality 3 3 1
Options available for experimentation 3 3 1

Evaluation of efficiency Error location and correction 1 1 3


Model validation 1 1 1
Logical consistency 5 5 3

Technical support Manual 1 3 5


Tutorial 1 1 1
Online support 3 3 1
Demo version 1 5 3
Specialized training 1 1 3
Updates 1 1 1

Costs Acquisition 1 1 3
Installation 1 1 1
Training 5 1 3
Customization 1 1 1
Technical support 1 1 1

Table 8 All the criteria in the group Evaluation of efficiency were


Values of the ICMLs and FMML for companies M1, M2, M3 and M4.
considered completely indispensable, as was the group itself. The
Company ICML FMML user considered the following assistants essential: the Error
KS PS SK PO MA MP
location assistant as it allows the model to be tracked while it is
executing and processing to be verified; the Model validation
M1 70 63 33 71 67 56 60
M2 35 96 92 88 96 75 80
assistant as it allows syntax errors and inconsistencies in
M3 35 96 67 75 83 56 69 parameters or values to be identified and the Logical consistency
M4 10 42 42 13 42 19 28 assistant as it identifies errors and suggests solutions.
A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27 23

Table 9
Priority of the alternatives for each group of criteria – company M1.

Group of criteria S1 S2 S3 Weight of the Group


General technical requirements 0.308 0.383 0.308 0.052
Data input and output 0.255 0.447 0.298 0.037
Specific technical requirements: development 0.255 0.447 0.298 0.258
Specific technical requirements: execution 0.441 0.259 0.3 0.258
Evaluation of efficiency 0.345 0.345 0.31 0.258
Technical support 0.452 0.286 0.262 0.086
Costs 0.312 0.416 0.272 0.052
Final Priority 0.384 0.319 0.297

Table 10
Priority of the alternatives for each criterion – company M2.

Group of criteria S1 S2 S3 Weight of the criteria


General technical requirements 0.3 0.4 0.3 0.048
Data input and output 0.335 0.367 0.298 0.079
Specific technical requirements: Development 0.351 0.288 0.361 0.238
Specific technical requirements: Execution 0.222 0.256 0.521 0.238
Evaluation of efficiency 0.345 0.345 0.31 0.238
Technical support 0.362 0.305 0.333 0.079
Costs 0.312 0.416 0.272 0.079
Final Priority 0.313 0.317 0.37

Table 11
Priority of the alternatives for each criterion – company M3.

Group of criteria S1 S2 S3 Weight of the criteria


General technical requirements 0.24 0.521 0.24 0.027
Data input and output 0.28 0.416 0.304 0.081
Specific technical requirements: Development 0.387 0.403 0.211 0.243
Specific technical requirements: Execution 0.282 0.361 0.357 0.243
Evaluation of efficiency 0.345 0.345 0.31 0.243
Technical support 0.465 0.303 0.231 0.081
Costs 0.308 0.429 0.263 0.081
Final Priority 0.338 0.377 0.284

For the Technical support group, the user considered the criteria completely indispensable in the group Data input and output.
Tutorial, Demo version and Specialized training completely indis- The criterion Automatic collection was considered partially
pensable, while Manual, Online support and Updates were judged indispensable, as was the group itself.
partially indispensable. The group was classified as partially In the group Specific technical requirements for development, the
indispensable. criteria Library of icons, Import icons and Editors for internal
The criteria in the group Costs, as well as the group itself, were functions were considered partially indispensable, while the
classified as neutral, as the manager of the unit did not want to criteria Internal functions and Import graphical background images
place any emphasis on these criteria when the questionnaire was were judged to be neutral by the user. The criteria Incorporate-
applied. merge models and Level of learning were considered partially
The best alternative was the S1 software, which had the best dispensable. The other criteria in this group were judged to be
evaluation with a total priority of 0.384, followed by S2 with 0.319 completely indispensable, and the user considered Access to source
and S3 with 0.297. code to be important. The group Specific technical requirements for
S1 was therefore recommended for simulation modeling in development was considered completely indispensable.
company M1. This was confirmed by the perception of the group The user classified the group Specific technical requirements
evaluating the software packages, which showed greater interest for execution as completely indispensable. The criteria Optimiza-
in interaction between processes and ease of programming and tion, Flowchart-based programming and Level of programming
attached less importance to 3D animations, visual quality and were classified as completely dispensable. The criteria
optimization. Configure parameters, User interaction and Activate or deactivate
animation were considered partially indispensable, while the other
5.2.2. Company M2 criteria in this group were considered completely indispensable.
Company M2 also did not consider any of the criteria in the The user indicated that the criteria 3D animation and Level of
group Technical requirements completely indispensable. Portability, customization were very useful when running the simulation
Compatibility, Network version and Safety device were classified as model.
neutral criteria. The Technical requirements group was also All the criteria in the group Evaluation of efficiency were
evaluated as neutral. considered completely indispensable, as was the group itself. The
The criteria Import and export data, Statistical information on user judged the Error location, Model validation and Logical
input data, Statistical information on output data, Batch input, consistency assistants to be fundamental to develop and run the
Interactive input and Consistency checking were considered model.
24 A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27

In the Technical support group, the user considered the criteria The Costs group was judged partially indispensable, as was the
Tutorial, Online support, Specialized training and Updates completely criterion Technical support. The remaining criteria, such as
indispensable, while the criteria Manual and Demo version were Acquisition, Installation, Training and Customization, were consid-
judged neutral. The group itself was classified as partially ered completely indispensable.
indispensable. The best alternative was the S2 software, which had the best
The criteria in the Costs group were considered completely evaluation with a total priority of 0.377, followed by S1 with 0.338
indispensable, but the group was considered partially indispens- and S3 with 0.284.
able. The user was more concerned with technical aspects. The S2 software was therefore recommended for simulation
The best alternative was the S3 software, which had the best modeling in company M3. This was confirmed by the perception of
evaluation with a total priority of 0.370, followed by S2 with 0.317 the software evaluation group, which showed greater interest in
and S1 with 0.313. Optimization, Object-oriented programming and Level of learning and
S3 was therefore recommended for simulation modeling in less interest in 3D animation and Open-source code.
company M2. This was confirmed by the software evaluation
group, which placed greater emphasis on open source code, 3D 6. Conclusion
animation, object-oriented programming, visual quality, the
options available for experimentation and customization. The goal of this study was to develop a two-stage methodology
for evaluating and selecting discrete-event simulation software for
5.2.3. Company M3 use in the manufacturing sector. In the first stage, an approach to
For company M3, the Technical requirements group and the the evaluation of operational processes based on maturity models
criterion Network version were classified as completely indispens- was presented. The second stage described a structured set of
able while the criterion Compatibility was judged neutral by the criteria for analyzing and selecting simulation software using the
user. The other criteria, such as Portability and Safety devices were AHP method, which is widely used to solve multicriteria problems.
considered completely dispensable. The Technical requirements This section comments and clarify the strengths and limitations of
group was also considered completely dispensable. the work.
The criteria Statistical information on input data, Statistical The proposed model proved easy to apply. However, certain
information on output data, Batch input and Consistency checking precautions must be taken when managers are evaluating the
were considered completely indispensable in the Data input and maturity of the processes involved. The aims of the maturity
output group. The user considered the criteria Import and export evaluation must be very clear, as the results are important when
data and Automatic collection neutral and Interactive input identifying problems in processes.
completely dispensable. The group was classified as partially The maturity model proposed here was of benefit to the
indispensable. companies studied as it showed the current maturity level of their
In the Specific technical requirements for development group, the production processes and identified the main areas where
user considered the criteria Graphical interface, Library of icons, improvement is needed. However, it should be stressed that
Import icons, Import graphical background images, Incorporate- the main aim in using the maturity model was to determine
merge models, Coding assistant, Variables and attributes and Level of whether the company under study was ready to use computer
learning completely indispensable. The criteria Internal functions, simulation.
Library of programmable objects, Scheduling, Routing rules, Queuing Regarding software selection, the proposed model proved to be
Rules and Specific cost variables were classified as partially of value and aligned with recent technological advances and
indispensable. The criterion Editors for graphical background images software developments. It was able to capture, in a structured
was classified as neutral. The criteria Icon editors and Editors for manner, the criteria related to simulation software features that
internal functions were judged partially dispensable. Only the are of interest to managers in manufacturing companies. The
criterion Source code was classified as completely dispensable. The weights assigned to the criteria must be chosen carefully to ensure
Specific technical requirements for development group was consid- that all criteria are evaluated according to the characteristics of the
ered completely indispensable. process being studied and the desired objectives. It should be
The user classified the Specific technical requirements: Execution stressed that the selected criteria must be adapted to the
group as completely indispensable and the criteria Configure continuous technological evolution of the discrete-event simula-
parameters, View an animation, Optimization and Object-oriented tion software and hardware available in the market.
programming as completely indispensable. The criteria User It should also be highlighted that the hierarchical structure can
interaction, Level of programming, Visual quality and Options be updated at any time, as criteria and software packages can be
available for experimenting were classified as partially indispens- added or removed.
able. Only the criterion Editors for graphical background images was Another important point is that the use of the AHP method and
classified as neutral. Activate or deactivate animation and View the assumption of completely consistent evaluations resulted in a
instantaneous values of variables were considered partially dis- simplified selection process, which also preserved the efficiency of
pensable, while the criteria 3D animation and Flowchart-based the method to ensure that the application was easy to use for the
programming were considered completely dispensable. decision makers. To ensure the correctness of the simplified
All the criteria in the group Evaluation of efficiency were method, the alternative basis of comparison must be the one that
considered completely indispensable, as was the group itself. The seems to be the most important.
user judged the Error location, Model validation and Logical We do not have the intent to generalize the validity of the
consistency assistants to be fundamental to develop and run the maturity model to other fields of the manufacturing process
model. beyond the capability to plain DESS use. This can be considered a
In the Technical support group, the user considered the criteria limitation of our proposal. Extensions of the maturity model can be
Manual and Specialized training completely indispensable. The the subject of further studies.
criteria Tutorial, Online support and Updates were judged partially This study has made a valuable contribution to the academic
indispensable. Only the criterion Demo version was considered literature and industry by describing the conception, development,
partially dispensable. The group itself was classified as partially and application of a structured methodology for evaluating and
indispensable. selecting discrete-event simulation software.
A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27 25

Acknowledgments greatest eigenvalue normalized to 1.


!
X
The second author thanks the CNPq (National Research Council) Prk ¼ ð1=I 1k Þ1= 1=I 1n ðA1Þ
for the research support. n

All that is needed therefore is to compare a very important


Appendix A alternative with all the others and calculate the priorities using
Eq. (A1).
Summary of the simplified AHP method.
The fundamental scale proposed by Saaty compares the relative Appendix B. Company M1
importance of two alternatives. Verbalization of the intensity of
importance of the base alternative in relation to the one being Priorities of the alternatives for each criterion and between
compared can be associated with a numerical scale from 1 to 9, as criteria.
shown in Table A1. Even values between 1 and 9 are used for
Criteria S1 S2 S3 Pr. Pr. Criteria
intermediate situations [31]. criteria Group
Fig. A1 represents a matrix with alternatives A1, A2 and A3,
Portability 0,333 0,333 0,333 0,313 0,052
where Imn is the intensity of importance corresponding to line m Compatibility 0,333 0,333 0,333 0,313
and column n. Network version 0,200 0,600 0,200 0,188
In the Saaty method, the relative priorities of the alternatives Safety devices 0,333 0,333 0,333 0,188
are synthetized in a vector corresponding to the eigenvector PrPSiG1 0,308 0,383 0,308
Import and export data 0,652 0,130 0,217 0,032 0,037
associated with the largest eigenvalue in the matrix normalized Statistical information on input 0,217 0,652 0,130 0,290
to add up to 1. By definition, judgment of j against i is the inverse data
of judgment of i against j, i.e., Iji = 1/Iij. However, when there is Statistical information on output 0,333 0,333 0,333 0,290
consistency between the judgments, judgment of alternative j data
Automatic collection 0,333 0,333 0,333 0,032
against k can be calculated from judgment of i against the other
Batch input 0,333 0,333 0,333 0,032
alternatives using the relationship: Ijk = Iik/Iij. Saaty shows that Interactive input 0,333 0,333 0,333 0,032
there can be inconsistencies in judgments. These may be Consistency checking 0,143 0,429 0,429 0,290
acceptable within certain limits defined in the AHP method. As PrPSiG2 0,255 0,447 0,298
inconsistency increases when less important alternatives are Graphical interface 0,333 0,333 0,333 0,071 0,258
Library of icons 0,455 0,455 0,091 0,071
compared and as judgment of more important alternatives is Icon editor 0,200 0,200 0,600 0,071
done more carefully, the method can be simplified while at the Import icons 0,455 0,455 0,091 0,014
same retaining its effectiveness by taking one of the apparently Import graphical background 0,455 0,455 0,091 0,014
more important alternatives as the basis for comparison with images
Editors for graphical background 0,200 0,200 0,600 0,014
the others. Instead of using the complete matrix, only one very
images
important alternative is compared with all the others. Incorporate-merge models 0,143 0,714 0,143 0,014
The calculations can then be simplified if the judgments are Coding assistant 0,429 0,429 0,143 0,071
assumed to be consistent. Leal [44] showed that if the judgments Source code 0,091 0,091 0,818 0,014
are consistent, the priorities can be calculated as shown in Eq. (A1), Internal functions 0,600 0,200 0,200 0,071
Editors for internal functions 0,200 0,200 0,600 0,071
which gives the elements of the priority vector corresponding to Variables and attributes 0,333 0,333 0,333 0,071
the eigenvector of the judgment matrix associated with the Library of programmable objects 0,333 0,333 0,333 0,071
Scheduling 0,333 0,333 0,333 0,071
Routing rules 0,600 0,200 0,200 0,071
Queuing rules 0,333 0,333 0,333 0,071
Specific cost variables 0,600 0,200 0,200 0,071
Level of learning 0,455 0,455 0,091 0,071
Table A1 PrPSiG3 0,255 0,447 0,298
Scale of judgment. Configure parameters 0,429 0,429 0,143 0,158 0,258
User interaction 0,652 0,217 0,130 0,032
Intensity of importance Definition View an animation 0,200 0,200 0,600 0,158
1 Equal importance Activate or deactivate animation 0,333 0,333 0,333 0,032
3 Moderate importance View instantaneous values of 0,333 0,333 0,333 0,158
5 Strong importance variables
7 Very Strong importance 3D animation 0,085 0,153 0,763 0,032
9 Extreme importance Optimization 0,217 0,652 0,130 0,018
Flowchart-based programming 0,818 0,091 0,091 0,158
Object-oriented programming 0,067 0,467 0,467 0,018
Level of customization 0,106 0,149 0,745 0,018
Level of programming 0,677 0,226 0,097 0,158
Visual quality 0,200 0,200 0,600 0,032
Options available for 0,200 0,200 0,600 0,032
experimentation
PrPSiG4 0,441 0,259 0,300
Error location and correction 0,429 0,429 0,143 0,333 0,258
Model validation 0,333 0,333 0,333 0,333
Logical consistency 0,273 0,273 0,455 0,333
PrPSiG5 0,345 0,345 0,310
Manual 0,652 0,217 0,130 0,083 0,086
Tutorial 0,333 0,333 0,333 0,250
Online support 0,200 0,200 0,600 0,083
Demo version 0,652 0,130 0,217 0,250
Specialized training 0,429 0,429 0,143 0,250
Updates 0,333 0,333 0,333 0,083
Fig. A1. Judgment matrix for alternatives A1, A2 and A3.
26 A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27

(Continued) (Continued)

Criteria S1 S2 S3 Pr. Pr. Criteria Criteria S1 S2 S3 Pr. Pr. Criteria


criteria Group criteria Group

PrPSiG6 0,452 0,286 0,262 Demo version 0,652 0,130 0,217 0,074
Acquisition 0,429 0,429 0,143 0,429 0,052 Specialized training 0,429 0,429 0,143 0,221
Installation 0,333 0,333 0,333 0,333 Updates 0,333 0,333 0,333 0,221
Training 0,130 0,652 0,217 0,130 PrPSiG6 0,362 0,305 0,333
Customization 0,333 0,333 0,333 0,333 Acquisition 0,429 0,429 0,143 0,200 0,079
Technical support 0,333 0,333 0,333 0,333 Installation 0,333 0,333 0,333 0,200
PrPSiG7 0,312 0,416 0,272 Training 0,130 0,652 0,217 0,200
PrSi 0,384 0,319 0,297 Customization 0,333 0,333 0,333 0,200
Technical support 0,333 0,333 0,333 0,200
PrPSiG7 0,312 0,416 0,272
Appendix C. Company M2 PrSi 0,313 0,317 0,370

Priorities of the alternatives for each criterion and between


criteria.
Criteria S1 S2 S3 Pr. Pr. Criteria
criteria Group Appendix D. Company M3
Portability 0,333 0,333 0,333 0,250 0,048
Compatibility 0,333 0,333 0,333 0,250 Priorities of the alternatives for each criterion and between
Network version 0,200 0,600 0,200 0,250 criteria.
Safety devices 0,333 0,333 0,333 0,250
Criteria S1 S2 S3 Pr. Pr. Criteria
PrPSiG1 0,300 0,400 0,300
criteria Group
Import and export data 0,652 0,130 0,217 0,158 0,079
Statistical information on input 0,217 0,652 0,130 0,158 Portability 0,333 0,333 0,333 0,078 0,027
data Compatibility 0,333 0,333 0,333 0,141
Statistical information on output 0,333 0,333 0,333 0,158 Network version 0,200 0,600 0,200 0,703
data Safety devices 0,333 0,333 0,333 0,078
Automatic collection 0,333 0,333 0,333 0,053 PrPSiG1 0,240 0,521 0,240
Batch input 0,333 0,333 0,333 0,158 Import and export data 0,652 0,130 0,217 0,044 0,081
Interactive input 0,333 0,333 0,333 0,158 Statistical information on input 0,217 0,652 0,130 0,222
Consistency checking 0,143 0,429 0,429 0,158 data
PrPSiG2 0,335 0,367 0,298 Statistical information on output 0,333 0,333 0,333 0,222
Graphical interface 0,333 0,333 0,333 0,079 0,238 data
Library of icons 0,455 0,455 0,091 0,026 Automatic collection 0,333 0,333 0,333 0,044
Icon editor 0,200 0,200 0,600 0,079 Batch input 0,333 0,333 0,333 0,222
Import icons 0,455 0,455 0,091 0,026 Interactive input 0,333 0,333 0,333 0,025
Import graphical background 0,455 0,455 0,091 0,016 Consistency checking 0,143 0,429 0,429 0,222
images PrPSiG2 0,280 0,416 0,304
Editors for graphical background 0,200 0,200 0,600 0,079 Graphical interface 0,333 0,333 0,333 0,094 0,243
images Library of icons 0,455 0,455 0,091 0,094
Incorporate-merge models 0,143 0,714 0,143 0,011 Icon editor 0,200 0,200 0,600 0,013
Coding assistant 0,429 0,429 0,143 0,079 Import icons 0,455 0,455 0,091 0,094
Source code 0,091 0,091 0,818 0,079 Import graphical background 0,455 0,455 0,091 0,094
Internal functions 0,600 0,200 0,200 0,016 images
Editors for internal functions 0,200 0,200 0,600 0,026 Editors for graphical background 0,200 0,200 0,600 0,019
Variables and attributes 0,333 0,333 0,333 0,079 images
Library of programmable objects 0,333 0,333 0,333 0,079 Incorporate-merge models 0,143 0,714 0,143 0,094
Scheduling 0,333 0,333 0,333 0,079 Coding assistant 0,429 0,429 0,143 0,094
Routing rules 0,600 0,200 0,200 0,079 Source code 0,091 0,091 0,818 0,010
Queuing rules 0,333 0,333 0,333 0,079 Internal functions 0,600 0,200 0,200 0,031
Specific cost variables 0,600 0,200 0,200 0,079 Editors for internal functions 0,200 0,200 0,600 0,013
Level of learning 0,455 0,455 0,091 0,011 Variables and attributes 0,333 0,333 0,333 0,094
PrPSiG3 0,351 0,288 0,361 Library of programmable objects 0,333 0,333 0,333 0,031
Configure parameters 0,429 0,429 0,143 0,040 0,238 Scheduling 0,333 0,333 0,333 0,031
User interaction 0,652 0,217 0,130 0,040 Routing rules 0,600 0,200 0,200 0,031
View an animation 0,200 0,200 0,600 0,120 Queuing rules 0,333 0,333 0,333 0,031
Activate or deactivate animation 0,333 0,333 0,333 0,040 Specific cost variables 0,600 0,200 0,200 0,031
View instantaneous values of 0,333 0,333 0,333 0,120 Level of learning 0,455 0,455 0,091 0,094
variables PrPSiG3 0,387 0,403 0,211
3D animation 0,085 0,153 0,763 0,120 Configure parameters 0,429 0,429 0,143 0,166 0,243
Optimization 0,217 0,652 0,130 0,013 User interaction 0,652 0,217 0,130 0,055
Flowchart-based programming 0,818 0,091 0,091 0,013 View an animation 0,200 0,200 0,600 0,166
Object-oriented programming 0,067 0,467 0,467 0,120 Activate or deactivate animation 0,333 0,333 0,333 0,024
Level of customization 0,106 0,149 0,745 0,120 View instantaneous values of 0,333 0,333 0,333 0,024
Level of programming 0,677 0,226 0,097 0,013 variables
Visual quality 0,200 0,200 0,600 0,120 3D animation 0,085 0,153 0,763 0,018
Options available for 0,200 0,200 0,600 0,120 Optimization 0,217 0,652 0,130 0,166
experimentation Flowchart-based programming 0,818 0,091 0,091 0,018
PrPSiG4 0,222 0,256 0,521 Object-oriented programming 0,067 0,467 0,467 0,166
Error location and correction 0,429 0,429 0,143 0,333 0,238 Level of customization 0,106 0,149 0,745 0,033
Model validation 0,333 0,333 0,333 0,333 Level of programming 0,677 0,226 0,097 0,055
Logical consistency 0,273 0,273 0,455 0,333 Visual quality 0,200 0,200 0,600 0,055
PrPSiG5 0,345 0,345 0,310 Options available for 0,200 0,200 0,600 0,055
Manual 0,652 0,217 0,130 0,044 0,079 experimentation
Tutorial 0,333 0,333 0,333 0,221 PrPSiG4 0,282 0,361 0,357
Online support 0,200 0,200 0,600 0,221 Error location and correction 0,429 0,429 0,143 0,333 0,243
A.M.C. Guimarães et al. / Computers in Industry 103 (2018) 14–27 27

(Continued) [11] K.M. Eisanhardt, Building theory from case study research, Acad. Manag. Rev.
Criteria S1 S2 S3 Pr. Pr. Criteria 14 (4) (1989) 532–550.
[12] W. Pullen, A public sector HPT maturity model, Perform. Improv. 46 (2007) 9–
criteria Group
15.
Model validation 0,333 0,333 0,333 0,333 [13] P.B. Crosby, Quality Is Free, McGraw-Hill, New York, NY, 1979.
Logical consistency 0,273 0,273 0,455 0,333 [14] H. Kerzner, Strategic Planning for Project Management Using a Project
PrPSiG5 0,345 0,345 0,310 Management Maturity Model, John Wiley & Sons, Nova York, 2001.
Manual 0,652 0,217 0,130 0,318 0,081 [15] D. Prado, Maturidade Em Gerência De Projetos, INDG Tecnologia e Serviços
Tutorial 0,333 0,333 0,333 0,106 Ltda, Nova Lima, 2008.
Online support 0,200 0,200 0,600 0,106 [16] M. Hammer, The audit process, Harvard Bus. Rev. Boston 35 (4) (2007) 73–84
abr..
Demo version 0,652 0,130 0,217 0,045
[17] Z. Lianyinga, H. Jinga, Z. Xinxing, The project management maturity model and
Specialized training 0,429 0,429 0,143 0,318
application based on PRINCE2, Procedia Eng. 29 (2012) 3691–3697.
Updates 0,333 0,333 0,333 0,106 [18] A. Raza, L.F. Capretz, F. Ahmed, An open source usability maturity model (OS-
PrPSiG6 0,465 0,303 0,231 UMM), Comput. Hum. Behav. 28 (2012) 1109–1121.
Acquisition 0,429 0,429 0,143 0,231 0,081 [19] Orms Today, Simulation Software Survey, (2015) . Acessado em novembro de
Installation 0,333 0,333 0,333 0,231 2015 http://www.orms-today.org/surveys/Simulation/Simulation1.html.
Training 0,130 0,652 0,217 0,231 [20] J. Banks, Selecting simulation software, Proceedings of the Winter Simulation
Customization 0,333 0,333 0,333 0,231 Conference, (1991) , pp. 15–20.
Technical support 0,333 0,333 0,333 0,077 [21] G. Mackulak, J. Cichran, P. Savory, Ascertaining important features for
PrPSiG7 0,308 0,429 0,263 industrial, Simul. Environ. Simul. 63 (4) (1994) 211–221.
PrSi 0,338 0,377 0,284 [22] L. Davis, G. Williams, Evaluation and selecting simulation software using the
analytic hierarchy process, Integr. Manuf. Syst. 5 (1) (1994) 23–32.
[23] V. Hlupic, Z. Irani, R.J. Paul, Evaluation framework for simulation software, Int.
J. Adv. Manuf. Technol. 15 (1999) 366–382.
References [24] J. Nikoukaran, V. Hlupic, R.J. Paul, A hierarchical framework for evaluating
simulation software, Simul. Pract. Theory 7 (1999) 219–231.
[1] J. Koch, S. Maisenbacher, M. Maurer, G. Reinhart, M.F. Zäh, Structural modeling [25] T.W. Tewoldeberhan, A. Verbraeck, E. Valentin, G. Bardonnet, An evaluation
of extended manufacturing systems – an approach to support changeability by and selection methodology for discrete-event simulation software,
reconfiguration planning, Procedia Cirp. 17 (2014) 142–147. Proceedings of the Winter Simulation Conference, (2002) , pp. 67–75.
[2] Y.G. Sandanayake, C.F. Oduoza, D.G. Proverbs, A systematic modelling and [26] J.K. Cochran, H.N. Chen, Fuzzy multi-criteria selection of object-oriented
simulation approach for JIT performance optimization, Robot. Comput. Manuf. simulation software for production system analysis, Comput. Oper. Res. 32
24 (2008) 735–743. (2005) 153–168.
[3] Y.G. Sandanayake, C.F. Oduoza, Dynamic simulation for performance [27] J. Chai, J.N.K. Liu, E.W.T. Ngai, Application of decision-making techniques in
optimization in just-in-time-enabled manufacturing processes, Int. J. Adv. supplier selection: a systematic review of literature, Expert Syst. Appl. 40
Manuf. Technol. 42 (2009) 372–380. (2013) 3872–3885.
[4] A. Azadeh, A. Maghsoudi, Optimization of production systems through [28] A.S. Jadhav, R.M. Sonar, Framework for evaluation and selection of the software
integration of computer simulation, design of experiment, and Tabu search: packages: a hybrid knowledge based system approach, J. Syst. Softw. 84 (2011)
the case of a large steelmaking workshop, Int. J. Adv. Manuf. Technol. 48 (2010) 1394–1407.
785–800. [29] A. Weckenmann, G. Akkasoglu, Methodic design of a customized maturity
[5] V.B. Sawant, S.S. Mohite, A Decision-Making Framework for Simulation model for geometrical tolerancing, Procedia Cirp. 10 (2013) 119–124.
Software Selection Problem Using a Preference Selection Index Method. [30] Project Management Institute – Pmi, Organizational Project Management
Springes Communications in Computer and Information Science Series, (2011) Maturity Model (OPM3) Knowledge Foundation, Project Management
, pp. 176–181. Institute, Inc., Pennsylvania, USA, 2003.
[6] M. Bosch-Mauchand, A. Siadat, N. Perry, A. Bernard, VCS: value chains [31] T.L. Saaty, The Analytic Hierarchy Process, McGraw Hill, New York, 1980.
simulator, a tool for value analysis of manufacturing enterprise processes (a [32] G. Rincon, M. Alvarez, M. Perez, S. Hernandez, A discrete-event simulation and
value-based decision support tool), J. Intel. Manuf. 23 (2012) 1389–1402. continuous software evaluation on a systemic quality model: an oil industry
[7] U.S.B. Rakiman, A.T. Bon, Production Line: Effect of Different Inspection Station case, Inf. Manag. 42 (2005) 1051–1066.
Allocation, Procedia Eng. 53 (2013) 509–515. [36] P. Mendes, J.E. Leal, A.M.T. Thomé, A maturity model for demand-driven supply
[8] A. Gupta, K. Singh, R. Verma, A critical study and comparison of manufacturing chains in the consumer product goods industry, Int. J. Prod. Econ. 179 (2016)
simulation softwares using analytic hierarchy process, J. Eng. Sci. Technol. 5 (1) 153–165.
(2010) 108–129. [37] P. Mendes, A Framework for Assessing and Guiding Progress Towards a
[9] A. Azadeh, S.N. Shirkouhi, K. Rezaie, A robust decision-making methodology Demand Driven Supply Chain (DDSC). Doctoral Thesis, Industrial Engineering
for evaluation and selection of simulation software package, Int. J. Adv. Manuf. Department. Pontifícia Universidade Católica do Rio de Janeiro, 2010.
Technol. 47 (2010) 381–393. [44] J.E. Leal, Método Ahp: Análise do Método Simplificado de Cálculo. Memorando
[10] R.K. Yin, Case Studies Research. Design and Methods, Sage Publications, 1994. Técnico do DEI, Departamento de Engenharia Industrial, PUC-RJ, 2008.

Anda mungkin juga menyukai