Directory UMM :Data Elmu:jurnal:I:Information and Management:Vol37.Issue3.Apr2000:

Information & Management 37 (2000) 135±151

Toward an empirical understanding of computer simulation
implementation success
Roger McHaneya,*, Timothy Paul Cronanb
a

Department of Management, College of Business Administration, Kansas State University, Manhattan, KS 66506, USA
Computer Information Systems and Quantitative Analysis, College of Business Administration, University of Arkansas,
Fayetteville, AR 72701, USA

b

Received 8 February 1999; accepted 30 July 1999

Abstract
This study details the empirical development of a seven-factor contingency model of simulation success. The seven factors
are software characteristics, operational cost characteristics, software environment characteristics, simulation software output
characteristics, organizational support characteristics, initial investment cost characteristics, and task characteristics. This
exploratory model is derived from salient factors hypothesized by researchers and practitioners in the simulation and IS
literature based on the premise that computer simulation can be classi®ed as a representational DSS. Additional analysis

includes use of a regression model to rank the strength of these factors in their relationship to end-user computing satisfaction.
The article concludes with discussion considering how the developed model should serve as a guideline for developers of
simulation software and support those seeking to use computer simulation in organizational decision making settings. # 2000
Elsevier Science B.V. All rights reserved.
Keywords: Computer simulation; Decision support systems; End-user computing satisfaction; Information systems success

1. Introduction
Computerized IS have been used successfully to
support the vast quantity of information available to
business leaders and help apply this information in a
way leading to competitive advantage and economic
gain. In order to remain competitive, organizations can
no longer afford to allow decision making to be
conducted using nonscienti®c methods. This climate
has fostered the search for better technologies, tools
*
Corresponding author. Tel.: ‡1-785-5327479; fax: ‡1-7855327024.
E-mail addresses: mchaney@ksu.edu (R. McHaney),
cronan@comp.uark.edu (T.P. Cronan).


and methodologies to aid in the decision making
process.
Computer simulation allows decision makers to
propose `what-if' questions and learn more about
the dynamics of the system. Within this context, it
becomes a decision support tool [33].
While implementations of computer simulation
have been reported with varying levels of success
[16,20,26,48,60] and failure [7,15,30], underlying
factors relating to these outcomes have not been
investigated empirically.
Increased recognition of computer simulation's
value has stimulated demand and encouraged a wide
variety of software vendors to enter the market
[50,56]. In general, this situation has been bene®cial

0378-7206/00/$ ± see front matter # 2000 Elsevier Science B.V. All rights reserved.
PII: S 0 3 7 8 - 7 2 0 6 ( 9 9 ) 0 0 0 4 1 - 5

136


R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151

to the business decision maker, but the changing
corporate environment has resulted in a need to better
understand and improve the process of developing,
selecting, and implementing computer simulation
technology [32,37,55].
Some reported failures have led researchers to
develop methodologies for the analysis of simulation
software implementation [3,18,46,61]. During implementation, requirements must be paralleled to those of
existing software and procedures. Like general purpose software, computer simulation tools are designed
to meet the perceived needs of the eventual decision
maker. However, there are many types of users and this
complicates the task. The inability of users to develop
appropriate requirements has been recognized in the
literature as being . . . perhaps, the greatest problem in
the military-analytical [simulation] community [43].
This study examines computer simulation in its role
as a decision support tool. The computer simulation

literature is used to discover recurrent factors believed
to in¯uence success. These factors are organized into a
contingency framework that is used to extend a decision support systems (DSS) success model developed
by Guimaraes, Igbaria and Lu [24]. Empirical data is
collected and used to con®rm this model with factor
analysis. Finally, the model is tested via regression
against success measures.
Computer simulation implementation literature is
in its infancy at best. Most published recommendations
can
be
classi®ed
as
speculative
[21,22,25,34,44,45,47] and are unsupported empirically. While researchers have no speci®c computer
simulation research framework upon which to build,
this study supports a belief that computer simulation is
a decision support tool [1,17,23,28,29,35,53].

2. The study

The objective of the ®rst phase of this study was to
examine the speculative literature to gather and organize characteristics that are believed to in¯uence
simulation implementation success. A contingency
framework was adopted from the DSS literature and
used as a general means of organizing data. According
to Tait and Vessey [57], ``Contingency theory itself has
no content, it is merely a framework for organizing
knowledge in a given area.'' This, together with use in

studies of IS success [5,12,19,49,51] and DSS
research [40], makes contingency theory suitable
for an exploratory investigation.
2.1. Phase 1: contingency model of implementation
success
Using Alter's classi®cation of simulation as a representational decision support system, the success contingency themes could be expected to be very similar
to general DSS success factors, as identi®ed by information system researchers [24] as decision maker
characteristics, task characteristics, decision support
system characteristics, and implementation characteristics [36].
The implementation factors and environmental
characteristics were categorized into ®ve areas: simulation analyst characteristics, task characteristics,

simulation product characteristics [4,39,44], organizational characteristics [38], and simulation software
provider characteristics.
These recurrent themes were expanded through a
breakdown of the simulation product category into
eight sub-groupings. The features included are input
processing, statistical, output, software environment,
animation capability, costs, and level of product development. This breakdown and the individual variables
categorized within each grouping are shown in Fig. 1.
Table 1 provides a breakup of each variable.
These contingency themes very closely match the
general DSS factors of Guimaraes, Igbaria, and Lu.
Dubin [14] calls this approach invention by extension.
Other than nomenclature, only a single real difference
exists: the implementation characteristics in the integrated model of DSS success are replaced by organizational and software provider characteristics.
While computer simulations are often developed inhouse, the simulation language or product is generally
acquired from a vendor. Fig. 2 illustrates both the
general DSS success model and the proposed computer simulation implementation success model.
3. Research methodology
3.1. Validity of dependent variable
The Doll and Torkzadeh measure of End-User

Computing Satisfaction (EUCS) was administered

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151

137

Fig. 1. Individual variables organized by recurrent theme.

as the primary dependent variable in this study. One
hundred and nineteen respondents completed this
portion of the survey. The validity or the extent to
which this instrument measures what it is intended to
measure of this instrument, is assessed in two ways
[54].
3.1.1. Construct validity
Construct validity is the degree to which the measures chosen are either true constructs describing the
event of interest or merely artifacts of the methodology itself [9]. Correlation analysis and con®rmatory
factor analysis can be used to assess construct validity.
First, a factor analysis was performed to con®rm the
data re¯ected by the psychometric properties of the

EUCS instrument. Doll, Xia, and Torkzadeh [13]
recommend using a second-order factor structure.
This recommendation was veri®ed for use with repre-

sentational decision support system applications and
previously validated [41]. The second-level structure
is a single factor, called End-User Computing Satisfaction. The ®rst-order structure consists of ®ve factors: accuracy, content, ease of use, format, and
timeliness. Five-position Likert-type scales were used
to score the responses.
3.1.2. Convergent validity
Convergent validity is used to determine if the
measure of interest correlates with other related measures [31]. Computer simulation implementation success was measured with the EUCS instrument. This
measure was correlated with three other measures.
Two were single questions Ð Q1: Were you satis®ed
with the simulation system? and Q2: Was the simulation system successful? The third corroborating measure was the survey selection process. Respondents

138

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151


Table 1
Variable definitions for Fig. 1
I. Simulation analyst characteristics
an1: Background of developer
an2: Knowledge of simulation methodology
an3: Simulation education of developer
II. Task characteristics
t1: Intended use of simulation
t2: Project/system complexity
t3: Level of simulation detail
t4: Use of a structured approach in model development
III. Simulation software product characteristics
A. Input features
i1: Interface to other software
i2: Input data analysis capability
i3: Portability
i4: Syntax
i5: Modeling flexibility/problem capability
i6: Modeling conciseness/programming style
i7: Structural modularity/macros

i8: Specialty application modules
i9: Attributes for entities
i10: Global variables
B. Processing features
p1: Execution speed
p2: Maximum model size
p3: Hardware platform
C. Statistical features
s1: Random-number generators
s2: Random deviate generators
s3: Standard distributions
s4: Observed distributions
s5: Independent replications
s6: Warm-up period/reset
s7: Confidence intervals
D. Output features
o1: Standard reports
o2: Customized reports
o3: Business graphics
o4: File creation

o5: Trace capabilities
o6: Summarization of multiple model runs
o7: Output data analysis
o8: Individual model output observations
o9: High resolution graphics displays
E. Simulation software environment features
e1: User interface
e2: Ease of learning
e3: On-line help
e4: On-line tutorial
e5: Interactive debugging
e6: Degree of interaction

F. Animation capability
a1: Animation ease of development
a2: Quality of picture
a3: Smoothness of movement
a4: Portability for remote viewing
a5: User-defined icons
a6: CAD interface
G. Costs
c1: Hardware cost
c2: Software cost
c3: Acquisition cost
c4: Operation cost
c5: Model modification costs
c6: Interface costs
c7: Maintenance costs
c8: Training costs
c9: Computer run time costs
H. Level of simulation product development
d1: Degree of product validation and verification
d2: Acceptance by experts
d3: Number of active users
d4: Database sophistication
IV. Simulation software provider characteristics
sp1: Reputation
sp2: Reliability
sp3: History
sp4: Stability
sp5: General customer support
sp6: Training
sp7: Technical support
sp8: Frequency of updates and enhancements
sp9: Warranty
sp10: Support line
sp11: Quality documentation
V. Organizational characteristics
or1: Mentorship
or2: Teamwork
or3: Corporate goals
or4: Future frequency of use

were asked to ®ll out one of two survey forms based on
their perception as to the success of the reported
simulation project. All converging measures correlations were signi®cant at the 0.0001 level (Q1 at 0.80,
Q2 at 0.66, and survey selection process at 0.43).
3.2. Measurement of the dependent variable
In this research, an abstract concept is being measured as the dependent variable±simulation implementation success. A pilot test was conducted in

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151

139

Fig. 2. The Guimaraes, Igbaria, and Lu integrated model of DSS success compared with the proposed model of computer simulation success.

earlier research, speci®c to the area of computer
simulation [41]. In this test, EUCS was found to be
a valid and reliable surrogate measure for computer
simulation implementation success.
Additionally, McHaney, Hightower and White [42]
conducted a test±retest study, which provided additional evidence that when applied to users of computer
simulation, the EUCS instrument remained internally
consistent and stable.
3.3. Independent variables
Most of the independent variables were operationalized as simple questions with tangible answers.
Since no instrument for measuring items associated
with computer simulation implementation success
exists, questions for measuring the independent variables were constructed and validated through pretesting and other conventional methods [2].

3.4. Sampling procedure
This study examined users of discrete event computer simulation. Five hundred and three potential
simulation users were randomly selected from a pool
consisting of the membership of the Society for
Computer Simulation and recent contributors to the
Winter Simulation Conference. Informational letters
describing the study were mailed out, followed by a
package with questionnaire forms. The ®rst form
asked for a report on a successful simulation project.
The second form asked for a report on a less-thansuccessful simulation project. Respondents were told
that successful simulations are efforts that produce
accurate and useful information within time, budget,
and schedule constraints.
A total of 125 of the responses were usable.
Although 184 questionnaires were returned, 59 were
removed from the study. The reasons for removal

140

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151

included that the respondent did not use simulation,
reported only an academic involvement with simulation, or could not be classi®ed as a simulation user
(i.e., acted solely as a programmer). In order to be
included in the analysis, a respondent needed to report
on a real-world simulation project. Fourteen additional packets were undeliverable.
Forty of the 125 usable responses were paired,
meaning the respondent reported on both successful
and less-than-successful simulation projects. This
meant that 105 different individuals/companies
reported on a total of 125 different simulation projects.
The net response rate of usable surveys from unique
sources was 21.5 percent.

Table 2
Respondent demographics

3.5. Representativeness of returns
Several demographic measures were taken to allow
us to determine whether the respondents were unnaturally concentrated or if effects confounding the
measurement of success appear to exist. Among the
areas investigated were occupation, years of experience, and software package (Table 2). Other concerns
address the areas of animation use, animation/statistics importance, and use of external vendors. When
these potential confounds were examined, their possible in¯uences on the dependent variable, end-user
computing satisfaction, were of primary interest.
However, due to concerns related to methods variance,
alternative measures of simulation implementation
success, including a single-line item for satisfaction,
a single-line item for success [11,12] and whether a
successful or less-than-successful simulation project
survey was selected, were examined.
Table 3 contains a summary of signi®cance of
potential confounds for this study. None of the general
demographics such as occupation [8,59], years of
experience or animation/statistics importance directly
correlated with success. Other measures did signi®cantly correlate with simulation implementation success±use of animation, use of an external vendor, and
simulation software type. While animation and external vendor use items were not included (due to low
response rates), this analysis indicates that these items
may play an important role in simulation implementation success. The signi®cant relationship between
software product and simulation implementation success was not unexpected. The various items forming

Occupations

Frequency

Percent

Professor/consultant
Engineer
Manager/planner
Scientist/researcher
Analyst/programmer
Self-employed/consultant

32
24
20
20
17
12

25.6
19.2
16
16
13.6
9.6

Years of experience
0±5
6±10
11±15
15±20
More than 20
Not reporting

Frequency
27
40
25
18
13
2

Percent
21.6
32
20
14.4
10.4
1.6

Project software
GPSS/H
SIMAN
FORTRAN
AUTOMOD
SLAM
SIMSCRIPT
C/C‡‡
PROMODEL
MICROSAINT
INSIGHT
EXTEND
AUTOMATION MASTER
MATLAB
ADA
SES Workbench
Othersa
Not reporting

Frequency
14
12
12
11
11
11
8
5
4
4
3
3
2
2
2
14
2

Percent
11.2
9.6
9.6
8.8
8.8
8.8
6.4
4
3.2
3.2
2.4
2.4
1.6
1.6
1.6
11.2
1.6

a

Each of the following were reported once: GPSS/PC/
WORLD, NETWORK, ManSim, BASIC, MAISIE, ADAM,
ROSS, PL/1, FILM, EXCEL, PASCAL, STELLA, FACTOR,
SIMPROCESS, SIGMA.

the simulation software product characteristics factor
vary according to the software being used. It is interesting to note that simulation software specialty
packages earned a signi®cantly higher end user computing satisfaction score than did traditional simulation languages.

4. Results
A con®rmatory factor analysis procedure was used
to determine if the structure shown in Fig. 3 existed in
the collected data. Prior to conducting the con®rma-

141

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151
Table 3
Summary of significance of potential confounding variablesa
Variable name

EUCS

Single-line success

Single-line satisfaction

Success/failure survey

Occupation
Years of experience
Product type
Importance of statistics/animation
Animation use
External vendor use

0.456
0.431
0.082*
0.131
0.047**
0.888

0.886
0.824
0.419
0.873
0.064*
0.158

0.156
0.345
0.068*
0.931
0.054**
0.068*

0.992
0.984
0.930
0.657
0.271
0.216

a

*, significant at the 0.10 level; **, significant at the 0.05 level.

tory factor analysis required to test this hypothesis,
variables relating to animation and vendor characteristics had to be removed. Since with them, only ®ftyfour total usable respondents would remain in the data
set, the six animation variables and the eleven software
providers variables were removed. Hence, a fourfactor model was tested.
4.1. Confirmatory analysis of independent variables
A con®rmatory factor analysis was run using the
SAS PROC CALIS program [52]. The ®t of the data to
the hypothesized model was assessed using several
measures. The ®rst was the w2 goodness of ®t measure.
Analysis indicated that the data collected did not ®t the
hypothesized
factor
structure
(w2 ˆ 4084.9,
2
P > w ˆ 0.0001). The goodness of ®t and adjusted

goodness of ®t indexes reported at 0.42 and 0.38,
respectively. Bentler and Bonett's non-normed index
was 0.36, and Bollen's non-normed index was 0.39
[6]. Thus, the collected data did not con®rm the
hypothesized structure.
An exploratory factor analysis was run to summarize the interrelationships among the variables and
determine if reasonable factors would emerge. The
correlation matrix was found to be signi®cantly different from zero. Barlett's sphericity test indicated a
w2 value above 54,000 and a signi®cance level of 0.00.
Thus, the intercorrelation matrix contains enough common variance to make factor analysis viable. Kaiser's
MSA was 0.76 Ð adequate for an exploratory study.
The number of desired factors were selected. The initial
starting point of seven factors was selected using
Horn's Test [27], Velicer's MAP [58] and a scree plot.
The initial latent factor structure was investigated.
Table 4 contains the result. As shown, many of the
variables did not signi®cantly contribute to the factor
structure. Thus, an iterative process of removing the
least signi®cant variable using the factor loadings and
a correlation analysis with Cronbach's alpha, and rerunning the analysis was followed. All variables not
contributing to the factor structure were removed one
at a time and the analysis was re-run. The factor
analysis procedure was repeated until a ®nal exploratory model emerged, which is shown in Table 5.
Table 6 provides information concerning the correlation analysis used in the removal of various questions in the development of the ®nal exploratory factor
analysis. Cronbach's alpha was 0.85 [10].
4.2. Discussion of the exploratory model

Fig. 3. Four-factor model tested with confirmatory factor analysis.

The exploratory factor analysis resulted in seven
interpretable and consistent factors. Forty-three of the

142

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151

Table 4
Initial exploratory model
Rotated factor pattern

S3
S5
S2
S1
O1
O5
S4
O6
E5
S7
O7
S6
I9
D1
I5
I6
O8
P2
C8
C4
C3
C2
C7
C5
C6
C9
C1
I2
P1
E3
E1
E2
E4
I4
I1
I8
E6
O9
O3
O4
AN3
I3
O2
OR1
OR2
OR4
D2
D3
P3
I10
I7

Factor 1

Factor 2

Factor 3

Factor 4

Factor 5

Factor 6

Factor 7

0.77
0.75
0.73
0.72
0.68
0.66
0.66
0.66
0.61
0.58
0.58
0.58
0.55
0.54
0.47
0.45
0.41
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±

±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
0.74
0.74
0.74
0.71
0.71
0.68
0.67
0.62
0.59
±
ÿ0.41
±
±
ÿ0.44
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±

±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
0.71
0.69
0.65
0.64
0.49
0.46
±
ÿ0.67
±
±
±
±
±
±
±
±
±
±
±
±
±
±

±
±
±
±
±
±
±
±
±
±
0.55
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
0.73
0.61
0.48
0.44
0.41
±
±
±
±
±
±
±
±
±

±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
0.41
±
±
±
0.73
0.65
0.61
0.57
0.56
±
±
±

±
±
±
±
±
±
±
±
±
±
±
±
0.47
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
0.60
0.59

±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±

143

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151
Table 4 (Continued )
Rotated factor pattern
Factor 1
AN1
T4
AN2
T1
T2
T3
D4
OR3

±
±
±
±
±
±
±
±

Factor 2
±
±
±
±
±
±
±
±

Factor 3
±
±
±
±
±
±
±
±

59 variables initially factor analyzed were retained.
The factors were:
Factor 1: software characteristics
Factor 2: operational cost characteristics
Factor 3: software environment characteristics
Factor 4: simulation software output characteristics
Factor 5: organizational support characteristics
Factor 6: initial investment cost characteristics
Factor 7: task characteristics
Most of these are closely related to the factors or
subfactors originally hypothesized in the integrated
model for computer simulation implementation success. The results of the factor analysis and the seven
derived factors are displayed in Table 7. Table 8 compares those derived factors to the hypothesized ones.
When these seven factors were regressed against
computer simulation implementation success (Table
8), a signi®cant relationship was revealed. This regression model follows the form:
EUCS ˆ b1 FACT1 ‡ b2 FACT2 ‡ . . . ‡ bN FACTN
Table 9 contains the results of the regression analysis.1
The calculated F-value is 28.6. This results in a
1
The regression model was validated by re-running the analysis
using the data randomly split in half. The resulting models retained
the same properties. Validity model 1 reported an R2 value of 0.68,
while validity model 2 reported an R2 value of 0.65. These numbers
are consistent with the full model's R2 value of 0.665. The two
randomly split datasets were also investigated for reliability.
Cronbach's alphas for each were 0.84 and 0.83. This compares to a
full dataset alpha of 0.85. Again, very close. The validation models
help support the premise that the model observed in the data is
more than an artifact of several observations. Instead, it is a
reflection of a true phenomenon taking place among users of
computer simulation.

Factor 4

Factor 5

Factor 6

Factor 7

±
±
±
±
±
±
±
±

±
±
±
±
±
±
±
±

0.57
0.44
±
±
±
±
±
±

±
±
±
±
0.76
0.71
0.60
0.42

probability >F of 0.0001. Therefore, the regression
is significant, indicating that a relationship between
the factors developed from the simulation data and
EUCS is present. The R2 statistic indicates approximately 66.5 percent of the variance in EUCS can be
accounted for with the factors developed in the
exploratory factor analysis.
The ®ndings of the exploratory study indicate that
the most important features related to computer simulation implementation success are those associated
with operational costs. Most respondents considered
operational costs to be important to success. The
respondents' assessment of success involved not only
model development itself, but also the cost of running
and maintaining the model. In order for simulation
implementation to be successful, it had to be within
cost expectations.
The second strongest factor Ð simulation software
output characteristics Ð shows the impact of information and knowledge gained in the simulation on the
decision makers. Included are concerns about output
analysis, statistical summaries, and output reporting.
Without being able to move the knowledge gained in
the simulation to some usable form, the simulation is
merely an exercise. This emphasizes the goal-oriented
nature of simulation projects. The result, rather than
precise method, appears to be very important to those
using the tool.
The next signi®cant factor Ð organizational support characteristics Ð groups the team aspects of
simulation use and its acceptance within the corporation. Here, are loadings of mentorship, teamwork,
number of active users, etc. Simulation is more than
programming. It requires the participation of systems
experts, management, and decision makers. Without

144

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151

Table 5
Final exploratory model
Rotated factor pattern

S3
S5
I9
S2
S4
E5
O5
S1
O6
S6
I10
D1
C6
C5
C8
C4
C7
C9
E3
E1
E2
E4
I4
I1
E6
O7
O9
O8
O3
O1
S7
D3
OR4
OR1
D2
OR2
O4
C3
C2
C1
T2
T3
D4

Factor 1

Factor 2

Factor 3

Factor 4

Factor 5

Factor 6

Factor 7

0.79
0.78
0.70
0.70
0.67
0.66
0.65
0.65
0.62
0.61
0.56
0.55
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±

±
±
±
±
±
±
±
±
±
±
±
±
0.79
0.79
0.79
0.76
0.68
0.54
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±

±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
0.76
0.72
0.71
0.68
0.53
0.49
ÿ0.70
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±

±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
0.73
0.65
0.63
0.62
0.56
0.51
±
±
±
±
±
±
±
±
±
±
±
±

±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
0.72
0.70
0.66
0.62
0.56
0.54
±
±
±
±
±
±

±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
0.84
0.82
0.65
±
±
±

±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
±
0.86
0.78
0.60

proper team structure and management, the modeling
effort could be in serious danger. This interaction was
recognized by the model users.
The next factor Ð simulation environment characteristics Ð relates to the modeling process. This is

the ease of using the modeling tool, language syntax,
how much of a modeler's time is required, etc. The
proliferation of easy-to-use software in all areas has
apparently impacted the expectation of simulation
software users. It was reported that signi®cantly

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151

145

Table 6
Correlation analysis
Cronbach coefficient alphaa
Deleted variable

I1
S6
O5
C1
C2
C3
C4
C5
C6
C7
C8
C9
D1
D2
D3
D4
E1
E2
E3
E4
E5
E6
I4
I9
I10
O1
O3
O4
O6
O7
O8
O9
OR1
OR2
OR4
S1
S2
S3
S4
S5
S7
T2
T3
a

Raw variables

Standardized variables

Correlation with total

Alpha

Correlation with total

Alpha

0.14
0.49
0.58
ÿ0.15
0.04
ÿ0.05
ÿ0.16
ÿ0.19
ÿ0.19
ÿ0.17
ÿ0.09
ÿ0.17
0.48
0.48
0.35
0.15
0.56
0.28
0.51
0.42
0.58
ÿ0.09
0.47
0.47
0.47
0.54
0.50
0.52
0.67
0.65
0.35
0.44
0.24
0.21
0.36
0.57
0.48
0.66
0.53
0.57
0.51
0.11
0.25

0.85
0.84
0.84
0.88
0.86
0.86
0.86
0.86
0.86
0.86
0.86
0.86
0.85
0.85
0.85
0.85
0.84
0.85
0.84
0.85
0.84
0.86
0.85
0.85
0.85
0.84
0.84
0.84
0.84
0.84
0.85
0.85
0.85
0.85
0.85
0.84
0.85
0.84
0.84
0.84
0.84
0.85
0.85

0.12
0.48
0.58
ÿ0.15
0.05
ÿ0.04
ÿ0.15
ÿ0.18
ÿ0.18
ÿ0.17
ÿ0.08
ÿ0.17
0.47
0.46
0.37
0.16
0.53
0.25
0.49
0.41
0.58
ÿ0.07
0.45
0.47
0.47
0.53
0.48
0.51
0.66
0.64
0.34
0.43
0.25
0.22
0.36
0.56
0.48
0.65
0.52
0.56
0.52
0.11
0.27

0.84
0.83
0.83
0.85
0.84
0.85
0.85
0.85
0.85
0.85
0.85
0.85
0.83
0.83
0.84
0.84
0.83
0.84
0.83
0.84
0.83
0.85
0.83
0.83
0.83
0.83
0.83
0.83
0.83
0.83
0.84
0.83
0.84
0.84
0.84
0.83
0.83
0.83
0.83
0.83
0.83
0.84
0.84

For raw variables, 0.85; for standardized variables, 0.84.

higher success scores were given to simulators (specialty simulation packages) than to traditional languages
used for simulation.

The ®fth strongest relationship was with the
software characteristics factor. It can be concluded
that speci®c modeling software yields a higher

146

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151

Table 7
Derived factor structurea
Loading

Name

Description

Factor one: software characteristics
0.79
0.78
0.70
0.70
0.67
0.66
0.65
0.65
0.62
0.61
0.56
0.55

s3
s5
i9
s2
s4
e5
o5
s1
o6
s6
i10
d1

Standard distributions
Independent replications
Attributes for entities
Random deviate generators
Observed distributions
Interactive debugging
Trace capabilities
Random-number generators
Summarization of multiple model runs
Warm-up period/reset
Global variables
Degree of product validation and verification

Factor two: operational cost characteristics
0.79
0.79
0.79
0.76
0.68
0.54

c6
c5
c8
c4
c7
c9

Interface costs
Model modification costs
Training costs
Operation cost
Maintenance costs
Computer run time costs

Factor three: software environment characteristics
0.76
0.72
0.71
0.68
0.53
0.49
ÿ0.70

e3
e1
e2
e4
i4
i1
e6

On-line help
User interface
Ease of learning
On-line tutorial
Syntax
Interface to other software
Degree of interaction

Factor four: simulation software output characteristics
0.73
0.65
0.63
0.62
0.56
0.51

o7
o9
o8
o3
o1
s7

Output data analysis
High resolution graphics displays
Individual model output observations
Business graphics
Standard reports
Confidence intervals

Factor five: organizational support characteristics
0.72
0.70
0.66
0.62
0.56
0.54

d3
or4
or1
d2
or2
o4

Number of active users
Future frequency of use
Mentorship
Acceptance by experts
Teamwork
File creation

Factor six: initial investment costs characteristics
0.84
0.82
0.65

c3
c2
c1

Acquisition cost
Software cost
Hardware cost

Factor seven: task characteristics
0.86
0.78
0.60

t2
t3
d4

Project/system complexity
Level of simulation detail
Database sophistication

a

Variables removed from model include an1, an2, an3, i2, i3, i5, i6, i7, i8, p1, p2, p3, o2, or 3, t1, t4.

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151
Table 8
Comparison of exploratory factor composition with hypothesized factor structure
Hypothesized factor composition
Simulation software characteristics factors
Simulation software product characteristics statistical features
s1: Random-number generators
s2: Random deviate generators
s3: Standard distributions
s4: Observed distributions
s5: Independent replications
s6: Warm-up period/reset
s7: Confidence intervals

Simulation cost characteristics factors
Costs
c1: Hardware cost
c2: Software cost
c3: Acquisition cost
c4: Operation cost
c5: Model modification costs
c6: Interface costs
c7: Maintenance costs
c8: Training costs
c9: Computer run time costs

Software environment characteristics factor
Simulation software environment features
e1: User interface
e2: Ease of learning
e3: On-line help
e4: On-line tutorial
e5: Interactive debugging
e6: Degree of interaction

Simulation software output characteristics factor
Output features
o1: Standard reports
o2: Customized reports
o3: Business graphics
o4: File creation
o5: Trace capabilities
o6: Summarization of multiple runs
o7: Output data analysis
o8: Individual model observations
o9: High resolution graphics displays
Organizational support characteristics factor
Organizational characteristics
or1: Mentorship

Exploratory factor composition
Software characteristics
s1: Random-number generators
s2: Random deviate generators
s3: Standard distributions
s4: Observed distributions
s5: Independent replications
s6: Warm-up period/reset
o5: Trace capabilities
o6: Summarization of multiple model runs
i9: Attributes for entities
i10: Global variables
e5: Interactive debugging
d1: Degree of product validation and verification

Initial investment costs characteristics
c1: Hardware cost
c2: Software cost
c3: Acquisition cost
Operational cost characteristics
c4: Operation cost
c5: Model modification costs
c6: Interface costs
c7: Maintenance costs
c8: Training costs
c9: Computer run time costs

Software environment characteristics
e1: User interface
e2: Ease of learning
e3: On-line help
e4: On-line tutorial
e6: Degree of interaction
i1: Interface to other software
i4: Syntax

Simulation software output characteristics
o1: Standard reports
o3: Business graphics
o7: Output data analysis
o8: Individual model output observations
o9: High resolution graphics displays
s7: Confidence intervals

Organizational support characteristics
or1: Mentorship

147

148

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151

Table 8 (Continued )
Hypothesized factor composition

Exploratory factor composition

or2: Teamwork
or3: Corporate goals
or4: Future frequency of use

or2: Teamwork
or4: Future frequency of use
d2: Acceptance by experts
d3: Number of active users
o4: File creation

Task characteristics factor
Task characteristics
t1: Intended use of simulation
t2: Project/system complexity
t3: Level of simulation detail
t4: Use of a structured approach

Task characteristics
t2: Project/system complexity
t3: Level of simulation detail
d4: Database sophistication

degree of perceived success. Various questions loading on this item include the underlying statistical
sophistication, debugging facilities, and other software features.
Initial investment costs is a factor that again emphasizes the cost aspect related to simulation implementation success. If the software costs too much, the
project may not be considered successful, even if the
simulation is used to make a good decision. This factor
may have been weakened through a memory effect.
The initial investment may have been made some time
ago. This could explain the relative strength of operational costs compared with investment costs.

The ®nal and weakest factor (not signi®cant) was
task characteristics. Although enough common variance was present to form a task characteristics factor,
the beta coef®cient for this factor was not signi®cant in
the regression analysis. This indicates that though task
characteristics may be related to simulation implementation success, it does not appear to contribute
enough to be considered signi®cant. Possibly, the
questionnaire items may have been worded poorly
or interpreted in several different ways. Alternatively,
the construct may just not be important to many of the
respondents. Perhaps, the dif®culty of the task or
complexity of the decision to be made simply is not

Table 9
Regression based on exploratory factor analysis
Analysis of variancea
Source

DF

Model
Error
C total

7
101
108

Sum of squares
4746.0
2393.2
7139.2

Mean square
678.0
23.7

F-value

P>F

28.6

0.0001

Parameter estimates
Variable

DF

Parameter estimate

Standard error

T for H0: Parameter ˆ 0

P > |T|

INTERCEP
Software
Operational cost
Software environment
Software output
Organizational support
Initial investment
Task

1
1
1
1
1
1
1
1

46.5
1.7
ÿ3.7
2.6
3.2
2.6
ÿ1.7
0.0

0.47
0.47
0.47
0.47
0.46
0.46
0.47
0.46

99.7
3.7
ÿ7.9
5.6
6.9
5.7
ÿ3.6
0.1

0.0001
0.0003
0.0001
0.0001
0.0001
0.0001
0.0005
0.9501

a

Root MSE, 4.9; Dep. mean, 46.5; C.V., 10.5; R2, 0.67; Adj. R2, 0.64.

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151

149

indicates investment in an animation system might
improve the value of simulation as a decision support
tool.
The results of this study do not indicate simulation
implementation success is a random occurrence, nor is
it manifested in the same way in every situation. But
rather, it relates to a variety of factors ranging from
software to cost to organizational support.
5.2. Limitations of the study
The exploratory nature of this study results in some
obvious limitations. First, the questions used to
develop the independent variable side of the contingency model need further re®nement. As with any
exploration into a previously unstudied area, this was
the ®rst attempt. Some of the factors that did not load
might be as a result of questionnaire construction
rather than lack of factor importance.
An important limitation to recognize is the nature of
the developed framework. Since the research approach
is based on contingency theory and is exploratory in
nature, no causality can be assumed.
The assumption that computer simulation is a representational model, DSS is another potential limitation
in terms of theoretical development. If this assumption
were not made, it would become dif®cult to justify the
use of information system literature. The sample size
may also be a limitation.
Fig. 4. New model for computer simulation success.

6. Summary
a factor relating to simulation implementation success.
Fig. 4 illustrates all seven factors.

5. Conclusion
5.1. Practical implications
Computer simulation is a primary decision making
aid in the areas of operations management, operations
research, industrial engineering and management
science. Because of its promise, a high degree of
commercialization of this technology is taking place.
The fact that animation has a signi®cant relationship with simulation implementation success

This study identi®ed a framework upon which
computer simulation implementation success can be
studied empirically. This framework was used as a
basis for exploratory empirical research into the
factors correlated with computer simulation implementation success. A mail survey was conducted to
measure recurrent factors associated with success or
failure. An exploratory model based on a set of
hypothesized factors was derived. This analysis
indicated the presence of seven factors in the data.
These, although not matching the hypothesized model
exactly, did relate very closely, providing the ®rst step
toward an understanding of computer simulation
implementation success.

150

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151

References
[1] S. Alter, A taxonomy of decision support systems, Sloan
Management Review, Fall, 1977, pp. 37±56.
[2] E. Babbie, Survey Research Methods, Wadsworth, Belmont,
CA, 1990.
[3] O. Balci, Guidelines for successful simulation studies, in:
Proceedings of the 1990 Winter Simulation Conference, Society
for Computer Simulation, San Diego, CA, 1990, pp. 25±32.
[4] J. Banks, Selecting simulation software, in: Proceedings of
the 1991 Winter Simulation Conference, Society for Computer Simulation, San Diego, CA, 1991, pp. 15±20.
[5] S. Blili, L. Raymond, S. Rivard, Impact of task uncertainty,
end-user involvement, and competence on the success of enduser computing, Information & Management 33(3), 1998, pp.
137±153.
[6] K.A. Bollen, Structural Equations with Latent Variables,
Wiley, New York, 1989.
[7] J.S. Carson, Convincing users of a model's validity is
challenging aspect of modeler's job, Industrial Engineering,
June, 1986, pp. 74±75.
[8] D. Christy, H. Watson, The application of simulation: a
survey of industry practice, Interfaces 13(5), 1983, pp. 47±52.
[9] T.D. Cook, D.T. Campbell, Quasi-Experimentation: Design
and Analysis Issues in Field Settings, Houghton Mifflin,
Boston, MA, 1979.
[10] L.J. Cronbach, Coefficient alpha and the internal consistency
of tests, Psychometrika 16, 1951, pp. 297±334.
[11] F.D. Davis, Perceived usefulness, perceived ease of use, and
user acceptance of information technology, MIS Quarterly,
September, 1989, pp. 319±340.
[12] W.J. Doll, G. Torkzadeh, The measurement of end-user
computing satisfaction, MIS Quarterly, June, 1988, pp. 259±
274.
[13] W.J. Doll, W. Xia, G. Torkzadeh, A confirmatory factor
analysis of the end-user computing satisfaction instrument,
MIS Quarterly, June, 1994, pp. 453±461.
[14] R. Dubin, Theory Building, Free Press, New York, 1978.
[15] T. Duff, Avoid the pitfalls of simulation, Automation,
November, 1991, pp. 32±36.
[16] R. Farina, G.A. Kochenberger, T. Obremski, The computer
runs the Bolder Boulder: a simulation of a major running
race, Interfaces 19(2), 1989, pp. 48±55.
[17] S. Floyd, C. Turner, K. Davis, Model-based decision support
systems: an effective implementation framework, Computers
in Operations Research 15(5), 1989, pp. 481±491.
[18] C.A. Fossett, D. Harrison, H. Weintrob, S.I. Gass, An
assessment procedure for simulation models: a case study,
Operations Research 39(5), 1991, pp. 710±723.
[19] C.R. Franz, D. Robey, Organizational context, user involvement, and the usefulness of information systems, Decision
Sciences 17(2), 1986, pp. 329±356.
[20] T.J. Gogg, C. Sands, Hughes Aircraft designs automated
storeroom system through simulation application, Industrial
Engineering, August, 1990, pp. 49±57.
[21] J. Gouskos, Three benefits every simulation buyer should
understand, Industrial Engineering, July, 1992, p. 34.

[22] J.W. Grant, S.A. Weiner, Factors to consider in choosing a
graphically animated simulation system, Industrial Engineering, August, 1986, pp. 37±40, 65±68.
[23] P. Gray, I. Borovits, The contrasting roles of monte carlo
simulation and gaming in decision support systems, Simulation 47(6), 1986, pp. 233±239.
[24] T. Guimaraes, M. Igbaria, M. Lu, The determinants of DSS
success: an integrated model, Decision Sciences 23(2), 1992,
pp. 409±430.
[25] S.W. Haider, J. Banks, Simulation software products for
analyzing manufacturing systems, Industrial Engineering,
July, 1986, pp. 98±103.
[26] J. Higdon, Planning a material handling simulation, Industrial
Engineering, November, 1988, pp. 55±59.
[27] J.L. Horn, A rationale and test for the number of factors in
factor analysis, Psychometrika 30(2), 1965, pp. 179±185.
[28] W.C. House, Business Simulation for Decision Makers, PBIPetrocelli, New York, 1977.
[29] C.H. Jones, At last real computer power for decision makers,
Harvard Business Review, September±October, 1970, pp. 75±
89.
[30] L. Keller, C. Harrell, J. Leavy, The three best reasons why
simulation fails, Industrial Engineering, April, 1991, pp. 27±31.
[31] F.N. Kerlinger, Foundations of Behavioral Research, Harcourt, Brace and Jovanovich, Fort Worth, TX, 1986.
[32] A.M. Law, S.W. Haider, Selecting simulation software for
manufacturing applications: practical guidelines and software
survey, Industrial Engineering, May, 1989, pp. 33±46.
[33] A.M. Law, W.D. Kelton, Simulation Modeling and Analysis,
2nd ed., McGraw-Hill, New York, 1993.
[34] A.M. Law, M.G. McComas, How to select simulation
software for manufacturing applications, Industrial Engineering, July, 1992, pp. 29±35.
[35] L. Lin, J. Cochran, J. Sarkis, A metamodel-based decision
support system for shop floor production control, Computers
in Industry 18, 1992, pp. 155±168.
[36] H.C. Lucas, Empirical evidence for a descriptive model of
implementation, MIS Quarterly 2(2), 1978, pp. 27±41.
[37] R. Lynch, Implementing packaged application software:
hidden costs and new challenges, Systems, Objectives,
Solutions 4, 1984, pp. 227±234.
[38] K. Mabrouk, Mentorship: a stepping stone to simulation
success, Industrial Engineering, February, 1994, pp. 41±43.
[39] G.T. Mackulak, J.K. Cochran, P.A. Savory, Ascertaining
important features for industrial simulation environments,
Simulation 63(4), 1994, pp. 211±221.
[40] R.I. Mann, H.J. Watson, A contingency model for user
involvement in DSS development, MIS Quarterly, March,
1984, pp. 27±37.
[41] R. McHaney, T.P. Cronan, Computer simulation success: on
the use of the end-user computing satisfaction instrument,
Decision Sciences 29 (2), 1998, pp. 525±534.
[42] R. McHaney, R. Hightower, D. White, EUCS test±retest
reliability in representational model decision support systems,
Information & Management 36, 1999, pp. 109±119.
[43] R.J. Might, Principles for the design and selection of combat
simulations, Simulation & Gaming 24(2), 1993, pp. 190±212.

R. McHaney, T.P. Cronan / Information & Management 37 (2000) 135±151
[44] H. Min, Selection of software: the analytic hierarchy process,
International Journal of Physical Distribution & Logistics
Management 22(1), 1992, pp. 42±52.
[45] J. Mott, K. Tumay, Developing a strategy for justifying
simulation, Industrial Engineering, July, 1992, pp. 38±42.
[46] K. Musselman, Conducting a successful simulation project,
in: Proceedings of the 1992 Winter Simulation Conference,
Society for Computer Simulation, San Diego, CA, 1992, pp.
115±121.
[47] B.U. Nwoke, D.R. Nelson, An overview of simulation in
manufacturing, Industrial Engineering, July, 1993, pp. 43±57.
[48] S. Randhawa, A. Mechling, R. Joerger, A simulation-based
resource planning system for Oregon motor vehicles division,
Interfaces 19(6), 1989, pp. 40±51.
[49] S. Rivard, S.L. Huff, User developed applications: evaluation
of success from the DP perspective, MIS Quarterly 8(1),
1984, pp. 39±50.
[50] J. Rodrigues (Ed.), Directory of Simulation Software, vol. 4,
The Society for Computer Simulation, San Diego, CA, 1993.
[51] L.G. Sanders, J.F. Courtney, A field study of organizational
factors influencing DSS success, MIS Quarterly 9(1), 1985,
pp. 77±93.
[52] SAS Institute, SAS User's Guide, vol. 1, ACECLUS-FREQ,
version 6, 4th ed., SAS Institute, Inc., Cary, NA, 1994.
[53] R.L. Schultz, System simulation: the use of simulation for
decision making, Behavioral Science 19, 1974, pp. 344±350.
[54] D.W. Straub, Validating instruments in MIS research, MIS
Quarterly, June, 1989, pp. 147±166.
[55] P. Sussman, Evaluating decision support software, Datamation, 15 October 1984, pp. 171±172.
[56] J. Swain, Flexible tools for modeling, OR/MS Today,
December, 1993, pp. 62±78.
[57] P. Tait, I. Vessey, The effect of user involvement on system
success: a contingency approach, MIS Quarterly 12(1), 1988,
pp. 91±108.
[58] W.F. Velicer, Determining the number of components from
the matrix of partial correlations, Psychometrika 41(3), 1976,
pp. 321±327.
[59] H.J. Watson, D. Christy, The evolving use of simulation,
Simulation and Games, September, 1982, pp. 351±363.
[60] A. Wilt, D. Goddin, Health care case study: simulating
staffing needs and work flow in an outpatient diagnostic
center, Industrial Engineering, May, 1989, pp. 22±26.
[61] B.D. Withers, A.A.B. Pritsker, D.H. Withers, A structured
definition of the modeling process, in: Proceedings of the

151

1993 Winter Simulation Conference, Society for Computer
Simulation, San Diego, CA, 1993, pp. 1109±1117.
Roger McHaney For eight years prior to
his return to academia, Roger McHaney
was employed by the Jervis B. Webb
Company. While there, he simulated
numerous materials-handling systems
for customers, including General Motors, Goodyear, Ford, IBM, Chrysler,
Kodak, Caterpillar, the Los Angeles
Times, and the Boston Globe. His
current research interests include automated guided vehicle system simulation,
innovative uses for simulation languages, simulation's use in DSS
and simulation success. After completing a Ph.D. in Computer
Information Systems and Quantitative Analysis at the University of
Arkansas College of Business, Dr. McHaney became an Assistant
Professor at Kansas State University. He is the author of the 1991
Academic Press book, Computer Simulation: A Practical Perspective and has published in Decision Sciences, The International
Journal of Production Research, Decision Support Systems,
Simulation, and various other journals.
Timothy Paul Cronan is Professor of
Computer Information and Quantitative
Analysis at the University of Arkansas,
Fayetteville. Dr. Cronan received the
D.B.A. from Louisiana Tech University,
and is an active member of the Decision
Sciences Institute and The Association
for Computing Machinery. He has
served as regional vice president and
on the board of directors of the Decision
Sciences Institute and as president of the
Southwest Region of the Institute. In addition, he served as
associate editor for MIS Quarterly. His research interests include
local area networks, downsizing, expert systems, performance
analysis and effectiveness, and end-user computing. Publications
have appeared in Decision Sciences, MIS Quarterly, OMEGA, The
International Journal of Management Science, The Journal of
Management Information Systems, Communications of the ACM,
Journal of End User Computing, Database, Journal of Research on
Computing in Education, Journal of Financial Research, as well as
in other journals and proceedings of various conferences.