Manajemen | Fakultas Ekonomi Universitas Maritim Raja Ali Haji 08832321003774756

Journal of Education for Business

ISSN: 0883-2323 (Print) 1940-3356 (Online) Journal homepage: http://www.tandfonline.com/loi/vjeb20

Business Students’ Perceptions, Attitudes, and
Satisfaction With Interactive Technology: An
Exploratory Study
Jacqueline Kilsheimer Eastman , Rajesh Iyer & Kevin L. Eastman
To cite this article: Jacqueline Kilsheimer Eastman , Rajesh Iyer & Kevin L. Eastman
(2011) Business Students’ Perceptions, Attitudes, and Satisfaction With Interactive
Technology: An Exploratory Study, Journal of Education for Business, 86:1, 36-43, DOI:
10.1080/08832321003774756
To link to this article: http://dx.doi.org/10.1080/08832321003774756

Published online: 20 Oct 2010.

Submit your article to this journal

Article views: 166

View related articles


Citing articles: 11 View citing articles

Full Terms & Conditions of access and use can be found at
http://www.tandfonline.com/action/journalInformation?journalCode=vjeb20
Download by: [Universitas Maritim Raja Ali Haji]

Date: 11 January 2016, At: 22:10

JOURNAL OF EDUCATION FOR BUSINESS, 86: 36–43, 2011
C Taylor & Francis Group, LLC
Copyright 
ISSN: 0883-2323
DOI: 10.1080/08832321003774756

Business Students’ Perceptions, Attitudes, and
Satisfaction With Interactive Technology: An
Exploratory Study
Jacqueline Kilsheimer Eastman
Downloaded by [Universitas Maritim Raja Ali Haji] at 22:10 11 January 2016


Georgia Southern University, Statesboro, Georgia, USA

Rajesh Iyer
Bradley University, Peoria, Illinois, USA

Kevin L. Eastman
Georgia Southern University, Statesboro, Georgia, USA

The authors modeled the relationships between students’ perceptions of interactive technology
in terms of whether it helps them pay more attention and be better prepared in a Consumer
Behavior course and their attitude toward and satisfaction with it. The results suggest that
students who feel they pay more attention due to the use of Interactive Technology have
a more positive attitude toward it. Additionally, those students who have a more positive
attitude toward Interactive Technology are more satisfied with its use. There is not, however,
a significant relationship between students feeling more prepared for the course due to the use
of Interactive Technology and their attitude toward it.
Keywords: audience response systems, clicker satisfaction, interactive technology

Faculty members are always looking for ways to keep their

students’ attention, encourage better preparation for class,
and improve students’ attitudes and satisfaction. Whereas for
some professors these are not areas of concern, from our experiences at conferences and informal discussions with other
professors, these areas come up frequently as concerns for
faculty. Lincoln (2008), in discussing the issues with large
classes, noted the challenges (a) facing faculty members to
get students to attend class, pay attention, and participate;
and (b) for faculty to understand where students are at in
their learning process and to manage the course effectively.
Interactive Technology (also called audience-response systems or clickers) may be one technological tool that can help
faculty accomplish these goals (Lincoln; Terreri & Simons,
2005). Kurdziel (2005) noted five reasons for educators to
use an audience response system: (a) to address the limitations of traditional lectures, (b) to engage students, (c) to
Correspondence should be addressed to Jacqueline Kilsheimer Eastman,
Georgia Southern University, Department of Management, Marketing, and
Logistics, P. O. Box 8154, Statesboro, GA 30460–8154, USA. E-mail:
jeastman@georgiasouthern.edu

provide feedback to both students and instructors, (d) to effectuate learning gains, and (e) to realize improvements in
attitudes. An advantage of using Interactive Technology is

that it gives a professor an objective means to track participation. With this technology, all students, even those who do
not like to speak up in class, have the ability to get involved,
participate, and provide their opinions and answer questions.
“When properly used, clickers can make the classroom experience more engaging and enjoyable for students and therefore generate improved attendance levels” (Lincoln, p. 39).
Nelson and Hauck (2008) noted that with large classes it is
nearly impossible to take attendance, conduct in-class activities, and utilize in-class surveys without the use of Interactive
Technology.
As described in Eastman (2007), Interactive Technology
involves the classroom use of individual response pads by
students to answer questions posted via PowerPoint. These
response pads are similar in appearance to a television remote
control device and are numbered, with each student having
a response pad to be used for the entire semester. There is a
wireless response system, hooked up to a professor’s laptop
computer or to a desktop computer in the classroom, that

Downloaded by [Universitas Maritim Raja Ali Haji] at 22:10 11 January 2016

STUDENT PERCEPTIONS OF INTERACTIVE TECHNOLOGY


gives the professor immediate feedback from every student
in the class. The software creates an electronic code that it assigns to a keypad and a keypad user’s profile; this information
is merged and can be exported to a database or spreadsheet
(Krantz, 2004). The professor could post a question via a
PowerPoint slide and the students would individually click
their responses. On the PowerPoint screen it would note each
number pad and whether there was a response, but would not
show any individual responses onscreen. Software on the
computer could then be used to instantly track responses by
an individual student and post the aggregate results. With
this technology, professors get instant, specific feedback and
students get the chance to express their thoughts and see what
others in the class are thinking (Terreri & Simons, 2005).
Although there have been numerous articles discussing
the use and benefits of Interactive Technology in the classroom and other venues (see Eastman, 2007), there have been
few business academic articles that have tested the impact
of Interactive Technology and none that have modeled the
educational constructs impacted by the use of Interactive
Technology. Lincoln (2008) described the use of clickers
with a large-size principles of marketing class and discussed

student feedback on the use of clickers and his recommendation for their use. Similarly, Nelson and Hauck (2008)
addressed the use of clickers in an introduction management
information systems course, as did Ghosh and Renna (2009)
in economics classes. Paladino (2008) noted that educational
technology can induce active learning, improve students’ understanding, and assist in forming competencies. Matulich,
Papp, and Haytko (2008) stressed the need for continuous
improvement and constant innovations to engage students.
Although innovation in the classroom is worthwhile and it is
important that professors discuss and share their experiences
with students, it is also vital that the usefulness and impact
of these activities be determined. Toral, Barrero, MartinezTorres, Gallardo, and Duran (2009) note the need for more
scientific analyses to assess learner satisfaction and the factors that impact it. This paper contributes to the literature by
addressing the usefulness and impact of the use of Interactive
Technology.
This article is one of the first to model the constructs of
attention, preparation, attitude, and satisfaction as they relate
to the use of Interactive Technology in a business course. We
chose these constructs to focus on because they relate to the
use of Interactive Technology as a means of improving the
impact of our teaching. We first provide some background on

Interactive Technology, focusing on what the literature has
said about attention, preparation, attitude and satisfaction
with Interactive Technology. Next, we describe a year-long
(two-semester) study involving three sections of consumer
behavior students that measures their perceptions in terms
of attention, preparation, attitude, and satisfaction regarding
Interactive Technology. We explain how each construct is
measured and purified, and then test a model of the relationships between these constructs. Finally, we discuss the
implications of our findings.

37

LITERATURE REVIEW
With Interactive Technology, students know that their opinions are being heard equally, and professors can get immediate feedback on the performance of the class as a whole while
tracking individual students behind the scenes to pinpoint
specific concerns (Terreri & Simons, 2005) and to determine
if more time is needed on a specific topic (Cohen, 2005).
Carnaghan and Webb (2005) noted the benefit of increasing
interactivity for students regardless of class size, and that the
use of Interactive Technology allows professors to focus on

problems revealed by the students’ responses. Taylor (2007)
described the benefit of utilizing Interactive Technology in
large lectures to increase students’ active involvement. Finally, per Hoffman and Goodwin (2006), they suggested that
the use of Interactive Technology ensures interaction, keeps
students focused, increases participation, promotes discussion, and increases retention. We hope to better determine
the impact of Interactive Technology in this study.
Hatch (2003) suggested the following uses for audience
response technology for meetings and training sessions: (a)
two-tiered questions (initial questions with follow-up items),
(b) brainstorming (measuring reactions to ideas), (c) pre- and
posttests of material learned, (d) as an icebreaker to start a
lesson, and (e) as a feedback mechanism (especially when
anonymous feedback is useful). All individuals would have
an assigned individual response pad so that teachers could
track who was responding, who was attending, and what
their scores were, but their responses would be anonymous
to the other participants. Finally, Cohen (2005) noted that
Interactive Technology can be utilized to find out if students
understand the material, as well as to take attendance and
administer exams. Thus, there are a number of ways that

professors could benefit from using Interactive Technology
to enhance communication in their courses.
Eastman (2007) described using Interactive Technology in
the classroom: (a) for opinion questions (in which any answer
was correct) to introduce topics and highlight opinions, (b)
for open-ended questions with students selected at random to
participate, and (c) for multiple choice questions to measure
students’ knowledge of the material and determine if the
class is ready to move on to the next topic. These questions
were included on exams as an incentive for the students
to learn the material (Eastman). There were approximately
6–8 questions created per chapter and this participation was
worth 5% of the final course average (Eastman). Lincoln
(2008) presented sample clicker questions and suggestions
for the use of clickers in designing pedagogy.
Attention and Preparation
The literature consistently suggests that Interactive Technology can be very useful in gaining the attention and interest of
students because it gives them the opportunity to share their
ideas in an anonymous way and requires them to respond
frequently to the material being presented (Cohen 2005;


Downloaded by [Universitas Maritim Raja Ali Haji] at 22:10 11 January 2016

38

J. K. EASTMAN ET AL.

Simpson, 2007; Taylor, 2007; Terreri & Simons, 2005;
Unmuth, 2004). Lincoln (2008) found that the majority of
students perceive that clickers help keep their attention.
Stone, Escoe, and Schenk (1999) suggested that seeing their
responses on screen in a class increases the students’ sense of
importance and thus their involvement. Attendance and participation has also improved using this technology (Cohen;
Kurdziel, 2005; Lincoln). Nelson and Hauck (2008) noted
that the literature described the short attention span of students, how students’ providing feedback improves attention,
and how students’ attention relates to their motivation to perform in a classroom. Matulich et al. (2008) suggested that
Interactive Technology can be helpful in addressing the short
attention span of students. Cohen noted that students like
the instant feedback, as it makes them more aware of their
learning difficulties. Carnaghan and Webb (2005), however,

found that Interactive Technology does not result in overall
better test grades.
Thus, the literature suggests that Interactive Technology
can improve attention in a class, but there has been little discussion of whether it impacts the students’ preparation for the class. Furthermore, whereas Nelson and
Hauck (2008) suggest that with improved attention students
are more motivated, the literature has not tested the impact of perceived attention on attitude toward Interactive
Technology.
Attitude and Satisfaction
Toral et al. (2009) offered that satisfaction relates to perceptions of being able to achieve success and the feeling of
achieving desired outcomes. Furthermore they stressed the
idea that “learner satisfaction must be explored through a
multidimensional analysis that considers a wide variety of
critical dimensions so as to provide effective metrics that
guide improvements in instructional design” (Toral et al.,
p. 190). Their analysis of satisfaction in an electronic instrumentation course found satisfaction to be driven by the
user interface, ease of use, enthusiasm, and motivation. This
suggests that attitude does impact satisfaction and that both
cognitive and affective dimensions need to be considered
(Toral et al.). Chen and Williams (2008) offered that the ease
of using technology (i.e., how smooth running the technology
is) impacts students’ attitudes toward it. Savage (2009) suggested that the use of information technology, specifically
in terms of downloading lecture materials, did not appear
to have substantive impact on student performance. Khan
(2009) offered that those who use computers more often
feel more engaged in their learning and feel more that computers aided their learning and interaction with faculty and
students.
In terms of Interactive Technology, Carnaghan and Webb
(2005) compared student satisfaction and exam performance
in different sections of a management course (where Interactive Technology was used in different parts of the course).

Although there was evidence of student satisfaction, exam
performance improved only for items closely related to those
displayed in class (Carnaghan & Webb, 2005). However,
Ghosh and Renna (2009) found that students perceive that
Interactive Technology improved their class performance.
Nelson and Hauck (2008) offered that with greater levels of
clicker usage, students perceive greater levels of learning and
benefits of using clickers.
A pilot program by an Interactive Technology provider
measuring the impact of Interactive Technology on college
marketing students found that: (a) 87% of students reported
they were more likely to attend class, (b) 72% of students
reported they were more likely to participate, (c) 61% of students reported they were more focused on the lecture, (d) 70%
of students reported they improved their understanding of
specific concepts, and (e) 63% of students reported that class
was more fun (Thomson Learning and Turning Technologies,
2006). Research suggests that students like using Interactive
Technology and that it makes class more enjoyable (Fitch,
2004; Hoffman & Goodwin, 2006; Lincoln, 2008). Simpson
(2007) reported that a survey of students at one university
found that three out of four students were satisfied with the
use of the Interactive Technology, whereas Taylor (2007,
p. 73) noted typical “clicker satisfaction rates of 80% or
more.” Similarly, Lincoln found that the majority of students
found that clickers were easy and fun to use and not a hassle
to bring to class, and want to see more use of them in future
classes. Ghosh and Renna (2009) found Interactive Technology to be well received by students and that students want
it adopted at the institutional level. Thus, the literature does
suggest that students like using Interactive Technology and
are satisfied with it. What the literature has not measured,
though, is how attitudes toward Interactive Technology impact satisfaction with it.

HYPOTHESES
Building on the literature, we proposed three hypotheses to test the relationships between attention, preparation, attitude, and satisfaction with Interactive Technology.
The model illustrated by these relationships is shown in
Figure 1.
Hypothesis 1 (H1): Students who feel that the use of Interactive Technology will help them pay more attention in
a course would have a more positive attitude toward the
use of Interactive Technology.
H2: Students who feel that the use of Interactive Technology
helps them be better prepared in a course would have
a more positive attitude toward the use of Interactive
Technology.
H3: Students who have a positive attitude toward the Interactive Technology would be more satisfied with Interactive
Technology.

STUDENT PERCEPTIONS OF INTERACTIVE TECHNOLOGY

39

TABLE 1
Descriptive Information on Sample
Attention

Items

Attitude
Toward
Interactive
Technology

Satisfaction
With Interactive
Technology

Downloaded by [Universitas Maritim Raja Ali Haji] at 22:10 11 January 2016

Preparation

FIGURE 1

Hypothesized model.

METHOD
Participants and Data Collection
The sample was made up of 97 Consumer Behavior students from a southeastern regional university from the three
sections of the course taught over two semesters. Consumer
Behavior is a required course for marketing majors and an
elective business course for other business majors. Although
the sample was a convenience sample, all students taking
the course over a year were included and the course was required for all marketing majors. Thus, we feel there was a
good representation of the marketing students in the course.
Enrollment in the course ranged from 32 to 48 students with
a mean enrollment of 37.667 students (SD = 8.962). The
demographics of the sample are provided in Table 1. Additionally, 64% of the respondents were marketing majors,
11% were management majors, and the remaining 25% were
double majors (marketing and management, marketing and
finance, or other). This survey was given the last week of
class. Students received one point of extra credit to their average grade in the course if they completed the survey, but
the students did not put their name on the survey. No students
refused to participate in the surveys so nonresponse bias was
not an issue.
The Interactive Technology was utilized in a manner similar to that described by Eastman (2007). It was utilized
frequently in every class (approximately six to eight questions per chapter) in a variety of ways (open-ended questions,
opinion questions, and graded questions that were later utilized in the exams). Student participation represented 5% of
their final course average.
Construct Operationalization
To measure the constructs, the survey included items either
created by us or adapted from other pedagogical research.
Several of the items came from Carnaghan and Webb’s

Gender:
Male
Female
Year in College:
Junior
Senior
Is this course a required course?
Yes
No
Mean GPA:
How often do you prepare for any course?
Never
Rarely
Sometimes
Often
Very Often
How many hours do you read for any course in a week?
0–2 hours
2–4 hours
4–6 hours
6–8 hours
More than 8 hours
Are you employed?
No
Yes (less than 10 hours per week)
Yes (between 10 and 20 hours per week)
Yes (more than 20 hours but less than 40 hours per week)
Yes (40 hours or more per week)

45%
55%
40%
60%
78%
22%
3.02
0%
4%
37%
50%
9%
23%
45%
23%
7%
2%
32%
6%
30%
28%
4%

(2005) working paper measuring the impact of Interactive
Technology. Several items from Kurdziel (2005) looking at
the impact of Interactive Technology in large biology lectures
were also utilized. Some of Massey, Brown, and Johnston’s
(2005) items measuring the impact of using games (such as
crossword puzzles and Jeopardy!) to review materials were
also adapted. Finally, the satisfaction items were adapted and
modified from Cole and Balasubramanian (1993), Arnould
and Price (1993), and Fisher and Price (1991). All of our survey items were on a 5-item Likert-type scale ranging from
1 (strongly disagree) to 5 (strongly agree). The survey items
were reviewed by several business faculty members for face
and content validity. The specific sources utilized for the various measures are noted in Table 2 along with the means and
standard deviations of the individual items.
Measures and Purification
Toral et al. (2009) stressed the need to use structural equation modeling to establish a scientific model of learner satisfaction. Following a process recommended by Anderson
and Gerbing (1988), the measurement quality of the indicators was evaluated. Anderson and Gerbing recommended
that researchers first refine the measurement model before
testing the structural component of the model. This twostep procedure has been widely adopted in marketing. The

40

J. K. EASTMAN ET AL.
TABLE 2
Measurement Items

Downloaded by [Universitas Maritim Raja Ali Haji] at 22:10 11 January 2016

Scale/items
Attention (CR = 0.79; VE = 0.55)
I pay more attention to what is going on in lecture when conceptual questions will be presented and I
can respond with Interactive Technology.a
The Interactive Technology was a fun way to review class materials.c
The response pads made me feel more comfortable participating in the course.b
Preparation (CR = 0.84; VE = 0.56)
Discussing the Interactive Technology questions helped me learn the material.a
Discussing the Interactive Technology questions helped me realize which concepts I needed to spend
more time on when I prepared for exams.a
The Interactive Technology helped me prepare for exams.a
Knowing the response pads were going to be used encouraged me to work harder to answer questions
in class.b
Attitude towards Interactive Technology (CR = 0.86; VE = 0.60)
I thought this course did focus too much on using the response pads.br
Overall I thought, the advantages of using response pads outweighed the disadvantages in this course.br
I think this course should continue to use Interactive Technology.d
I think other professors should use Interactive Technology in their courses.d
Satisfaction with the Interactive Technology (CR = 0.96; VE = 0.85)
My experience at using the Interactive Technology was good.e
I am happy that this course used Interactive Technology.e
My trial of the Interactive Technology worked out well.e
I am sure it was the right thing to use Interactive Technology.e

Standardized
loading

Mean

Standard
deviation

0.77

4.14

0.91

0.80
0.65

4.15
3.99

0.90
0.99

0.82
0.71

4.05
4.00

0.85
0.91

0.75
0.72

4.01
3.93

0.98
0.91

0.60
0.71
0.85
0.91

3.75
3.81
4.19
4.12

1.17
1.15
1.16
1.10

0.93
0.95
0.93
0.91

4.27
4.22
4.25
4.08

0.86
1.02
0.90
1.03

Composite Reliability (CR) and variance extracted (VE) are provided for each scale.
Sources used to measure the scale.
aKurdziel (2005); bCarnaghan and Webb (2005); cMassey, Brown and Johnston (2005); dNew (created by the authors); eCole and Balasubramanian (1993);
Arnould and Price (1993); Fisher and Price (1991); rReverse Coded items.

goal was to identify a final set of items with acceptable
discriminant and convergent validity, internal consistency,
and reliability (Hair, Black, Babin, Anderson, & Tatham,
2006).
Every single item was subjected to a confirmatory factor
analysis. All factor loadings were significant at the 0.01 level
and all individual reliabilities were far above the required
value of 0.4 (Bagozzi & Baumgartner, 1994). According to
the recommendations of Bagozzi and Yi (1991) and Bagozzi
and Baumgartner, a composite reliability of at least 0.7 is
desirable. This requirement was met. After having assessed

the individual factors, the reduced set of items was subjected
to a confirmatory factor analysis using maximum likelihood
estimation. The results of the analysis are summarized in
Tables 2 and 3. Also listed in Table 3 are the means, standard deviations, and coefficient alphas for each of the four
constructs we examined.
Although the chi-square value was significant, χ 2(84,
N = 97) = 128.96, p