SlideShare a Scribd company logo
1 of 9
Download to read offline
Full Terms & Conditions of access and use can be found at
http://www.tandfonline.com/action/journalInformation?journalCode=vjeb20
Download by: [California State University], [Yann Abdourazakou] Date: 30 January 2017, At: 09:56
Journal of Education for Business
ISSN: 0883-2323 (Print) 1940-3356 (Online) Journal homepage: http://www.tandfonline.com/loi/vjeb20
LearnSmart, adaptive teaching, and student
learning effectiveness: An empirical investigation
Qin Sun, Yann Abdourazakou & Thomas J. Norman
To cite this article: Qin Sun, Yann Abdourazakou & Thomas J. Norman (2017): LearnSmart,
adaptive teaching, and student learning effectiveness: An empirical investigation, Journal of
Education for Business, DOI: 10.1080/08832323.2016.1274711
To link to this article: http://dx.doi.org/10.1080/08832323.2016.1274711
Published online: 19 Jan 2017.
Submit your article to this journal
Article views: 15
View related articles
View Crossmark data
LearnSmart, adaptive teaching, and student learning effectiveness: An empirical
investigation
Qin Suna
, Yann Abdourazakoub
, and Thomas J. Normanb
a
Trident University International, Cypress, California, USA; b
California State University, Dominguez Hills, Carson, California, USA
ABSTRACT
Facing the growing number of digital natives entering the classroom, business professors look for
innovative ways to enhance the student learning experience. The authors focus on the online
interactive learning tool LearnSmart (McGraw-Hill, New York, NY), and examine its impact on
student learning effectiveness by testing the direct and indirect relationships among perceived
competence, perceived challenge, instructors, perceived value, and satisfaction with LearnSmart.
Constructionism served as the theoretical foundation for this study. About 215 students at a public
university in the United States took the survey and 197 valid responses were received. Regression
analysis results showed that the use of LearnSmart improved students’ perceived competency, thus
increasing their perceived value of using LearnSmart, as well as their satisfaction with LearnSmart.
Perceived value was also found to mediate the impact of perceived competency on satisfaction
with LearnSmart, and the instructor played a significant role in facilitating and improving student
learning. Perceived challenge impacted student’s perceived value of using LearnSmart, but it did
not influence satisfaction with LearnSmart.
KEYWORDS
Adaptive teaching;
interactive learning tool;
perceived challenge;
perceived competence;
perceived value; student
learning effectiveness
Introduction
Student learning styles have been found to influence stu-
dent academic performance and thus it is important for
educators to adapt teaching methods to student learning
styles (Fleming, 2001; Sandman, 2014). As digital millen-
nial college students are more comfortable with the
online learning (Comer, Lenaghan, & Sengupta, 2015;
Ganesh & Sun, 2009), online teaching is increasingly a
part of course delivery. One major advantage of online
education is flexibility, especially to students with time,
distance or language constraints (Liu, Gomez, Khan, &
Yen, 2007; Steiner & Hyman, 2010). Therefore, it is not
surprising that more business professors intend to incor-
porate online learning into their curriculum.
Universities are also offering hybrid classes that inte-
grate online education into the traditional classroom set-
ting. In this way, professors combine the advantages of
online educations with traditional face-to-face teaching,
and thus enrich student learning. The effectiveness of
face to face teaching was found to be related to appropri-
ate instructional design and instructors’ subject knowl-
edge and capability to communicate clearly with the
students, to create rapport with the students, and deliver
the course enthusiastically (Sweeney, Morrison, Jarratt,
& Heffernan, 2009). On the other hand, online practices
could facilitate the teaching in classroom (Berger &
Topol, 2001).
With the rapid advancement of information technol-
ogy students have shown increasing interest in technol-
ogy-enhanced pedagogies (Jackson, Helms, Jackson, &
Gum, 2011), and business professors look for innovative
ways to teach business topics and enhance student learn-
ing effectiveness. Various high-tech devices and
technologies are employed in business courses to
enhance student learning effectiveness such as personal
digital assistants and video cameras (McGorry, 2006;
Smith & Fisher, 2006), multimedia (Sheppard & Vibert,
2016), online video (Lancellotti, Thomas, & Kohli, 2016;
Winch & Cahn, 2015), virtual communities (Luethge,
Raska, Greer, & O’Connor, 2016), class blogs (Ferguson,
Makarem, & Jones, 2016), and electronic conferencing
(Wood, Solomon, & Allan, 2008).
With the complement of technologies, experiential
learning can be promoted by interactive or adaptive
online teaching, which conforms to the requirements of
the Association to Advance Collegiate Schools of Busi-
ness (AACSB) to increase the flexibility and responsive-
ness to student needs (AACSB, 2003; Hatfield & Taylor,
CONTACT Qin Sun qin.sun@trident.edu Trident University International, College of Business Administration, 5757 Plaza Dr., Suite 100, Cypress, CA
90630, USA.
© 2017 Taylor & Francis Group, LLC
JOURNAL OF EDUCATION FOR BUSINESS
http://dx.doi.org/10.1080/08832323.2016.1274711
1998). For example, professors can use discussion
forums, messaging and emails to create an interactive
community to facilitate and encourage experiential
learning among students (Wood et al., 2008), or design
interactive spreadsheets to enhance students’ problem
solving skills (Bertheussen & Myrland, 2016).
Online teaching and learning can be customized when
the assignments are interactive. Previous researchers
have identified different types of interactivity that can
occur online: learner-content interaction, learner-
instructor interaction, and learner-learner interaction
(Moore, 1989) as well as student-learning management
system interaction (Davidson-Shivers, 2009). Previous
studies have revealed that online courses with great levels
of interactivity combine higher levels of student motiva-
tion, enhanced learning outcomes, and satisfaction over
interactive learning environments (Espasa & Meneses,
2010). Perceived overall interactivity also positively influ-
ences learner satisfaction (Fulford & Zhang, 1993). In the
broader context of interactive teaching, it’s less clear as to
whether adaptive learning systems should fit into exist-
ing online or on-campus courses.
In this study, we intend to examine how the integra-
tion of an online interactive learning tool (i.e., Learn-
Smart) could enhance student learning and thus learning
effectiveness. To the best of our knowledge, there is little
peer-reviewed work examining this new learning tool in
the business fields. In this study, we describe the imple-
mentation of LearnSmart in introductory marketing and
management courses at a West Coast public university.
Literature review
Student learning is the central focus of higher education.
However, student learning is context dependent, and
various factors influence student learning effectiveness
such as students’ own motivation, classroom climate,
teaching methods, and course level (Comer et al., 2015).
In particular, interaction is considered as a determining
factor to promote student learning effectiveness, not
only in traditional classroom setting but also in online
education (Rovai & Barnum, 2003; Swan, 2003). The
importance of interactions that learners engage in has
been highlighted in the adaptive teaching and learning
pedagogy. According to Oxman and Wong (2014), adap-
tive learning refers to a learning process where the con-
tent taught adapts based on the responses of the
individual. Adaptive learning systems can help personal-
ize the instruction based on an individual learning model
(Kinshukan, 2003; VanLehn, 2006). Individualized pac-
ing used in adaptive learning technology demonstrated
more positive impacts than those with class-based or a
mixed form of pacing, and courses with adaptive
learning technologies showed better learning outcomes
than nonadaptive ones (Means, Peters, & Zheng, 2014).
LearnSmart is one adaptive learning system that can be
used on top of fully online or on-campus courses at any
university. It gives a new geometry to the interactivity
between students, the instructor and the course content.
LearnSmart is an adaptive learning tool that evaluates
students’ knowledge levels by tracking the topics stu-
dents have mastered, thus identifies the areas that need
further instruction and practice. Depending on student
progress, LearnSmart automatically adapts the learning
contents based on their knowledge strengths and weak-
nesses, and their confidence level around that knowledge
(Norman, 2011). So LearnSmart is tailored to the specific
needs of each student with the continuous evaluation of
a student’s knowledge on the concepts covered in each
chapter (McGraw-Hill Higher Education, 2012).
LearnSmart’s adaptive technology also identifies the
concepts that students are most likely to forget over the
course of the semester—by considering those that they
had been weakest on or least confident with—and encour-
ages periodic review by the students to ensure that con-
cepts are truly learned and retained. As a result, it goes
beyond systems that simply help students study for a test
or exam, and helps students with true concept retention
and learning. LearnSmart also generates dynamic reports
to document progress and suggests areas for additional
reinforcement, offering students real-time feedback on
their content mastery. By monitoring student progress,
professors can instantly assess the level of understanding
and mastery for an entire class or an individual student at
any given time, and adapt the teaching in classroom
(McGraw-Hill Education, 2014; Norman, 2011).
Statement/significance of the problem
Akin (1981) discussed that marketing college students
should learn not only marketing, but skills that will help
them in the future in the marketing field, such as the skill
of knowing how to learn. Kember, Charlesworth, Davies,
McKay, and Stott (1997) also emphasized the impor-
tance of enhancing meaningful independent learning
among college students. On one hand, college students
need to digest and comprehend the materials; on the
other hand, students should be able to apply those course
materials for academic assessment and future profes-
sional development (Richardson, 1994). The adaptive
teaching method could help meet individual student
needs and thus enhance student learning effectiveness
(Zhang, Zhou, Briggs, & Nunamaker, 2006). Student
learning effectiveness refers to the learning value as per-
ceived by students (Ganesh, Paswan, & Sun, 2015).
2 Q. SUN ET AL.
Constructivism posits that students who are engaged
with interactive activities would have more effective
learning than those who are not (Leidner & Jarvenpaa,
1995). Therefore, it is important for business professors
to emphasize the engaging learning experience among
college students. LearnSmart offers an interactive and
adaptive way for students to read digital textbook chap-
ters while engaging with online practice problems and
quizzes. In comparison, students who read hard copy
textbooks would not have the opportunity to apply these
concepts; thus, the learning might be less effective.
Measurement of student learning effectiveness can be
assessed based on the objective performance such as
course grades, as well as subjective evaluation such as
student perception of LearnSmart and their satisfaction
with its use (Swan, 2003). Griff and Matter (2013) looked
at test scores of students in six schools, but did not find a
significant impact of LearnSmart on student grade per-
formance. However, there is no study to explore the
potential influence of this adaptive online learning tool
on student perceived learning effectiveness. This study
fills this literature gap and empirically tests the impact of
LearnSmart on several subjective evaluation factors of
learning effectiveness, that is, satisfaction with Learn-
Smart and perceived value, considering several relevant
factors in the literature such as perceived competence,
perceived challenge, and instructors (Ganesh et al.,
2015).
Perceived competence
Perceived competence is defined as the students’ percep-
tion of mastering their school work (Pintrich & De
Groot, 1990). Those students with higher perceived com-
petence have been found to be more intrinsically moti-
vated, have higher test performance than those with
lower perceived competence (Bicen & Laverie, 2009). In
addition, students with higher perceived competence
tend to give better course evaluations, thus indicating
more effective student learning (Clayson, 2009; Ganesh
et al., 2015). LearnSmart intends to improve student
learning by automatically adapting the practices and tests
with respect to student understanding of course contents,
thus enhancing their perceived competence. Students
have the opportunity to do multiple practices to under-
stand the same concepts if they did not fully understand
them at the beginning. Since perceived learning value
can be used to measure student learning effectiveness
(Ganesh et al., 2015), we would expect the students
would perceive high value from LearnSmart. In this way,
students would be more satisfied with LearnSmart as the
learning effectiveness improves. Therefore, we formu-
lated the following hypotheses:
Hypothesis 1a (H1a): Perceived competence would be
positively associated with perceived value.
H1b: Perceived competence would be positively associ-
ated with satisfaction with LearnSmart.
Perceived challenge
Perceived challenge refers to the student perception of
the workload in the class and the extent of difficulty of
the class or assignment. Extant literature shows the
direct association between perceived challenge and the
overall evaluation of class (Ganesh et al., 2015; Parayi-
tam, Desai, & Phelps, 2007). LearnSmart incorporates
various practices and quizzes into each chapter, which
requires much more time for students to finish the
chapter than just reading a textbook. Students may get
frustrated at the beginning due to the extra work and
thus perceive some challenge. With the extra learning
and improved understanding of course content, the stu-
dents would feel more competent and less challenged
over time. Taking these findings into account, we would
expect the students to perceive more value even though
it is perceived to be more challenging since they may
feel that they have become more competent. By the
same token, we expected that the increased value per-
ception of this adaptive learning tool would lead to
more satisfaction with LearnSmart (Ledden, Kalafatis, &
Samouel, 2007). As a result, it is reasonable to assume
the following:
H2a: Perceived challenge would be positively associ-
ated with perceived value.
2b: Perceived challenge would be positively associated
with satisfaction with LearnSmart.
Mediation role of perceived value
Perceived value by the students refers to the overall eval-
uation of the utility of the service or learning tool and
higher perceived value lead to satisfaction with the edu-
cation (Ledden et al., 2007; Dlacic, Arslanagic, Kadic-
Maglajlic, Mrkovic,  Raspor, 2014). As students per-
ceive some value from the use of LearnSmart, they may
become more satisfied with the tool. In addition, because
perceived competency and perceived challenge are
expected to impact the perceived value of using Learn-
Smart, it is logical to propose that perceived value would
mediate the connection between perceived competency
and satisfaction with LearnSmart, as well as the relation
between perceived challenge and satisfaction with Learn-
Smart. Consequently, we propose that the following:
H3a: Perceived value would mediate the relation
between perceived competency and satisfaction
with LearnSmart.
JOURNAL OF EDUCATION FOR BUSINESS 3
3b: Perceived value would mediate the relation
between perceived challenge and satisfaction with
LearnSmart.
Moderating role of instructor
Adaptive teaching tries to match teaching methods with
student learning style and instructor plays a significant role
in this adaptive learning process (Fleming, 2001; Sandman,
2014). As a result, we also explored the role of instructor in
the student teaching effectiveness, specifically a student’s
perceived value of LearnSmart. We assume that the more
experienced instructors would provide better guidance
regarding student use of LearnSmart and in-class instruc-
tion, which alleviate perceived challenge of using Learn-
Smart and thus increase their perceived competency of
learning. As a result, students would perceive higher value
from LearnSmart from experienced instructors than less
experienced ones, which leads to the following hypotheses:
H4a: The instructor would moderate the relation
between perceived competency and perceived
value.
H4b: The instructor would moderate the relation
between perceived challenge and perceived value.
Research methodology
The research context for this study was four undergradu-
ate marketing and management courses, offered at a pub-
lic university in the western United States. LearnSmart is
a part of the McGraw-Hill Connect course, which is
required for students to study each chapter of the course.
It is designed to improve student’s understanding of
course contents through its online interactive platform.
Over the 15-week semester, students were asked to com-
plete the LearnSmart assignment either before or after the
instructor finished the lecture in class. They were given
one week to finish LearnSmart and then take a quiz for
that chapter. The instructors can check the assignment
statistics to get details on student performance and the
time each student spent to finish LearnSmart assign-
ments. Then the instructors can adapt in-class teaching
with respect to the student LearnSmart performance.
Survey instrument
We also borrowed and adapted existing scales to measure
each construct, based on the qualitative feedback from stu-
dents in a marketing discussion forum (Table 1). Perceived
competency, perceived challenge, and satisfaction with
LearnSmart were measured with a 7-point Likert-type
scale with responses ranging from 1 (strongly disagree) to
7 (strongly agree; Ganesh et al., 2015). Ganesh et al. devel-
oped these scales based on faculty evaluation instruments
commonly used in higher education and validated these
scales with acceptable reliability, as well as convergent and
discriminant validity. Perceived value looks at the compar-
ative value of LearnSmart with respect to reading a text-
book and the 7-point Likert-type scale has been adapted
from Dlacic et al. (2014), which has showed acceptable
reliability, as well as convergent and discriminant validity
of this scale for perceived value. Demographic questions
such as age, gender, ethnicity, and employment status
were included at the end of questionnaire.
A pretest on the survey instrument was conducted to
ensure the face validity of instrument. Several experienced
researchers consented to review the clarity of the wording,
coherence, and logic order and possible ambiguity of the
questionnaire. The questionnaire was refined as a result.
Data collection
Data for this research were collected using an online
survey. Students were offered a course grade incentive
Table 1. Constructs scales.
Constructs Scale item M SD
Perceived
competency
LearnSmart made me more confident
in learning course concepts.
5.55 1.402
LearnSmart made me more confident
in applying course concepts.
5.41 1.422
LearnSmart improved my critical
thinking ability.
5.18 1.502
LearnSmart taught me tools for
decision making.
5.02 1.532
LearnSmart taught me skills useful for
life.
4.79 1.561
Perceived value LearnSmart provides greater learning
value than just reading a textbook.
5.64 1.48
LearnSmart provides greater learning
value than the class lectures.
5.01 1.578
LearnSmart pushes me to peak
performance compared to just
reading a textbook.
5.29 1.529
LearnSmart helps me to earn a higher
grade than just reading a textbook.
5.49 1.494
Perceived
challenge
LearnSmart requires more work than
just reading a textbook.
5.48 1.636
LearnSmart takes too much time than
just reading a textbook.
4.74 1.773
LearnSmart is more challenging than
just reading a textbook.
4.62 1.805
It is more frustrating to do LearnSmart
than just reading a textbook.
4.03 1.865
Satisfaction with
LearnSmart
I would like to continue using
LearnSmart for other courses.
5.31 1.59
I have no regrets about using
LearnSmart.
5.30 1.591
I am satisfied with the learning
effectiveness of LearnSmart.
5.38 1.458
I would recommend the use of
LearnSmart in other courses.
5.4 1.553
4 Q. SUN ET AL.
for their voluntary participation. This hardly made any
difference to the grade outcome, yet was spectacularly
successful in encouraging response (Ganesh et al.,
2015). About 215 students were invited to participate in
this study and we received 197 valid responses. Table 2
shows the demographic characteristics of sample; 52.8%
of the participants (107) were women and 46.2% (92)
were men. The largest age group of the respondents
was 21–25 years old (50.8%), followed by 26–30 years
old (21.1%), 36 years old or older (12.1%), and 18–
20 years old (5.5%). About half of the respondents
(47.2%) were Hispanic, while African American, Asian,
and Caucasian participants each were about 15% of the
sample. The majority of the respondents were
employed: 40.7% with full time employment, 35.7%
with part-time employment, and only 22.6% with no
employment.
Findings
The exploratory factor analysis (EFA) for the con-
structs was conducted to evaluate the dimensionality
of each construct. Only one factor was extracted to
show the unidimensionality of each construct. Based
on the Cronbach’s alpha values for the constructs in
this study (Table 3), ranging from .818 to .943, all
latent constructs used in the hypothesized model have
acceptable reliability (Churchill, 1979). The average
variance extracted (AVE) were all above 0.50, with
perceived competency at 0.797, perceived value at
0.821, perceived challenge at 0.649, and satisfaction
with LearnSmart at 0.857 (Table 3), indicating accept-
able convergent validity (McDonald  Ho, 2002). In
addition, as the square root value of AVE per factor
(from 0.806 to 0.926) is more than the inter-factor
correlations (from 0.018 to 0.794), the constructs are
considered to have adequate discriminant validity
(Table 4; Fornell  Larcker, 1981).
The hypothesized mediation relationships were tested
using three-step multiple regressions proposed by Baron
and Kenny (1986) and Sobel (1982). Three multiple
regressions were run to test the direct and indirect rela-
tionship between perceived competency, perceived value
and learning satisfaction with LearnSmart. The first
regression (Table 5) showed that perceived competency
is significant related to satisfaction with LearnSmart
(b D .794, p D .000) and the second regression also indi-
cated the significant impact of perceived competency on
perceived value (b D .772, p D .000). Therefore, H1a and
H1b are supported. The third regression found that both
perceived competency (b D .491, p D .000) and per-
ceived value (b D .392, p D .000) are positively related to
satisfaction with LearnSmart. In addition, the beta value
of perceived competency in the third regression is
smaller than that in the first regression, showing partial
mediation of perceived value. The Sobel test results sup-
ported the significance of partial mediation (Z D 5.891,
p D .000). As a result, H3a is confirmed.
By the same token, three multiple regressions were
used to test the direct and indirect relationship
between perceived challenge, perceived value and
Table 2. Sample demographic characteristics.
Variables Category n %
Gender Male 92 46.2
Female 105 52.8
Age (years) 18–20 11 5.5
21–25 101 50.8
26–30 42 21.1
31–35 19 9.5
36 or older 24 12.1
Ethnicity African American 27 13.6
Asian 29 14.6
Caucasian 30 15.1
Hispanics 94 47.2
Others 17 8.5
Employment Full time 81 40.7
Part time 71 35.7
No employment 45 22.6
Table 3. Factor loadings and AVE.
Constructs Items Factor loading AVE Square root of AVE
PC PC1 0.885
PC2 0.891
PC3 0.911
PC4 0.908
PC5 0.868 0.797 0.893
PV PV1 0.92
PV2 0.85
PV3 0.933
PV4 0.92 0.821 0.906
PCh PCh1 0.749
PCh2 0.827
PCh3 0.877
PCh4 0.764 0.649 0.806
SwLS SwLS1 0.947
SwLS2 0.872
SwLS3 0.933
SwLS4 0.949 0.857 0.926
Note. AVE D average variance extracted. PC D perceived competency; PCh D
perceived challenge; PV D perceived value; SwLS D satisfaction with
LearnSmart.
Table 4. Convergent and discriminant validity.
PC PV PCh SwLS
PC .936
PV .772ÃÃ
.926
PCh .175Ã
.134 .818
SwLS .794ÃÃ
.771ÃÃ
.018 .943
Note. Cronbach’s alpha is italicized diagonally and the correlations are off the
diagonal. PC D perceived competency; PCh D perceived challenge; PV D
perceived value; SwLS D satisfaction with LearnSmart.
Ã
p  .05 level (two tailed).
ÃÃ
p  .01 level (two tailed).
JOURNAL OF EDUCATION FOR BUSINESS 5
learning satisfaction with LearnSmart. The first regres-
sion (Table 6) showed that perceived challenge is not
significant related to satisfaction with LearnSmart
(b D .018, p D .799), but the second regression indi-
cated the significant impact of perceived challenge on
perceived value (b D .134, p D .059). Therefore, H2a
is supported whereas H2b is rejected. Although the
third regression found that both perceived challenge
(b D ¡.087, p D .057) and perceived value (b D .783,
p D .000) are significantly related to satisfaction with
LearnSmart, there is no mediation effect of perceived
value (Z D 0.803, p D .250). As a result, H3b is not
supported.
Table 7 shows the results of hierarchical regressions
with instructor as an independent variable in model 1
and as a moderator in model 2. The results show the sig-
nificant moderating effect of instructor with respect to
the relation between perceived competency and per-
ceived value (b D .383, p D .066), and that between per-
ceived challenge and perceived value (b D ¡.455, p D
.015). Three instructors taught these four courses. One
had 5 years of experience with LearnSmart and the other
two were using LearnSmart for the first time. The post
hoc Scheffe test shows that the student in the class taught
by the instructor with several years of LearnSmart per-
ceived more value than those in the classes taught by the
instructors who just started to use LearnSmart. There-
fore, H4a and H4b are supported.
Conclusion
In this study we intended to examine the influence of an
online interactive learning tool LearnSmart on student
evaluation of this tool, the course, and their learning
effectiveness. Perceived competence and perceived chal-
lenge were also investigated to test their potential
impacts on student learning effectiveness by using Learn-
Smart. The findings show that the use of LearnSmart
improves student’s perceived competency, thus increas-
ing their perceived value of using LearnSmart, as well as
their satisfaction with LearnSmart. Perceived value was
also found to mediate the impact of perceived compe-
tency on satisfaction with LearnSmart. At the same time,
the instructor was found to play a significant role to facil-
itate student learning and improve student learning
effectiveness.
Perceived challenge was found in this study to impact
student’s perceived value of using LearnSmart, while it
does not influence satisfaction with LearnSmart. How-
ever, experienced instructors could help students
improve their perceived value of using LearnSmart by
adapting their teaching to student learning style. The
results of this study help evaluate the effectiveness of
using LearnSmart to enhance student learning effective-
ness and make recommendations on its future use. The
results also add new knowledge to marketing education
literature regarding the employment of a new informa-
tion technology in the course design.
Recommendations for additional research
There are some limits of this study. First of all, the objec-
tive measures of student performance such as student
Table 5. Mediation test with perceived competency as IV.
DV IV b t p R2
Z value
Regression 1 Satisfaction with LS Perceived competency .794 18.325 .000 .630
Regression 2 Perceived value Perceived competency .772 17.038 .000 .596
Regression 3 Satisfaction with LS Perceived competency .491 7.882 .000 .693
Perceived value .392 6.300 .000
Sobel test .000 5.891
Table 6. Mediation test with perceived challenge as IV.
DV IV b t p R2
Z
Regression 1 Satisfaction with LS Perceived challenge .018 0.256 .799 .000
Regression 2 Perceived value Perceived challenge .134 1.901 .059 .018
Regression 3 Satisfaction with LS Perceived challenge ¡.087 ¡1.912 .057 .602
Perceived value .783 17.230 .000
Sobel test .057 1.900
Table 7. Hierarchical regression with instructor as moderator.
Variables Model 1 Model 2
PC .790ÃÃÃ
.578ÃÃÃ
PCh .002 .315ÃÃ
Ins ¡.124ÃÃ
¡.106
InteractionPCIns .383y
InteractionPChIns ¡.455ÃÃ
R2
.611 .626
Note. Ins D instruction; PC D perceived competency; PCh D perceived chal-
lenge; dependent variable (DV) D perceived value.
y
p  .10 (two tailed). ÃÃ
p  .01 (two tailed). ÃÃÃ
p  .001 (two tailed).
6 Q. SUN ET AL.
quiz grades and final grades are not included in the
study. Although a previous study shows an insignificant
impact of using LearnSmart on student’s test perfor-
mance, there is a need to consider both objective and
subjective measures in the same study. Second, the cross
sectional data is used in this study and longitudinal anal-
ysis is needed in future study to provide richer insights.
In addition, other factors such as student experience
with technology and their attitudes toward technology
could impact their use of online learning tool such as
LearnSmart and future researchers could explore the
impact of these factors on student learning performance.
More robust mediation analysis using structural equa-
tion modeling can be conducted to further test the
hypotheses in this study.
Of the three types of interaction, learner-learner inter-
action can be limited within LearnSmart, which can chal-
lenge the implementation of one type of interaction in an
online course. The ability of business instructors to com-
bine LearnSmart with other tools available in the online
environment such as blackboard is critical. The ability to
integrate learning needs and preferences for different
types of interactivity to increase satisfaction would be
valuable to see if there can be any teaching effectiveness
correlation. To generalize the findings of this first study,
a new survey will be undertaken in order to increase our
sample size with a second set of data.
References
Akin, G. (1981). Viewpoint: Marketing educators should
model competencies, assign group projects. Marketing
News, 15(2), 2.
Association to Advance Collegiate Schools of Business
(AACSB). (2003). Eligibility procedures and standards for
business accreditation, St. Louis, MO: Author.
Baron, R. M.,  Kenny, D. A. (1986). The moderator-mediator
variable distinction in social psychological research: Con-
ceptual, strategic, and statistical considerations. Journal of
Personality and Social Psychology, 51, 1173–1182.
Berger, K. A.,  Topol, M. T. (2001). Technology to enhance
learning: Use of a web site platform in traditional classes and
distance learning. Marketing Education Review, 11(3), 15–26.
Bertheussen, B. A.,  Myrland, Ø. (2016). Relation between
academic performance and students’ engagement in digital
learning activities. Journal of Education for Business, 91,
125–131. doi:10.1080/08832323.2016.1140113.
Bicen, P.,  Laverie, D. A. (2009). Group-based assessment as a
dynamic approach to marketing education. Journal of Mar-
keting Education, 31, 96–108.
Churchill, G. A. Jr. (1979). A paradigm for developing better
measures of marketing constructs. Journal of Marketing
Research, 16, 64–73.
Clayson, D. E. (2009). Student evaluations of teaching: Are they
related to what students learn? A meta-analysis and review
of the literature. Journal of Marketing Education, 31, 16–30.
Comer, D. R., Lenaghan, J. A.,  Sengupta, K. (2015). Factors
that affect students’ capacity to fulfill the role of online
learner. Journal of Education for Business, 90, 145–155.
doi:10.1080/08832323.2015.1007906.
Davidson-Shivers, G. V. (2009). Frequency and types of
instructor interactions in online instruction. Journal of
Interactive Online Learning, 8(1), 23–40.
Dlacic, J., Arslanagic, M., Kadic-Maglajlic, S., Mrkovic, S., 
Raspor, S. (2014). Exploring perceived service quality, per-
ceived value, and repurchase intention in higher education
using structural equation modelling. Top Quality Manage-
ment, 25, 141–157.
Espasa, A.,  Meneses, J. (2010). Analyzing feedback processes
in an online teaching and learning environment: An explor-
atory study. Higher Education, 59, 277–292.
Ferguson, J. L., Makarem, S. C.,  Jones, R. E. (2016).
Using a class blog for student experiential learning
reflection in business courses. Journal of Education for
Business, 91, 1–10. doi:10.1080/08832323.2015.1108279.
Fleming, N. D. (2001). Teaching and learning styles: VARK
strategies. Christchurch, New Zealand: Author.
Fornell, C. J.  Larcker, D. F. (1981). Evaluating structural
equation models with unobservable variable and mea-
surement error. Journal of Marketing Research, 18,
39–50.
Fulford, C. P.,  Zhang, S. (1993). Perceptions of interaction:
The critical predictor in distance education. American Jour-
nal of Distance Education, 7(3), 8–21.
Ganesh, G. K., Paswan, A. K.,  Sun, Q. (2015). Are face-to-
face classes more effective than online classes? An empirical
examination. Marketing Education Review, 25, 67–81.
Ganesh, G. K.  Sun, Q. (2009). Using simulations in the
undergraduate marketing capstone case course. Marketing
Education Review, 19, 7–16.
Griff, E. R.,  Matter, S. F. (2013). Evaluation of an adaptive
online learning system. British Journal of Educational Tech-
nology, 44, 170–176.
Hatfield, L.  Taylor, R. K. (1998). Making business schools
responsive to customers: Lessons learned and actions. Mar-
keting Education Review, 8(2), 1–8.
Jackson, M. J., Helms, M. M., Jackson, W. T.,  Gum, J. R. (2011).
Student expectations of technology-enhanced pedagogy: A
ten-year comparison. Journal of Education for Business, 86,
294–301. doi:10.1080/08832323.2010.518648.
Kember, D., Charlesworth, M., Davies, H., McKay, J.,  Stott,
V. (1997). Evaluating the effectiveness of educational inno-
vations: Using the study process questionnaire to show that
meaningful learning occurs. Studies in Educational Evalua-
tion, 23, 141–157.
Kinshukan, T. L. (2003). User exploration based adaptation in
adaptive learning systems. International Journal of Informa-
tion Systems in Education, 1, 22–31.
Lancellotti, M.,  Sunil, T., Kohli, C. (2016). Online video
modules for improvement in student learning. Journal of
Education for Business, 91, 19–22. doi:10.1080/
08832323.2015.1108281.
Ledden, L., Kalafatis, S.P,  Samouel, P. (2007). The relation-
ship between personal values and perceived value of educa-
tion. Journal of Business Research, 60, 965–974.
Leidner, D. E.,  Jarvenpaa, S. (1995). The use of information
technology to enhance management school education: A
theoretical view. MIS Quarterly, 19, 265–291.
JOURNAL OF EDUCATION FOR BUSINESS 7
Liu, S., Gomez, J., Khan, B.,  Yen, C-J. (2007). Toward a
learner-oriented community college online course dropout
framework. International Journal on E-Learning, 6(Decem-
ber), 519–542.
Luethge, D. J., Raska, D., Greer, B. M.,  O’Connor, C.
(2016). Crossing the Atlantic: Integrating cross-cultural
experiences into undergraduate business courses using
virtual communities technology. Journal of Education
for Business, 91, 219–226. doi:10.1080/08832323.2016.
1160022.
McDonald, R. P.,  Ho, M. (2002). Principles and practice in
reporting statistical equation analyses. Psychological Meth-
ods, 7, 64–82.
McGorry, S. Y. (2006). Data in the palm of your hand. Market-
ing Education Review, 16, 83–90.
McGraw-Hill Education. (2014). Walters State and McGraw-
Hill Education team to provide all-digital learning experi-
ence to undergraduate biology students. Retrieved from
http://www.mheducation.com/news-media/press-releases/
walters-state-and-mcgraw-hill-education-team-provide-all-
digital-learning-experience.html.
McGraw-Hill Higher Education. (2012). McGraw-Hill awards
college scholarships to winners of ‘LearnSmart and win’ video
contest [Press Release]. Retrieved from http://www.prnews
wire.com/news-releases/mcgraw-hill-awards-college-scholar
ships-to-winners-of-learnsmart-and-win-video-contest-
136664358.html
Means, B., Peters, V.,  Zheng, Y. (2014). Lessons from five
years of funding digital courseware by the Gates Foundation
and SRI Research. SRI. Retrieved from: https://www.sri.
com/sites/default/files/publications/psexecsummary_1.pdf
Moore, M. G. (1989). Three types of interaction [Editorial].
American Journal of Distance Education, 3(2), 1–7.
Norman, T. (2011). McGraw-Hill LearnSmart effectiveness
study. New York, NY: McGraw-Hill.
Oxman, S.  Wong, W. (2014). White paper: Adaptive
learning systems. DeVry Education Group. Retrieved
from https://pdfs.semanticscholar.org/4e57/2108fa1591
d21d980a6efe78673edc48c652.pdf
Parayitam, S., Desai, K.,  Phelps, L. (2007). The effect of
teacher communication and course content on student sat-
isfaction and effectiveness. Academy of Educational Leader-
ship Journal, 11, 91–105.
Pintrich, P. R.,  De Groot, E. V. (1990). Motivational and
self-regulation learning components of classroom aca-
demic performance. Journal of Educational Psychology,
82, 33–40.
Richardson, J. T. E. (1994). Cultural specificity of approaches
to studying in higher education: A literature survey. Higher
Education, 27, 449–468.
Rovai, A. P.,  Barnum, K. T. (2003). On-line course effective-
ness: An analysis of student interactions and perceptions of
learning. Journal of Distance Education, 18, 57–73.
Sandman, T. E. (2014). A preliminary investigation into the
adaptive learning styles of business students. Decision Scien-
ces Journal of Innovative Education, 12, 33–54.
Sheppard, M.,  Vibert, C. (2016). Cases for the net gen-
eration: An empirical examination of students’ attitude
toward multimedia case studies. Journal of Education
for Business, 91, 101–107. doi:10.1080/08832323.2015.
1128382.
Smith, S.,  Fisher, D. (2006). You can observe a lot by just watch-
ing: Using videography in a retail setting to teach observational
research methods. Marketing Education Review, 16, 75–78.
Sobel, M. E. (1982). Asymptotic intervals for indirect effects in
structural equations models. In S. Leinhart (Ed.), Sociologi-
cal methodology 1982 (pp. 290–312). San Francisco, CA:
Jossey-Bass.
Steiner, S. D.,  Hyman, M. R. (2010). Improving the student
experience: Allowing students enrolled in a required course
to select online or face-to-face instruction. Marketing Edu-
cation Review, 20, 29–33.
Swan, K. (2003). Learning effectiveness online: What the
research tells us. In J. Bourne  J. C. Moore (Eds.), Elements
of quality online education, practice and direction (pp. 13–
45). Needham, MA: Sloan Center for Online Education.
Sweeney, A. D. P., Morrison, M. D., Jarratt, D.,  Heffernan, T.
(2009). Modeling the constructs contributing to the effec-
tiveness of marketing lecturers. Journal of Marketing Educa-
tion, 31, 190–202.
VanLehn, K. (2006). The behavior of tutoring systems. Inter-
national Journal of Artifical Intelligence in Education, 16,
227–265.
Winch, J. K.,  Cahn, E. S. (2015). Improving student perfor-
mance in a management science course with supplemental
tutorial videos. Journal of Education for Business, 90, 402–
409. doi:10.1080/08832323.2015.1081865.
Wood, N. T., Solomon, M. R.,  Allan, D. (2008). Welcome to
the matrix: E-learning gets a second life. Marketing Educa-
tion Review, 18, 47–53.
Zhang, D., Zhou, L., Briggs, R. O.,  Nunamaker, J. F. (2006).
Instructional video in e-learning: Assessing the impact of
interactive video on learning effectiveness. Information 
Management, 43(1), 15–27.
8 Q. SUN ET AL.

More Related Content

What's hot

12 masnida-upm-u7-004-ready kpt6043
12 masnida-upm-u7-004-ready kpt604312 masnida-upm-u7-004-ready kpt6043
12 masnida-upm-u7-004-ready kpt6043sendirian berhad
 
Attitudes of nursing and midwifery school's student toward blended learning a...
Attitudes of nursing and midwifery school's student toward blended learning a...Attitudes of nursing and midwifery school's student toward blended learning a...
Attitudes of nursing and midwifery school's student toward blended learning a...Alexander Decker
 
The Effect of Mobile Learning on the Development of the Students' Learning Be...
The Effect of Mobile Learning on the Development of the Students' Learning Be...The Effect of Mobile Learning on the Development of the Students' Learning Be...
The Effect of Mobile Learning on the Development of the Students' Learning Be...inventionjournals
 
Impact of lecture method on students learning in islamic study at secondary l...
Impact of lecture method on students learning in islamic study at secondary l...Impact of lecture method on students learning in islamic study at secondary l...
Impact of lecture method on students learning in islamic study at secondary l...Zaffar Ali
 
A way for blending vle and face to-face instruction by Gulden ILIN
A way for blending vle and face to-face instruction by Gulden ILINA way for blending vle and face to-face instruction by Gulden ILIN
A way for blending vle and face to-face instruction by Gulden ILINsuhailaabdulaziz
 
Gamification Strategies in a Hybrid Exemplary College Course
Gamification Strategies in a Hybrid Exemplary College CourseGamification Strategies in a Hybrid Exemplary College Course
Gamification Strategies in a Hybrid Exemplary College CourseSzymon Machajewski
 
The seven principles of online learning: Feedback from faculty and alumni on ...
The seven principles of online learning: Feedback from faculty and alumni on ...The seven principles of online learning: Feedback from faculty and alumni on ...
The seven principles of online learning: Feedback from faculty and alumni on ...eraser Juan José Calderón
 
Reconstructing classroom routines through on line instructional delivery tech...
Reconstructing classroom routines through on line instructional delivery tech...Reconstructing classroom routines through on line instructional delivery tech...
Reconstructing classroom routines through on line instructional delivery tech...Alexander Decker
 
Felege, christopher online education perceptions and recommendations focus ...
Felege, christopher online education   perceptions and recommendations focus ...Felege, christopher online education   perceptions and recommendations focus ...
Felege, christopher online education perceptions and recommendations focus ...William Kritsonis
 
Open Education Bridging the Gap Inequality of Higher Education opportunity
Open Education Bridging the Gap Inequality of Higher Education opportunityOpen Education Bridging the Gap Inequality of Higher Education opportunity
Open Education Bridging the Gap Inequality of Higher Education opportunityIJRESJOURNAL
 
Effectiveness of video based cooperative learning strategy on high, medium an...
Effectiveness of video based cooperative learning strategy on high, medium an...Effectiveness of video based cooperative learning strategy on high, medium an...
Effectiveness of video based cooperative learning strategy on high, medium an...Gambari Isiaka
 
Distance Education and Online Learning Design Options by Frankie A. Fran
Distance Education and Online Learning Design Options by Frankie A. FranDistance Education and Online Learning Design Options by Frankie A. Fran
Distance Education and Online Learning Design Options by Frankie A. FranFrankie Fran
 
Using socrative and smartphones for the support of collaborative learning
Using socrative and smartphones for the support of collaborative learningUsing socrative and smartphones for the support of collaborative learning
Using socrative and smartphones for the support of collaborative learningIJITE
 
Virtualizing the school during the covid 19
Virtualizing the school during the covid 19Virtualizing the school during the covid 19
Virtualizing the school during the covid 19Joshua Owolabi
 
Chung, miri k obstacles preservice teachers encountered focus v6 n1, 2011
Chung, miri k obstacles preservice teachers encountered focus v6 n1, 2011Chung, miri k obstacles preservice teachers encountered focus v6 n1, 2011
Chung, miri k obstacles preservice teachers encountered focus v6 n1, 2011William Kritsonis
 
Effects of web based learning tools on student achievement
Effects of web based learning tools on student achievementEffects of web based learning tools on student achievement
Effects of web based learning tools on student achievementann-crosby
 

What's hot (16)

12 masnida-upm-u7-004-ready kpt6043
12 masnida-upm-u7-004-ready kpt604312 masnida-upm-u7-004-ready kpt6043
12 masnida-upm-u7-004-ready kpt6043
 
Attitudes of nursing and midwifery school's student toward blended learning a...
Attitudes of nursing and midwifery school's student toward blended learning a...Attitudes of nursing and midwifery school's student toward blended learning a...
Attitudes of nursing and midwifery school's student toward blended learning a...
 
The Effect of Mobile Learning on the Development of the Students' Learning Be...
The Effect of Mobile Learning on the Development of the Students' Learning Be...The Effect of Mobile Learning on the Development of the Students' Learning Be...
The Effect of Mobile Learning on the Development of the Students' Learning Be...
 
Impact of lecture method on students learning in islamic study at secondary l...
Impact of lecture method on students learning in islamic study at secondary l...Impact of lecture method on students learning in islamic study at secondary l...
Impact of lecture method on students learning in islamic study at secondary l...
 
A way for blending vle and face to-face instruction by Gulden ILIN
A way for blending vle and face to-face instruction by Gulden ILINA way for blending vle and face to-face instruction by Gulden ILIN
A way for blending vle and face to-face instruction by Gulden ILIN
 
Gamification Strategies in a Hybrid Exemplary College Course
Gamification Strategies in a Hybrid Exemplary College CourseGamification Strategies in a Hybrid Exemplary College Course
Gamification Strategies in a Hybrid Exemplary College Course
 
The seven principles of online learning: Feedback from faculty and alumni on ...
The seven principles of online learning: Feedback from faculty and alumni on ...The seven principles of online learning: Feedback from faculty and alumni on ...
The seven principles of online learning: Feedback from faculty and alumni on ...
 
Reconstructing classroom routines through on line instructional delivery tech...
Reconstructing classroom routines through on line instructional delivery tech...Reconstructing classroom routines through on line instructional delivery tech...
Reconstructing classroom routines through on line instructional delivery tech...
 
Felege, christopher online education perceptions and recommendations focus ...
Felege, christopher online education   perceptions and recommendations focus ...Felege, christopher online education   perceptions and recommendations focus ...
Felege, christopher online education perceptions and recommendations focus ...
 
Open Education Bridging the Gap Inequality of Higher Education opportunity
Open Education Bridging the Gap Inequality of Higher Education opportunityOpen Education Bridging the Gap Inequality of Higher Education opportunity
Open Education Bridging the Gap Inequality of Higher Education opportunity
 
Effectiveness of video based cooperative learning strategy on high, medium an...
Effectiveness of video based cooperative learning strategy on high, medium an...Effectiveness of video based cooperative learning strategy on high, medium an...
Effectiveness of video based cooperative learning strategy on high, medium an...
 
Distance Education and Online Learning Design Options by Frankie A. Fran
Distance Education and Online Learning Design Options by Frankie A. FranDistance Education and Online Learning Design Options by Frankie A. Fran
Distance Education and Online Learning Design Options by Frankie A. Fran
 
Using socrative and smartphones for the support of collaborative learning
Using socrative and smartphones for the support of collaborative learningUsing socrative and smartphones for the support of collaborative learning
Using socrative and smartphones for the support of collaborative learning
 
Virtualizing the school during the covid 19
Virtualizing the school during the covid 19Virtualizing the school during the covid 19
Virtualizing the school during the covid 19
 
Chung, miri k obstacles preservice teachers encountered focus v6 n1, 2011
Chung, miri k obstacles preservice teachers encountered focus v6 n1, 2011Chung, miri k obstacles preservice teachers encountered focus v6 n1, 2011
Chung, miri k obstacles preservice teachers encountered focus v6 n1, 2011
 
Effects of web based learning tools on student achievement
Effects of web based learning tools on student achievementEffects of web based learning tools on student achievement
Effects of web based learning tools on student achievement
 

Similar to LearnSmart adaptive teaching and student learning effectiveness An empirical investigation

Blended Learning - Reading Horizons White Paper
Blended Learning - Reading Horizons White PaperBlended Learning - Reading Horizons White Paper
Blended Learning - Reading Horizons White PaperReading Horizons
 
GROUP 5_RESEARCH PROPOSAL.docx
GROUP 5_RESEARCH PROPOSAL.docxGROUP 5_RESEARCH PROPOSAL.docx
GROUP 5_RESEARCH PROPOSAL.docxAimRoneRubia
 
Administrative Support of Faculty Preparation and Interactivity in Online Tea...
Administrative Support of Faculty Preparation and Interactivity in Online Tea...Administrative Support of Faculty Preparation and Interactivity in Online Tea...
Administrative Support of Faculty Preparation and Interactivity in Online Tea...William Kritsonis
 
Adult Learning Theories ChartPart 1 Theories o.docx
Adult Learning Theories ChartPart 1  Theories o.docxAdult Learning Theories ChartPart 1  Theories o.docx
Adult Learning Theories ChartPart 1 Theories o.docxdaniahendric
 
Morpheus UNIMAS: Strengthening Student Engagement in Blended Learning Environ...
Morpheus UNIMAS: Strengthening Student Engagement in Blended Learning Environ...Morpheus UNIMAS: Strengthening Student Engagement in Blended Learning Environ...
Morpheus UNIMAS: Strengthening Student Engagement in Blended Learning Environ...Kee-Man Chuah
 
USING BLENDED LEARNING IN DEVELOPING STUDENT TEACHERS TEACHI…
USING BLENDED LEARNING IN DEVELOPING STUDENT TEACHERS TEACHI…USING BLENDED LEARNING IN DEVELOPING STUDENT TEACHERS TEACHI…
USING BLENDED LEARNING IN DEVELOPING STUDENT TEACHERS TEACHI…Hisham Hussein
 
The Mobile Classroom at Cross Purposes with Higher Education: Pros and Cons: ...
The Mobile Classroom at Cross Purposes with Higher Education: Pros and Cons: ...The Mobile Classroom at Cross Purposes with Higher Education: Pros and Cons: ...
The Mobile Classroom at Cross Purposes with Higher Education: Pros and Cons: ...William Kritsonis
 
Lumadue, rick the mobile classroom at cross purposes with higher education fo...
Lumadue, rick the mobile classroom at cross purposes with higher education fo...Lumadue, rick the mobile classroom at cross purposes with higher education fo...
Lumadue, rick the mobile classroom at cross purposes with higher education fo...William Kritsonis
 
Factors affecting the quality of e-learning during the COVID-19 pandemic from...
Factors affecting the quality of e-learning during the COVID-19 pandemic from...Factors affecting the quality of e-learning during the COVID-19 pandemic from...
Factors affecting the quality of e-learning during the COVID-19 pandemic from...eraser Juan José Calderón
 
ULTO Teaching & Learning Conference
ULTO Teaching & Learning ConferenceULTO Teaching & Learning Conference
ULTO Teaching & Learning ConferenceAnthony Rippon
 
WEB-CONFERENCING AND STUDENTS’ ENGAGEMENT IN SCIENCE(title defense).pptx
WEB-CONFERENCING AND STUDENTS’ ENGAGEMENT IN SCIENCE(title defense).pptxWEB-CONFERENCING AND STUDENTS’ ENGAGEMENT IN SCIENCE(title defense).pptx
WEB-CONFERENCING AND STUDENTS’ ENGAGEMENT IN SCIENCE(title defense).pptxsharon veloso
 
The effectiveness of using blended learning on learning outcomes and motivation
The effectiveness of using blended learning on  learning outcomes and motivationThe effectiveness of using blended learning on  learning outcomes and motivation
The effectiveness of using blended learning on learning outcomes and motivationResearch Publish Journals (Publisher)
 
Against The Odds Teaching Writing In An Online Environment
Against The Odds  Teaching Writing In An Online EnvironmentAgainst The Odds  Teaching Writing In An Online Environment
Against The Odds Teaching Writing In An Online EnvironmentAudrey Britton
 
Educational technology mdia method in teaching and learning progress
Educational technology mdia method in teaching and learning progressEducational technology mdia method in teaching and learning progress
Educational technology mdia method in teaching and learning progressAziz Ahmad
 
Challenges and Experiences of Students in the Virtual Classroom World: A Lite...
Challenges and Experiences of Students in the Virtual Classroom World: A Lite...Challenges and Experiences of Students in the Virtual Classroom World: A Lite...
Challenges and Experiences of Students in the Virtual Classroom World: A Lite...Dr. Amarjeet Singh
 
Research proposal
Research proposal Research proposal
Research proposal Sarah Richer
 
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...IJITE
 

Similar to LearnSmart adaptive teaching and student learning effectiveness An empirical investigation (20)

Blended Learning - Reading Horizons White Paper
Blended Learning - Reading Horizons White PaperBlended Learning - Reading Horizons White Paper
Blended Learning - Reading Horizons White Paper
 
GROUP 5_RESEARCH PROPOSAL.docx
GROUP 5_RESEARCH PROPOSAL.docxGROUP 5_RESEARCH PROPOSAL.docx
GROUP 5_RESEARCH PROPOSAL.docx
 
Administrative Support of Faculty Preparation and Interactivity in Online Tea...
Administrative Support of Faculty Preparation and Interactivity in Online Tea...Administrative Support of Faculty Preparation and Interactivity in Online Tea...
Administrative Support of Faculty Preparation and Interactivity in Online Tea...
 
2 yoon
2 yoon2 yoon
2 yoon
 
Adult Learning Theories ChartPart 1 Theories o.docx
Adult Learning Theories ChartPart 1  Theories o.docxAdult Learning Theories ChartPart 1  Theories o.docx
Adult Learning Theories ChartPart 1 Theories o.docx
 
Anica for revision
Anica for revisionAnica for revision
Anica for revision
 
Morpheus UNIMAS: Strengthening Student Engagement in Blended Learning Environ...
Morpheus UNIMAS: Strengthening Student Engagement in Blended Learning Environ...Morpheus UNIMAS: Strengthening Student Engagement in Blended Learning Environ...
Morpheus UNIMAS: Strengthening Student Engagement in Blended Learning Environ...
 
USING BLENDED LEARNING IN DEVELOPING STUDENT TEACHERS TEACHI…
USING BLENDED LEARNING IN DEVELOPING STUDENT TEACHERS TEACHI…USING BLENDED LEARNING IN DEVELOPING STUDENT TEACHERS TEACHI…
USING BLENDED LEARNING IN DEVELOPING STUDENT TEACHERS TEACHI…
 
The Mobile Classroom at Cross Purposes with Higher Education: Pros and Cons: ...
The Mobile Classroom at Cross Purposes with Higher Education: Pros and Cons: ...The Mobile Classroom at Cross Purposes with Higher Education: Pros and Cons: ...
The Mobile Classroom at Cross Purposes with Higher Education: Pros and Cons: ...
 
Lumadue, rick the mobile classroom at cross purposes with higher education fo...
Lumadue, rick the mobile classroom at cross purposes with higher education fo...Lumadue, rick the mobile classroom at cross purposes with higher education fo...
Lumadue, rick the mobile classroom at cross purposes with higher education fo...
 
Factors affecting the quality of e-learning during the COVID-19 pandemic from...
Factors affecting the quality of e-learning during the COVID-19 pandemic from...Factors affecting the quality of e-learning during the COVID-19 pandemic from...
Factors affecting the quality of e-learning during the COVID-19 pandemic from...
 
ULTO Teaching & Learning Conference
ULTO Teaching & Learning ConferenceULTO Teaching & Learning Conference
ULTO Teaching & Learning Conference
 
WEB-CONFERENCING AND STUDENTS’ ENGAGEMENT IN SCIENCE(title defense).pptx
WEB-CONFERENCING AND STUDENTS’ ENGAGEMENT IN SCIENCE(title defense).pptxWEB-CONFERENCING AND STUDENTS’ ENGAGEMENT IN SCIENCE(title defense).pptx
WEB-CONFERENCING AND STUDENTS’ ENGAGEMENT IN SCIENCE(title defense).pptx
 
The effectiveness of using blended learning on learning outcomes and motivation
The effectiveness of using blended learning on  learning outcomes and motivationThe effectiveness of using blended learning on  learning outcomes and motivation
The effectiveness of using blended learning on learning outcomes and motivation
 
Against The Odds Teaching Writing In An Online Environment
Against The Odds  Teaching Writing In An Online EnvironmentAgainst The Odds  Teaching Writing In An Online Environment
Against The Odds Teaching Writing In An Online Environment
 
Educational technology mdia method in teaching and learning progress
Educational technology mdia method in teaching and learning progressEducational technology mdia method in teaching and learning progress
Educational technology mdia method in teaching and learning progress
 
article_8371.pdf
article_8371.pdfarticle_8371.pdf
article_8371.pdf
 
Challenges and Experiences of Students in the Virtual Classroom World: A Lite...
Challenges and Experiences of Students in the Virtual Classroom World: A Lite...Challenges and Experiences of Students in the Virtual Classroom World: A Lite...
Challenges and Experiences of Students in the Virtual Classroom World: A Lite...
 
Research proposal
Research proposal Research proposal
Research proposal
 
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
 

LearnSmart adaptive teaching and student learning effectiveness An empirical investigation

  • 1. Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=vjeb20 Download by: [California State University], [Yann Abdourazakou] Date: 30 January 2017, At: 09:56 Journal of Education for Business ISSN: 0883-2323 (Print) 1940-3356 (Online) Journal homepage: http://www.tandfonline.com/loi/vjeb20 LearnSmart, adaptive teaching, and student learning effectiveness: An empirical investigation Qin Sun, Yann Abdourazakou & Thomas J. Norman To cite this article: Qin Sun, Yann Abdourazakou & Thomas J. Norman (2017): LearnSmart, adaptive teaching, and student learning effectiveness: An empirical investigation, Journal of Education for Business, DOI: 10.1080/08832323.2016.1274711 To link to this article: http://dx.doi.org/10.1080/08832323.2016.1274711 Published online: 19 Jan 2017. Submit your article to this journal Article views: 15 View related articles View Crossmark data
  • 2. LearnSmart, adaptive teaching, and student learning effectiveness: An empirical investigation Qin Suna , Yann Abdourazakoub , and Thomas J. Normanb a Trident University International, Cypress, California, USA; b California State University, Dominguez Hills, Carson, California, USA ABSTRACT Facing the growing number of digital natives entering the classroom, business professors look for innovative ways to enhance the student learning experience. The authors focus on the online interactive learning tool LearnSmart (McGraw-Hill, New York, NY), and examine its impact on student learning effectiveness by testing the direct and indirect relationships among perceived competence, perceived challenge, instructors, perceived value, and satisfaction with LearnSmart. Constructionism served as the theoretical foundation for this study. About 215 students at a public university in the United States took the survey and 197 valid responses were received. Regression analysis results showed that the use of LearnSmart improved students’ perceived competency, thus increasing their perceived value of using LearnSmart, as well as their satisfaction with LearnSmart. Perceived value was also found to mediate the impact of perceived competency on satisfaction with LearnSmart, and the instructor played a significant role in facilitating and improving student learning. Perceived challenge impacted student’s perceived value of using LearnSmart, but it did not influence satisfaction with LearnSmart. KEYWORDS Adaptive teaching; interactive learning tool; perceived challenge; perceived competence; perceived value; student learning effectiveness Introduction Student learning styles have been found to influence stu- dent academic performance and thus it is important for educators to adapt teaching methods to student learning styles (Fleming, 2001; Sandman, 2014). As digital millen- nial college students are more comfortable with the online learning (Comer, Lenaghan, & Sengupta, 2015; Ganesh & Sun, 2009), online teaching is increasingly a part of course delivery. One major advantage of online education is flexibility, especially to students with time, distance or language constraints (Liu, Gomez, Khan, & Yen, 2007; Steiner & Hyman, 2010). Therefore, it is not surprising that more business professors intend to incor- porate online learning into their curriculum. Universities are also offering hybrid classes that inte- grate online education into the traditional classroom set- ting. In this way, professors combine the advantages of online educations with traditional face-to-face teaching, and thus enrich student learning. The effectiveness of face to face teaching was found to be related to appropri- ate instructional design and instructors’ subject knowl- edge and capability to communicate clearly with the students, to create rapport with the students, and deliver the course enthusiastically (Sweeney, Morrison, Jarratt, & Heffernan, 2009). On the other hand, online practices could facilitate the teaching in classroom (Berger & Topol, 2001). With the rapid advancement of information technol- ogy students have shown increasing interest in technol- ogy-enhanced pedagogies (Jackson, Helms, Jackson, & Gum, 2011), and business professors look for innovative ways to teach business topics and enhance student learn- ing effectiveness. Various high-tech devices and technologies are employed in business courses to enhance student learning effectiveness such as personal digital assistants and video cameras (McGorry, 2006; Smith & Fisher, 2006), multimedia (Sheppard & Vibert, 2016), online video (Lancellotti, Thomas, & Kohli, 2016; Winch & Cahn, 2015), virtual communities (Luethge, Raska, Greer, & O’Connor, 2016), class blogs (Ferguson, Makarem, & Jones, 2016), and electronic conferencing (Wood, Solomon, & Allan, 2008). With the complement of technologies, experiential learning can be promoted by interactive or adaptive online teaching, which conforms to the requirements of the Association to Advance Collegiate Schools of Busi- ness (AACSB) to increase the flexibility and responsive- ness to student needs (AACSB, 2003; Hatfield & Taylor, CONTACT Qin Sun qin.sun@trident.edu Trident University International, College of Business Administration, 5757 Plaza Dr., Suite 100, Cypress, CA 90630, USA. © 2017 Taylor & Francis Group, LLC JOURNAL OF EDUCATION FOR BUSINESS http://dx.doi.org/10.1080/08832323.2016.1274711
  • 3. 1998). For example, professors can use discussion forums, messaging and emails to create an interactive community to facilitate and encourage experiential learning among students (Wood et al., 2008), or design interactive spreadsheets to enhance students’ problem solving skills (Bertheussen & Myrland, 2016). Online teaching and learning can be customized when the assignments are interactive. Previous researchers have identified different types of interactivity that can occur online: learner-content interaction, learner- instructor interaction, and learner-learner interaction (Moore, 1989) as well as student-learning management system interaction (Davidson-Shivers, 2009). Previous studies have revealed that online courses with great levels of interactivity combine higher levels of student motiva- tion, enhanced learning outcomes, and satisfaction over interactive learning environments (Espasa & Meneses, 2010). Perceived overall interactivity also positively influ- ences learner satisfaction (Fulford & Zhang, 1993). In the broader context of interactive teaching, it’s less clear as to whether adaptive learning systems should fit into exist- ing online or on-campus courses. In this study, we intend to examine how the integra- tion of an online interactive learning tool (i.e., Learn- Smart) could enhance student learning and thus learning effectiveness. To the best of our knowledge, there is little peer-reviewed work examining this new learning tool in the business fields. In this study, we describe the imple- mentation of LearnSmart in introductory marketing and management courses at a West Coast public university. Literature review Student learning is the central focus of higher education. However, student learning is context dependent, and various factors influence student learning effectiveness such as students’ own motivation, classroom climate, teaching methods, and course level (Comer et al., 2015). In particular, interaction is considered as a determining factor to promote student learning effectiveness, not only in traditional classroom setting but also in online education (Rovai & Barnum, 2003; Swan, 2003). The importance of interactions that learners engage in has been highlighted in the adaptive teaching and learning pedagogy. According to Oxman and Wong (2014), adap- tive learning refers to a learning process where the con- tent taught adapts based on the responses of the individual. Adaptive learning systems can help personal- ize the instruction based on an individual learning model (Kinshukan, 2003; VanLehn, 2006). Individualized pac- ing used in adaptive learning technology demonstrated more positive impacts than those with class-based or a mixed form of pacing, and courses with adaptive learning technologies showed better learning outcomes than nonadaptive ones (Means, Peters, & Zheng, 2014). LearnSmart is one adaptive learning system that can be used on top of fully online or on-campus courses at any university. It gives a new geometry to the interactivity between students, the instructor and the course content. LearnSmart is an adaptive learning tool that evaluates students’ knowledge levels by tracking the topics stu- dents have mastered, thus identifies the areas that need further instruction and practice. Depending on student progress, LearnSmart automatically adapts the learning contents based on their knowledge strengths and weak- nesses, and their confidence level around that knowledge (Norman, 2011). So LearnSmart is tailored to the specific needs of each student with the continuous evaluation of a student’s knowledge on the concepts covered in each chapter (McGraw-Hill Higher Education, 2012). LearnSmart’s adaptive technology also identifies the concepts that students are most likely to forget over the course of the semester—by considering those that they had been weakest on or least confident with—and encour- ages periodic review by the students to ensure that con- cepts are truly learned and retained. As a result, it goes beyond systems that simply help students study for a test or exam, and helps students with true concept retention and learning. LearnSmart also generates dynamic reports to document progress and suggests areas for additional reinforcement, offering students real-time feedback on their content mastery. By monitoring student progress, professors can instantly assess the level of understanding and mastery for an entire class or an individual student at any given time, and adapt the teaching in classroom (McGraw-Hill Education, 2014; Norman, 2011). Statement/significance of the problem Akin (1981) discussed that marketing college students should learn not only marketing, but skills that will help them in the future in the marketing field, such as the skill of knowing how to learn. Kember, Charlesworth, Davies, McKay, and Stott (1997) also emphasized the impor- tance of enhancing meaningful independent learning among college students. On one hand, college students need to digest and comprehend the materials; on the other hand, students should be able to apply those course materials for academic assessment and future profes- sional development (Richardson, 1994). The adaptive teaching method could help meet individual student needs and thus enhance student learning effectiveness (Zhang, Zhou, Briggs, & Nunamaker, 2006). Student learning effectiveness refers to the learning value as per- ceived by students (Ganesh, Paswan, & Sun, 2015). 2 Q. SUN ET AL.
  • 4. Constructivism posits that students who are engaged with interactive activities would have more effective learning than those who are not (Leidner & Jarvenpaa, 1995). Therefore, it is important for business professors to emphasize the engaging learning experience among college students. LearnSmart offers an interactive and adaptive way for students to read digital textbook chap- ters while engaging with online practice problems and quizzes. In comparison, students who read hard copy textbooks would not have the opportunity to apply these concepts; thus, the learning might be less effective. Measurement of student learning effectiveness can be assessed based on the objective performance such as course grades, as well as subjective evaluation such as student perception of LearnSmart and their satisfaction with its use (Swan, 2003). Griff and Matter (2013) looked at test scores of students in six schools, but did not find a significant impact of LearnSmart on student grade per- formance. However, there is no study to explore the potential influence of this adaptive online learning tool on student perceived learning effectiveness. This study fills this literature gap and empirically tests the impact of LearnSmart on several subjective evaluation factors of learning effectiveness, that is, satisfaction with Learn- Smart and perceived value, considering several relevant factors in the literature such as perceived competence, perceived challenge, and instructors (Ganesh et al., 2015). Perceived competence Perceived competence is defined as the students’ percep- tion of mastering their school work (Pintrich & De Groot, 1990). Those students with higher perceived com- petence have been found to be more intrinsically moti- vated, have higher test performance than those with lower perceived competence (Bicen & Laverie, 2009). In addition, students with higher perceived competence tend to give better course evaluations, thus indicating more effective student learning (Clayson, 2009; Ganesh et al., 2015). LearnSmart intends to improve student learning by automatically adapting the practices and tests with respect to student understanding of course contents, thus enhancing their perceived competence. Students have the opportunity to do multiple practices to under- stand the same concepts if they did not fully understand them at the beginning. Since perceived learning value can be used to measure student learning effectiveness (Ganesh et al., 2015), we would expect the students would perceive high value from LearnSmart. In this way, students would be more satisfied with LearnSmart as the learning effectiveness improves. Therefore, we formu- lated the following hypotheses: Hypothesis 1a (H1a): Perceived competence would be positively associated with perceived value. H1b: Perceived competence would be positively associ- ated with satisfaction with LearnSmart. Perceived challenge Perceived challenge refers to the student perception of the workload in the class and the extent of difficulty of the class or assignment. Extant literature shows the direct association between perceived challenge and the overall evaluation of class (Ganesh et al., 2015; Parayi- tam, Desai, & Phelps, 2007). LearnSmart incorporates various practices and quizzes into each chapter, which requires much more time for students to finish the chapter than just reading a textbook. Students may get frustrated at the beginning due to the extra work and thus perceive some challenge. With the extra learning and improved understanding of course content, the stu- dents would feel more competent and less challenged over time. Taking these findings into account, we would expect the students to perceive more value even though it is perceived to be more challenging since they may feel that they have become more competent. By the same token, we expected that the increased value per- ception of this adaptive learning tool would lead to more satisfaction with LearnSmart (Ledden, Kalafatis, & Samouel, 2007). As a result, it is reasonable to assume the following: H2a: Perceived challenge would be positively associ- ated with perceived value. 2b: Perceived challenge would be positively associated with satisfaction with LearnSmart. Mediation role of perceived value Perceived value by the students refers to the overall eval- uation of the utility of the service or learning tool and higher perceived value lead to satisfaction with the edu- cation (Ledden et al., 2007; Dlacic, Arslanagic, Kadic- Maglajlic, Mrkovic, Raspor, 2014). As students per- ceive some value from the use of LearnSmart, they may become more satisfied with the tool. In addition, because perceived competency and perceived challenge are expected to impact the perceived value of using Learn- Smart, it is logical to propose that perceived value would mediate the connection between perceived competency and satisfaction with LearnSmart, as well as the relation between perceived challenge and satisfaction with Learn- Smart. Consequently, we propose that the following: H3a: Perceived value would mediate the relation between perceived competency and satisfaction with LearnSmart. JOURNAL OF EDUCATION FOR BUSINESS 3
  • 5. 3b: Perceived value would mediate the relation between perceived challenge and satisfaction with LearnSmart. Moderating role of instructor Adaptive teaching tries to match teaching methods with student learning style and instructor plays a significant role in this adaptive learning process (Fleming, 2001; Sandman, 2014). As a result, we also explored the role of instructor in the student teaching effectiveness, specifically a student’s perceived value of LearnSmart. We assume that the more experienced instructors would provide better guidance regarding student use of LearnSmart and in-class instruc- tion, which alleviate perceived challenge of using Learn- Smart and thus increase their perceived competency of learning. As a result, students would perceive higher value from LearnSmart from experienced instructors than less experienced ones, which leads to the following hypotheses: H4a: The instructor would moderate the relation between perceived competency and perceived value. H4b: The instructor would moderate the relation between perceived challenge and perceived value. Research methodology The research context for this study was four undergradu- ate marketing and management courses, offered at a pub- lic university in the western United States. LearnSmart is a part of the McGraw-Hill Connect course, which is required for students to study each chapter of the course. It is designed to improve student’s understanding of course contents through its online interactive platform. Over the 15-week semester, students were asked to com- plete the LearnSmart assignment either before or after the instructor finished the lecture in class. They were given one week to finish LearnSmart and then take a quiz for that chapter. The instructors can check the assignment statistics to get details on student performance and the time each student spent to finish LearnSmart assign- ments. Then the instructors can adapt in-class teaching with respect to the student LearnSmart performance. Survey instrument We also borrowed and adapted existing scales to measure each construct, based on the qualitative feedback from stu- dents in a marketing discussion forum (Table 1). Perceived competency, perceived challenge, and satisfaction with LearnSmart were measured with a 7-point Likert-type scale with responses ranging from 1 (strongly disagree) to 7 (strongly agree; Ganesh et al., 2015). Ganesh et al. devel- oped these scales based on faculty evaluation instruments commonly used in higher education and validated these scales with acceptable reliability, as well as convergent and discriminant validity. Perceived value looks at the compar- ative value of LearnSmart with respect to reading a text- book and the 7-point Likert-type scale has been adapted from Dlacic et al. (2014), which has showed acceptable reliability, as well as convergent and discriminant validity of this scale for perceived value. Demographic questions such as age, gender, ethnicity, and employment status were included at the end of questionnaire. A pretest on the survey instrument was conducted to ensure the face validity of instrument. Several experienced researchers consented to review the clarity of the wording, coherence, and logic order and possible ambiguity of the questionnaire. The questionnaire was refined as a result. Data collection Data for this research were collected using an online survey. Students were offered a course grade incentive Table 1. Constructs scales. Constructs Scale item M SD Perceived competency LearnSmart made me more confident in learning course concepts. 5.55 1.402 LearnSmart made me more confident in applying course concepts. 5.41 1.422 LearnSmart improved my critical thinking ability. 5.18 1.502 LearnSmart taught me tools for decision making. 5.02 1.532 LearnSmart taught me skills useful for life. 4.79 1.561 Perceived value LearnSmart provides greater learning value than just reading a textbook. 5.64 1.48 LearnSmart provides greater learning value than the class lectures. 5.01 1.578 LearnSmart pushes me to peak performance compared to just reading a textbook. 5.29 1.529 LearnSmart helps me to earn a higher grade than just reading a textbook. 5.49 1.494 Perceived challenge LearnSmart requires more work than just reading a textbook. 5.48 1.636 LearnSmart takes too much time than just reading a textbook. 4.74 1.773 LearnSmart is more challenging than just reading a textbook. 4.62 1.805 It is more frustrating to do LearnSmart than just reading a textbook. 4.03 1.865 Satisfaction with LearnSmart I would like to continue using LearnSmart for other courses. 5.31 1.59 I have no regrets about using LearnSmart. 5.30 1.591 I am satisfied with the learning effectiveness of LearnSmart. 5.38 1.458 I would recommend the use of LearnSmart in other courses. 5.4 1.553 4 Q. SUN ET AL.
  • 6. for their voluntary participation. This hardly made any difference to the grade outcome, yet was spectacularly successful in encouraging response (Ganesh et al., 2015). About 215 students were invited to participate in this study and we received 197 valid responses. Table 2 shows the demographic characteristics of sample; 52.8% of the participants (107) were women and 46.2% (92) were men. The largest age group of the respondents was 21–25 years old (50.8%), followed by 26–30 years old (21.1%), 36 years old or older (12.1%), and 18– 20 years old (5.5%). About half of the respondents (47.2%) were Hispanic, while African American, Asian, and Caucasian participants each were about 15% of the sample. The majority of the respondents were employed: 40.7% with full time employment, 35.7% with part-time employment, and only 22.6% with no employment. Findings The exploratory factor analysis (EFA) for the con- structs was conducted to evaluate the dimensionality of each construct. Only one factor was extracted to show the unidimensionality of each construct. Based on the Cronbach’s alpha values for the constructs in this study (Table 3), ranging from .818 to .943, all latent constructs used in the hypothesized model have acceptable reliability (Churchill, 1979). The average variance extracted (AVE) were all above 0.50, with perceived competency at 0.797, perceived value at 0.821, perceived challenge at 0.649, and satisfaction with LearnSmart at 0.857 (Table 3), indicating accept- able convergent validity (McDonald Ho, 2002). In addition, as the square root value of AVE per factor (from 0.806 to 0.926) is more than the inter-factor correlations (from 0.018 to 0.794), the constructs are considered to have adequate discriminant validity (Table 4; Fornell Larcker, 1981). The hypothesized mediation relationships were tested using three-step multiple regressions proposed by Baron and Kenny (1986) and Sobel (1982). Three multiple regressions were run to test the direct and indirect rela- tionship between perceived competency, perceived value and learning satisfaction with LearnSmart. The first regression (Table 5) showed that perceived competency is significant related to satisfaction with LearnSmart (b D .794, p D .000) and the second regression also indi- cated the significant impact of perceived competency on perceived value (b D .772, p D .000). Therefore, H1a and H1b are supported. The third regression found that both perceived competency (b D .491, p D .000) and per- ceived value (b D .392, p D .000) are positively related to satisfaction with LearnSmart. In addition, the beta value of perceived competency in the third regression is smaller than that in the first regression, showing partial mediation of perceived value. The Sobel test results sup- ported the significance of partial mediation (Z D 5.891, p D .000). As a result, H3a is confirmed. By the same token, three multiple regressions were used to test the direct and indirect relationship between perceived challenge, perceived value and Table 2. Sample demographic characteristics. Variables Category n % Gender Male 92 46.2 Female 105 52.8 Age (years) 18–20 11 5.5 21–25 101 50.8 26–30 42 21.1 31–35 19 9.5 36 or older 24 12.1 Ethnicity African American 27 13.6 Asian 29 14.6 Caucasian 30 15.1 Hispanics 94 47.2 Others 17 8.5 Employment Full time 81 40.7 Part time 71 35.7 No employment 45 22.6 Table 3. Factor loadings and AVE. Constructs Items Factor loading AVE Square root of AVE PC PC1 0.885 PC2 0.891 PC3 0.911 PC4 0.908 PC5 0.868 0.797 0.893 PV PV1 0.92 PV2 0.85 PV3 0.933 PV4 0.92 0.821 0.906 PCh PCh1 0.749 PCh2 0.827 PCh3 0.877 PCh4 0.764 0.649 0.806 SwLS SwLS1 0.947 SwLS2 0.872 SwLS3 0.933 SwLS4 0.949 0.857 0.926 Note. AVE D average variance extracted. PC D perceived competency; PCh D perceived challenge; PV D perceived value; SwLS D satisfaction with LearnSmart. Table 4. Convergent and discriminant validity. PC PV PCh SwLS PC .936 PV .772ÃÃ .926 PCh .175Ã .134 .818 SwLS .794ÃÃ .771ÃÃ .018 .943 Note. Cronbach’s alpha is italicized diagonally and the correlations are off the diagonal. PC D perceived competency; PCh D perceived challenge; PV D perceived value; SwLS D satisfaction with LearnSmart. Ã p .05 level (two tailed). ÃÃ p .01 level (two tailed). JOURNAL OF EDUCATION FOR BUSINESS 5
  • 7. learning satisfaction with LearnSmart. The first regres- sion (Table 6) showed that perceived challenge is not significant related to satisfaction with LearnSmart (b D .018, p D .799), but the second regression indi- cated the significant impact of perceived challenge on perceived value (b D .134, p D .059). Therefore, H2a is supported whereas H2b is rejected. Although the third regression found that both perceived challenge (b D ¡.087, p D .057) and perceived value (b D .783, p D .000) are significantly related to satisfaction with LearnSmart, there is no mediation effect of perceived value (Z D 0.803, p D .250). As a result, H3b is not supported. Table 7 shows the results of hierarchical regressions with instructor as an independent variable in model 1 and as a moderator in model 2. The results show the sig- nificant moderating effect of instructor with respect to the relation between perceived competency and per- ceived value (b D .383, p D .066), and that between per- ceived challenge and perceived value (b D ¡.455, p D .015). Three instructors taught these four courses. One had 5 years of experience with LearnSmart and the other two were using LearnSmart for the first time. The post hoc Scheffe test shows that the student in the class taught by the instructor with several years of LearnSmart per- ceived more value than those in the classes taught by the instructors who just started to use LearnSmart. There- fore, H4a and H4b are supported. Conclusion In this study we intended to examine the influence of an online interactive learning tool LearnSmart on student evaluation of this tool, the course, and their learning effectiveness. Perceived competence and perceived chal- lenge were also investigated to test their potential impacts on student learning effectiveness by using Learn- Smart. The findings show that the use of LearnSmart improves student’s perceived competency, thus increas- ing their perceived value of using LearnSmart, as well as their satisfaction with LearnSmart. Perceived value was also found to mediate the impact of perceived compe- tency on satisfaction with LearnSmart. At the same time, the instructor was found to play a significant role to facil- itate student learning and improve student learning effectiveness. Perceived challenge was found in this study to impact student’s perceived value of using LearnSmart, while it does not influence satisfaction with LearnSmart. How- ever, experienced instructors could help students improve their perceived value of using LearnSmart by adapting their teaching to student learning style. The results of this study help evaluate the effectiveness of using LearnSmart to enhance student learning effective- ness and make recommendations on its future use. The results also add new knowledge to marketing education literature regarding the employment of a new informa- tion technology in the course design. Recommendations for additional research There are some limits of this study. First of all, the objec- tive measures of student performance such as student Table 5. Mediation test with perceived competency as IV. DV IV b t p R2 Z value Regression 1 Satisfaction with LS Perceived competency .794 18.325 .000 .630 Regression 2 Perceived value Perceived competency .772 17.038 .000 .596 Regression 3 Satisfaction with LS Perceived competency .491 7.882 .000 .693 Perceived value .392 6.300 .000 Sobel test .000 5.891 Table 6. Mediation test with perceived challenge as IV. DV IV b t p R2 Z Regression 1 Satisfaction with LS Perceived challenge .018 0.256 .799 .000 Regression 2 Perceived value Perceived challenge .134 1.901 .059 .018 Regression 3 Satisfaction with LS Perceived challenge ¡.087 ¡1.912 .057 .602 Perceived value .783 17.230 .000 Sobel test .057 1.900 Table 7. Hierarchical regression with instructor as moderator. Variables Model 1 Model 2 PC .790ÃÃÃ .578ÃÃÃ PCh .002 .315ÃÃ Ins ¡.124ÃÃ ¡.106 InteractionPCIns .383y InteractionPChIns ¡.455ÃÃ R2 .611 .626 Note. Ins D instruction; PC D perceived competency; PCh D perceived chal- lenge; dependent variable (DV) D perceived value. y p .10 (two tailed). ÃÃ p .01 (two tailed). ÃÃÃ p .001 (two tailed). 6 Q. SUN ET AL.
  • 8. quiz grades and final grades are not included in the study. Although a previous study shows an insignificant impact of using LearnSmart on student’s test perfor- mance, there is a need to consider both objective and subjective measures in the same study. Second, the cross sectional data is used in this study and longitudinal anal- ysis is needed in future study to provide richer insights. In addition, other factors such as student experience with technology and their attitudes toward technology could impact their use of online learning tool such as LearnSmart and future researchers could explore the impact of these factors on student learning performance. More robust mediation analysis using structural equa- tion modeling can be conducted to further test the hypotheses in this study. Of the three types of interaction, learner-learner inter- action can be limited within LearnSmart, which can chal- lenge the implementation of one type of interaction in an online course. The ability of business instructors to com- bine LearnSmart with other tools available in the online environment such as blackboard is critical. The ability to integrate learning needs and preferences for different types of interactivity to increase satisfaction would be valuable to see if there can be any teaching effectiveness correlation. To generalize the findings of this first study, a new survey will be undertaken in order to increase our sample size with a second set of data. References Akin, G. (1981). Viewpoint: Marketing educators should model competencies, assign group projects. Marketing News, 15(2), 2. Association to Advance Collegiate Schools of Business (AACSB). (2003). Eligibility procedures and standards for business accreditation, St. Louis, MO: Author. Baron, R. M., Kenny, D. A. (1986). The moderator-mediator variable distinction in social psychological research: Con- ceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51, 1173–1182. Berger, K. A., Topol, M. T. (2001). Technology to enhance learning: Use of a web site platform in traditional classes and distance learning. Marketing Education Review, 11(3), 15–26. Bertheussen, B. A., Myrland, Ø. (2016). Relation between academic performance and students’ engagement in digital learning activities. Journal of Education for Business, 91, 125–131. doi:10.1080/08832323.2016.1140113. Bicen, P., Laverie, D. A. (2009). Group-based assessment as a dynamic approach to marketing education. Journal of Mar- keting Education, 31, 96–108. Churchill, G. A. Jr. (1979). A paradigm for developing better measures of marketing constructs. Journal of Marketing Research, 16, 64–73. Clayson, D. E. (2009). Student evaluations of teaching: Are they related to what students learn? A meta-analysis and review of the literature. Journal of Marketing Education, 31, 16–30. Comer, D. R., Lenaghan, J. A., Sengupta, K. (2015). Factors that affect students’ capacity to fulfill the role of online learner. Journal of Education for Business, 90, 145–155. doi:10.1080/08832323.2015.1007906. Davidson-Shivers, G. V. (2009). Frequency and types of instructor interactions in online instruction. Journal of Interactive Online Learning, 8(1), 23–40. Dlacic, J., Arslanagic, M., Kadic-Maglajlic, S., Mrkovic, S., Raspor, S. (2014). Exploring perceived service quality, per- ceived value, and repurchase intention in higher education using structural equation modelling. Top Quality Manage- ment, 25, 141–157. Espasa, A., Meneses, J. (2010). Analyzing feedback processes in an online teaching and learning environment: An explor- atory study. Higher Education, 59, 277–292. Ferguson, J. L., Makarem, S. C., Jones, R. E. (2016). Using a class blog for student experiential learning reflection in business courses. Journal of Education for Business, 91, 1–10. doi:10.1080/08832323.2015.1108279. Fleming, N. D. (2001). Teaching and learning styles: VARK strategies. Christchurch, New Zealand: Author. Fornell, C. J. Larcker, D. F. (1981). Evaluating structural equation models with unobservable variable and mea- surement error. Journal of Marketing Research, 18, 39–50. Fulford, C. P., Zhang, S. (1993). Perceptions of interaction: The critical predictor in distance education. American Jour- nal of Distance Education, 7(3), 8–21. Ganesh, G. K., Paswan, A. K., Sun, Q. (2015). Are face-to- face classes more effective than online classes? An empirical examination. Marketing Education Review, 25, 67–81. Ganesh, G. K. Sun, Q. (2009). Using simulations in the undergraduate marketing capstone case course. Marketing Education Review, 19, 7–16. Griff, E. R., Matter, S. F. (2013). Evaluation of an adaptive online learning system. British Journal of Educational Tech- nology, 44, 170–176. Hatfield, L. Taylor, R. K. (1998). Making business schools responsive to customers: Lessons learned and actions. Mar- keting Education Review, 8(2), 1–8. Jackson, M. J., Helms, M. M., Jackson, W. T., Gum, J. R. (2011). Student expectations of technology-enhanced pedagogy: A ten-year comparison. Journal of Education for Business, 86, 294–301. doi:10.1080/08832323.2010.518648. Kember, D., Charlesworth, M., Davies, H., McKay, J., Stott, V. (1997). Evaluating the effectiveness of educational inno- vations: Using the study process questionnaire to show that meaningful learning occurs. Studies in Educational Evalua- tion, 23, 141–157. Kinshukan, T. L. (2003). User exploration based adaptation in adaptive learning systems. International Journal of Informa- tion Systems in Education, 1, 22–31. Lancellotti, M., Sunil, T., Kohli, C. (2016). Online video modules for improvement in student learning. Journal of Education for Business, 91, 19–22. doi:10.1080/ 08832323.2015.1108281. Ledden, L., Kalafatis, S.P, Samouel, P. (2007). The relation- ship between personal values and perceived value of educa- tion. Journal of Business Research, 60, 965–974. Leidner, D. E., Jarvenpaa, S. (1995). The use of information technology to enhance management school education: A theoretical view. MIS Quarterly, 19, 265–291. JOURNAL OF EDUCATION FOR BUSINESS 7
  • 9. Liu, S., Gomez, J., Khan, B., Yen, C-J. (2007). Toward a learner-oriented community college online course dropout framework. International Journal on E-Learning, 6(Decem- ber), 519–542. Luethge, D. J., Raska, D., Greer, B. M., O’Connor, C. (2016). Crossing the Atlantic: Integrating cross-cultural experiences into undergraduate business courses using virtual communities technology. Journal of Education for Business, 91, 219–226. doi:10.1080/08832323.2016. 1160022. McDonald, R. P., Ho, M. (2002). Principles and practice in reporting statistical equation analyses. Psychological Meth- ods, 7, 64–82. McGorry, S. Y. (2006). Data in the palm of your hand. Market- ing Education Review, 16, 83–90. McGraw-Hill Education. (2014). Walters State and McGraw- Hill Education team to provide all-digital learning experi- ence to undergraduate biology students. Retrieved from http://www.mheducation.com/news-media/press-releases/ walters-state-and-mcgraw-hill-education-team-provide-all- digital-learning-experience.html. McGraw-Hill Higher Education. (2012). McGraw-Hill awards college scholarships to winners of ‘LearnSmart and win’ video contest [Press Release]. Retrieved from http://www.prnews wire.com/news-releases/mcgraw-hill-awards-college-scholar ships-to-winners-of-learnsmart-and-win-video-contest- 136664358.html Means, B., Peters, V., Zheng, Y. (2014). Lessons from five years of funding digital courseware by the Gates Foundation and SRI Research. SRI. Retrieved from: https://www.sri. com/sites/default/files/publications/psexecsummary_1.pdf Moore, M. G. (1989). Three types of interaction [Editorial]. American Journal of Distance Education, 3(2), 1–7. Norman, T. (2011). McGraw-Hill LearnSmart effectiveness study. New York, NY: McGraw-Hill. Oxman, S. Wong, W. (2014). White paper: Adaptive learning systems. DeVry Education Group. Retrieved from https://pdfs.semanticscholar.org/4e57/2108fa1591 d21d980a6efe78673edc48c652.pdf Parayitam, S., Desai, K., Phelps, L. (2007). The effect of teacher communication and course content on student sat- isfaction and effectiveness. Academy of Educational Leader- ship Journal, 11, 91–105. Pintrich, P. R., De Groot, E. V. (1990). Motivational and self-regulation learning components of classroom aca- demic performance. Journal of Educational Psychology, 82, 33–40. Richardson, J. T. E. (1994). Cultural specificity of approaches to studying in higher education: A literature survey. Higher Education, 27, 449–468. Rovai, A. P., Barnum, K. T. (2003). On-line course effective- ness: An analysis of student interactions and perceptions of learning. Journal of Distance Education, 18, 57–73. Sandman, T. E. (2014). A preliminary investigation into the adaptive learning styles of business students. Decision Scien- ces Journal of Innovative Education, 12, 33–54. Sheppard, M., Vibert, C. (2016). Cases for the net gen- eration: An empirical examination of students’ attitude toward multimedia case studies. Journal of Education for Business, 91, 101–107. doi:10.1080/08832323.2015. 1128382. Smith, S., Fisher, D. (2006). You can observe a lot by just watch- ing: Using videography in a retail setting to teach observational research methods. Marketing Education Review, 16, 75–78. Sobel, M. E. (1982). Asymptotic intervals for indirect effects in structural equations models. In S. Leinhart (Ed.), Sociologi- cal methodology 1982 (pp. 290–312). San Francisco, CA: Jossey-Bass. Steiner, S. D., Hyman, M. R. (2010). Improving the student experience: Allowing students enrolled in a required course to select online or face-to-face instruction. Marketing Edu- cation Review, 20, 29–33. Swan, K. (2003). Learning effectiveness online: What the research tells us. In J. Bourne J. C. Moore (Eds.), Elements of quality online education, practice and direction (pp. 13– 45). Needham, MA: Sloan Center for Online Education. Sweeney, A. D. P., Morrison, M. D., Jarratt, D., Heffernan, T. (2009). Modeling the constructs contributing to the effec- tiveness of marketing lecturers. Journal of Marketing Educa- tion, 31, 190–202. VanLehn, K. (2006). The behavior of tutoring systems. Inter- national Journal of Artifical Intelligence in Education, 16, 227–265. Winch, J. K., Cahn, E. S. (2015). Improving student perfor- mance in a management science course with supplemental tutorial videos. Journal of Education for Business, 90, 402– 409. doi:10.1080/08832323.2015.1081865. Wood, N. T., Solomon, M. R., Allan, D. (2008). Welcome to the matrix: E-learning gets a second life. Marketing Educa- tion Review, 18, 47–53. Zhang, D., Zhou, L., Briggs, R. O., Nunamaker, J. F. (2006). Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Information Management, 43(1), 15–27. 8 Q. SUN ET AL.