SlideShare a Scribd company logo
1 of 50
Computers & Education 54 (2010) 1222–1232
Contents lists available at ScienceDirect
Computers & Education
journal homepage: www.elsevier .com/ locate/compedu
Engaging online learners: The impact of Web-based learning
technology
on college student engagement
Pu-Shih Daniel Chen a,*, Amber D. Lambert b, Kevin R. Guidry
b
a Department of Counseling and Higher Education, University
of North Texas, 1155 Union Circle #310829, Denton, TX
76203-5017, USA
b Center for Postsecondary Research, Indiana University
Bloomington, USA
a r t i c l e i n f o
Article history:
Received 31 July 2009
Received in revised form 30 October 2009
Accepted 16 November 2009
Keywords:
Online learning
Engagement
College
University
NSSE
Web-based
Deep learning
0360-1315/$ - see front matter � 2009 Elsevier Ltd. A
doi:10.1016/j.compedu.2009.11.008
* Corresponding author. Tel.: +1 940 369 8062; fax
E-mail addresses: [email protected] (Pu-Shih D
a b s t r a c t
Widespread use of the Web and other Internet technologies in
postsecondary education has exploded in
the last 15 years. Using a set of items developed by the National
Survey of Student Engagement (NSSE),
the researchers utilized the hierarchical linear model (HLM) and
multiple regressions to investigate the
impact of Web-based learning technology on student
engagement and self-reported learning outcomes in
face-to-face and online learning environments. The results show
a general positive relationship between
the use the learning technology and student engagement and
learning outcomes. We also discuss the
possible impact on minority and part-time students as they are
more likely to enroll in online courses.
� 2009 Elsevier Ltd. All rights reserved.
1. Introduction
The Internet and other digital technologies have become
thoroughly integrated in the lives of today’s college student. A
recent study by
EDUCAUSE (Hawkins & Rudy, 2008) found that the vast
majority of US students at baccalaureate degree-granting
institutions own and use
their own computers. Online learning management systems
(LMS) such as Blackboard, D2L, or Sakai are nearly ubiquitous
on American
colleges and universities, and wireless Internet access
permeates most college classrooms (Green, 2007; Hawkins &
Rudy, 2008). Outside
the classroom, Internet connections are available in virtually all
on-campus residence halls (Hawkins & Rudy, 2008) and an
estimated 79–
95% of all American College students use Facebook and
MySpace (Ellison, 2007).
Most first-year college students now arrive on campus with
their own personal computer, digital music player, cell phone,
and other
digital devices (Salaway & Caruso, 2008). As technology
becomes a part of modern life and fuel price remains high, more
and more college
students opt to take online or hybrid courses using readily-
available computers and information technologies (Allen &
Seaman, 2008).
Moreover, many students expect instructors to integrate Internet
technologies, such as online learning management systems and
collab-
orative Internet technologies, into traditional face-to-face
classes to enhance learning experience, believing those tools
make the educa-
tional experience more convenient and educationally effective
(Salaway & Caruso, 2008).
Since the early 2000s, Web-based applications have become the
de facto standard platform for distance education courses and
learning
management systems (Parsad & Lewis, 2008). The widespread
adaptation of digital technologies and online courses has caused
many
researchers (Bråten & Streømsø, 2006; Kuh & Hu, 2001;
Robinson & Hullinger, 2008; Zhou & Zhang, 2008) to question
the impact of the
Internet and Web-based learning technology on student’s
educational engagement and learning outcomes. The concept of
student engage-
ment is not new to educators. Years of research has shown that
what students do during college counts more in terms of
learning outcomes
than who they are or even where they go to college (Austin,
1993; Kuh, 2004; Pace, 1980; Pascarella & Terenzini, 2005). In
the Seven prin-
ciples for good practice in undergraduate education, Chickering
and Gamson (1987) argued that good college education should
promote
student-faculty interaction, cooperation among students, active
learning, prompt feedback, time on task, high expectations, and
respect
for diverse talents and ways of learning. In a follow-up article
published in 1996, Chickering and Ehrmann (1996) stated that
new
ll rights reserved.
: +1 940 369 7177.
aniel Chen), [email protected] (A.D. Lambert),
[email protected] (K.R. Guidry).
http://dx.doi.org/10.1016/j.compedu.2009.11.008
mailto:[email protected]
mailto:[email protected]
mailto:[email protected]
http://www.sciencedirect.com/science/journal/03601315
http://www.elsevier.com/locate/compedu
Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010)
1222–1232 1223
communication and information technology alone will not lead
to student success. Instead, educators must utilize technology as
a lever to
promote student engagement in order to maximize the power of
computers and information technology as a catalyst for student
success in
college (Ehrmann, 2004).
Most studies on the topic of technology and student engagement
have affirmed the utility of computers and information
technology on
promoting student engagement (Hu & Kuh, 2001; Nelson Laird
& Kuh, 2005; Robinson & Hullinger, 2008). For example,
Robinson and Hul-
linger found that asynchronous instructional technology allows
learners more time to think critically and reflectively, which in
turns stim-
ulates higher order thinking such as analysis, synthesis,
judgment, and application of knowledge. Duderstadt, Atkins,
and Houweling
(2002) stated, ‘‘When implemented through active, inquiry
based learning pedagogies, online learning can stimulate
students to use higher
order skills such as problem solving, collaboration, and
stimulation” (p. 75). Furthermore, students taking online
courses are expected to
work collaboratively, which is an important component of
student engagement, plus that collaborative components have
been integrated
into most Web-based course designs (Thurmond & Wambach,
2004).
Other than promoting student engagement, research focused on
the connection between technology and learning outcomes has
been
mixed. George Kuh and his associates have published several
articles related to this issue using the National Survey of
Student Engagement
(NSSE) data. In Kuh and Hu (2001), the authors suggested a
positive relationship between a student’s use of computers and
other informa-
tion technologies and self-reported gains in science and
technology, vocational preparation, and intellectual
development. Hu and Kuh
(2001) also found that students attending more ‘‘wired”
institutions reported more frequently use computing and
information technology
and higher levels of engagement in good educational practices
than their counterparts at less wired institutions. A similar study
conducted
by Kuh and Vesper (2001) concluded that increased familiarity
with computers was positively related to developing other
important skills
and competencies, including social skills.
Studies conducted by other researchers, however, have mixed
outcomes that have often not been as positive as those reported
by
George Kuh and his associates. A meta-analysis commissioned
by the US Department of Education examined empirical
evidence of the im-
pact of online and hybrid courses on learning outcomes. The
authors found that both online and hybrid courses have a
significant positive
impact on learning outcomes, with hybrid courses having a
greater impact. However, the authors caution that the ‘‘positive
effects asso-
ciated with blended learning should not be attributed to the
media, per se” (p. ix) (Means, Toyama, Murphy, Bakia, &
Jones, 2009). This
reflects long-standing findings that, contrary to many naïve
beliefs, media do not have a significant impact on learning
outcomes (Clark,
2009). Other meta-analyses of distance education impacts on
learning outcomes have supported these mixed findings
(Bernard et al.,
2004; Sitzmann, Kraiger, Stewart, & Wisher, 2006).
While it is unclear if students learn more in online courses, it
does seem clear that there is an increase in students’
information literacy.
For example, Robinson and Hullinger (2008) found a correlation
between taking online courses and the improvement of students’
com-
puter skills. Though most online courses do not require students
to have high level computer skills in order to complete the
courses, they
nevertheless require students to become familiar with essential
information technological skills such as using e-mail,
participating in on-
line chatting, posting to a Web-based discussion board, and
using word processing, presentation, and spreadsheet software.
Even though there are many educational benefits associated
with using computer technologies, there are also downsides.
Critics have
argued that online learning and the use of information
technology may put certain student populations in disadvantage.
Echoing Jenkins’
‘‘participation gap” idea (Jenkins, 2006), some researchers have
suggested that characteristics such as socioeconomic status
(Gladieux &
Swail, 1999) and institutional resources (Hu & Kuh, 2001) play
a significant role in students’ use of and the impact of
computers and
the Internet. In addition, some researchers asserted that the lack
of face-to-face interactions in online learning may reduce
instructional
effectiveness for students of certain learning styles (Bullen,
1998; Terrell & Dringus, 2000; Ward & Newlands, 1998).
Sanders (2006) argued
that no communication technology can replace the physical
presence and the serendipitous moments of learning such as the
spontaneous
discussion or the overheard remarks during class break that so
often occurred in a face-to-face environment.
1.1. Purpose of study and research questions
Although studies have found positive connections between the
use of computers and information technology and student
engagement
and learning outcomes, most of them studied the general use of
information technology instead of the specific use of
instructional and
learning management systems. This study investigates the
nature of student engagement in the online learning environment
to find out
if student and institutional characteristics affect the use of the
learning technologies and their impact on student engagement.
Specifically,
the following research questions were addressed:
1. How often do college students in different types of courses
use the Web and Internet technologies for course-related tasks?
2. Do individual and institutional characteristics affect the
likelihood of taking online courses?
3. Does the relative amount of technology employed in a course
have a relationship with student engagement, learning
approaches, and
student self-reported learning outcomes?
2. Methods
2.1. Instrument and data source
The data for this study come from the 2008 administration of
the National Survey of Student Engagement (NSSE). NSSE is an
annual
survey created and administered by the Indiana University
Center for Postsecondary Research. Since the inception of the
NSSE in 2000,
more than a million first-year students and seniors at more than
1300 baccalaureate degree-granting colleges and universities in
the Uni-
ted States and Canada have reported the time and energy that
they devote to the educationally purposeful activities measured
by this an-
nual survey (Indiana University Center for Postsecondary
Research, 2008b). Participating institutions use their student
engagement results
to identify areas where teaching and learning can be improved.
NSSE results have been found to positively correlate with
desired learning
outcomes, such as critical thinking ability and grades (Carini,
Kuh, & Klein, 2006; Kuh, 2004; Ouimet, Bunnage, Carini, Kuh,
& Kennedy,
https://www.researchgate.net/publication/247116209_Participati
on_and_Critical_Thinking_in_Online_University_Distance_Edu
cation?el=1_x_8&enrichId=rgreq-5b2569bc-7830-4af3-80d3-
9b27c325e5f5&enrichSource=Y292ZXJQYWdlOzIyMzIzNTA4
MTtBUzoyNjAwNjI5ODc5NDM5MzdAMTQzOTAxNTI1Njk1N
w==
https://www.researchgate.net/publication/245347186_An_Invest
igation_of_the_Effect_of_Learning_Style_on_Student_Success_
in_Online_Learning_Environment?el=1_x_8&enrichId=rgreq-
5b2569bc-7830-4af3-80d3-
9b27c325e5f5&enrichSource=Y292ZXJQYWdlOzIyMzIzNTA4
MTtBUzoyNjAwNjI5ODc5NDM5MzdAMTQzOTAxNTI1Njk1N
w==
https://www.researchgate.net/publication/259823448_Converge
nce_Culture_Where_Old_Media_and_New_Media_Collide?el=1
_x_8&enrichId=rgreq-5b2569bc-7830-4af3-80d3-
9b27c325e5f5&enrichSource=Y292ZXJQYWdlOzIyMzIzNTA4
MTtBUzoyNjAwNjI5ODc5NDM5MzdAMTQzOTAxNTI1Njk1N
w==
https://www.researchgate.net/publication/237587031_The_Impo
nderable_Bloom_Reconsidering_the_Role_of_Technology_in_E
ducation?el=1_x_8&enrichId=rgreq-5b2569bc-7830-4af3-80d3-
9b27c325e5f5&enrichSource=Y292ZXJQYWdlOzIyMzIzNTA4
MTtBUzoyNjAwNjI5ODc5NDM5MzdAMTQzOTAxNTI1Njk1N
w==
https://www.researchgate.net/publication/237279421_The_Natio
nal_Survey_of_Student_Engagement_Conceptual_Framework_O
verview_of_Psychometric_Properties?el=1_x_8&enrichId=rgreq
-5b2569bc-7830-4af3-80d3-
9b27c325e5f5&enrichSource=Y292ZXJQYWdlOzIyMzIzNTA4
MTtBUzoyNjAwNjI5ODc5NDM5MzdAMTQzOTAxNTI1Njk1N
w==
1224 Pu-Shih Daniel Chen et al. / Computers & Education 54
(2010) 1222–1232
2004; Pike, 2006). The conceptual framework and psychometric
properties of the NSSE and the development of NSSE scales
have been am-
ply documented (Kuh, 2004; Nelson Laird, Shoup, & Kuh,
2005).
In 2007, researchers at NSSE developed a set of questions to
investigate the nature of student engagement in the online
learning envi-
ronment. The original set of questions includes 22 questions.
After pilot testing and expert review, the items were revised and
the numbers
were reduced to 13 (see Appendix for the list of items). The
final set of 13 items asks respondents to identify the number of
classes in which
they were enrolled in the last academic year and how many of
those courses were conducted entirely online or face-to-face
with a signif-
icant online component. Survey respondents also reported on
specific behaviors related to their collegiate experiences,
including in- and
out-of-class behaviors, time usage, and learning approaches that
are known to contribute to desirable learning outcomes.
2.2. Sample
The NSSE online learning questions were attached to the end of
the NSSE online survey and sent to participating students at 45
US bac-
calaureate degree-granting institution. The 45 institutions were
randomly selected from the pool of 763 institutions participated
in the
2008 NSSE administration. The institutions include 14 (31%)
public and 31 (69%) private institutions; 8 (19%) of them were
classified by
the Carnegie Foundation for the Advancement of Teaching
(2009) as doctoral institutions, 16 (38%) were master’s
institutions, and 18
(43%) were baccalaureate institutions. Detailed institutional
characteristics of the 45 participating institutions and their
comparison with
all 2008 NSSE participating institutions can be found in Table
1.
The survey was sent to 77,714 first-year and senior college
students and approximately 23,706 students responded to this
set of ques-
tions, yielding a response rate of 30.5%. However, about 4500
students who were purposely sampled by the institutions were
excluded
from analysis, which leaves only students who were randomly
sampled. Additionally, one institution that offers online courses
only
was removed from the dataset because no comparison among
different course delivery methods can be made at this online
institution.
Removing this online institution did not greatly affect the
general characteristics of the sample. Finally, 1825 students,
who accounted
for 7.7% of the total respondents, were excluded as their
responses indicated that they may not understand these
questions in the manner
intended by the researchers (when summed, their responses
indicated that over 100% of their classes were online or hybrid
classes); this
indicates a likely data reliability issue with these new questions
that will be addressed when discussing this study’s limitations.
The final data set for this study has 17,819 respondents, in
which 8065 (45%) were first-year students and the remaining
9754 (55%)
seniors. Nearly 7000 respondents (35%) were male and 13,000
(65%) female. The majority (97% for first-year students and
87% for senior
students) of the surveyed students were enrolled full-time at
their institution. Detailed student characteristics including
gender, enroll-
ment status, and race and ethnicity can be found in Table 2.
Table 1
Institutional characteristics.
Institutions participated in this study (n = 45) All NSSE 2008
institutions (n = 763)a All US institutionsb
Count Percentage (%) Count Percentage (%) Percentage (%)
Control Public 14 31 320 42 35
Private 31 69 443 58 65
Carnegie classifications Doctoral 8 19 103 16 18
Master’s 16 38 303 47 41
Baccalaureate 18 43 244 38 41
Urbanicity City 27 60 333 47 46
Suburban 6 13 154 22 22
Town 7 16 173 24 21
Rural 5 11 53 7 9
a Not all NSSE participating institutions are classified by the
Carnegie Foundation for the Advancement of Teaching.
b US percentages are based on data from the 2007 IPEDS
institutional characteristics file as reported in Indiana
University Center for Postsecondary Research (2008a).
Table 2
Respondent demographics.
First-year Senior
Count Percentage (%) Count Percentage (%)
Gender Male 2771 34 3351 35
Female 5274 66 6375 65
Enrollment status Part-time 259 3 1175 13
Full-time 7789 97 8562 87
Race or ethnicity African American or Black 676 8 881 9
American Indian or other Native American 40 1 60 1
Asian, Asian American, or Pacific Islander 483 6 437 5
White (non-Hispanic) 5753 71 7132 73
Hispanic, Mexican or Mexican American, Puerto Rican 279 4
273 3
Other 124 2 111 1
Multiracial 208 3 194 2
No response 502 6 666 7
https://www.researchgate.net/publication/237279421_The_Natio
nal_Survey_of_Student_Engagement_Conceptual_Framework_O
verview_of_Psychometric_Properties?el=1_x_8&enrichId=rgreq
-5b2569bc-7830-4af3-80d3-
9b27c325e5f5&enrichSource=Y292ZXJQYWdlOzIyMzIzNTA4
MTtBUzoyNjAwNjI5ODc5NDM5MzdAMTQzOTAxNTI1Njk1N
w==
Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010)
1222–1232 1225
2.3. Variables and data analysis
For the purposes of this study, a Web or online course is
defined as a course that is conducted entirely through the
Internet without any
face-to-face contact among instructor(s) and students. In
contrast, a face-to-face course is a course that conducted
entirely in a physical
classroom without using any Internet technology for course
management or instructional purpose. Although there are many
definitions
for hybrid learning, or so-called blended learning (Bersin, 2004;
Driscoll, 2002; Reay, 2001; Rossett, 2001; Sands, 2002; Ward
& LaBranche,
2003), Graham (2006) indicated that blended learning can be
sorted into three categories: enabling blends, enhancing blends,
and trans-
forming blends. Enabling blends focus primarily on improving
student access and convenience. Enhancing blends allow for
incremental
changes to the pedagogy while transforming blends carry radical
transformation of the pedagogy. Learning management systems
and tech-
nology equipped classrooms are two examples of enhancing
blends. For the purpose of this study, the researchers adopted
enhancing
blends as the definition of hybrid courses. Therefore, a hybrid
course is defined as one that blends both Web and face-to-face
components
in the same course. A hybrid course must include both face-to-
face contacts among instructor(s) and students and the use of
the Internet or
Web technology for course management or instructional
purpose. If the only utilization of the Internet or Web
technology in a face-to-face
course is for non-instructive or routine communication, the
course is considered a face-to-face course rather than a hybrid
course.
To answer the first research question, descriptive statistics
including means and standard deviations were reported for all of
the survey
items. The Kruskal Wallis Test (Siegel & Castellan, 1988), a
nonparametric equivalent of the analysis of variance (ANOVA),
was conducted to
examine if statistically significant differences exist in student’s
technology use among different course delivery methods.
Hierarchical lin-
ear modeling (HLM) was utilized to answer the second research
question (Raudenbush & Bryk, 2002). The assumption
underlying the HLM
analysis is that institutions have a differential impact on
student’s course taking behaviors and technology usage. The
benefit of using HLM
is that it allowed the researchers to partition the variance
attributable to the individual and the variance attributable to the
institution. The
dependent variables for the HLM analysis are the ratio of
classes taken online. The independent variables include
individual (level 1) vari-
ables such as the student’s gender, enrollment status (part-/full-
time), ethnicity, major, and parental education. The institutional
level vari-
ables (level 2 variables) are dummy-coded 2005 Carnegie basic
classification, control (public/private), and urbanicity or locale.
The third research question, which addresses the impact of
learning technologies on student engagement and outcomes, was
answered
using Ordinary Least Squares (OLS) multiple regression
analysis. A regression analysis is a statistical technique that
allows the researcher to
investigate the relationship between one dependent variable and
several independent variables (Tabachnick & Fidell, 2007). The
dependent
variables for this analysis include four of the five NSSE
Benchmarks of Effective Educational Practice (Kuh, 2004;
LaNasa, Cabrera, & Trangs-
rud, 2009; Pascarella & Seifert, 2008) – level of academic
challenge (LAC), active and collaborative learning (ACL),
student-faculty interac-
tion (SFI), and supportive campus environment (SCE), the three
student self-reported Gain Scales (Chen, Ted, & Davis, 2007;
Pike, 2006) –
gain in general education, gain in personal and social
development, and gain in practical competence, and the three
deep learning scales
(Nelson Laird et al., 2005) – higher order thinking, reflective
learning, and integrative learning. One of the NSSE Benchmarks
– enriching
educational experiences (EEE) – is excluded from the analysis
because technology use is part of the benchmark. The
independent variables
include the percentage of classes taken online, the percentage of
classes that were hybrid classes, a composite score of course-
related tech-
nology use, and controls for student and institutional
characteristics.
3. Results
3.1. Descriptive statistics
The first three questions of the survey asked students how many
courses they took in the current academic year, how many of
those
courses used the Web or Internet as the primary method to
delivery course content, and how many of those courses were
hybrid courses.
Using those responses, we were able to classify course delivery
methods into three categories: Web or Internet-only, face-to-
face, and hy-
brid. As a result of this classification, students can take courses
in seven different patterns: Web-only, face-to-face-only,
hybrid-only, some
Web and hybrid, Web and face-to-face, some face-to-face and
hybrid, and all three delivery methods. As shown in Table 3,
very few (2.1%)
of the 17,819 students who adequately completed the survey
took all their courses in Web-only mode. A larger percentage of
students took
some Web courses and some hybrid courses (5.2%) while a
similar percentage enrolled in both Web and face-to-face
courses (7.6%). The
majority (84.8%) took classes with at least some face-to-face
component. Although some of those students were also enrolled
in Web
(7.6%), hybrid (21.5%), or both Web and hybrid (34.9%)
courses, one-fifth (20.8%) of the respondents were enrolled
only in face-to-face clas-
ses with no significant Web or Internet component. These seven
groups were collapsed into five groups for later analyses: Web-
only, hy-
brid-only, some Web, face-to-face and hybrid, and face-to-face-
only.
As shown in Tables 4 and 5, students whom one would expect to
use technology more often – those enrolled in Web and hybrid
classes
– indeed used online learning tools and technologies more
frequently than students who only took face-to-face courses.
More specifically,
Table 3
Distribution of course options.
Course delivery method First-year students Senior students
Combined
Frequency Percentage (%) Frequency Percentage (%) Frequency
Percentage (%)
Web-only 90 1.1 281 2.9 371 2.1
Hybrid-only 628 7.8 789 8.1 1417 8
Face-to-face-only 1718 21.3 1988 20.4 3706 20.8
Web and hybrid 362 4.5 561 5.8 923 5.2
Web and face-to-face 573 7.1 776 8 1349 7.6
Face-to-face and hybrid 1699 21.1 2139 21.9 3838 21.5
All three delivery methods 2995 37.1 3220 33 6215 34.9
Total 8065 100.00 9754 100.00 17,819 100.00
Table 4
First-year student engagement in online learning activities.
Web-only Hybrid-only Some Web Hybrid and
face-to-face
Face-to-face-
only
Mean SD Mean SD Mean SD Mean SD Mean SD
How often: discussed or completed an assignment using a
synchronous tool like
instant messaging, online chat room, video conference, etc.
1.91 1.174 1.72 .961 1.62 .886 1.50 .810 1.45 .824
How often: discussed or completed an assignment using an
asynchronous tool like
e-mail, discussion board, listserv, etc.
3.12 1.091 2.62 .974 2.46 .931 2.39 .893 2.00 .928
How often: used your institution’s Web-based library resources
in completing class
assignments
2.40 .997 2.60 .910 2.45 .900 2.44 .861 2.29 .919
How often: used the Internet to discuss with an instructor topics
you would not
feel comfortable discussing face-to-face or in a classroom
1.70 .993 1.87 .989 1.78 .940 1.69 .874 1.62 .882
How often: used an electronic medium (listserv, chat group,
Internet, instant
messaging, etc.) to discuss or complete an assignment
3.07 1.095 2.66 1.044 2.66 1.037 2.61 1.001 2.33 1.047
How often: used e-mail to communicate with an instructor 3.40
.761 3.25 .790 3.25 .781 3.17 .778 3.04 .824
To what extent does your institution emphasize using computers
in academic
work?
3.56 .781 3.42 .744 3.33 .780 3.30 .753 3.15 .821
Table 5
Senior student engagement in online learning activities.
Web-only Hybrid-only Some Web Hybrid and
face-to-face
Face-to-face-
only
Mean SD Mean SD Mean SD Mean SD Mean SD
How often: discussed or completed an assignment using a
synchronous tool like
instant messaging, online chat room, video conference, etc.
2.05 1.160 1.62 .921 1.64 .889 1.51 .812 1.34 .734
How often: discussed or completed an assignment using an
asynchronous tool like
e-mail, discussion board, listserv, etc.
3.29 1.032 2.82 .986 2.69 .942 2.58 .915 2.07 .979
How often: used your institution’s Web-based library resources
in completing class
assignments
2.72 1.042 2.81 .964 2.75 .933 2.77 .939 2.52 1.020
How often: used the Internet to discuss with an instructor topics
you would not feel
comfortable discussing face-to-face or in a classroom
1.77 1.086 1.82 .990 1.74 .933 1.61 .850 1.48 .819
How often: used an electronic medium (listserv, chat group,
Internet, instant
messaging, etc.) to discuss or complete an assignment
3.25 1.018 2.99 1.009 2.91 .991 2.81 .979 2.47 1.067
How often: used e-mail to communicate with an instructor 3.67
.604 3.53 .687 3.47 .691 3.43 .707 3.28 .788
To what extent does your institution emphasize using computers
in academic work? 3.72 .594 3.64 .613 3.49 .716 3.48 .711 3.37
.799
1226 Pu-Shih Daniel Chen et al. / Computers & Education 54
(2010) 1222–1232
respondents who were enrolled in online courses more
frequently used both synchronous and asynchronous
communication tools for
instructional or learning purposes. Compared with students in
traditional face-to-face setting, online students also more
frequently used
electronic media to discuss or complete assignments, and these
differences were consistent for both first-year and senior
students. One
interesting finding is that students who took hybrid courses
more frequently utilized the institutional Web-based library
resources in com-
pleting class assignment than students who only had online
courses or those only had face-to-face courses. A probable
explanation is that
students who took hybrid courses are more familiar with doing
research online than students who took only face-to-face
courses. On the
other hand, students who only took online courses may feel
comfortable with the Internet technologies but may not receive
sufficient
instruction on how to conducting research using Web-based
library resources.
We attempted to perform an analysis of variance (ANOVA) on
the mean scores for these seven questions for both first-year and
senior
students to determine which, if any, of the apparent differences
are statistically significant. These tests were abandoned as the
assumptions
of ANOVA, particularly homoscedacity, were only met in two
of the 14 tests. A nonparametric test, the Kruskal Wallis Test,
indicated that
there are significant differences in the mean scores for each
question among at least some of the groups of students.
However, the very
large number of respondents makes it difficult to make much
meaning of the significant results of those tests given the
sensitivity of
the tests to the high number of respondents.
3.1.1. HLM one-way ANOVA model
To answer the second research question, a hierarchical linear
model (HLM) was built to investigate the impacts of individual
and insti-
tutional variables on students’ course taking behaviors. Before
estimating the full, two-level HLM to examine the effects of
individual and
institutional variables in the student’s likelihood of taking
online courses, we used the one-way ANOVA model or so-
called ‘‘null model” to
estimate the proportion of variance that exists between and
within colleges. The proportion of variance between institutions
ranges from
0.033 for first-year students to 0.157 for seniors (Table 6). The
result indicates that institutional variables have more influence
on seniors
than first-year students in their decision to take online courses.
This result also warrants further investigation into what
individual and
institutional variables may affect student’s decision to take
online courses.
3.1.2. HLM random coefficient regression and intercept- and
slopes-as-outcomes models
The second step of the modeling procedure is the creation of the
random coefficient regression model, also known as the level 1
model
or the individual level model. This procedure tests and
establishes the individual-level independent variables before
estimating the full,
intercept- and slopes-as-outcomes model. Table 7 presents the
descriptive statistics of the independent variables included in
the analysis.
The level 1 independent variables include student’s gender (0 =
male, 1 = female), enrollment status (0 = full-time, 1 = part-
time), ethnicity
Table 6
Variance components of dependent variable.
Ratio of online courses taken by the student
First-year students Seniors
Total variance .05929 .08028
Variance within institutions .05731 .06767
Variance between institutions .00198 .01261
Proportion between institutions .033 .157
Table 7
Descriptive statistics for independent variables included in
models.
First-year students Seniors
Mean SD Min. Max. Mean SD Min. Max. Description
Individual characteristics
First generation college
student
.38 .49 0 1 .42 .49 0 1 First generation college student is
defined as neither parents has a baccalaureate
degree from a college. 1 = first generation college student, 0 =
all other
Female .64 .48 0 1 .65 .48 0 1 Gender: 1 = female, 0 = male
Part-time enrollment .03 .18 0 1 .13 .33 0 1 Enrollment status: 1
= enrolled part-time, 0 = enrolled full-time
Ethnical minority .28 .45 0 1 .26 .44 0 1 Ethnicity: 0 =
White/Caucasian, 1 = all other
STEM .18 .39 0 1 .17 .37 0 1 Major: 1 = Science, Technology,
Engineering, and Mathematics, 0 = all other
Arts, Humanities, and
Social Sciences
(reference)
.26 .44 0 1 .28 .45 0 1 Major: 1 = Arts, Humanities, and Social
Sciences, 0 = all other
Business .17 .37 0 1 .18 .39 0 1 Major: 1 = Business, 0 = all
other
Professional .12 .32 0 1 .13 .34 0 1 Major: 1 = professional, 0 =
all other
Other and undecided .16 .37 0 1 .15 .36 0 1 Major: 1 = Other
majors and undecided, 0 = all other
Institutional characteristics
Carnegie: doctoral
institution
.18 .39 0 1 .18 .39 0 1 Carnegie classification: 1 = doctorate
granting universities, 0 = all other
Carnegie: master’s
institution
.36 .48 0 1 .36 .48 0 1 Carnegie classification: 1 = master’s
colleges and universities, 0 = all other
Carnegie: baccalaureate
institution
.4 .5 0 1 .4 .5 0 1 Carnegie classification: 1 = baccalaureate
colleges, 0 = all other
Carnegie: other .07 .25 0 1 .07 .25 0 1 Carnegie classification: 1
= special focus institutions, tribal colleges, none-
classified institutions
Private .69 .47 0 1 .69 .47 0 1 Control: 1 = private, 0 = public
City .6 .5 0 1 .6 .5 0 1 Urbanicity: 1 = city, 0 = all other
Suburban .13 .34 0 1 .13 .34 0 1 Urbanicity: 1 = suburban, 0 =
all other
Town .16 .37 0 1 .16 .37 0 1 Urbanicity: 1 = town, 0 = all other
Rural .11 .32 0 1 .11 .32 0 1 Urbanicity: 1 = rural, 0 = all other
Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010)
1222–1232 1227
(0 = White/Caucasian, 1 = minority), first generation college
student status (0 = at least one parent has a baccalaureate
degree, 1 = neither
parent has a baccalaureate degree), and a series of dummy-
coded variables for major (with Arts, Humanities, and Social
Sciences being the
reference category). The outcomes of the random coefficient
regression model will be reported jointly with the final model.
In the third and final step in the modeling process, we built the
between-institution model by allowing the intercept to vary by
insti-
tution. We then modeled the intercept with institutional
characteristics. Included in the level 2 models are 2005 basic
Carnegie classifica-
tions (doctorate granting universities, master’s colleges and
universities, baccalaureate colleges, and others) with the
doctorate granting
universities serving as the reference category. We also included
institution control (public or private) and locale or urbanicity
(city, sub-
urban, town, and rural, of which city serves as the reference
category). To avoid multicollinearity, we did not include the
size of the insti-
tution as a control because the size of institution is highly
correlated with the Carnegie classification within our sample (r
= .71, p < .001).
Table 8 illustrates the summary effects of individual and
institutional variables on student’s decision to take online
courses. It is clear
that the factors that affect online course taking for first-year
students and seniors are quite different. For first-year students,
enrollment in a
private institution slightly increases the likelihood (p < .05) of
enrollment in online courses while enrollment in a
baccalaureate colleges
and universities slightly reduces (p < .05) the chance of
enrollment in online courses compared with their counterparts
enrolled in a doc-
torate granting institutions. Contrary to their effect on first-year
students, institutional variables have no statistically significant
effect on
senior students’ decision to take online courses.
Although individual variables affect both first-year and senior
students’ decision to take online courses, they tend to affect
seniors more
than first-year students. For first-year students, racial and
ethnic minorities (p < .001) and part-time students (p < .05) are
more likely to
enroll in online courses. The same effects can also be found
with senior students (both at p < .001). Additionally, seniors
who major in the
professional fields (e.g. education, nursing, occupational
therapy. . . , etc.) are also more likely to enroll in online
courses (p < .001). The stu-
dent’s major has no effect on first-year student’s likelihood of
taking online courses except for students in business, who are
slightly more
likely than students in other majors to enroll in online courses
(p < .05).
3.2. Multiple regression models
To answer the third research question, which addresses the
impact of learning technologies on student engagement and
outcomes, Or-
dinary Least Squares (OLS) multiple regression analysis was
used. As can be seen in Tables 9 and 10, the total variance
explained by the
Table 8
Coefficients from HLM for the ratio of courses taken online by
the student.
First-Year Students Seniors
Coefficient p-value Coefficient p-value
Institution-level variables
Intercept .118 .001 .141 .001
Carnegie: master’s �.01 .435 .004 .816
Carnegie: baccalaureate �.038 .016 �.03 .188
Carnegie: other �.039 .282 .27 .628
Private .025 .043 .014 .408
Locale: suburban .016 .282 �.001 .992
Locale: town .003 .859 �.027 .27
Locale: rural .039 .075 .001 .995
Individual-level variables
First generation college student .013 .056 .013 .096
Female �.01 .113 �.005 .421
Part-time .093 .016 .086 .001
Minority .035 .001 .047 .001
Major: STEM �.02 .056 �.03 .041
Major: business .02 .032 .004 .778
Major: professional �.009 .307 �.046 .001
Major: Other and undecided .001 .952 .008 .518
Variance components
Variance between institutions .0006 .00539
Variance between explained 69.70% 57%
Variance within institutions .05407 .06368
Variance within explained 5.65% 5.90%
Table 9
First-year students’ partitioning of variance for the deep
learning scales, gains scales, and NSSE Benchmarks in multiple
regression models.
Variance due to Studenta and institutionalb characteristics
Delivery of coursesc Use of learning technologyd Total variance
explained
Deep learning scales
Higher order thinking .046*** .005*** .116*** .167***
Integrative learning .050*** .008*** .199*** .257***
Reflective learning .032*** .001*** .090*** .123***
Gains scales
Person and social development .070*** .007*** .129***
.206***
Practical competence .075*** .009*** .164*** .248***
General education .059*** .010*** .126*** .195***
NSSE Benchmarks
Academic challenge .085*** .008*** .144*** .237***
Active and collaborative learning .096*** .004** .185***
.285***
Supportive campus environment .076*** .013*** .102***
.191***
Student-faculty interaction .106*** .001*** .214*** .321***
a Student characteristics include: gender, enrollment status,
parents’ education, grades, SAT scores, transfer status, age,
membership in a fraternity/sorority, whether or not
a student is a STEM field, race-ethnicity, and US citizenship.
b Institutional characteristics include: Carnegie classification
and control.
c Delivery of courses included: the percentage of courses a
student was taking online and the percentage of courses a
student was taking face-to-face with Web-
components.
d Use of learning technology included: a single scale combining
the seven questions asking students about how often they used
certain course-related technology.
** p < .01.
*** p < .001.
1228 Pu-Shih Daniel Chen et al. / Computers & Education 54
(2010) 1222–1232
multiple regression models employed in this study is
statistically significant in all cases and quite substantial in
many of these models. For
first-year students (Table 9), the variance explained by the
models ranges from 12.3% to 32.1% while for seniors it ranges
from 11.1% to
26.2% (Table 10). Of the variance explained the largest portion
by far is students’ use of learning technology. In contrast, the
delivery meth-
od of the courses in which students are enrolled seems to have a
statistically significant but in most cases unsubstantial, impact
on the
variance explained for the model.
In all of these models, the relationship between the NSSE
Benchmarks of Effective Education Practices, deep approach of
learning, and
student self-reported educational gains, and the use of learning
technology is positive and relatively strong. Table 11 displays
the relative
influence of learning technology with other forms of
engagement and students learning. Multicollinearity is not a
concern for this study as
the only moderate correction happens between enrollment status
and age (r = .47). All the other independent variables have a
Pearson’s r
less than .1.
4. Discussion
The first research question asked: How often do college
students in different types of courses use the Web and Internet
technologies for
course-related tasks? First, it is important to note that the
majority of students in this study had classes that were entirely
or partially in the
Table 10
Seniors students’ partitioning of variance for the deep learning
scales, gains scales, and NSSE Benchmarks in multiple
regression models.
Variance due to Studenta and institutionalb characteristics
Delivery of coursesc Use of learning technologyd Total variance
explained
Deep learning scales
Higher order thinking .143*** .032*** .005*** .106***
Integrative learning .251*** .069*** .012*** .170***
Reflective learning .111*** .038*** .007*** .066***
Gains scales
Person and social development .091*** .004*** .119***
.214***
Practical competence .069*** .013*** .138*** .220***
General education .078*** .009*** .089*** .176***
NSSE benchmarks
Academic challenge .045*** .013*** .132*** .190***
Active and collaborative learning .082*** .015*** .165***
.262***
Supportive campus environment .065*** .008*** .085***
.158***
Student-faculty interaction .074*** .010*** .161*** .245***
��p < .01.
a Student characteristics include: gender, enrollment status,
parents’ education, grades, SAT scores, transfer status, age,
membership in a fraternity/sorority, whether or not
a student is a STEM field, race-ethnicity, and US citizenship.
b Institutional characteristics include: Carnegie classification
and control.
c Delivery of courses included: the percentage of courses a
student was taking online and the percentage of courses a
student was taking face-to-face with Web-
components.
d Use of learning technology included: a single scale combining
the seven questions asking students about how often they used
certain course-related technology.
*** p < .001.
Table 11
Net effectsa of use of learning technology on the deep learning
scales, gains scales, and NSSE Benchmarks in multiple
regression models.
Variance due to First-year students Seniors
Deep learning scales
Higher order thinking ++ ++
Integrative learning ++ ++
Reflective learning ++ +
Gains scales
Person and social development +++ +++
Practical competence +++ ++
General education ++ +
NSSE Benchmarks
Academic challenge + +
Active and collaborative learning ++ ++
Supportive campus environment + +
Student-faculty interaction +++ +++
+, p < .001 and unstandardized B > .3; ++, p < .001 and
unstandarized B > .4, +++, p < .001 and unstandarized B > .5.
a Table reports results from ten multiple regression models (one
per row). Student level controls include gender, enrollment
status, parents’ education, grades, SAT scores,
transfer status, age, membership in a fraternity/sorority,
whether or not a student is a STEM field, race-ethnicity, US
citizenship, the percentage of courses a student was
taking online and the percentage of courses a student was taking
face-to-face with Web-components. Institutional controls
include Carnegie classification and control.
Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010)
1222–1232 1229
classroom. Very few were enrolled in all online courses and few
were enrolled in hybrid-only or hybrid and online classes. Our
finding is
consistent with the perception that students who took online
courses are more likely to use Web or Internet technologies to
enhance their
learning and communication with faculty and other students.
Our results also indicate that students who took hybrid courses
more fre-
quently utilize Web-based library resources in completing
assignments than students who took only online or face-to-face
courses.
Although the cause of this result is unknown, it is possible that
not all students who took online courses are aware of the
learning resources
that are available to them. Instructors must ensure that students
who enroll in online courses are provided instruction on how to
access the
learning resources that are available to them online and offline.
Institutions may also want to provide personal assistance in
dealing with
academic difficulties and technical problems to online students
who do not have the benefit of personal contacts with faculty
and fellow
classmates as in the face-to-face classrooms (LaPadula, 2003).
Our second research question asked: Do individual and
institutional characteristics affect the likelihood of taking
online courses? The
results of our analyses indicate that individual and institutional
characteristics have small but statistically significant effects on
students’
likelihood of taking online courses. We understand that there
are many personal and institutional factors that can affect a
student’s course
taking behavior and we are not trying to imply a casual
relationship in our study. Factors like employment, child care,
and financial support
can and should have a significant impact on a student’s decision
of which type of courses he or she would take. Nevertheless, we
find that
certain types of students including racial and ethnic minorities
and part-time students are more likely to take online courses.
We also
found that senior college students majoring in professional
fields and first-year business students more frequently take
online courses than
students in other fields. In the future, the question that deserves
further investigation is whether minority and part-time students
take
online courses more often because online courses offer better
quality of education or because it is more convenient. If the
reason is for mere
convenience – and our guess is it probably is – then institutions
must ensure that online students receive high quality
instruction, support
services, and other fringe benefits enjoyed by traditional face-
to-face students. Things like social and informal interaction
with faculty and
other students and opportunities to receive personal assistance
from faculty and staff are also important for both online and
face-to-face
1230 Pu-Shih Daniel Chen et al. / Computers & Education 54
(2010) 1222–1232
students. If online students do not receive the same quality of
education and support as their traditional classroom
counterparts, another
form of unintended educational segregation may develop as
increasing numbers of minority, part-time, and working students
dispropor-
tionately elect to take online courses.
In our third research question we asked: does the relative
amount of technology employed in a course have a relationship
with student
engagement, learning approaches, and student self-reported
learning outcomes? While one should hesitate to suggest a
causal relationship
between the use of information technology and learning
approaches, educational gains, and other forms of engagement,
the results suggest
that even after controlling for individual and institutional
characteristics, there is a relationship that exists between
students who engage
in course-related technology and those who engage in other
ways, as well as the learning approaches and gains while in
college. It would
seem that the use of course-related learning technology is
another important concept under the umbrella of student
engagement. Com-
paring results from the models for first-year students to those
for seniors also suggests that use of technology has a stronger
impact earlier
in the college experience. Perhaps integrating technology into
lower-division courses could be more beneficial in encouraging
engagement
in other ways of learning in college.
The positive correlation between the use of technology and
measures of engagement found in this study are not surprising
because it
replicates previous studies (Hu & Kuh, 2001; Kuh & Hu, 2001;
Nelson Laird & Kuh, 2005). This study demonstrates that this
positive cor-
relation is persisting even as new technologies are being
introduced and students are entering college with increasingly
sophisticated uses
for and expectations of technology in their lives and on campus.
While this study does not explain the precise nature of the
relationship
between technology and engagement, it does highlight the need
for future research exploring the nature of this persisting
positive
correlation.
4.1. Limitations
The most significant limitation of this study is that the results
are largely based on responses to an experimental set of
questions that are
relatively untested for their psychometric properties, including
validity and reliability. While the questions have face and
content validity,
the researchers have not yet performed extensive investigations
of the psychometric properties of these questions. Additionally,
institu-
tions participating in this study were not randomly selected
from the pool of 4-year colleges and universities in the United
States – the
nature of NSSE allows institutions to self-select into the pool.
Although the sample covers a wide range of American higher
education insti-
tutions in terms of the Carnegie classifications, size, control,
and urbanicity, one must be cautious when generalizing the
results of this
study beyond these students. On a related note, because the
limitations of our data, including non-random institutional
sample and the
nature of the NSSE survey, it is not possible to make
conclusions about the direction of causality in this study. For
instance, while our find-
ings suggest that students who use online learning technology
are more engaged, it is possible that more engaged students tend
to use
learning technology more. Future studies are needed in order to
point out the direction of causality between the use of learning
technology
and student engagement. Lastly, a large sample size like we had
in this study (17,819 first-year and senior students) can be both
a blessing
and a curse. A large randomly selected student sample improves
the external validity of this study, but it also has the potential
of making
all statistical tests significant. For that reason, we reported
effect sizes for all our statistical findings. From our point of
view, we believe the
benefits of a large sample outweigh the associated
disadvantages.
5. Conclusion
Overall, the results of this study point to a positive relationship
between Web-based learning technology use and student
engagement
and desirable learning outcomes. Not only do students who
utilize the Web and Internet technologies in their learning tend
to score higher
in the traditional student engagement measures (e.g. level of
academic challenge, active and collaborative learning, student-
faculty inter-
action, and supportive campus environment), they also are more
likely to make use of deep approaches of learning like higher
order think-
ing, reflective learning, and integrative learning in their study
and they reported higher gains in general education, practical
competence,
and personal and social development. These results are
encouraging signs that Internet and Web-based learning
technologies continue to
have a positive impact on student learning and engagement.
New technology also brings new challenges to higher education
institutions.
As more ethnic minority and part-time students elect to take
online courses instead of traditional classroom courses,
ensuring the quality
of online education and providing good online student support
services becomes a mandate for social equity. It is also the
responsibility of
the institutional administrators and faculty to make certain that
all online students received adequate academic and
technological support
and they are made aware of all the online and offline resources
available to them. No one would deny that computers and the
Internet
technology have offered educational opportunities to many
people who would otherwise be excluded from the traditional
higher education
system. Now the goal should be not just provide the educational
opportunities but the highest educational quality for all
students.
Appendix A
A.1. NSSE 2008 online learning survey items
1. During the current school year, how many courses have you
completed in total? (Use a drop down menu for student to select
from 0
to 20 or more)
2. During the current school year, about how many of these
courses used the Web or Internet as the primary method to
deliver course
content? (Use a drop down menu for student to select from 0 to
20 or more)
3. During the current school year, about how many of your
courses were conducted face-to-face but had a Web component
designed to
promote interaction among students and instructors? (Use a drop
down menu for student to select from 0 to 20 or more)
4. In your experience at your institution during the current
school year, about how often have you done each of the
following? (Very
often, often, sometimes, never)
a. Discussed or completed an assignment using a
‘‘synchronous” tool like instant messenger, online chat room,
video conference, etc.
Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010)
1222–1232 1231
b. Discussed or completed an assignment using an
‘‘asynchronous” tool like e-mail, discussion board, listserv, etc.
c. Asked for help from a tutor or other students outside of
required class activities.
d. Participated in discussions about important topics related to
your major field or discipline.
e. Participated in course activities that challenged you
intellectually.
f. Participated in a study group outside of those required as a
class activity.
g. Participated in discussions that enhance your understanding
of social responsibility.
h. Used your institution’s Web-based library resources in
completing class assignments.
i. Participated in discussions that enhance your understanding of
different cultures.
j. Used the Internet to discuss with an instructor topics you
would not feel comfortable discussing face-to-face or in a
classroom.
References
Allen, I. E., & Seaman, J. (2008). Staying the course: Online
education in the United States. Needham, MA: Sloan
Consortium. <http://www.sloan-c.org/publications/survey/pdf/
staying_the_course.pdf>.
Austin, A. W. (1993). What matters in colleges: Four critical
years revisited. San Francisco: Jossey-Bass.
Bersin, J. (2004). The blended learning book: Best practices,
proven methodologies, and lessons learned. San Francisco, CA:
Pfeiffer.
Bråten, I., & Streømsø, H. I. (2006). Epistemological beliefs,
interest, and gender as predictors of Internet-based learning
activities. Computers in Human Behavior, 22(6),
1027–1042.
Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade,
A., Wozney, L., et al. (2004). How does distance education
compare with classroom instruction? A meta-analysis of
the empirical literature. Review of Educational Research, 74,
379–439.
Bullen, M. (1998). Participation and critical thinking in online
university distance education. Journal of Distance Education,
13(2), 1–32.
Carini, R. M., Kuh, G. D., & Klein, S. P. (2006). Student
engagement and student learning: Testing the linkages.
Research in Higher Education, 47(1), 1–32. <http://
www.springerlink.com/content/b8m6t51v83732308/fulltext.pdf>
.
Carnegie Foundation for the Advancement of Teaching (2009).
Basic classification.
<http://www.carnegiefoundation.org/classifications/index.asp?k
ey=791>.
Chen, P. D., Ted, I., & Davis, L. K. (2007). Engaging African
American students: Compare student engagement and student
satisfaction at HBCUs and their self-identified PWIs
using National Survey of Student Engagement (NSSE) data. In
Paper presented at the 32nd annual conference of the
Association for the Study of Higher Education, Louisville,
KY. Indiana University Bloomington, Center for Postsecondary
Research.
<http://cpr.iub.edu/uploads/Engaging%20African%20American
%20Students%20ASHE.pdf>.
Chickering, A. W., & Gamson, Z. F. (1987). Seven principles
for good practice in undergraduate education. AAHE Bulletin,
39(7), 3–7.
Chickering, A. W., & Ehrmann, S. C. (1996). Implementing the
seven principles: Technology as lever. AAHE
Bulletin(October), 3–6.
Clark, R. E. (2009). Past and future research in online
education. In Paper presented at the 2009 annual meeting of the
American Education Research Association, San Diego,
California. <http://www.cogtech.usc.edu/aera_09.php>.
Driscoll, M. (2002). Blended learning: Let’s get beyond the
hype. Learning & training innovations newsline.
<http://www.ltinewsline.com/ltimagazine/article/
articleDetail.jsp?id=11755>.
Duderstadt, J., Atkins, D., & Houweling, D. (2002). Higher
education in the digital age: Technology issues and strategies
for American colleges and universities. Westport, CT: Praeger.
Ehrmann, S. C. (2004). Beyond computer literacy: Implications
of technology for the content of a college education. Liberal
Education, 90(4), 6–13.
Ellison, N. (2007). Facebook use on campus: A social capital
perspective on social network sites. In Program presented at the
sixth annual ECAR symposium, Boca Raton, FL.
Gladieux, L. E., & Swail, W. S. (1999). The virtual university
and educational opportunity: Issues of equity and access for the
next generation. Washington, DC: College Board.
Graham, C. R. (2006). Blended learning systems: Definition,
current trends, and future directions. In C. J. Bonk & C. R.
Graham (Eds.), The handbook of blended learning
(pp. 3–21). San Francisco, CA: Preiffer.
Green, K. (2007). The 2007 campus computing survey.
<http://www.campuscomputing.net/sites/www.campuscomputing
.net/files/2007-CCP_0.pdf>.
Hawkins, B. L., & Rudy, J. A. (2008). EDUCAUSE core data
service fiscal year 2007 summary report.
<http://net.educause.edu/ir/library/pdf/PUB8005.pdf>.
Hu, S., & Kuh, G. D. (2001). Computing experience and good
practices in undergraduate education: Does the degree of
campus ‘‘wiredness” matter? Education Policy Analysis
Archives, 9(49). <http://epaa.asu.edu/epaa/v9n49.html>.
Indiana University Center for Postsecondary Research (2008a).
Institutional report 2008. Indiana University Center for
Postsecondary Research. <http://nsse.iub.edu/
2008_Institutional_Report/>.
Indiana University Center for Postsecondary Research. (2008b).
Promoting engagement for all students: The imperative to look
within 2008 results. Bloomington, IN. <http://
nsse.iub.edu/NSSE_2008_Results/docs/withhold/NSSE2008_Res
ults_revised_11-14-2008.pdf>.
Jenkins, H. (2006). Convergence culture: Where old and new
media collide. New York: NYU Press.
Kuh, G. D. (2004). The National Survey Of Student
Engagement: Conceptual framework and overview of
psychometric properties.
<http://nsse.iub.edu/2004_annual_report/pdf/
2004_Conceptual_Framework.pdf>.
Kuh, G. D., & Hu, S. (2001). The relationships between
computer and information technology use, student learning, and
other college experiences. Journal of College Student
Development, 42, 217–232.
Kuh, G. D., & Vesper, N. (2001). Do computers enhance or
detract from student learning? Research in Higher Education,
42, 87–102.
LaNasa, S. M., Cabrera, A. F., & Trangsrud, H. (2009). The
construct validity of student engagement: A confirmatory factor
analysis approach. Research in Higher Education, 50(4),
315–332.
LaPadula, M. (2003). A comprehensive look at online student
support services for distance learners. The American Journal of
Distance Education, 17(2), 119–128.
Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K.
(2009). Evaluation of evidence-based practices in online
learning: A meta-analysis and review of online learning studies.
Washington, DC: US Department of Education.
Nelson Laird, T. F., & Kuh, G. D. (2005). Student experiences
with information technology and their relationship to other
aspects of student engagement. Research in Higher
Education, 46(2), 211–233.
Nelson Laird, T. F., Shoup, R., & Kuh, G. D. (2005). Measuring
deep approaches to learning using the National Survey of
Student Engagement. In Paper presented at the annual meeting
of the Association for Institutional Research, Chicago, IL.
<http://nsse.iub.edu/pdf/conference_presentations/2006/AIR200
6DeepLearningFINAL.pdf>.
Ouimet, J. A., Bunnage, J. C., Carini, R. M., Kuh, G. D., &
Kennedy, J. (2004). Using focus groups, expert advice, and
cognitive interviews to establish the validity of a college
student survey. Research in Higher Education, 45(3), 233–250.
Pace, C. R. (1980). Measuring the quality of student effort.
Current Issues in Higher Education, 2, 10–16.
Parsad, B., & Lewis, L. (2008). Distance education at degree-
granting Postsecondary Institutions: 2006–2007 (NCES 2009–
044). National Center for Education Statistics, Institute of
Education Sciences. Washington, DC: US Department of
Education. <http://nces.ed.gov/pubs2009/2009044.pdf>.
Pascarella, E. T., & Seifert, T. A. (2008). Validation of the
NSSE benchmarks and deep approaches to learning against
liberal arts outcomes. Wabash College, Center for Inquiry in the
Liberal Arts
<http://www.wabashnationalstudy.org/files/2008ashevalidationo
fnssebenchmarkslinked.pdf.
Pascarella, E. T., & Terenzini, P. T. (2005). How college affects
students (volume 2): A third decade of research. San Francisco:
Jossey-Bass.
Pike, G. R. (2006). The convergent and discriminant validity of
NSSE scalelet scores. Journal of College Student Development,
47(5), 550–563.
Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear
models: Applications and data analysis methods (2nd ed.).
Thousand Oaks, CA: Sage.
Reay, J. (2001). Blended learning: A fusion for the future.
Knowledge Management Review, 4(3), 6.
Robinson, C. C., & Hullinger, H. (2008). New benchmarks in
higher education: Student engagement in online learning.
Journal of Education for Business, 84(2), 101–108.
Rossett, A. (2001). The ASTD e-learning handbook: Best
practices, strategies, and case studies for an emerging field.
New York, NY: McGraw-Hill.
Salaway, G., & Caruso, J. B. (2008). The ECAR study of
undergraduate students and information technology.
<http://www.educause.edu/ECAR/TheECARStudyofUndergradu
ateStu/
163283>.
Sanders, R. (2006). The ‘‘imponderable bloom”: Reconsidering
the role of technology in education. Innovate Journal of Online
Education, 2(6). <http://innovateonline.info/
index.php?view=article&id=232&action=article>.
Sands, P. (2002). Inside outside, upside downside: Strategies
for connecting online and face-to-face instruction in hybrid
courses. Teaching with Technology Today, 8(6). <http://
www.uwsa.edu/ttt/articles/sands2.htm>.
http://www.sloan-
c.org/publications/survey/pdf/staying_the_course.pdf
http://www.sloan-
c.org/publications/survey/pdf/staying_the_course.pdf
http://www.springerlink.com/content/b8m6t51v83732308/fulltex
t.pdf
http://www.springerlink.com/content/b8m6t51v83732308/fulltex
t.pdf
http://www.carnegiefoundation.org/classifications/index.asp?ke
y=791
http://cpr.iub.edu/uploads/Engaging%20African%20American%
20Students%20ASHE.pdf
http://www.cogtech.usc.edu/aera_09.php
http://www.ltinewsline.com/ltimagazine/article/articleDetail.jsp
?id=11755
http://www.ltinewsline.com/ltimagazine/article/articleDetail.jsp
?id=11755
http://www.campuscomputing.net/sites/www.campuscomputing.
net/files/2007-CCP_0.pdf
http://net.educause.edu/ir/library/pdf/PUB8005.pdf
http://epaa.asu.edu/epaa/v9n49.html
http://nsse.iub.edu/2008_Institutional_Report/
http://nsse.iub.edu/2008_Institutional_Report/
http://nsse.iub.edu/NSSE_2008_Results/docs/withhold/NSSE20
08_Results_revised_11-14-2008.pdf
http://nsse.iub.edu/NSSE_2008_Results/docs/withhold/NSSE20
08_Results_revised_11-14-2008.pdf
http://nsse.iub.edu/2004_annual_report/
http://nsse.iub.edu/2004_annual_report/
http://nsse.iub.edu/pdf/conference_presentations/2006/AIR2006
DeepLearningFINAL.pdf
http://nces.ed.gov/pubs2009/2009044.pdf
http://www.wabashnationalstudy.org/files/
http://www.educause.edu/ECAR/TheECARStudyofUndergraduat
eStu/163283
http://www.educause.edu/ECAR/TheECARStudyofUndergraduat
eStu/163283
http://innovateonline.info/index.php?view=article&amp;id=232
&amp;action=article
http://innovateonline.info/index.php?view=article&amp;id=232
&amp;action=article
http://www.uwsa.edu/ttt/articles/sands2.htm
http://www.uwsa.edu/ttt/articles/sands2.htm
1232 Pu-Shih Daniel Chen et al. / Computers & Education 54
(2010) 1222–1232
Siegel, S., & Castellan, N. J. Jr., (1988). Nonparametric
statistics for the behavioral sciences (2nd ed.). Boston:
McGraw-Hill.
Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006).
The comparative effectiveness of web-based and classroom
instruction: A meta-analysis. Personnel Psychology, 59,
623–664.
Tabachnick, B. G., & Fidell, L. S. (2007). Using multivariate
statistics (5th ed.). Boston: Pearson.
Terrell, S. R., & Dringus, L. (2000). An investigation of the
effect of learning style on student success in an online learning
environment. Journal of Educational Technology
Systems, 28(3), 231–238.
Thurmond, V., & Wambach, K. (2004). Understanding
interactions in distance education: A review of the literature.
International Journal of Instructional Technology & Distance
Learning, 1, 9–33.
<http://www.itdl.org/journal/Jan_04/article02.htm>.
Ward, J., & LaBranche, G. A. (2003). Blended learning: The
convergence of e-learning and meetings. Franchising World,
35(4), 22–23.
Ward, M., & Newlands, D. (1998). Use of the Web in
undergraduate teaching. Computers and Education, 31(2), 171–
184.
Zhou, L., & Zhang, D. (2008). Web 2.0 impact on student
learning process. In K. McFerrin et al. (Eds.), Proceedings of
society for information technology and teacher education
international conference (pp. 2880–2882). Chesapeake, VA:
AACE.
http://www.itdl.org/journal/Jan_04/article02.htm
NOTE: 4 pages paper should have abstract, introduction,
discussion, conclusion with no grammatical errors, good
sentence formation, APA Format, in text citations, references
related to Operational excellence areas only
Below is the topic:
Practical Connection Assignment
The structure and scope of operations
Consider the music business as a supply network. How has
music downloads and streaming affected artists’ sales? What
implications has online music transmission had for traditional
music retailers?
Hints:
a) Research music industry structure before downloads
Draw flow diagrams
b) Research current music industry structure
Draw flow diagrams
c) Compare and contrast
Remember terms such as intermediation, outsourcing etc.
Provide a cover page, an introduction, and a conclusion.
Three pages minimum: Do not forget to use APA.
Discussion 9: Parametric or Non-Parametric Test?
Read the following article:
Chen, P. S. D., Lambert, A. D., & Guidry, K. R. (2010).
Engaging online learners: The impact of Web-based learning
technology on college student engagement. Computers &
Education, 54(4), 1222-1232. Retrieved from https://www-
sciencedirect-
com.nl.idm.oclc.org/science/article/pii/S0360131509003285?via
%3Dihub ATTACHED
Chen, Lambert, & Guidry (2010) found they needed to use
nonparametric tests in their work.
1. Given a choice between performing a parametric or non-
parametric test, which would you choose and why?
(Assume you had a parametric and non-parametric version of a
dependent variable and that it did not matter which one you
chose)
· Your initial post (approximately 200-250 words) should
address each question in the discussion directions

More Related Content

Similar to Impact of Web-Based Learning on Student Engagement

Examining Internet Use Among Low-Income Students
Examining Internet Use Among Low-Income StudentsExamining Internet Use Among Low-Income Students
Examining Internet Use Among Low-Income StudentsJason Seliskar
 
020. students’ attitude and behavioural intention on adoption of internet for...
020. students’ attitude and behavioural intention on adoption of internet for...020. students’ attitude and behavioural intention on adoption of internet for...
020. students’ attitude and behavioural intention on adoption of internet for...Gambari Isiaka
 
Technology Utilization among Graduate Assistants and Faculty
Technology Utilization among Graduate Assistants and FacultyTechnology Utilization among Graduate Assistants and Faculty
Technology Utilization among Graduate Assistants and FacultyMsRyals
 
Technology Utilization among Graduate Assistants and Faculty
Technology Utilization among Graduate Assistants and FacultyTechnology Utilization among Graduate Assistants and Faculty
Technology Utilization among Graduate Assistants and FacultyAshley Ryals
 
Students’ Perceptions of the Effectiveness of Technology Use by Professors
Students’ Perceptions of the Effectiveness of Technology Use by ProfessorsStudents’ Perceptions of the Effectiveness of Technology Use by Professors
Students’ Perceptions of the Effectiveness of Technology Use by ProfessorsCathy Yang
 
MathAMATYC Educator Vol 6 No 2 Feb 2015: Technology--The Past The Present and...
MathAMATYC Educator Vol 6 No 2 Feb 2015: Technology--The Past The Present and...MathAMATYC Educator Vol 6 No 2 Feb 2015: Technology--The Past The Present and...
MathAMATYC Educator Vol 6 No 2 Feb 2015: Technology--The Past The Present and...Fred Feldon
 
WHY DO LEARNERS CHOOSE ONLINE LEARNING THE LEARNERS’ VOI.docx
WHY DO LEARNERS CHOOSE ONLINE LEARNING  THE LEARNERS’ VOI.docxWHY DO LEARNERS CHOOSE ONLINE LEARNING  THE LEARNERS’ VOI.docx
WHY DO LEARNERS CHOOSE ONLINE LEARNING THE LEARNERS’ VOI.docxmansonagnus
 
8484 litreview ecb
8484 litreview ecb8484 litreview ecb
8484 litreview ecbeboswell
 
Coloma 109
Coloma 109Coloma 109
Coloma 109Mae
 
The Evolution of E-Learning...ED-6620
The Evolution of E-Learning...ED-6620The Evolution of E-Learning...ED-6620
The Evolution of E-Learning...ED-6620samoores
 
Student teachers’ first reflections on information
Student teachers’ first reflections on informationStudent teachers’ first reflections on information
Student teachers’ first reflections on informationPamela Vásquez Costales
 
Educational technology: Literature Review
Educational technology: Literature ReviewEducational technology: Literature Review
Educational technology: Literature ReviewNoerhadiLokman
 
Leone Walsh - Project 3 - Slidecast
Leone Walsh - Project 3 - SlidecastLeone Walsh - Project 3 - Slidecast
Leone Walsh - Project 3 - Slidecastmunmba
 
Findings on facebook in higher education a comparison of college faculty and...
Findings on facebook in higher education  a comparison of college faculty and...Findings on facebook in higher education  a comparison of college faculty and...
Findings on facebook in higher education a comparison of college faculty and...Arina Fauzi
 

Similar to Impact of Web-Based Learning on Student Engagement (20)

Examining Internet Use Among Low-Income Students
Examining Internet Use Among Low-Income StudentsExamining Internet Use Among Low-Income Students
Examining Internet Use Among Low-Income Students
 
020. students’ attitude and behavioural intention on adoption of internet for...
020. students’ attitude and behavioural intention on adoption of internet for...020. students’ attitude and behavioural intention on adoption of internet for...
020. students’ attitude and behavioural intention on adoption of internet for...
 
Technology Utilization among Graduate Assistants and Faculty
Technology Utilization among Graduate Assistants and FacultyTechnology Utilization among Graduate Assistants and Faculty
Technology Utilization among Graduate Assistants and Faculty
 
Technology Utilization among Graduate Assistants and Faculty
Technology Utilization among Graduate Assistants and FacultyTechnology Utilization among Graduate Assistants and Faculty
Technology Utilization among Graduate Assistants and Faculty
 
Using ICTs
Using ICTsUsing ICTs
Using ICTs
 
Students’ Perceptions of the Effectiveness of Technology Use by Professors
Students’ Perceptions of the Effectiveness of Technology Use by ProfessorsStudents’ Perceptions of the Effectiveness of Technology Use by Professors
Students’ Perceptions of the Effectiveness of Technology Use by Professors
 
MathAMATYC Educator Vol 6 No 2 Feb 2015: Technology--The Past The Present and...
MathAMATYC Educator Vol 6 No 2 Feb 2015: Technology--The Past The Present and...MathAMATYC Educator Vol 6 No 2 Feb 2015: Technology--The Past The Present and...
MathAMATYC Educator Vol 6 No 2 Feb 2015: Technology--The Past The Present and...
 
WHY DO LEARNERS CHOOSE ONLINE LEARNING THE LEARNERS’ VOI.docx
WHY DO LEARNERS CHOOSE ONLINE LEARNING  THE LEARNERS’ VOI.docxWHY DO LEARNERS CHOOSE ONLINE LEARNING  THE LEARNERS’ VOI.docx
WHY DO LEARNERS CHOOSE ONLINE LEARNING THE LEARNERS’ VOI.docx
 
8484 litreview ecb
8484 litreview ecb8484 litreview ecb
8484 litreview ecb
 
Coloma 109
Coloma 109Coloma 109
Coloma 109
 
Ej1127074
Ej1127074Ej1127074
Ej1127074
 
The Evolution of E-Learning...ED-6620
The Evolution of E-Learning...ED-6620The Evolution of E-Learning...ED-6620
The Evolution of E-Learning...ED-6620
 
Student teachers’ first reflections on information
Student teachers’ first reflections on informationStudent teachers’ first reflections on information
Student teachers’ first reflections on information
 
Educational technology: Literature Review
Educational technology: Literature ReviewEducational technology: Literature Review
Educational technology: Literature Review
 
Leone Walsh - Project 3 - Slidecast
Leone Walsh - Project 3 - SlidecastLeone Walsh - Project 3 - Slidecast
Leone Walsh - Project 3 - Slidecast
 
RRL-2.docx
RRL-2.docxRRL-2.docx
RRL-2.docx
 
Perceived Competence of Zimbabwean Academics in the Use of Information Techno...
Perceived Competence of Zimbabwean Academics in the Use of Information Techno...Perceived Competence of Zimbabwean Academics in the Use of Information Techno...
Perceived Competence of Zimbabwean Academics in the Use of Information Techno...
 
Digital natives
Digital nativesDigital natives
Digital natives
 
11_01.pdf
11_01.pdf11_01.pdf
11_01.pdf
 
Findings on facebook in higher education a comparison of college faculty and...
Findings on facebook in higher education  a comparison of college faculty and...Findings on facebook in higher education  a comparison of college faculty and...
Findings on facebook in higher education a comparison of college faculty and...
 

More from mccormicknadine86

Option #2Researching a Leader Complete preliminary rese.docx
Option #2Researching a Leader Complete preliminary rese.docxOption #2Researching a Leader Complete preliminary rese.docx
Option #2Researching a Leader Complete preliminary rese.docxmccormicknadine86
 
Option 1 ImperialismThe exploitation of  colonial resources.docx
Option 1 ImperialismThe exploitation of  colonial resources.docxOption 1 ImperialismThe exploitation of  colonial resources.docx
Option 1 ImperialismThe exploitation of  colonial resources.docxmccormicknadine86
 
Option Wireless LTD v. OpenPeak, Inc.Be sure to save an elec.docx
Option Wireless LTD v. OpenPeak, Inc.Be sure to save an elec.docxOption Wireless LTD v. OpenPeak, Inc.Be sure to save an elec.docx
Option Wireless LTD v. OpenPeak, Inc.Be sure to save an elec.docxmccormicknadine86
 
Option A Land SharkWhen is a shark just a shark Consider the.docx
Option A Land SharkWhen is a shark just a shark Consider the.docxOption A Land SharkWhen is a shark just a shark Consider the.docx
Option A Land SharkWhen is a shark just a shark Consider the.docxmccormicknadine86
 
Option 3 Discuss your thoughts on drugs and deviance. Do you think .docx
Option 3 Discuss your thoughts on drugs and deviance. Do you think .docxOption 3 Discuss your thoughts on drugs and deviance. Do you think .docx
Option 3 Discuss your thoughts on drugs and deviance. Do you think .docxmccormicknadine86
 
OPTION 2 Can we make the changes we need to make After the pandemi.docx
OPTION 2 Can we make the changes we need to make After the pandemi.docxOPTION 2 Can we make the changes we need to make After the pandemi.docx
OPTION 2 Can we make the changes we need to make After the pandemi.docxmccormicknadine86
 
Option 1 You will create a PowerPoint (or equivalent) of your p.docx
Option 1 You will create a PowerPoint (or equivalent) of your p.docxOption 1 You will create a PowerPoint (or equivalent) of your p.docx
Option 1 You will create a PowerPoint (or equivalent) of your p.docxmccormicknadine86
 
Option A Description of Dance StylesSelect two styles of danc.docx
Option A Description of Dance StylesSelect two styles of danc.docxOption A Description of Dance StylesSelect two styles of danc.docx
Option A Description of Dance StylesSelect two styles of danc.docxmccormicknadine86
 
Option #2Provide several slides that explain the key section.docx
Option #2Provide several slides that explain the key section.docxOption #2Provide several slides that explain the key section.docx
Option #2Provide several slides that explain the key section.docxmccormicknadine86
 
Option 2 Slavery vs. Indentured ServitudeExplain how and wh.docx
Option 2 Slavery vs. Indentured ServitudeExplain how and wh.docxOption 2 Slavery vs. Indentured ServitudeExplain how and wh.docx
Option 2 Slavery vs. Indentured ServitudeExplain how and wh.docxmccormicknadine86
 
Option 2 ArtSelect any 2 of works of art about the Holocaus.docx
Option 2 ArtSelect any 2 of works of art about the Holocaus.docxOption 2 ArtSelect any 2 of works of art about the Holocaus.docx
Option 2 ArtSelect any 2 of works of art about the Holocaus.docxmccormicknadine86
 
Option #1 Stanford University Prison Experiment Causality, C.docx
Option #1 Stanford University Prison Experiment Causality, C.docxOption #1 Stanford University Prison Experiment Causality, C.docx
Option #1 Stanford University Prison Experiment Causality, C.docxmccormicknadine86
 
Option A  Gender CrimesCriminal acts occur against individu.docx
Option A  Gender CrimesCriminal acts occur against individu.docxOption A  Gender CrimesCriminal acts occur against individu.docx
Option A  Gender CrimesCriminal acts occur against individu.docxmccormicknadine86
 
opic 4 Discussion Question 1 May students express religious bel.docx
opic 4 Discussion Question 1 May students express religious bel.docxopic 4 Discussion Question 1 May students express religious bel.docx
opic 4 Discussion Question 1 May students express religious bel.docxmccormicknadine86
 
Option 1Choose a philosopher who interests you. Research that p.docx
Option 1Choose a philosopher who interests you. Research that p.docxOption 1Choose a philosopher who interests you. Research that p.docx
Option 1Choose a philosopher who interests you. Research that p.docxmccormicknadine86
 
Option #1The Stanford University Prison Experiment Structu.docx
Option #1The Stanford University Prison Experiment Structu.docxOption #1The Stanford University Prison Experiment Structu.docx
Option #1The Stanford University Prison Experiment Structu.docxmccormicknadine86
 
Operationaland Organizational SecurityChapter 3Princ.docx
Operationaland Organizational SecurityChapter 3Princ.docxOperationaland Organizational SecurityChapter 3Princ.docx
Operationaland Organizational SecurityChapter 3Princ.docxmccormicknadine86
 
Open the file (Undergrad Reqt_Individual In-Depth Case Study) for in.docx
Open the file (Undergrad Reqt_Individual In-Depth Case Study) for in.docxOpen the file (Undergrad Reqt_Individual In-Depth Case Study) for in.docx
Open the file (Undergrad Reqt_Individual In-Depth Case Study) for in.docxmccormicknadine86
 
onsider whether you think means-tested programs, such as the Tem.docx
onsider whether you think means-tested programs, such as the Tem.docxonsider whether you think means-tested programs, such as the Tem.docx
onsider whether you think means-tested programs, such as the Tem.docxmccormicknadine86
 
Operations security - PPT should cover below questions (chapter 1 to.docx
Operations security - PPT should cover below questions (chapter 1 to.docxOperations security - PPT should cover below questions (chapter 1 to.docx
Operations security - PPT should cover below questions (chapter 1 to.docxmccormicknadine86
 

More from mccormicknadine86 (20)

Option #2Researching a Leader Complete preliminary rese.docx
Option #2Researching a Leader Complete preliminary rese.docxOption #2Researching a Leader Complete preliminary rese.docx
Option #2Researching a Leader Complete preliminary rese.docx
 
Option 1 ImperialismThe exploitation of  colonial resources.docx
Option 1 ImperialismThe exploitation of  colonial resources.docxOption 1 ImperialismThe exploitation of  colonial resources.docx
Option 1 ImperialismThe exploitation of  colonial resources.docx
 
Option Wireless LTD v. OpenPeak, Inc.Be sure to save an elec.docx
Option Wireless LTD v. OpenPeak, Inc.Be sure to save an elec.docxOption Wireless LTD v. OpenPeak, Inc.Be sure to save an elec.docx
Option Wireless LTD v. OpenPeak, Inc.Be sure to save an elec.docx
 
Option A Land SharkWhen is a shark just a shark Consider the.docx
Option A Land SharkWhen is a shark just a shark Consider the.docxOption A Land SharkWhen is a shark just a shark Consider the.docx
Option A Land SharkWhen is a shark just a shark Consider the.docx
 
Option 3 Discuss your thoughts on drugs and deviance. Do you think .docx
Option 3 Discuss your thoughts on drugs and deviance. Do you think .docxOption 3 Discuss your thoughts on drugs and deviance. Do you think .docx
Option 3 Discuss your thoughts on drugs and deviance. Do you think .docx
 
OPTION 2 Can we make the changes we need to make After the pandemi.docx
OPTION 2 Can we make the changes we need to make After the pandemi.docxOPTION 2 Can we make the changes we need to make After the pandemi.docx
OPTION 2 Can we make the changes we need to make After the pandemi.docx
 
Option 1 You will create a PowerPoint (or equivalent) of your p.docx
Option 1 You will create a PowerPoint (or equivalent) of your p.docxOption 1 You will create a PowerPoint (or equivalent) of your p.docx
Option 1 You will create a PowerPoint (or equivalent) of your p.docx
 
Option A Description of Dance StylesSelect two styles of danc.docx
Option A Description of Dance StylesSelect two styles of danc.docxOption A Description of Dance StylesSelect two styles of danc.docx
Option A Description of Dance StylesSelect two styles of danc.docx
 
Option #2Provide several slides that explain the key section.docx
Option #2Provide several slides that explain the key section.docxOption #2Provide several slides that explain the key section.docx
Option #2Provide several slides that explain the key section.docx
 
Option 2 Slavery vs. Indentured ServitudeExplain how and wh.docx
Option 2 Slavery vs. Indentured ServitudeExplain how and wh.docxOption 2 Slavery vs. Indentured ServitudeExplain how and wh.docx
Option 2 Slavery vs. Indentured ServitudeExplain how and wh.docx
 
Option 2 ArtSelect any 2 of works of art about the Holocaus.docx
Option 2 ArtSelect any 2 of works of art about the Holocaus.docxOption 2 ArtSelect any 2 of works of art about the Holocaus.docx
Option 2 ArtSelect any 2 of works of art about the Holocaus.docx
 
Option #1 Stanford University Prison Experiment Causality, C.docx
Option #1 Stanford University Prison Experiment Causality, C.docxOption #1 Stanford University Prison Experiment Causality, C.docx
Option #1 Stanford University Prison Experiment Causality, C.docx
 
Option A  Gender CrimesCriminal acts occur against individu.docx
Option A  Gender CrimesCriminal acts occur against individu.docxOption A  Gender CrimesCriminal acts occur against individu.docx
Option A  Gender CrimesCriminal acts occur against individu.docx
 
opic 4 Discussion Question 1 May students express religious bel.docx
opic 4 Discussion Question 1 May students express religious bel.docxopic 4 Discussion Question 1 May students express religious bel.docx
opic 4 Discussion Question 1 May students express religious bel.docx
 
Option 1Choose a philosopher who interests you. Research that p.docx
Option 1Choose a philosopher who interests you. Research that p.docxOption 1Choose a philosopher who interests you. Research that p.docx
Option 1Choose a philosopher who interests you. Research that p.docx
 
Option #1The Stanford University Prison Experiment Structu.docx
Option #1The Stanford University Prison Experiment Structu.docxOption #1The Stanford University Prison Experiment Structu.docx
Option #1The Stanford University Prison Experiment Structu.docx
 
Operationaland Organizational SecurityChapter 3Princ.docx
Operationaland Organizational SecurityChapter 3Princ.docxOperationaland Organizational SecurityChapter 3Princ.docx
Operationaland Organizational SecurityChapter 3Princ.docx
 
Open the file (Undergrad Reqt_Individual In-Depth Case Study) for in.docx
Open the file (Undergrad Reqt_Individual In-Depth Case Study) for in.docxOpen the file (Undergrad Reqt_Individual In-Depth Case Study) for in.docx
Open the file (Undergrad Reqt_Individual In-Depth Case Study) for in.docx
 
onsider whether you think means-tested programs, such as the Tem.docx
onsider whether you think means-tested programs, such as the Tem.docxonsider whether you think means-tested programs, such as the Tem.docx
onsider whether you think means-tested programs, such as the Tem.docx
 
Operations security - PPT should cover below questions (chapter 1 to.docx
Operations security - PPT should cover below questions (chapter 1 to.docxOperations security - PPT should cover below questions (chapter 1 to.docx
Operations security - PPT should cover below questions (chapter 1 to.docx
 

Recently uploaded

call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...M56BOOKSTORE PRODUCT/SERVICE
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfUmakantAnnand
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppCeline George
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Celine George
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfakmcokerachita
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docxPoojaSen20
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptxVS Mahajan Coaching Centre
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 

Recently uploaded (20)

call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.Compdf
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
URLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website AppURLs and Routing in the Odoo 17 Website App
URLs and Routing in the Odoo 17 Website App
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
Incoming and Outgoing Shipments in 1 STEP Using Odoo 17
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdf
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docx
 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions  for the students and aspirants of Chemistry12th.pptxOrganic Name Reactions  for the students and aspirants of Chemistry12th.pptx
Organic Name Reactions for the students and aspirants of Chemistry12th.pptx
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
Staff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSDStaff of Color (SOC) Retention Efforts DDSD
Staff of Color (SOC) Retention Efforts DDSD
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 

Impact of Web-Based Learning on Student Engagement

  • 1. Computers & Education 54 (2010) 1222–1232 Contents lists available at ScienceDirect Computers & Education journal homepage: www.elsevier .com/ locate/compedu Engaging online learners: The impact of Web-based learning technology on college student engagement Pu-Shih Daniel Chen a,*, Amber D. Lambert b, Kevin R. Guidry b a Department of Counseling and Higher Education, University of North Texas, 1155 Union Circle #310829, Denton, TX 76203-5017, USA b Center for Postsecondary Research, Indiana University Bloomington, USA a r t i c l e i n f o Article history: Received 31 July 2009 Received in revised form 30 October 2009 Accepted 16 November 2009 Keywords: Online learning Engagement College University NSSE Web-based
  • 2. Deep learning 0360-1315/$ - see front matter � 2009 Elsevier Ltd. A doi:10.1016/j.compedu.2009.11.008 * Corresponding author. Tel.: +1 940 369 8062; fax E-mail addresses: [email protected] (Pu-Shih D a b s t r a c t Widespread use of the Web and other Internet technologies in postsecondary education has exploded in the last 15 years. Using a set of items developed by the National Survey of Student Engagement (NSSE), the researchers utilized the hierarchical linear model (HLM) and multiple regressions to investigate the impact of Web-based learning technology on student engagement and self-reported learning outcomes in face-to-face and online learning environments. The results show a general positive relationship between the use the learning technology and student engagement and learning outcomes. We also discuss the possible impact on minority and part-time students as they are more likely to enroll in online courses. � 2009 Elsevier Ltd. All rights reserved. 1. Introduction The Internet and other digital technologies have become thoroughly integrated in the lives of today’s college student. A recent study by EDUCAUSE (Hawkins & Rudy, 2008) found that the vast majority of US students at baccalaureate degree-granting institutions own and use their own computers. Online learning management systems (LMS) such as Blackboard, D2L, or Sakai are nearly ubiquitous on American colleges and universities, and wireless Internet access
  • 3. permeates most college classrooms (Green, 2007; Hawkins & Rudy, 2008). Outside the classroom, Internet connections are available in virtually all on-campus residence halls (Hawkins & Rudy, 2008) and an estimated 79– 95% of all American College students use Facebook and MySpace (Ellison, 2007). Most first-year college students now arrive on campus with their own personal computer, digital music player, cell phone, and other digital devices (Salaway & Caruso, 2008). As technology becomes a part of modern life and fuel price remains high, more and more college students opt to take online or hybrid courses using readily- available computers and information technologies (Allen & Seaman, 2008). Moreover, many students expect instructors to integrate Internet technologies, such as online learning management systems and collab- orative Internet technologies, into traditional face-to-face classes to enhance learning experience, believing those tools make the educa- tional experience more convenient and educationally effective (Salaway & Caruso, 2008). Since the early 2000s, Web-based applications have become the de facto standard platform for distance education courses and learning management systems (Parsad & Lewis, 2008). The widespread adaptation of digital technologies and online courses has caused many researchers (Bråten & Streømsø, 2006; Kuh & Hu, 2001; Robinson & Hullinger, 2008; Zhou & Zhang, 2008) to question the impact of the Internet and Web-based learning technology on student’s
  • 4. educational engagement and learning outcomes. The concept of student engage- ment is not new to educators. Years of research has shown that what students do during college counts more in terms of learning outcomes than who they are or even where they go to college (Austin, 1993; Kuh, 2004; Pace, 1980; Pascarella & Terenzini, 2005). In the Seven prin- ciples for good practice in undergraduate education, Chickering and Gamson (1987) argued that good college education should promote student-faculty interaction, cooperation among students, active learning, prompt feedback, time on task, high expectations, and respect for diverse talents and ways of learning. In a follow-up article published in 1996, Chickering and Ehrmann (1996) stated that new ll rights reserved. : +1 940 369 7177. aniel Chen), [email protected] (A.D. Lambert), [email protected] (K.R. Guidry). http://dx.doi.org/10.1016/j.compedu.2009.11.008 mailto:[email protected] mailto:[email protected] mailto:[email protected] http://www.sciencedirect.com/science/journal/03601315 http://www.elsevier.com/locate/compedu Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010) 1222–1232 1223 communication and information technology alone will not lead to student success. Instead, educators must utilize technology as a lever to
  • 5. promote student engagement in order to maximize the power of computers and information technology as a catalyst for student success in college (Ehrmann, 2004). Most studies on the topic of technology and student engagement have affirmed the utility of computers and information technology on promoting student engagement (Hu & Kuh, 2001; Nelson Laird & Kuh, 2005; Robinson & Hullinger, 2008). For example, Robinson and Hul- linger found that asynchronous instructional technology allows learners more time to think critically and reflectively, which in turns stim- ulates higher order thinking such as analysis, synthesis, judgment, and application of knowledge. Duderstadt, Atkins, and Houweling (2002) stated, ‘‘When implemented through active, inquiry based learning pedagogies, online learning can stimulate students to use higher order skills such as problem solving, collaboration, and stimulation” (p. 75). Furthermore, students taking online courses are expected to work collaboratively, which is an important component of student engagement, plus that collaborative components have been integrated into most Web-based course designs (Thurmond & Wambach, 2004). Other than promoting student engagement, research focused on the connection between technology and learning outcomes has been mixed. George Kuh and his associates have published several articles related to this issue using the National Survey of Student Engagement (NSSE) data. In Kuh and Hu (2001), the authors suggested a
  • 6. positive relationship between a student’s use of computers and other informa- tion technologies and self-reported gains in science and technology, vocational preparation, and intellectual development. Hu and Kuh (2001) also found that students attending more ‘‘wired” institutions reported more frequently use computing and information technology and higher levels of engagement in good educational practices than their counterparts at less wired institutions. A similar study conducted by Kuh and Vesper (2001) concluded that increased familiarity with computers was positively related to developing other important skills and competencies, including social skills. Studies conducted by other researchers, however, have mixed outcomes that have often not been as positive as those reported by George Kuh and his associates. A meta-analysis commissioned by the US Department of Education examined empirical evidence of the im- pact of online and hybrid courses on learning outcomes. The authors found that both online and hybrid courses have a significant positive impact on learning outcomes, with hybrid courses having a greater impact. However, the authors caution that the ‘‘positive effects asso- ciated with blended learning should not be attributed to the media, per se” (p. ix) (Means, Toyama, Murphy, Bakia, & Jones, 2009). This reflects long-standing findings that, contrary to many naïve beliefs, media do not have a significant impact on learning outcomes (Clark, 2009). Other meta-analyses of distance education impacts on learning outcomes have supported these mixed findings
  • 7. (Bernard et al., 2004; Sitzmann, Kraiger, Stewart, & Wisher, 2006). While it is unclear if students learn more in online courses, it does seem clear that there is an increase in students’ information literacy. For example, Robinson and Hullinger (2008) found a correlation between taking online courses and the improvement of students’ com- puter skills. Though most online courses do not require students to have high level computer skills in order to complete the courses, they nevertheless require students to become familiar with essential information technological skills such as using e-mail, participating in on- line chatting, posting to a Web-based discussion board, and using word processing, presentation, and spreadsheet software. Even though there are many educational benefits associated with using computer technologies, there are also downsides. Critics have argued that online learning and the use of information technology may put certain student populations in disadvantage. Echoing Jenkins’ ‘‘participation gap” idea (Jenkins, 2006), some researchers have suggested that characteristics such as socioeconomic status (Gladieux & Swail, 1999) and institutional resources (Hu & Kuh, 2001) play a significant role in students’ use of and the impact of computers and the Internet. In addition, some researchers asserted that the lack of face-to-face interactions in online learning may reduce instructional effectiveness for students of certain learning styles (Bullen, 1998; Terrell & Dringus, 2000; Ward & Newlands, 1998). Sanders (2006) argued
  • 8. that no communication technology can replace the physical presence and the serendipitous moments of learning such as the spontaneous discussion or the overheard remarks during class break that so often occurred in a face-to-face environment. 1.1. Purpose of study and research questions Although studies have found positive connections between the use of computers and information technology and student engagement and learning outcomes, most of them studied the general use of information technology instead of the specific use of instructional and learning management systems. This study investigates the nature of student engagement in the online learning environment to find out if student and institutional characteristics affect the use of the learning technologies and their impact on student engagement. Specifically, the following research questions were addressed: 1. How often do college students in different types of courses use the Web and Internet technologies for course-related tasks? 2. Do individual and institutional characteristics affect the likelihood of taking online courses? 3. Does the relative amount of technology employed in a course have a relationship with student engagement, learning approaches, and student self-reported learning outcomes? 2. Methods 2.1. Instrument and data source The data for this study come from the 2008 administration of the National Survey of Student Engagement (NSSE). NSSE is an
  • 9. annual survey created and administered by the Indiana University Center for Postsecondary Research. Since the inception of the NSSE in 2000, more than a million first-year students and seniors at more than 1300 baccalaureate degree-granting colleges and universities in the Uni- ted States and Canada have reported the time and energy that they devote to the educationally purposeful activities measured by this an- nual survey (Indiana University Center for Postsecondary Research, 2008b). Participating institutions use their student engagement results to identify areas where teaching and learning can be improved. NSSE results have been found to positively correlate with desired learning outcomes, such as critical thinking ability and grades (Carini, Kuh, & Klein, 2006; Kuh, 2004; Ouimet, Bunnage, Carini, Kuh, & Kennedy, https://www.researchgate.net/publication/247116209_Participati on_and_Critical_Thinking_in_Online_University_Distance_Edu cation?el=1_x_8&enrichId=rgreq-5b2569bc-7830-4af3-80d3- 9b27c325e5f5&enrichSource=Y292ZXJQYWdlOzIyMzIzNTA4 MTtBUzoyNjAwNjI5ODc5NDM5MzdAMTQzOTAxNTI1Njk1N w== https://www.researchgate.net/publication/245347186_An_Invest igation_of_the_Effect_of_Learning_Style_on_Student_Success_ in_Online_Learning_Environment?el=1_x_8&enrichId=rgreq- 5b2569bc-7830-4af3-80d3- 9b27c325e5f5&enrichSource=Y292ZXJQYWdlOzIyMzIzNTA4 MTtBUzoyNjAwNjI5ODc5NDM5MzdAMTQzOTAxNTI1Njk1N w== https://www.researchgate.net/publication/259823448_Converge nce_Culture_Where_Old_Media_and_New_Media_Collide?el=1 _x_8&enrichId=rgreq-5b2569bc-7830-4af3-80d3-
  • 10. 9b27c325e5f5&enrichSource=Y292ZXJQYWdlOzIyMzIzNTA4 MTtBUzoyNjAwNjI5ODc5NDM5MzdAMTQzOTAxNTI1Njk1N w== https://www.researchgate.net/publication/237587031_The_Impo nderable_Bloom_Reconsidering_the_Role_of_Technology_in_E ducation?el=1_x_8&enrichId=rgreq-5b2569bc-7830-4af3-80d3- 9b27c325e5f5&enrichSource=Y292ZXJQYWdlOzIyMzIzNTA4 MTtBUzoyNjAwNjI5ODc5NDM5MzdAMTQzOTAxNTI1Njk1N w== https://www.researchgate.net/publication/237279421_The_Natio nal_Survey_of_Student_Engagement_Conceptual_Framework_O verview_of_Psychometric_Properties?el=1_x_8&enrichId=rgreq -5b2569bc-7830-4af3-80d3- 9b27c325e5f5&enrichSource=Y292ZXJQYWdlOzIyMzIzNTA4 MTtBUzoyNjAwNjI5ODc5NDM5MzdAMTQzOTAxNTI1Njk1N w== 1224 Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010) 1222–1232 2004; Pike, 2006). The conceptual framework and psychometric properties of the NSSE and the development of NSSE scales have been am- ply documented (Kuh, 2004; Nelson Laird, Shoup, & Kuh, 2005). In 2007, researchers at NSSE developed a set of questions to investigate the nature of student engagement in the online learning envi- ronment. The original set of questions includes 22 questions. After pilot testing and expert review, the items were revised and the numbers were reduced to 13 (see Appendix for the list of items). The final set of 13 items asks respondents to identify the number of classes in which they were enrolled in the last academic year and how many of
  • 11. those courses were conducted entirely online or face-to-face with a signif- icant online component. Survey respondents also reported on specific behaviors related to their collegiate experiences, including in- and out-of-class behaviors, time usage, and learning approaches that are known to contribute to desirable learning outcomes. 2.2. Sample The NSSE online learning questions were attached to the end of the NSSE online survey and sent to participating students at 45 US bac- calaureate degree-granting institution. The 45 institutions were randomly selected from the pool of 763 institutions participated in the 2008 NSSE administration. The institutions include 14 (31%) public and 31 (69%) private institutions; 8 (19%) of them were classified by the Carnegie Foundation for the Advancement of Teaching (2009) as doctoral institutions, 16 (38%) were master’s institutions, and 18 (43%) were baccalaureate institutions. Detailed institutional characteristics of the 45 participating institutions and their comparison with all 2008 NSSE participating institutions can be found in Table 1. The survey was sent to 77,714 first-year and senior college students and approximately 23,706 students responded to this set of ques- tions, yielding a response rate of 30.5%. However, about 4500 students who were purposely sampled by the institutions were excluded from analysis, which leaves only students who were randomly sampled. Additionally, one institution that offers online courses
  • 12. only was removed from the dataset because no comparison among different course delivery methods can be made at this online institution. Removing this online institution did not greatly affect the general characteristics of the sample. Finally, 1825 students, who accounted for 7.7% of the total respondents, were excluded as their responses indicated that they may not understand these questions in the manner intended by the researchers (when summed, their responses indicated that over 100% of their classes were online or hybrid classes); this indicates a likely data reliability issue with these new questions that will be addressed when discussing this study’s limitations. The final data set for this study has 17,819 respondents, in which 8065 (45%) were first-year students and the remaining 9754 (55%) seniors. Nearly 7000 respondents (35%) were male and 13,000 (65%) female. The majority (97% for first-year students and 87% for senior students) of the surveyed students were enrolled full-time at their institution. Detailed student characteristics including gender, enroll- ment status, and race and ethnicity can be found in Table 2. Table 1 Institutional characteristics. Institutions participated in this study (n = 45) All NSSE 2008 institutions (n = 763)a All US institutionsb Count Percentage (%) Count Percentage (%) Percentage (%) Control Public 14 31 320 42 35 Private 31 69 443 58 65
  • 13. Carnegie classifications Doctoral 8 19 103 16 18 Master’s 16 38 303 47 41 Baccalaureate 18 43 244 38 41 Urbanicity City 27 60 333 47 46 Suburban 6 13 154 22 22 Town 7 16 173 24 21 Rural 5 11 53 7 9 a Not all NSSE participating institutions are classified by the Carnegie Foundation for the Advancement of Teaching. b US percentages are based on data from the 2007 IPEDS institutional characteristics file as reported in Indiana University Center for Postsecondary Research (2008a). Table 2 Respondent demographics. First-year Senior Count Percentage (%) Count Percentage (%) Gender Male 2771 34 3351 35 Female 5274 66 6375 65 Enrollment status Part-time 259 3 1175 13 Full-time 7789 97 8562 87 Race or ethnicity African American or Black 676 8 881 9 American Indian or other Native American 40 1 60 1 Asian, Asian American, or Pacific Islander 483 6 437 5 White (non-Hispanic) 5753 71 7132 73 Hispanic, Mexican or Mexican American, Puerto Rican 279 4 273 3 Other 124 2 111 1
  • 14. Multiracial 208 3 194 2 No response 502 6 666 7 https://www.researchgate.net/publication/237279421_The_Natio nal_Survey_of_Student_Engagement_Conceptual_Framework_O verview_of_Psychometric_Properties?el=1_x_8&enrichId=rgreq -5b2569bc-7830-4af3-80d3- 9b27c325e5f5&enrichSource=Y292ZXJQYWdlOzIyMzIzNTA4 MTtBUzoyNjAwNjI5ODc5NDM5MzdAMTQzOTAxNTI1Njk1N w== Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010) 1222–1232 1225 2.3. Variables and data analysis For the purposes of this study, a Web or online course is defined as a course that is conducted entirely through the Internet without any face-to-face contact among instructor(s) and students. In contrast, a face-to-face course is a course that conducted entirely in a physical classroom without using any Internet technology for course management or instructional purpose. Although there are many definitions for hybrid learning, or so-called blended learning (Bersin, 2004; Driscoll, 2002; Reay, 2001; Rossett, 2001; Sands, 2002; Ward & LaBranche, 2003), Graham (2006) indicated that blended learning can be sorted into three categories: enabling blends, enhancing blends, and trans- forming blends. Enabling blends focus primarily on improving student access and convenience. Enhancing blends allow for incremental changes to the pedagogy while transforming blends carry radical transformation of the pedagogy. Learning management systems
  • 15. and tech- nology equipped classrooms are two examples of enhancing blends. For the purpose of this study, the researchers adopted enhancing blends as the definition of hybrid courses. Therefore, a hybrid course is defined as one that blends both Web and face-to-face components in the same course. A hybrid course must include both face-to- face contacts among instructor(s) and students and the use of the Internet or Web technology for course management or instructional purpose. If the only utilization of the Internet or Web technology in a face-to-face course is for non-instructive or routine communication, the course is considered a face-to-face course rather than a hybrid course. To answer the first research question, descriptive statistics including means and standard deviations were reported for all of the survey items. The Kruskal Wallis Test (Siegel & Castellan, 1988), a nonparametric equivalent of the analysis of variance (ANOVA), was conducted to examine if statistically significant differences exist in student’s technology use among different course delivery methods. Hierarchical lin- ear modeling (HLM) was utilized to answer the second research question (Raudenbush & Bryk, 2002). The assumption underlying the HLM analysis is that institutions have a differential impact on student’s course taking behaviors and technology usage. The benefit of using HLM is that it allowed the researchers to partition the variance attributable to the individual and the variance attributable to the institution. The dependent variables for the HLM analysis are the ratio of
  • 16. classes taken online. The independent variables include individual (level 1) vari- ables such as the student’s gender, enrollment status (part-/full- time), ethnicity, major, and parental education. The institutional level vari- ables (level 2 variables) are dummy-coded 2005 Carnegie basic classification, control (public/private), and urbanicity or locale. The third research question, which addresses the impact of learning technologies on student engagement and outcomes, was answered using Ordinary Least Squares (OLS) multiple regression analysis. A regression analysis is a statistical technique that allows the researcher to investigate the relationship between one dependent variable and several independent variables (Tabachnick & Fidell, 2007). The dependent variables for this analysis include four of the five NSSE Benchmarks of Effective Educational Practice (Kuh, 2004; LaNasa, Cabrera, & Trangs- rud, 2009; Pascarella & Seifert, 2008) – level of academic challenge (LAC), active and collaborative learning (ACL), student-faculty interac- tion (SFI), and supportive campus environment (SCE), the three student self-reported Gain Scales (Chen, Ted, & Davis, 2007; Pike, 2006) – gain in general education, gain in personal and social development, and gain in practical competence, and the three deep learning scales (Nelson Laird et al., 2005) – higher order thinking, reflective learning, and integrative learning. One of the NSSE Benchmarks – enriching educational experiences (EEE) – is excluded from the analysis because technology use is part of the benchmark. The independent variables include the percentage of classes taken online, the percentage of
  • 17. classes that were hybrid classes, a composite score of course- related tech- nology use, and controls for student and institutional characteristics. 3. Results 3.1. Descriptive statistics The first three questions of the survey asked students how many courses they took in the current academic year, how many of those courses used the Web or Internet as the primary method to delivery course content, and how many of those courses were hybrid courses. Using those responses, we were able to classify course delivery methods into three categories: Web or Internet-only, face-to- face, and hy- brid. As a result of this classification, students can take courses in seven different patterns: Web-only, face-to-face-only, hybrid-only, some Web and hybrid, Web and face-to-face, some face-to-face and hybrid, and all three delivery methods. As shown in Table 3, very few (2.1%) of the 17,819 students who adequately completed the survey took all their courses in Web-only mode. A larger percentage of students took some Web courses and some hybrid courses (5.2%) while a similar percentage enrolled in both Web and face-to-face courses (7.6%). The majority (84.8%) took classes with at least some face-to-face component. Although some of those students were also enrolled in Web (7.6%), hybrid (21.5%), or both Web and hybrid (34.9%) courses, one-fifth (20.8%) of the respondents were enrolled only in face-to-face clas- ses with no significant Web or Internet component. These seven
  • 18. groups were collapsed into five groups for later analyses: Web- only, hy- brid-only, some Web, face-to-face and hybrid, and face-to-face- only. As shown in Tables 4 and 5, students whom one would expect to use technology more often – those enrolled in Web and hybrid classes – indeed used online learning tools and technologies more frequently than students who only took face-to-face courses. More specifically, Table 3 Distribution of course options. Course delivery method First-year students Senior students Combined Frequency Percentage (%) Frequency Percentage (%) Frequency Percentage (%) Web-only 90 1.1 281 2.9 371 2.1 Hybrid-only 628 7.8 789 8.1 1417 8 Face-to-face-only 1718 21.3 1988 20.4 3706 20.8 Web and hybrid 362 4.5 561 5.8 923 5.2 Web and face-to-face 573 7.1 776 8 1349 7.6 Face-to-face and hybrid 1699 21.1 2139 21.9 3838 21.5 All three delivery methods 2995 37.1 3220 33 6215 34.9 Total 8065 100.00 9754 100.00 17,819 100.00 Table 4 First-year student engagement in online learning activities. Web-only Hybrid-only Some Web Hybrid and
  • 19. face-to-face Face-to-face- only Mean SD Mean SD Mean SD Mean SD Mean SD How often: discussed or completed an assignment using a synchronous tool like instant messaging, online chat room, video conference, etc. 1.91 1.174 1.72 .961 1.62 .886 1.50 .810 1.45 .824 How often: discussed or completed an assignment using an asynchronous tool like e-mail, discussion board, listserv, etc. 3.12 1.091 2.62 .974 2.46 .931 2.39 .893 2.00 .928 How often: used your institution’s Web-based library resources in completing class assignments 2.40 .997 2.60 .910 2.45 .900 2.44 .861 2.29 .919 How often: used the Internet to discuss with an instructor topics you would not feel comfortable discussing face-to-face or in a classroom 1.70 .993 1.87 .989 1.78 .940 1.69 .874 1.62 .882 How often: used an electronic medium (listserv, chat group, Internet, instant messaging, etc.) to discuss or complete an assignment 3.07 1.095 2.66 1.044 2.66 1.037 2.61 1.001 2.33 1.047
  • 20. How often: used e-mail to communicate with an instructor 3.40 .761 3.25 .790 3.25 .781 3.17 .778 3.04 .824 To what extent does your institution emphasize using computers in academic work? 3.56 .781 3.42 .744 3.33 .780 3.30 .753 3.15 .821 Table 5 Senior student engagement in online learning activities. Web-only Hybrid-only Some Web Hybrid and face-to-face Face-to-face- only Mean SD Mean SD Mean SD Mean SD Mean SD How often: discussed or completed an assignment using a synchronous tool like instant messaging, online chat room, video conference, etc. 2.05 1.160 1.62 .921 1.64 .889 1.51 .812 1.34 .734 How often: discussed or completed an assignment using an asynchronous tool like e-mail, discussion board, listserv, etc. 3.29 1.032 2.82 .986 2.69 .942 2.58 .915 2.07 .979 How often: used your institution’s Web-based library resources in completing class assignments
  • 21. 2.72 1.042 2.81 .964 2.75 .933 2.77 .939 2.52 1.020 How often: used the Internet to discuss with an instructor topics you would not feel comfortable discussing face-to-face or in a classroom 1.77 1.086 1.82 .990 1.74 .933 1.61 .850 1.48 .819 How often: used an electronic medium (listserv, chat group, Internet, instant messaging, etc.) to discuss or complete an assignment 3.25 1.018 2.99 1.009 2.91 .991 2.81 .979 2.47 1.067 How often: used e-mail to communicate with an instructor 3.67 .604 3.53 .687 3.47 .691 3.43 .707 3.28 .788 To what extent does your institution emphasize using computers in academic work? 3.72 .594 3.64 .613 3.49 .716 3.48 .711 3.37 .799 1226 Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010) 1222–1232 respondents who were enrolled in online courses more frequently used both synchronous and asynchronous communication tools for instructional or learning purposes. Compared with students in traditional face-to-face setting, online students also more frequently used electronic media to discuss or complete assignments, and these differences were consistent for both first-year and senior students. One interesting finding is that students who took hybrid courses more frequently utilized the institutional Web-based library resources in com- pleting class assignment than students who only had online courses or those only had face-to-face courses. A probable
  • 22. explanation is that students who took hybrid courses are more familiar with doing research online than students who took only face-to-face courses. On the other hand, students who only took online courses may feel comfortable with the Internet technologies but may not receive sufficient instruction on how to conducting research using Web-based library resources. We attempted to perform an analysis of variance (ANOVA) on the mean scores for these seven questions for both first-year and senior students to determine which, if any, of the apparent differences are statistically significant. These tests were abandoned as the assumptions of ANOVA, particularly homoscedacity, were only met in two of the 14 tests. A nonparametric test, the Kruskal Wallis Test, indicated that there are significant differences in the mean scores for each question among at least some of the groups of students. However, the very large number of respondents makes it difficult to make much meaning of the significant results of those tests given the sensitivity of the tests to the high number of respondents. 3.1.1. HLM one-way ANOVA model To answer the second research question, a hierarchical linear model (HLM) was built to investigate the impacts of individual and insti- tutional variables on students’ course taking behaviors. Before estimating the full, two-level HLM to examine the effects of individual and institutional variables in the student’s likelihood of taking
  • 23. online courses, we used the one-way ANOVA model or so- called ‘‘null model” to estimate the proportion of variance that exists between and within colleges. The proportion of variance between institutions ranges from 0.033 for first-year students to 0.157 for seniors (Table 6). The result indicates that institutional variables have more influence on seniors than first-year students in their decision to take online courses. This result also warrants further investigation into what individual and institutional variables may affect student’s decision to take online courses. 3.1.2. HLM random coefficient regression and intercept- and slopes-as-outcomes models The second step of the modeling procedure is the creation of the random coefficient regression model, also known as the level 1 model or the individual level model. This procedure tests and establishes the individual-level independent variables before estimating the full, intercept- and slopes-as-outcomes model. Table 7 presents the descriptive statistics of the independent variables included in the analysis. The level 1 independent variables include student’s gender (0 = male, 1 = female), enrollment status (0 = full-time, 1 = part- time), ethnicity Table 6 Variance components of dependent variable. Ratio of online courses taken by the student
  • 24. First-year students Seniors Total variance .05929 .08028 Variance within institutions .05731 .06767 Variance between institutions .00198 .01261 Proportion between institutions .033 .157 Table 7 Descriptive statistics for independent variables included in models. First-year students Seniors Mean SD Min. Max. Mean SD Min. Max. Description Individual characteristics First generation college student .38 .49 0 1 .42 .49 0 1 First generation college student is defined as neither parents has a baccalaureate degree from a college. 1 = first generation college student, 0 = all other Female .64 .48 0 1 .65 .48 0 1 Gender: 1 = female, 0 = male Part-time enrollment .03 .18 0 1 .13 .33 0 1 Enrollment status: 1 = enrolled part-time, 0 = enrolled full-time Ethnical minority .28 .45 0 1 .26 .44 0 1 Ethnicity: 0 = White/Caucasian, 1 = all other STEM .18 .39 0 1 .17 .37 0 1 Major: 1 = Science, Technology, Engineering, and Mathematics, 0 = all other Arts, Humanities, and Social Sciences (reference)
  • 25. .26 .44 0 1 .28 .45 0 1 Major: 1 = Arts, Humanities, and Social Sciences, 0 = all other Business .17 .37 0 1 .18 .39 0 1 Major: 1 = Business, 0 = all other Professional .12 .32 0 1 .13 .34 0 1 Major: 1 = professional, 0 = all other Other and undecided .16 .37 0 1 .15 .36 0 1 Major: 1 = Other majors and undecided, 0 = all other Institutional characteristics Carnegie: doctoral institution .18 .39 0 1 .18 .39 0 1 Carnegie classification: 1 = doctorate granting universities, 0 = all other Carnegie: master’s institution .36 .48 0 1 .36 .48 0 1 Carnegie classification: 1 = master’s colleges and universities, 0 = all other Carnegie: baccalaureate institution .4 .5 0 1 .4 .5 0 1 Carnegie classification: 1 = baccalaureate colleges, 0 = all other Carnegie: other .07 .25 0 1 .07 .25 0 1 Carnegie classification: 1 = special focus institutions, tribal colleges, none- classified institutions Private .69 .47 0 1 .69 .47 0 1 Control: 1 = private, 0 = public City .6 .5 0 1 .6 .5 0 1 Urbanicity: 1 = city, 0 = all other
  • 26. Suburban .13 .34 0 1 .13 .34 0 1 Urbanicity: 1 = suburban, 0 = all other Town .16 .37 0 1 .16 .37 0 1 Urbanicity: 1 = town, 0 = all other Rural .11 .32 0 1 .11 .32 0 1 Urbanicity: 1 = rural, 0 = all other Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010) 1222–1232 1227 (0 = White/Caucasian, 1 = minority), first generation college student status (0 = at least one parent has a baccalaureate degree, 1 = neither parent has a baccalaureate degree), and a series of dummy- coded variables for major (with Arts, Humanities, and Social Sciences being the reference category). The outcomes of the random coefficient regression model will be reported jointly with the final model. In the third and final step in the modeling process, we built the between-institution model by allowing the intercept to vary by insti- tution. We then modeled the intercept with institutional characteristics. Included in the level 2 models are 2005 basic Carnegie classifica- tions (doctorate granting universities, master’s colleges and universities, baccalaureate colleges, and others) with the doctorate granting universities serving as the reference category. We also included institution control (public or private) and locale or urbanicity (city, sub- urban, town, and rural, of which city serves as the reference category). To avoid multicollinearity, we did not include the size of the insti- tution as a control because the size of institution is highly correlated with the Carnegie classification within our sample (r = .71, p < .001). Table 8 illustrates the summary effects of individual and
  • 27. institutional variables on student’s decision to take online courses. It is clear that the factors that affect online course taking for first-year students and seniors are quite different. For first-year students, enrollment in a private institution slightly increases the likelihood (p < .05) of enrollment in online courses while enrollment in a baccalaureate colleges and universities slightly reduces (p < .05) the chance of enrollment in online courses compared with their counterparts enrolled in a doc- torate granting institutions. Contrary to their effect on first-year students, institutional variables have no statistically significant effect on senior students’ decision to take online courses. Although individual variables affect both first-year and senior students’ decision to take online courses, they tend to affect seniors more than first-year students. For first-year students, racial and ethnic minorities (p < .001) and part-time students (p < .05) are more likely to enroll in online courses. The same effects can also be found with senior students (both at p < .001). Additionally, seniors who major in the professional fields (e.g. education, nursing, occupational therapy. . . , etc.) are also more likely to enroll in online courses (p < .001). The stu- dent’s major has no effect on first-year student’s likelihood of taking online courses except for students in business, who are slightly more likely than students in other majors to enroll in online courses (p < .05). 3.2. Multiple regression models
  • 28. To answer the third research question, which addresses the impact of learning technologies on student engagement and outcomes, Or- dinary Least Squares (OLS) multiple regression analysis was used. As can be seen in Tables 9 and 10, the total variance explained by the Table 8 Coefficients from HLM for the ratio of courses taken online by the student. First-Year Students Seniors Coefficient p-value Coefficient p-value Institution-level variables Intercept .118 .001 .141 .001 Carnegie: master’s �.01 .435 .004 .816 Carnegie: baccalaureate �.038 .016 �.03 .188 Carnegie: other �.039 .282 .27 .628 Private .025 .043 .014 .408 Locale: suburban .016 .282 �.001 .992 Locale: town .003 .859 �.027 .27 Locale: rural .039 .075 .001 .995 Individual-level variables First generation college student .013 .056 .013 .096 Female �.01 .113 �.005 .421 Part-time .093 .016 .086 .001 Minority .035 .001 .047 .001 Major: STEM �.02 .056 �.03 .041 Major: business .02 .032 .004 .778 Major: professional �.009 .307 �.046 .001 Major: Other and undecided .001 .952 .008 .518
  • 29. Variance components Variance between institutions .0006 .00539 Variance between explained 69.70% 57% Variance within institutions .05407 .06368 Variance within explained 5.65% 5.90% Table 9 First-year students’ partitioning of variance for the deep learning scales, gains scales, and NSSE Benchmarks in multiple regression models. Variance due to Studenta and institutionalb characteristics Delivery of coursesc Use of learning technologyd Total variance explained Deep learning scales Higher order thinking .046*** .005*** .116*** .167*** Integrative learning .050*** .008*** .199*** .257*** Reflective learning .032*** .001*** .090*** .123*** Gains scales Person and social development .070*** .007*** .129*** .206*** Practical competence .075*** .009*** .164*** .248*** General education .059*** .010*** .126*** .195*** NSSE Benchmarks Academic challenge .085*** .008*** .144*** .237*** Active and collaborative learning .096*** .004** .185*** .285***
  • 30. Supportive campus environment .076*** .013*** .102*** .191*** Student-faculty interaction .106*** .001*** .214*** .321*** a Student characteristics include: gender, enrollment status, parents’ education, grades, SAT scores, transfer status, age, membership in a fraternity/sorority, whether or not a student is a STEM field, race-ethnicity, and US citizenship. b Institutional characteristics include: Carnegie classification and control. c Delivery of courses included: the percentage of courses a student was taking online and the percentage of courses a student was taking face-to-face with Web- components. d Use of learning technology included: a single scale combining the seven questions asking students about how often they used certain course-related technology. ** p < .01. *** p < .001. 1228 Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010) 1222–1232 multiple regression models employed in this study is statistically significant in all cases and quite substantial in many of these models. For first-year students (Table 9), the variance explained by the models ranges from 12.3% to 32.1% while for seniors it ranges from 11.1% to 26.2% (Table 10). Of the variance explained the largest portion by far is students’ use of learning technology. In contrast, the delivery meth-
  • 31. od of the courses in which students are enrolled seems to have a statistically significant but in most cases unsubstantial, impact on the variance explained for the model. In all of these models, the relationship between the NSSE Benchmarks of Effective Education Practices, deep approach of learning, and student self-reported educational gains, and the use of learning technology is positive and relatively strong. Table 11 displays the relative influence of learning technology with other forms of engagement and students learning. Multicollinearity is not a concern for this study as the only moderate correction happens between enrollment status and age (r = .47). All the other independent variables have a Pearson’s r less than .1. 4. Discussion The first research question asked: How often do college students in different types of courses use the Web and Internet technologies for course-related tasks? First, it is important to note that the majority of students in this study had classes that were entirely or partially in the Table 10 Seniors students’ partitioning of variance for the deep learning scales, gains scales, and NSSE Benchmarks in multiple regression models. Variance due to Studenta and institutionalb characteristics Delivery of coursesc Use of learning technologyd Total variance
  • 32. explained Deep learning scales Higher order thinking .143*** .032*** .005*** .106*** Integrative learning .251*** .069*** .012*** .170*** Reflective learning .111*** .038*** .007*** .066*** Gains scales Person and social development .091*** .004*** .119*** .214*** Practical competence .069*** .013*** .138*** .220*** General education .078*** .009*** .089*** .176*** NSSE benchmarks Academic challenge .045*** .013*** .132*** .190*** Active and collaborative learning .082*** .015*** .165*** .262*** Supportive campus environment .065*** .008*** .085*** .158*** Student-faculty interaction .074*** .010*** .161*** .245*** ��p < .01. a Student characteristics include: gender, enrollment status, parents’ education, grades, SAT scores, transfer status, age, membership in a fraternity/sorority, whether or not a student is a STEM field, race-ethnicity, and US citizenship. b Institutional characteristics include: Carnegie classification and control.
  • 33. c Delivery of courses included: the percentage of courses a student was taking online and the percentage of courses a student was taking face-to-face with Web- components. d Use of learning technology included: a single scale combining the seven questions asking students about how often they used certain course-related technology. *** p < .001. Table 11 Net effectsa of use of learning technology on the deep learning scales, gains scales, and NSSE Benchmarks in multiple regression models. Variance due to First-year students Seniors Deep learning scales Higher order thinking ++ ++ Integrative learning ++ ++ Reflective learning ++ + Gains scales Person and social development +++ +++ Practical competence +++ ++ General education ++ + NSSE Benchmarks Academic challenge + + Active and collaborative learning ++ ++ Supportive campus environment + + Student-faculty interaction +++ +++ +, p < .001 and unstandardized B > .3; ++, p < .001 and unstandarized B > .4, +++, p < .001 and unstandarized B > .5.
  • 34. a Table reports results from ten multiple regression models (one per row). Student level controls include gender, enrollment status, parents’ education, grades, SAT scores, transfer status, age, membership in a fraternity/sorority, whether or not a student is a STEM field, race-ethnicity, US citizenship, the percentage of courses a student was taking online and the percentage of courses a student was taking face-to-face with Web-components. Institutional controls include Carnegie classification and control. Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010) 1222–1232 1229 classroom. Very few were enrolled in all online courses and few were enrolled in hybrid-only or hybrid and online classes. Our finding is consistent with the perception that students who took online courses are more likely to use Web or Internet technologies to enhance their learning and communication with faculty and other students. Our results also indicate that students who took hybrid courses more fre- quently utilize Web-based library resources in completing assignments than students who took only online or face-to-face courses. Although the cause of this result is unknown, it is possible that not all students who took online courses are aware of the learning resources that are available to them. Instructors must ensure that students who enroll in online courses are provided instruction on how to access the learning resources that are available to them online and offline. Institutions may also want to provide personal assistance in dealing with academic difficulties and technical problems to online students who do not have the benefit of personal contacts with faculty
  • 35. and fellow classmates as in the face-to-face classrooms (LaPadula, 2003). Our second research question asked: Do individual and institutional characteristics affect the likelihood of taking online courses? The results of our analyses indicate that individual and institutional characteristics have small but statistically significant effects on students’ likelihood of taking online courses. We understand that there are many personal and institutional factors that can affect a student’s course taking behavior and we are not trying to imply a casual relationship in our study. Factors like employment, child care, and financial support can and should have a significant impact on a student’s decision of which type of courses he or she would take. Nevertheless, we find that certain types of students including racial and ethnic minorities and part-time students are more likely to take online courses. We also found that senior college students majoring in professional fields and first-year business students more frequently take online courses than students in other fields. In the future, the question that deserves further investigation is whether minority and part-time students take online courses more often because online courses offer better quality of education or because it is more convenient. If the reason is for mere convenience – and our guess is it probably is – then institutions must ensure that online students receive high quality instruction, support services, and other fringe benefits enjoyed by traditional face- to-face students. Things like social and informal interaction with faculty and
  • 36. other students and opportunities to receive personal assistance from faculty and staff are also important for both online and face-to-face 1230 Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010) 1222–1232 students. If online students do not receive the same quality of education and support as their traditional classroom counterparts, another form of unintended educational segregation may develop as increasing numbers of minority, part-time, and working students dispropor- tionately elect to take online courses. In our third research question we asked: does the relative amount of technology employed in a course have a relationship with student engagement, learning approaches, and student self-reported learning outcomes? While one should hesitate to suggest a causal relationship between the use of information technology and learning approaches, educational gains, and other forms of engagement, the results suggest that even after controlling for individual and institutional characteristics, there is a relationship that exists between students who engage in course-related technology and those who engage in other ways, as well as the learning approaches and gains while in college. It would seem that the use of course-related learning technology is another important concept under the umbrella of student engagement. Com- paring results from the models for first-year students to those for seniors also suggests that use of technology has a stronger
  • 37. impact earlier in the college experience. Perhaps integrating technology into lower-division courses could be more beneficial in encouraging engagement in other ways of learning in college. The positive correlation between the use of technology and measures of engagement found in this study are not surprising because it replicates previous studies (Hu & Kuh, 2001; Kuh & Hu, 2001; Nelson Laird & Kuh, 2005). This study demonstrates that this positive cor- relation is persisting even as new technologies are being introduced and students are entering college with increasingly sophisticated uses for and expectations of technology in their lives and on campus. While this study does not explain the precise nature of the relationship between technology and engagement, it does highlight the need for future research exploring the nature of this persisting positive correlation. 4.1. Limitations The most significant limitation of this study is that the results are largely based on responses to an experimental set of questions that are relatively untested for their psychometric properties, including validity and reliability. While the questions have face and content validity, the researchers have not yet performed extensive investigations of the psychometric properties of these questions. Additionally, institu- tions participating in this study were not randomly selected from the pool of 4-year colleges and universities in the United
  • 38. States – the nature of NSSE allows institutions to self-select into the pool. Although the sample covers a wide range of American higher education insti- tutions in terms of the Carnegie classifications, size, control, and urbanicity, one must be cautious when generalizing the results of this study beyond these students. On a related note, because the limitations of our data, including non-random institutional sample and the nature of the NSSE survey, it is not possible to make conclusions about the direction of causality in this study. For instance, while our find- ings suggest that students who use online learning technology are more engaged, it is possible that more engaged students tend to use learning technology more. Future studies are needed in order to point out the direction of causality between the use of learning technology and student engagement. Lastly, a large sample size like we had in this study (17,819 first-year and senior students) can be both a blessing and a curse. A large randomly selected student sample improves the external validity of this study, but it also has the potential of making all statistical tests significant. For that reason, we reported effect sizes for all our statistical findings. From our point of view, we believe the benefits of a large sample outweigh the associated disadvantages. 5. Conclusion Overall, the results of this study point to a positive relationship between Web-based learning technology use and student engagement
  • 39. and desirable learning outcomes. Not only do students who utilize the Web and Internet technologies in their learning tend to score higher in the traditional student engagement measures (e.g. level of academic challenge, active and collaborative learning, student- faculty inter- action, and supportive campus environment), they also are more likely to make use of deep approaches of learning like higher order think- ing, reflective learning, and integrative learning in their study and they reported higher gains in general education, practical competence, and personal and social development. These results are encouraging signs that Internet and Web-based learning technologies continue to have a positive impact on student learning and engagement. New technology also brings new challenges to higher education institutions. As more ethnic minority and part-time students elect to take online courses instead of traditional classroom courses, ensuring the quality of online education and providing good online student support services becomes a mandate for social equity. It is also the responsibility of the institutional administrators and faculty to make certain that all online students received adequate academic and technological support and they are made aware of all the online and offline resources available to them. No one would deny that computers and the Internet technology have offered educational opportunities to many people who would otherwise be excluded from the traditional higher education system. Now the goal should be not just provide the educational opportunities but the highest educational quality for all students.
  • 40. Appendix A A.1. NSSE 2008 online learning survey items 1. During the current school year, how many courses have you completed in total? (Use a drop down menu for student to select from 0 to 20 or more) 2. During the current school year, about how many of these courses used the Web or Internet as the primary method to deliver course content? (Use a drop down menu for student to select from 0 to 20 or more) 3. During the current school year, about how many of your courses were conducted face-to-face but had a Web component designed to promote interaction among students and instructors? (Use a drop down menu for student to select from 0 to 20 or more) 4. In your experience at your institution during the current school year, about how often have you done each of the following? (Very often, often, sometimes, never) a. Discussed or completed an assignment using a ‘‘synchronous” tool like instant messenger, online chat room, video conference, etc. Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010) 1222–1232 1231 b. Discussed or completed an assignment using an ‘‘asynchronous” tool like e-mail, discussion board, listserv, etc.
  • 41. c. Asked for help from a tutor or other students outside of required class activities. d. Participated in discussions about important topics related to your major field or discipline. e. Participated in course activities that challenged you intellectually. f. Participated in a study group outside of those required as a class activity. g. Participated in discussions that enhance your understanding of social responsibility. h. Used your institution’s Web-based library resources in completing class assignments. i. Participated in discussions that enhance your understanding of different cultures. j. Used the Internet to discuss with an instructor topics you would not feel comfortable discussing face-to-face or in a classroom. References Allen, I. E., & Seaman, J. (2008). Staying the course: Online education in the United States. Needham, MA: Sloan Consortium. <http://www.sloan-c.org/publications/survey/pdf/ staying_the_course.pdf>. Austin, A. W. (1993). What matters in colleges: Four critical years revisited. San Francisco: Jossey-Bass. Bersin, J. (2004). The blended learning book: Best practices, proven methodologies, and lessons learned. San Francisco, CA: Pfeiffer. Bråten, I., & Streømsø, H. I. (2006). Epistemological beliefs, interest, and gender as predictors of Internet-based learning activities. Computers in Human Behavior, 22(6), 1027–1042. Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., et al. (2004). How does distance education
  • 42. compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research, 74, 379–439. Bullen, M. (1998). Participation and critical thinking in online university distance education. Journal of Distance Education, 13(2), 1–32. Carini, R. M., Kuh, G. D., & Klein, S. P. (2006). Student engagement and student learning: Testing the linkages. Research in Higher Education, 47(1), 1–32. <http:// www.springerlink.com/content/b8m6t51v83732308/fulltext.pdf> . Carnegie Foundation for the Advancement of Teaching (2009). Basic classification. <http://www.carnegiefoundation.org/classifications/index.asp?k ey=791>. Chen, P. D., Ted, I., & Davis, L. K. (2007). Engaging African American students: Compare student engagement and student satisfaction at HBCUs and their self-identified PWIs using National Survey of Student Engagement (NSSE) data. In Paper presented at the 32nd annual conference of the Association for the Study of Higher Education, Louisville, KY. Indiana University Bloomington, Center for Postsecondary Research. <http://cpr.iub.edu/uploads/Engaging%20African%20American %20Students%20ASHE.pdf>. Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39(7), 3–7. Chickering, A. W., & Ehrmann, S. C. (1996). Implementing the seven principles: Technology as lever. AAHE Bulletin(October), 3–6. Clark, R. E. (2009). Past and future research in online
  • 43. education. In Paper presented at the 2009 annual meeting of the American Education Research Association, San Diego, California. <http://www.cogtech.usc.edu/aera_09.php>. Driscoll, M. (2002). Blended learning: Let’s get beyond the hype. Learning & training innovations newsline. <http://www.ltinewsline.com/ltimagazine/article/ articleDetail.jsp?id=11755>. Duderstadt, J., Atkins, D., & Houweling, D. (2002). Higher education in the digital age: Technology issues and strategies for American colleges and universities. Westport, CT: Praeger. Ehrmann, S. C. (2004). Beyond computer literacy: Implications of technology for the content of a college education. Liberal Education, 90(4), 6–13. Ellison, N. (2007). Facebook use on campus: A social capital perspective on social network sites. In Program presented at the sixth annual ECAR symposium, Boca Raton, FL. Gladieux, L. E., & Swail, W. S. (1999). The virtual university and educational opportunity: Issues of equity and access for the next generation. Washington, DC: College Board. Graham, C. R. (2006). Blended learning systems: Definition, current trends, and future directions. In C. J. Bonk & C. R. Graham (Eds.), The handbook of blended learning (pp. 3–21). San Francisco, CA: Preiffer. Green, K. (2007). The 2007 campus computing survey. <http://www.campuscomputing.net/sites/www.campuscomputing .net/files/2007-CCP_0.pdf>. Hawkins, B. L., & Rudy, J. A. (2008). EDUCAUSE core data service fiscal year 2007 summary report. <http://net.educause.edu/ir/library/pdf/PUB8005.pdf>. Hu, S., & Kuh, G. D. (2001). Computing experience and good practices in undergraduate education: Does the degree of campus ‘‘wiredness” matter? Education Policy Analysis
  • 44. Archives, 9(49). <http://epaa.asu.edu/epaa/v9n49.html>. Indiana University Center for Postsecondary Research (2008a). Institutional report 2008. Indiana University Center for Postsecondary Research. <http://nsse.iub.edu/ 2008_Institutional_Report/>. Indiana University Center for Postsecondary Research. (2008b). Promoting engagement for all students: The imperative to look within 2008 results. Bloomington, IN. <http:// nsse.iub.edu/NSSE_2008_Results/docs/withhold/NSSE2008_Res ults_revised_11-14-2008.pdf>. Jenkins, H. (2006). Convergence culture: Where old and new media collide. New York: NYU Press. Kuh, G. D. (2004). The National Survey Of Student Engagement: Conceptual framework and overview of psychometric properties. <http://nsse.iub.edu/2004_annual_report/pdf/ 2004_Conceptual_Framework.pdf>. Kuh, G. D., & Hu, S. (2001). The relationships between computer and information technology use, student learning, and other college experiences. Journal of College Student Development, 42, 217–232. Kuh, G. D., & Vesper, N. (2001). Do computers enhance or detract from student learning? Research in Higher Education, 42, 87–102. LaNasa, S. M., Cabrera, A. F., & Trangsrud, H. (2009). The construct validity of student engagement: A confirmatory factor analysis approach. Research in Higher Education, 50(4), 315–332. LaPadula, M. (2003). A comprehensive look at online student support services for distance learners. The American Journal of Distance Education, 17(2), 119–128.
  • 45. Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies. Washington, DC: US Department of Education. Nelson Laird, T. F., & Kuh, G. D. (2005). Student experiences with information technology and their relationship to other aspects of student engagement. Research in Higher Education, 46(2), 211–233. Nelson Laird, T. F., Shoup, R., & Kuh, G. D. (2005). Measuring deep approaches to learning using the National Survey of Student Engagement. In Paper presented at the annual meeting of the Association for Institutional Research, Chicago, IL. <http://nsse.iub.edu/pdf/conference_presentations/2006/AIR200 6DeepLearningFINAL.pdf>. Ouimet, J. A., Bunnage, J. C., Carini, R. M., Kuh, G. D., & Kennedy, J. (2004). Using focus groups, expert advice, and cognitive interviews to establish the validity of a college student survey. Research in Higher Education, 45(3), 233–250. Pace, C. R. (1980). Measuring the quality of student effort. Current Issues in Higher Education, 2, 10–16. Parsad, B., & Lewis, L. (2008). Distance education at degree- granting Postsecondary Institutions: 2006–2007 (NCES 2009– 044). National Center for Education Statistics, Institute of Education Sciences. Washington, DC: US Department of Education. <http://nces.ed.gov/pubs2009/2009044.pdf>. Pascarella, E. T., & Seifert, T. A. (2008). Validation of the NSSE benchmarks and deep approaches to learning against liberal arts outcomes. Wabash College, Center for Inquiry in the Liberal Arts <http://www.wabashnationalstudy.org/files/2008ashevalidationo
  • 46. fnssebenchmarkslinked.pdf. Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students (volume 2): A third decade of research. San Francisco: Jossey-Bass. Pike, G. R. (2006). The convergent and discriminant validity of NSSE scalelet scores. Journal of College Student Development, 47(5), 550–563. Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed.). Thousand Oaks, CA: Sage. Reay, J. (2001). Blended learning: A fusion for the future. Knowledge Management Review, 4(3), 6. Robinson, C. C., & Hullinger, H. (2008). New benchmarks in higher education: Student engagement in online learning. Journal of Education for Business, 84(2), 101–108. Rossett, A. (2001). The ASTD e-learning handbook: Best practices, strategies, and case studies for an emerging field. New York, NY: McGraw-Hill. Salaway, G., & Caruso, J. B. (2008). The ECAR study of undergraduate students and information technology. <http://www.educause.edu/ECAR/TheECARStudyofUndergradu ateStu/ 163283>. Sanders, R. (2006). The ‘‘imponderable bloom”: Reconsidering the role of technology in education. Innovate Journal of Online Education, 2(6). <http://innovateonline.info/ index.php?view=article&id=232&action=article>. Sands, P. (2002). Inside outside, upside downside: Strategies for connecting online and face-to-face instruction in hybrid courses. Teaching with Technology Today, 8(6). <http:// www.uwsa.edu/ttt/articles/sands2.htm>. http://www.sloan-
  • 47. c.org/publications/survey/pdf/staying_the_course.pdf http://www.sloan- c.org/publications/survey/pdf/staying_the_course.pdf http://www.springerlink.com/content/b8m6t51v83732308/fulltex t.pdf http://www.springerlink.com/content/b8m6t51v83732308/fulltex t.pdf http://www.carnegiefoundation.org/classifications/index.asp?ke y=791 http://cpr.iub.edu/uploads/Engaging%20African%20American% 20Students%20ASHE.pdf http://www.cogtech.usc.edu/aera_09.php http://www.ltinewsline.com/ltimagazine/article/articleDetail.jsp ?id=11755 http://www.ltinewsline.com/ltimagazine/article/articleDetail.jsp ?id=11755 http://www.campuscomputing.net/sites/www.campuscomputing. net/files/2007-CCP_0.pdf http://net.educause.edu/ir/library/pdf/PUB8005.pdf http://epaa.asu.edu/epaa/v9n49.html http://nsse.iub.edu/2008_Institutional_Report/ http://nsse.iub.edu/2008_Institutional_Report/ http://nsse.iub.edu/NSSE_2008_Results/docs/withhold/NSSE20 08_Results_revised_11-14-2008.pdf http://nsse.iub.edu/NSSE_2008_Results/docs/withhold/NSSE20 08_Results_revised_11-14-2008.pdf http://nsse.iub.edu/2004_annual_report/ http://nsse.iub.edu/2004_annual_report/ http://nsse.iub.edu/pdf/conference_presentations/2006/AIR2006 DeepLearningFINAL.pdf http://nces.ed.gov/pubs2009/2009044.pdf http://www.wabashnationalstudy.org/files/ http://www.educause.edu/ECAR/TheECARStudyofUndergraduat eStu/163283 http://www.educause.edu/ECAR/TheECARStudyofUndergraduat eStu/163283
  • 48. http://innovateonline.info/index.php?view=article&amp;id=232 &amp;action=article http://innovateonline.info/index.php?view=article&amp;id=232 &amp;action=article http://www.uwsa.edu/ttt/articles/sands2.htm http://www.uwsa.edu/ttt/articles/sands2.htm 1232 Pu-Shih Daniel Chen et al. / Computers & Education 54 (2010) 1222–1232 Siegel, S., & Castellan, N. J. Jr., (1988). Nonparametric statistics for the behavioral sciences (2nd ed.). Boston: McGraw-Hill. Sitzmann, T., Kraiger, K., Stewart, D., & Wisher, R. (2006). The comparative effectiveness of web-based and classroom instruction: A meta-analysis. Personnel Psychology, 59, 623–664. Tabachnick, B. G., & Fidell, L. S. (2007). Using multivariate statistics (5th ed.). Boston: Pearson. Terrell, S. R., & Dringus, L. (2000). An investigation of the effect of learning style on student success in an online learning environment. Journal of Educational Technology Systems, 28(3), 231–238. Thurmond, V., & Wambach, K. (2004). Understanding interactions in distance education: A review of the literature. International Journal of Instructional Technology & Distance Learning, 1, 9–33. <http://www.itdl.org/journal/Jan_04/article02.htm>. Ward, J., & LaBranche, G. A. (2003). Blended learning: The convergence of e-learning and meetings. Franchising World, 35(4), 22–23. Ward, M., & Newlands, D. (1998). Use of the Web in undergraduate teaching. Computers and Education, 31(2), 171–
  • 49. 184. Zhou, L., & Zhang, D. (2008). Web 2.0 impact on student learning process. In K. McFerrin et al. (Eds.), Proceedings of society for information technology and teacher education international conference (pp. 2880–2882). Chesapeake, VA: AACE. http://www.itdl.org/journal/Jan_04/article02.htm NOTE: 4 pages paper should have abstract, introduction, discussion, conclusion with no grammatical errors, good sentence formation, APA Format, in text citations, references related to Operational excellence areas only Below is the topic: Practical Connection Assignment The structure and scope of operations Consider the music business as a supply network. How has music downloads and streaming affected artists’ sales? What implications has online music transmission had for traditional music retailers? Hints: a) Research music industry structure before downloads Draw flow diagrams b) Research current music industry structure Draw flow diagrams c) Compare and contrast Remember terms such as intermediation, outsourcing etc. Provide a cover page, an introduction, and a conclusion. Three pages minimum: Do not forget to use APA. Discussion 9: Parametric or Non-Parametric Test? Read the following article: Chen, P. S. D., Lambert, A. D., & Guidry, K. R. (2010).
  • 50. Engaging online learners: The impact of Web-based learning technology on college student engagement. Computers & Education, 54(4), 1222-1232. Retrieved from https://www- sciencedirect- com.nl.idm.oclc.org/science/article/pii/S0360131509003285?via %3Dihub ATTACHED Chen, Lambert, & Guidry (2010) found they needed to use nonparametric tests in their work. 1. Given a choice between performing a parametric or non- parametric test, which would you choose and why? (Assume you had a parametric and non-parametric version of a dependent variable and that it did not matter which one you chose) · Your initial post (approximately 200-250 words) should address each question in the discussion directions