SlideShare a Scribd company logo
Working Capital Management
Alternative Working Capital Policies
Cash Management
Inventory and A/R Management
Trade Credit
Bank Loans
Chapter 16
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-1
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
1
Working Capital Terminology
Working capital: current assets.
Net working capital: current assets minus current liabilities.
Net operating working capital: current assets minus (current
liabilities less notes payable).
Current assets investment policy: deciding the level of each
type of current asset to hold, and how to finance current assets.
Working capital management: controlling cash, inventories,
and A/R, plus short-term liability management.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-2
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
2
Selected Ratios for SKI Inc.SKIInd. AvgCurrent
ratio1.75x2.25xDebt/Assets58.76%50.00%Turnover of cash &
securities16.67x22.22xDays sales
outstanding45.6332.00Inventory turnover4.82x7.00xFixed
assets turnover11.35x12.00xTotal assets
turnover2.08x3.00xProfit margin2.07%3.50%Return on
equity10.45%21.00%
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-3
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
3
How does SKI’s current assets investment policy compare with
its industry?
Current assets investment policy is reflected in the current ratio,
turnover of cash and securities, inventory turnover, and days
sales outstanding.
These ratios indicate SKI has large amounts of working capital
relative to its level of sales.
SKI is either very conservative or inefficient.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-4
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
4
Is SKI inefficient or conservative?
A conservative (relaxed) policy may be appropriate if it leads to
greater profitability.
However, SKI is not as profitable as the average firm in the
industry.
This suggests the company has excessive current assets.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-5
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
5
Working Capital Financing Policies
Moderate: Match the maturity of the assets with the maturity of
the financing.
Aggressive: Use short-term financing to finance permanent
assets.
Conservative: Use permanent capital for permanent assets and
temporary assets.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-6
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
6
Moderate Financing Policy
Years
Lower dashed line would be more aggressive.
$
Perm C.A.
Fixed Assets
Temp. C.A.
S-T
Loans
L-T Fin:
Stock,
Bonds,
Spon. C.L.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-7
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
7
Conservative Financing Policy
$
Years
Perm C.A.
Fixed Assets
Marketable
securities
Zero S-T
Debt
L-T Fin:
Stock,
Bonds,
Spon. C.L.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-8
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
8
Cash Conversion Cycle
The cash conversion cycle focuses on the length of time
between when a company makes payments to its creditors and
when a company receives payments from its customers.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-9
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
9
Cash Conversion Cycle
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-10
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
10
Minimizing Cash Holdings
Use a lockbox
Insist on wire transfers and debit/credit cards from customers
Synchronize inflows and outflows
Reduce need for “safety stock” of cash
Increase forecast accuracy
Hold marketable securities
Negotiate a line of credit
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-11
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
11
Cash Budget
Forecasts cash inflows, outflows, and ending cash balances.
Used to plan loans needed or funds available to invest.
Can be daily, weekly, or monthly, forecasts.
Monthly for annual planning and daily for actual cash
management.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-12
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
12
SKI’s Cash Budget for January and
FebruaryJanuaryFebruaryCollections $67,651.95
$62,755.40Purchases 44,603.75 36,472.65Wages
6,690.56 5,470.90Rent 2,500.00 2,500.00Total
payments $53,794.31 $44,443.55Net cash flows
$13,857.64 $18,311.85
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-13
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
13
SKI’s Cash BudgetJanuaryFebruaryCash at start if no borrowing
$ 3,000.00 $16,857.64Net cash flows 13,857.64
18,311.85Cumulative cash $16,857.64
$35,169.49Less: Target cash 1,500.00
1,500.00Surplus $15,357.64 $33,669.49
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-14
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
14
How could bad debts be worked into the cash budget?
Collections would be reduced by the amount of the bad debt
losses.
For example, if the firm had 3% bad debt losses, collections
would total only 97% of sales.
Lower collections would lead to higher borrowing requirements.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-15
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
15
Analyze SKI’s Forecasted Cash Budget
Cash holdings will exceed the target balance for each month,
except for October and November.
Cash budget indicates the company is holding too much cash.
SKI could improve its EVA by either investing cash in more
productive assets, or by returning cash to its shareholders.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-16
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
16
Why might SKI want to maintain a relatively high amount of
cash?
If sales turn out to be considerably less than expected, SKI
could face a cash shortfall.
A company may choose to hold large amounts of cash if it does
not have much faith in its sales forecast, or if it is very
conservative.
The cash may be used, in part, to fund future investments.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-17
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
17
Inventory Costs
Types of inventory costs
Carrying costs: storage and handling costs, insurance, property
taxes, depreciation, and obsolescence.
Ordering costs: cost of placing orders, shipping, and handling
costs.
Costs of running short: loss of sales or customer goodwill, and
the disruption of production schedules.
Reducing inventory levels generally reduces carrying costs,
increases ordering costs, and may increase the costs of running
short.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-18
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
18
Is SKI holding too much inventory?
SKI’s inventory turnover (4.82x) is considerably lower than the
industry average (7.00x).
The firm is carrying a large amount of inventory per dollar of
sales.
By holding excessive inventory, the firm is increasing its costs,
which reduces its ROE.
Moreover, this additional working capital must be financed, so
EVA is also lowered.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-19
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
19
If SKI reduces its inventory without adversely affecting sales,
what effect will this have on the cash position?
Short run: Cash will increase as inventory purchases decline.
This will reduce financing or target cash balance.
Long run: Company is likely to take steps to reduce its cash
holdings and increase its EVA.
The “excess” cash can be used to make investments in more
productive assets such as plant and equipment resulting in an
increase in operating income increasing its EVA.
Alternately, can distribute “excess” cash to its shareholders
through higher dividends or repurchasing shares resulting in a
lower cost of capital increasing its EVA.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-20
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
20
Do SKI’s customers pay more or less promptly than those of its
competitors?
SKI’s DSO (45.6 days) is well above the industry average (32
days).
SKI’s customers are paying less promptly.
SKI should consider tightening its credit policy in order to
reduce its DSO.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-21
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
21
Elements of Credit Policy
Credit Period: How long to pay? Shorter period reduces DSO
and average A/R, but it may discourage sales.
Cash Discounts: Lowers price. Attracts new customers and
reduces DSO.
Credit Standards: Restrictive standards tend to reduce sales,
but reduce bad debt expense. Fewer bad debts reduce DSO.
Collection Policy: How tough? Restrictive policy will reduce
DSO but may damage customer relationships.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-22
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
22
Does SKI face any risk if it restricts its credit policy?
Yes, a restrictive credit policy may discourage sales.
Some customers may choose to go elsewhere if they are
pressured to pay their bills sooner.
SKI must balance the benefits of fewer bad debts with the cost
of possible lost sales.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-23
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
23
If SKI reduces its DSO without adversely affecting sales, how
would this affect its cash position?
Short run: If customers pay sooner, this increases cash
holdings. This will reduce financing or target cash balance
needed.
Long run: Over time, the company would hopefully invest the
cash in more productive assets, or pay it out to shareholders.
Both of these actions would increase EVA.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-24
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
24
What is trade credit?
Trade credit is credit furnished by a firm’s suppliers.
Trade credit is often the largest source of short-term credit,
especially for small firms.
Spontaneous, easy to get, but cost can be high.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-25
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
25
Terms of Trade Credit
A firm buys $3,000,000 net ($3,030,303 gross) on terms of
1/10, net 40.
The firm can forego discounts and pay on Day 40, without
penalty.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-26
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
26
Breaking Down Trade Credit
Payables level, if the firm takes discounts
Payables = $8,219.18(10) = $82,192
Payables level, if the firm takes no discounts
Payables = $8,219.18(40) = $328,767
Credit breakdown
Total trade credit$328,767Free trade credit- 82,192Costly
trade credit$246,575
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-27
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
27
Nominal Cost of Trade Credit
The firm loses 0.01($3,030,303) = $30,303 of discounts to
obtain $246,575 in extra trade credit:
rNOM = $30,303/$246,575
= 0.1229 = 12.29%
The $30,303 is paid throughout the year, so the effective cost of
costly trade credit is higher.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-28
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
28
Nominal Cost of Trade Credit Formula
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-29
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
29
Effective Cost of Trade Credit
Periodic rate = 0.01/0.99 = 1.01%
Periods/year = 365/(40 – 10) = 12.1667
Effective cost of trade credit
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-30
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
30
Bank Loans
The firm can borrow $100,000 for 1 year at an 8% nominal rate.
Interest may be set under one of the following scenarios:
Simple annual interest
Installment loan, add-on, 12 months
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-31
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
31
Simple Annual Interest
Simple interest means no discount or add-on.
Interest = 0.08($100,000) = $8,000
rNOM = EAR = $8,000/$100,000 = 8.0%
For a 1-year simple interest loan, rNOM = EAR.
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-32
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
32
Add-on Interest
Interest = 0.08($100,000) = $8,000
Face amount = $100,000 + $8,000 = $108,000
Monthly payment = $108,000/12 = $9,000
Avg. loan outstanding = $100,000/2 = $50,000
Approximate cost = $8,000/$50,000 = 16.0%
To find the exact effective rate, recognize that the firm receives
$100,000 and must make monthly payments of $9,000 (like an
annuity).
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-33
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
33
Add-on Interest
From the calculator output below, we have:
rNOM = 12 (0.012043)
= 0.1445 = 14.45%
EAR = (1.012043)12 – 1 = 15.45%
INPUTS
OUTPUT
N
I/YR
PMT
PV
FV
12
1.2043
100
-9
0
INTRO
INV & A/R MGMT
ALT WC POLICIES
BANK LOANS
CASH MGMT
TRADE CREDIT
16-34
© 2016 Cengage Learning. All Rights Reserved. May not be
scanned, copied, or duplicated, or posted to a publicly
accessible website, in whole or in part.
34
period
deferral
Payables
period
collection
Average
period
conversion
Inventory
CCC
-
+
=
days
92
30
46
76
CCC
30
46
4.82
365
CCC
period
deferral
Payables
g
outstandin
sales
Days
turnover
Inventory
year
per
Days
CCC
period
deferral
Payables
period
collection
Average
period
conversion
Inventory
CCC
=
-
+
=
-
+
=
-
+
=
-
+
=
18
.
219
,
8
$
365
/
000
,
000
,
3
$
purchases
daily
Net
=
=
%
29
.
12
1229
.
0
10
40
365
99
1
period
Discount
g
outstandin
credit
Days
days
365
%
Discount
100
%
Discount
r
NOM
=
=
-
´
=
-
´
-
=
%
01
.
13
1
(1.0101)
1
rate)
Periodic
1
(
AR
E
12.1667
N
=
-
=
-
+
=
Student Satisfaction with Online Learning: Is it a
Psychological Contract?
Charles Dziuban, Patsy Moskal, Jessica Thompson, Lauren
Kramer, Genevieve DeCantis and Andrea Hermsdorfer
Research Initiative for Teaching Effectiveness
University of Central Florida
Abstract
The authors explore the possible relationship between student
satisfaction with online learning and the
theory of psychological contracts. The study incorporates latent
trait models using the image analysis
procedure and computation of Anderson and Rubin factors
scores with contrasts for students who are
satisfied, ambivalent, or dissatisfied with their online learning
experiences. The findings identify three
underlying satisfaction components: engaged learning, agency,
and assessment. The factor score
comparisons indicate that students in the general satisfaction
categories characterize important differences
in engaged learning and agency, but not assessment. These
results lead the authors to hypothesize that
predetermined, but unspecified expectations (i.e., psychological
contracts) for online courses by both
students and faculty members are important advance organizers
for clarifying student satisfaction.
Introduction
From its inception, online learning has been confronted by
concerns about quality from the
established educational community and society at large
(Carnaghan & Webb, 2007; Akdemir & Koszalka,
2008). Often, in addressing these concerns students’ perceptions
of their course experience becomes a
surrogate for learning engagement in the context of satisfaction
(Swan, 2001; Arbaugh, 2001; Richardson
& Swan, 2003; Bolliger, 2004). Because contemporary students
view information as a commodity which
can be traded openly among a community of learners,
collaboration becomes fundamental to a variety of
educational outcomes (Shirky, 2010; Dziuban et al., 2013).
Online Learning Vol. 19 Issue 2 (2015) 122
Modern technologies are contributing to the dissolution of
traditional classroom boundaries
(Shirky, 2008). Students connect with their instructors and each
other through modalities of almost every
variety, greatly expanding avenues of communication. Norberg,
Dziuban and Moskal’s (2011)
development of a time-based blended learning model, for
instance, modifies the instructor’s role (Liu &
Hwang, 2010) in learning environments based on students’
synchronous and asynchronous learning
preferences. The need for new and more authentic assessment
techniques in addition to challenges to
traditional educational structures (e.g. semester length time
boundaries) raises issues about what
moderates students’ academic expectations and satisfaction.
Studies suggest that online students wish to decrease their
ambivalence toward formal education
by gaining some sense of a carefully delineated path to success
(Dziuban & Dziuban, 1998; Dziuban,
Moskal & Dziuban, 2000; Long, 2011; Young & Dziuban,
2000). Students prefer active, rather than
passive learning environments, and, because they participate in
a highly interactive world, they expect the
same in their classes (Dziuban et al., 2003). Today’s learners
require more outlets for creativity and
collaboration which online learning environments can
accommodate through a variety of instructional
models that are provided anytime, anyplace.
Researchers should not be surprised that identifying the
defining elements for satisfaction has
become much more dynamic and complex. The construct has
multiple facets that tend to be stochastic as
a particular course progresses. In this study, we attempt to
clarify the underlying (latent) elements of
student satisfaction in the context of overall course evaluation
for students who respond positively to
online experiences on end-of-course evaluation protocols.
Feldman (1993) describes the assessment
challenges we encounter as distributions of considerations when
he argues that responses to survey
questions provide only an estimate of the central tendency of an
individual’s attitude or belief about a
subject or object. Craig and Martinez (2005) summarize the
issue: “in retrospect, it seems rather
simplistic to think of attitudes as always being unidimensional.
After all, who hasn’t experienced mixed
feelings about people, places and things that we have
encountered or visited in our lives?” (p. 1)
Recent Studies on Student Satisfaction with Online Courses
Multiple approaches define and assess student satisfaction.
Rubin, Fernandes & Avgerinou
(2013) extended research on the Community of Inquiry
(Garrison, Anderson & Archer, 2000) which
defines social, cognitive, and teaching presence as being
essential to the student learning experience and,
thus, student satisfaction. They determined that learning
management system (LMS) features greatly
impact perceptions of community according to the inquiry
framework. In a related study, Mahmood,
Mahmood and Malik (2012) argued that teaching presence plays
the most critical role in how students
evaluate online learning.
The interaction construct plays an important role in both face-
to-face and online learning
modalities (Kuo, Walker, Belland & Schroder, 2013). In fact,
many studies have found that both quantity
and quality of student interactions are highly correlated with
student satisfaction in almost any learning
environment. However, investigators have noted that
demographic and cultural considerations also impact
the design of appropriate interaction techniques in online
learning (González-Gómez, Guardiola, Martín
Rodríguez & Montaro Alonso, 2012).
Ke and Kwak (2013) identified five elements of student
satisfaction: learner relevance, active
learning, authentic learning, learner autonomy, and technology
competence. Kuo et al. (2013) determined
that learner-instructor interaction and learner-content
interaction combined with technology efficacy are
valid indicators of students’ positive perceptions. However
Battalio (2007), using a criterion approach,
argued that a positive course rating requires effective learner-
instructor interaction.
Keengwe, Diteeyont and Lawson-Body (2012) argued that
students’ expectations influence the
instructor’s design of effective technology tools in online
courses and are the key to understanding the
satisfaction construct. The authors concluded that satisfaction
was most impacted by learning
convenience combined with the effectiveness of e-learning
tools. Dziuban, Moskal, Brophy-Ellison and
Shea (2007) found six key elements that contribute to students’
satisfaction: an enriched learning
Online Learning Vol. 19 Issue 2 (2015) 123
environment, well-defined rules of engagements, instructor
commitment, reduced ambiguity, an engaging
environment, and reduced ambivalence about the value of the
course.
Because colleges and universities have to be much more
responsive to their student client base
(Long, 2011; Bordelon, 2012; Allen & Seaman, 2013),
ambivalence becomes particularly important. This
implies satisfaction is an underlying indicator of success in
various learning environments, especially
online modalities. Satisfied students appear to be engaged,
motivated and responsive; contribute to an
effective learning climate; and achieve at higher levels.
Dissatisfied or ambivalent students contribute to
environments where instructors appear to have much more
difficulty facilitating effective learning
situations. Faculty members in such circumstances have trouble
relating to their students and may
incorrectly assume that such difficulties are related primarily to
student dissatisfaction with online
learning (Dziuban et al., 2007).
A precise configuration of student satisfaction with online
learning is proving to be elusive
because it might be context dependent (e.g., college, discipline,
course level, institution, and, of course,
instructor). Bowker and Star (1999) use the term “boundary
object” to suggest that these items or ideas
adapt to specific needs and constraints while maintaining a
common identity. While bringing a
community of practice together for communication and inquiry
purposes, they are generally are weak in
the large cohort. According to these researchers, however, the
object (student satisfaction, in this case) is
much more well-defined within individual constituencies. These
definitional issues appear to reflect what
Watts (2011) calls confirmation bias—that is, accepting
information that confirms our existing beliefs
much more readily than information that does not. To express
their degree of satisfaction, students react
only to things that they expect, but are never expressly stated
(i.e., their predetermined psychological
contract) or to what they have already assumed about the
course. However, should dissonance with these
expectations develop, students may encounter ambivalence
characterized by simultaneous positive and
negative feelings. These are the mixed emotions described by
Weigert (1991) and Long (2011).
Factor Studies of Student Satisfaction with Online Learning
A small number of studies conducted by investigators seeking to
identify the dimensionality of
student satisfaction with online learning have emerged in the
past few years. This work has been a natural
extension of inquiry into student satisfaction in higher
education (Abrami & d’Apollonia, 1991; Feldman,
1976; Feldman, 1993; Greenwald & Gilmore, 1997; Kim,
Damewood & Hodge, 2000; Marsh & Roche,
1997; McKeachie, 1997). While prior studies have focused
primarily on face-to-face teaching
environments, online learning has provided a new dynamic and
has re-energized interest in the topic.
Arbaugh (2007) adopted factoring methods to validate the
Community of Inquiry framework (Garrison et
al., 2000) incorporating social, cognitive, and teaching
presences. He retrieved these primary constructs
and demonstrated that they exhibited excellent reliability. His
work extended the original Community of
Inquiry framework to a fourth dimension: course design and
organization. Stewart, Hong, and Strudler
(2004), using principal components analysis, found a fairly
complex underlying dimensionality that
defines the pattern of student satisfaction in online learning: the
evaluative construct for student involved
issues such as web page appearance, hyperlinks and navigation
facility, technical constraints, online
applications, instructional techniques and expectations, content
delivery, and the interaction environment.
Bangert (2006) found four underlying elements related to the
evaluation of online and blended courses:
interaction, active learning, time on task, and student
cooperation. In a later study, he validated his
previous findings using both exploratory and confirmatory
factor methods (Bangert, 2008).
In a somewhat different approach to the identification of
underlying dimensionality, Wang,
Dziuban, Cook, and Moskal (2009) used classification and
regression trees to predict student assessment
of online courses and identified a series of validated if-then
decision rules for predicting students’
perceptions of excellent teaching based on three constructs:
facilitation of learning, ability of the
instructor to communicate information and concepts, and the
instructor’s respect and concern for students.
Dziuban and Moskal (2011) conducted a study of the factor
invariance in student satisfaction
across online, blended, and face-to-face courses. Using
Guttman’s (1954) image analysis, they found a
single general component that remained constant across all
modalities. The authors concluded that
Online Learning Vol. 19 Issue 2 (2015) 124
students do not use the modality of a course to differentiate
elements of excellent instruction and course
satisfaction.
In a later study, Dziuban, Moskal, Kramer and Thompson
(2013) used the alpha factoring
procedure (Kaiser & Caffery, 1965) to identify the underlying
dimensionality of satisfaction under
varying conditions of student ambivalence toward their online
courses. Using overall satisfaction with the
course, they classified students into five categories: negative
non-ambivalent, negative ambivalent,
ambivalent, positive ambivalent and positive non-ambivalent,
corresponding with the 5-item Likert scale.
By factoring the remaining items of the instrument stratified by
those categories, they found the highest
dimensionality for students in the ambivalent categories and the
lowest dimensionality in the non-
ambivalent classifications. The factors in the extreme categories
(either positive or negative) were
identical as were the factors in the positive ambivalent and
negative ambivalent categories. The authors
hypothesized that the students who appear to be least engaged
in their courses (i.e., ambivalent) may be
the most reflective and thoughtful about evaluating their
educational experience.
Psychological Contracts as a Basis for Understanding
Satisfaction
By definition, factor analysis studies imply latent dimensions–
constructs that cannot be directly
observed. Therefore, the underlying components identified in
these kinds of studies relate closely to
Argyris’s (1960) notion of a psychological contract. These
contracts are formed by implicit understanding
and are not bound by written or legal agreements of two parties
within a reciprocal relationship. They
consist of perceived obligations and expectations, and thus, are
subjective and vary from person to person
(Bordia, Hobman, Restubog, & Bordia, 2010). When broken, or
breached, (due to perceived unfairness,
inequality, or mistrust), satisfaction and performance decline
and workforce turnover increases,
consequently impacting attitudes and behaviors (Bordia et al.,
2010).
All workplace psychological contracts contain six features:
voluntary choice, mutual agreement,
incompleteness, presence of numerous contract makers, plan for
managing unsuccessful contract losses,
and a relational model between employer and employee
(Rousseau, 1990). Relational, transactional and
balanced define these six features contained within three
different types as outlined by Rousseau (1990).
Relational agreements are based on loyalty and stability and,
thus, foster satisfaction (Raja, Johns &
Ntalianis, 2004). Transactional agreements include fewer duties,
are usually temporary or short in
duration, and usually result in behaviors that align or are
consistent with the contributions in which one is
rewarded. These contributions are economic in nature and
viewed as less significant. The balanced
agreement: a hybrid contract that is generally open-ended
includes a mutual concern between both parties,
but with clear expectations (Raja et al., 2004).
When analyzing and assessing psychological contracts, the three
forms of measurement include
content-oriented assessment, feature and evaluation. The
content-oriented assessment examines the
obligations of the agreement; the feature-oriented assessment
compares one agreement to another’s based
upon attributes of the contracts; and the evaluation-oriented
assessment assesses the degree of fulfillment
and the amount of change that results (Rousseau & Tijoriwala,
1998).
Psychological Contracts in Education
Although Argyris (1960) developed the theory of a
psychological contract for the workplace, the
idea has important implications for educational environments.
Wade-Benzoni, Rousseau, and Li (2006),
for instance, contend that students view psychological contracts
as a form of mutual exchange in the
education process. The interactions between student and
instructor are crucial and telling about ongoing
perceptions of obligations (Wade-Benzoni et al., 2006). Often
there is little to no explicit communication
about these arrangements because they are informal and
temporary. The power in the relationship within
these contracts is predominately asymmetric, favoring faculty
members who hold expectations about
student performance and control resources such as grades,
letters of recommendation, advice on careers,
and, in some cases, dissertation research approval (Wade-
Benzoni et al., 2006).
Prior to viewing a syllabus, students begin to form expectations
as they assess course offerings
for academic development, decision-making input, challenges,
feedback, and support (Spies et al., 2010).
Online Learning Vol. 19 Issue 2 (2015) 125
According to Spies et al. (2010), students pay close attention to
the following categories: faculty,
futuristic factors, student development, course and curricular
content, learning opportunities,
involvement, and facilities. These agreements tend to change
and become more elaborate as the course
progresses.
Within a particular class, both students and faculty form a large
number of contracts that present
satisfaction challenges if, in the participants’ judgment, their
implicit expectations are not met. This
suggests that student satisfaction with online learning is, as
Lakoff (1987) termed, a function of an
idealized cognitive model—a construct fabricated to bring
clarity and structure to a situation. Kahneman
(2011) describes this thinking as “what you see is all there is.”
Because of the complex interaction of
these many constructs, however, student satisfaction with online
learning appears to be an example of
“there is much more than you can see directly.”
The Survey and Sample
The Research Initiative for Teaching Effectiveness (RITE) at
the University of Central Florida
(UCF) has been surveying UCF’s online students as part of an
ongoing impact evaluation since 1996,
when the university began offering Web courses. The
longitudinal nature of the university’s examination
of student attitudes has allowed for refinement and validation
from the original survey. Ongoing
evaluation allows researchers to accommodate rapid change in
the online course environments and
provide baseline data on items that may contribute to student
satisfaction with these courses (Roberts,
2007).
Response rates for online surveys are always a concern (Sax,
Gilmartin & Bryant, 2003). The art
of a student survey is the development of an instrument that
addresses the target, yet requires a minimal
time to complete. The current RITE instrument focuses
specifically on the dynamics of student
satisfaction with online learning and is presented in a 5-point
Likert scale format, ranging from strongly
agree to strongly disagree. Items related to the learning
management system or to technology itself have
been excluded in an effort to minimize survey fatigue. Survey
items were validated by examining their
psychometric properties from previous surveys in terms of
central tendency, variability, skewness and
kurtosis, looking for anomalies, and for their relevance to the
current state of the online initiative.
Once the original item pool was selected the survey was
developed using Google Forms
(https://drive.google.com). Students were sent an announcement
through the UCF Knights student email
platform. Student directions included the purpose of the study,
their rights as survey participants, and
contact information for both the survey administrators and the
University’s Institutional Review Board.
Students were advised that the survey was open only to
undergraduates 18 years of age and older, and
were reminded that the survey was voluntary. Students were
free to omit any question they were not
comfortable answering or stop at any time with no penalty.
Students received no rewards for participation
and there were no risks for non-participation. All student data
remained anonymous when aggregated.
Overall 1,217 surveys were returned.
An examination of student responses indicated that 84% of
students represented the millennial
generation, 72% were female, and 76% were unmarried. Almost
half of the responding students worked at
least 10 hours per week and data reflected the ethnic
demography of the university, with 70% of students
having a grade point average of at least 3.0. Respondents were
experienced with online learning—97% of
students indicated taking at least one online course, with a
median of five online courses. Students were
predominately upperclassmen, with 83% of respondents being
juniors or seniors. The university has
targeted the majority of its online offerings to the upper
undergraduate level, thereby allowing for the
transition of freshmen and sophomores to university life prior to
extensive online learning. Our
respondent sample of predominately upper undergraduates
reflects this philosophy. Students who
indicated they had not taken an online course were excluded
from analyses, reducing the usable sample to
1,197 responses.
Online Learning Vol. 19 Issue 2 (2015) 126
Methodology
Reliability and Domain Sampling
Prior to any analysis of the item responses collected in this
sample, the psychometric quality of
the information yielded by the instrument was assessed with
validated techniques. Next, coefficient alpha
(Cronbach, 1951) was used to determine the survey reliability.
The psychometric sampling issue of how
well the items comprise a reasonable sample from the domain of
interest is an important aspect of
analyzing constructs such as student satisfaction. Addressing
this issue, Guttman (1953) developed a
theoretical solution illustrating that the domain sampling
properties of items improve when the inverse of
the correlation matrix approaches a diagonal. Kaiser and Rice
(1974) used this property to develop their
measure of sample adequacy. The index has an upper bound of
one with Kaiser offering some decision
rules for interpreting the value of measure of sampling
adequacy (MSA). If the value of the index is in
the range .80 to .99, the investigator has evidence of an
excellent domain sample. Values in the .70s
signal an acceptable result, and those in the .60s indicate data
that are unacceptable. MSA has been used
for data assessment prior to the application of any factoring
procedures. Computation of the MSA index
gives the investigators a benchmark for the construct validity of
the items. This procedure was
recommended by Dziuban and Shirkey (1974) prior to any latent
dimension analysis. An individual MSA
for each variable gives the investigators an indication of
whether or not a particular item belongs in the
particular domain.
Dimensionality of Student Responses
The investigators sought to determine whether multiple
dimensions underlie students’ satisfaction
to their online learning environments. This is normally
accomplished by the application of some version
of the generalized factor analysis procedure. In this study the
data were analyzed with Guttman’s (1954)
image analysis. The procedure assumes that the data sets divide
into two components. The first
component is the portion of data that can be predicted from the
remaining variables in the set (the image).
The second component is the data that is not predictable from
the remaining variables (the anti-image).
The method is operationalized by predicting a standardized
score on a variable for each individual from
the remaining variables in the set. The image procedure derives
underlying components found in the
covariance matrix (the image matrix) of the standardized
variables.
The number of factors (components) retained in the final
solution was determined by a procedure
originally proposed by Dziuban and Shirkey (1993) and later
validated by Hill (2011). The method
involves the initial assessment of the dataset with the MSA
followed by subsequent MSA computation on
the matrix of partial correlations once the impact of the first,
second, third etc. number of factors have
been removed from the system. Once a value in the .60s has
been reached, the majority of information
from the system has been attained. The initial pattern matrix
was transformed (rotated) according to the
promax (Hendrickson & White, 1964) procedure. Pattern
coefficients absolutely larger than .40 were used
for interpretation purposes (Stevens, 2002).
Once the final dimensionality of the data set was determined,
factor scores for each subject in the
sample were derived using the Anderson and Rubin (1956)
method. These scores have a mean of zero and
a standard deviation of one and are uncorrelated with each
other. They also have a reasonably good
relationship to the estimated factor validity. The final step in
the handling of the data involved deriving a
linear transformation of the standardized factor scores with T =
(Z x 10) + 50 giving the scores a mean of
50 and standard deviation of 10 for ease of interpretation.
The scores for each factor were used as dependent measures for
a rescaled comparison variable
related to overall online course satisfaction. Because the
number of dissatisfied students was small, the
comparison variable was declassified into satisfied, ambivalent,
and dissatisfied and used as a factor in
the hypothesis test. The investigators were concerned with
trends and effect size differences among the
dissatisfied (4%), ambivalent (5%), and satisfied (91%) groups
followed by Bonferroni post hoc
comparisons (Hochberg, 1988).
Online Learning Vol. 19 Issue 2 (2015) 127
Results
The promax transformed pattern matrix may be found in Table
1. The overall MSA for the
variables set was .94 with an overall alpha reliability coefficient
of .96. These values indicate excellent
domain sampling and reliability. The individual MSAs indicate
each item belongs to the family
psychometrically. Upon extraction of three dimensions from the
system using the Dziuban-Shirkey
procedures, the MSA on the residual correlation matrix was .58
indicating that what remained in the
system was essentially noise.
Table 1 Pattern Matrix for the Promax Transformed Image
Analysis
Items
Engaged
Learning Agency Assessment MSA
Generally, I am more engaged in my online
courses .84 .04 -.07 .94
I have more opportunities to reflect on what I
have learned in online courses .79 -.05 .04 .94
Online learning helps me understand course
material .76 .03 .05 .95
There are more opportunities to collaborate
with other students in an online course .67 -.14 -.03 .93
My online experience has increased my
opportunity to access and use information .66 .11 .06 .95
I am more likely to ask questions in an online
course .65 -.11 .01 .94
Generally, I understand course requirements
better in an online course .64 -.09 .19 .96
Because of online courses, I am more likely to
get a degree .56 .09 -.03 .94
I can manage my own learning better in online
courses .54 .18 .17 .95
Take more online courses? .47 .22 .04 .96
I am motivated to succeed -.12 .56 -.03 .81
I have strong time management skills .05 .53 -.07 .85
I am a multitasker -.05 .57 .05 .87
Assessment of my academic progress is more
accurate in online courses -.19 -.04 .56 .92
I can more easily monitor my academic
progress in online courses .14 .11 .51 .92
Response time from teachers and assistants is
quicker in online courses .24 -.12 .43 .94
MSA = .94
Residual MSA = .58
Average Item Correlation = .70
=.96
From Table 1, the reader may observe that the first factor
appears very general, covering a
number of issues associated with online courses ranging from
engagement through willingness to take
Online Learning Vol. 19 Issue 2 (2015) 128
another online course. However, upon closer examination, it is
clear what appears to be very general is
quite specific in relation to what students evaluate in online
courses. These elements include students’
abilities to engage, reflect, understand material, collaborate,
find information, question, understand course
requirements, manage their own learning, and increase
opportunities for degree completion. This finding
suggests students simultaneously evaluate multiple aspects of
online courses to make decisions about
their class experience. Furthermore, students may evaluate each
element separately, especially when they
are unsure of their satisfaction levels. We name this factor
engaged learning (74% factor variance) and in
many respects, it conforms to Tolstoy’s (1878/2004) opening
argument and Diamond’s (1999) contention
that many elements must be accommodated if conditions are to
be resolved satisfactorily. Conversely, any
one or more of these elements might cause students to be less
than satisfied with their educational
experience.
The second factor (17% factor variance) in the pattern matrix
involves motivation, time
management skills, and multitasking ability. This dimension
suggests that students’ sense of agency—that
is, students’ ability to initiate and control their own actions in
the learning environment—plays a role in
their satisfaction with their online learning experience. Students
with a strong sense of agency assume
responsibility for their learning and bring a sense of
empowerment to their classes. Since the majority of
students in this study indicated higher levels of satisfaction
with online learning, we might reasonably
assume they bring a higher sense of agency as well. Agency,
however, may not be specifically related to
course modality.
The final factor (9% factor variance) depicts the manner in
which the assessment process evolves
in the online environment. Satisfied students are characterized
by an ability to assess and monitor their
progress, and indicate that a timely response by the instructor
plays an important role in their satisfaction.
Therefore, we find online students incorporate three dimensions
into their evaluation process of online
learning experiences: 1) engaged learning with various course
elements, 2) a sense of agency, and 3) an
efficient assessment of academic progress.
The factor correlation matrix in Table 2 indicates that these
student dimensions are highly and
positively related in a generally satisfied population. This
suggests that engaged learning, agency, and
assessment factors form a highly interrelated lexicon for student
satisfaction, with engaged learning most
highly related to agency (r =.86) and agency most highly related
to assessment (r =.77).
Table 2 Factor Correlation Matrix
Factors
Engaged
Learning Agency
Engaged Learning
Agency .86
Assessment .59 .77
The average correlation among the factors in Table 3 is .74
when computed by the method
developed by Kaiser (1958). Table 3 contains the means,
standard deviations, and significance levels
for the three sets of factor scores for the declassified overall
satisfaction variable. In addition, the table
contains the pairwise combinations that proved significant on
the Bonferroni comparison and the
associated effect size calculated by the method outlined by
Hedges and Olkin (1985). The factor scores
for engaged learning and agency lead to the null hypothesis
rejection, however assessment did not. For
Online Learning Vol. 19 Issue 2 (2015) 129
engaged learning, dissatisfied versus ambivalent ratings
produced an effect size of .53, dissatisfied versus
satisfied ratings yielded values of 2.01, and ambivalent versus
satisfied ratings equaled 1.43. Bonferroni
comparisons for the agency factors showed two significant
differences with dissatisfied versus satisfied
ratings producing an effect size value of 1.03, while ambivalent
versus satisfied ratings yielded .77. Each
of the above effects sizes by most standards is considered
substantial.
Table 3 Factor Score Difference by Overall Satisfaction
Dissatisfied (D)
(n=46)
Ambivalent (A)
(n=56)
Satisfied (S)
(n=1016)
Sig.Factors Scores S.D. S.D. S.D.
Engaged Learning 33.63 10.86 38.92 8.92 51.59 8.86
.00
Agency 41.46 11.27 43.74 13.84 50.98 9.15 .00
Assessment 48.93 9.24 49.21 8.56 50.13 10.19 .63
Significant Bonferroni Pairwise Differences
Categories Engaged Learning Agency Assessment
Dissatisfied vs. Ambivalent .00 ns ns
Dissatisfied vs. Satisfied .00 .00 ns
Ambivalent vs. Satisfied .00 .00 ns
Effect Sizes - Hedges’ g
Categories Engaged Learning Agency Assessment
Dissatisfied vs. Ambivalent 0.53 0.18 0.03
Dissatisfied vs. Satisfied 2.01 1.03 0.12
Ambivalent vs. Satisfied 1.43 0.77 0.09
Limitations
There are a number of limitations in this study. Initially it
should be noted that the factors derived
resulted from a one-time administration of the survey
instrument during the semester. Therefore, the
stability of the satisfaction factors over an entire semester has
not been validated. Second, the study was
conducted on individual item responses rather than scales.
Although this has precedent in literature, single
items with presumed low reliability can be problematic in factor
studies such as this because of their
instability. Third, many aspects of exploratory factor analysis
involve arbitrary decisions, for instance,
number of factors to extract, values for salience in the pattern
matrix, rotational strategy, and naming the
final dimensions. Fourth, online survey research using mass e-
mailings to students has the possibility of
introducing response bias into the data. This makes replication
of studies much more difficult. Finally,
although the investigators collected extensive demographic data
on the responding students, there was no
possibility for controlling for many of the student
characteristics that might have influenced the results.
This raises a more general limitation resulting from the ease
with which survey instruments can be
distributed in the electronic environment. This causes many
students to suffer “survey fatigue” that can
adversely impact response rates.
Online Learning Vol. 19 Issue 2 (2015) 130
Conclusion
Student Satisfaction in the Online Environment
From its inception, the Sloan-Consortium (now the Online
Learning Consortium) established
student satisfaction with online leaning as one of its founding
metaphoric pillars. In doing so, the
organization demonstrated a commitment to the student voice as
a component for understanding effective
teaching and learning. This commitment by the Online Learning
Consortium resulted in two decades of
research devoted to understanding how students define
excellence in their learning space. Satisfaction
with online learning is becoming increasingly important in
higher education for a number of reasons. The
most important is the rapid adoption of this teaching and
learning modality in colleges, universities, and
community colleges across the country. However, another
mediating issue is the growing sense of student
agency in the educational process. Students are able and do
express their opinions about their educational
experiences in formats ranging from end of course evaluation
protocols to social networks of all varieties
making their voice more important than ever before.
Factor Studies
Online learning has redefined student satisfaction research. It
has caused the education research
community to reexamine traditionally held assumptions that
learning primarily takes place within a
metaphoric container called a “course.” In reviewing the studies
that address student satisfaction, from a
factor analytic perspective, one point becomes obvious: this is a
complex system with very little
agreement. Even the most recent factor analytic studies have
done little to resolve the lack of consensus
about the dimensions that underlie satisfaction with online
learning. This appears to be the factor
invariance problem in full bloom, where differing contexts
mediate how students relate to their learning
experiences because a common prototype for online courses has
been elusive at best. There exists the
possibility that each course incorporates several unique
characteristics that make it difficult to identity
common factors that are robust across context. Although the
results of these studies differ in how many
and what dimensions constitute satisfaction, their unifying
objective was the same: identify the
underlying theoretical perspective of student perception of
online learning. In addition, all of them
subscribed to latent trait theory, recognizing that the important
dimensions that students differentiate
when they express their opinions about online learning are
formed by the combination of the original
items that cannot be directly observed—that which underlies
student satisfaction.
Psychological Contracts as a Lens for Student Satisfaction
Very often theories developed in on one discipline inform work
in another area. We contend that
this is the case with the psychological contracts and factors that
define student satisfaction with online
learning. The theory of psychological contracts explains
employee satisfaction through the perspectives of
expectations for the work place and employee interactions.
These contracts may be common across
employees, for instance safety on the job, or they may be unique
to individual employees such as
promotion. The elements of the contract are implicit in that they
are never formally stated, but they are
assumed by the individual holding them to be mutually agreed
upon between the employee and the
employer. Of course, this may or may not be so. Most
importantly, a violation of the psychological
contract, either real or perceived, by either party, leads to
workplace dissatisfaction.
In factor analytic studies, items about student satisfaction with
online learning correspond to the
formation of a psychological contract. The survey responses are
reconfigured into a smaller number of
latent (non-observable) dimensions that are never really
articulated by the students, but are, nonetheless,
fully expected to be satisfied. Of course, instructors have
contracts for students as well. Studies such as
this identify the student psychological contact after the fact, not
prior to the class, however, nothing
prevents both from happening and/or a comparison of the two.
The prior contract might be identified
before the class and the satisfaction factors after the class.
Online Learning Vol. 19 Issue 2 (2015) 131
Table 4 depicts the relationship between online student
satisfaction factors and the features of a
psychological contract specified in the literature. Each factor
translates into how it might be expressed in
the student voice followed by a corresponding contract feature
and an assessment strategy. Engaged
learning, the largest contributor to the factor pattern, indicates
that students expect instructors to adopt a
facilitative role in their teaching. This dimension corresponds to
the relational contract where the learning
environment is stable and organized with a clearly delineated
path to success. Assessment in this situation
is evaluation oriented, indexing fulfillment and change (i.e.,
students achieving learning outcomes).
The second factor, agency, characterizes satisfied students who
recognize their abilities and
accomplishments in a balanced contract arrangement that they
assessed by the degree of agreement
between them and the instructor (feature oriented). The final
factor, assessment, corresponds to that
transactional contract with its evaluation determined by the
degree to which the obligations of the course
have been met (content oriented).
Although they have been developed in different contexts,
workplace contracts and student
satisfaction factors are similar. Both attempt to explain the
underlying cause of satisfaction or lack
thereof. Both are general and non-nonspecific, becoming more
complex as the job, class, or classes
evolve over time. They are limited in their scope and at best
index a kind of average perception of the
workplace or educational environment. Rarely are employees
fully satisfied with their jobs or students
completely satisfied with their classes. However, both contracts
and factors frame blueprints for
developing instructional design strategies that can improve
learning climates. Online learning has
unbundled the classroom and the same technologies have done
precisely the same to the workplace: no
longer are either bound by physical space.
Perhaps in a more traditional time, psychological contracts
(predispositions) and student
satisfaction elements (post dispositions) were somewhat
separate in their context and orientation.
However, it seems clear that information and instructional
technologies are migrating into the same
orientation space. This makes the questions “What did you
expect on your way in?” and “Now that you
are finished, how was your experience?” part of the same
climate assessment paradigm. By coalescing
factors and psychological contracts, we might gain insights into
more effective learning environments that
are not possible when each theory is considered separately.
Blending the two takes best features of both
and results in something entirely new--something more than you
can see in either theory.
Table 4 Correspondence Between Satisfaction and
Psychological Contracts
Student Satisfaction
Factors Student Voice Contract Features Contract Assessment
Engaged Learning “Facilitate my learning” Relational
Evaluation Oriented
Agency “Recognize my abilities and accomplishments”
Balanced Feature Oriented
Assessment “Let me know where I stand” Transactional Content
Oriented
Online Learning Vol. 19 Issue 2 (2015) 132
References
Abrami, P.C., & d’Apollonia, S. (1991). Multidimensional
students’ evaluations of teaching
effectiveness—generalizability of “N=1” research: Comment of
marsh. Journal of Educational
Psychology, 83(3), 411-415. doi: 10.1037/0022-0663.83.3.411
Akdemir, O., & Koszalka, T. A. (2008). Investigating the
relationships among instructional strategies and
learning styles in online environments. Computers & Education,
50(4), 1451-1461. doi:
10.1016/j.compedu.2007.01.004
Allen, I. E., & Seaman, J. (2013). Changing course: Ten years
of tracking online education in the United
States. Newburyport, MA: Sloan Consortium.
Anderson, R. D., & Rubin, H. (1956). Statistical inference in
factor analysis. Proceedings of the Third
Berkeley Symposium of Mathematical Statistics and
Probability, 5, 111-150.
Arbaugh, J.B. (2007). An empirical verification of the
community of inquiry framework. Journal of
Asynchronous Learning Network, 11(1), 73-85.
Arbaugh, J. B. (2001). How instructor immediacy behaviors
affect student satisfaction and learning in
web-based courses. Business Communication Quarterly, 64(4),
42-54. doi:
10.1177/108056990106400405
Argyris, C. (1960). Understanding organizational behavior.
Homewood, IL: Dorsey.
Bangert, A. W. (2006). Identifying factors underlying the
quality of online teaching effectiveness: An
exploratory study. Journal of Computing in Higher Education,
17(2), 79-99. doi:
10.1007/BF03032699
Bangert, A. W. (2008). The development and validation of the
student evaluation of online teaching
effectiveness. Computers in the Schools, 25(1), 35-47. doi:
10.1080/07380560802157717
Battalio, J. (2007). Interaction online: A reevaluation. Quarterly
Review of Distance Education, 8(4), 339-
352.
Bolliger, D. U. (2004). Key factors for determining student
satisfaction in online courses. International
Journal on E-Learning, 3(1), 61-67.
Bordelon, D. E. (2012). Where have we been? Where are we
going? The evolution of American higher
education. Procedia-Social and Behavioral Sciences, 55(5), 100-
105. doi:
10.1016/j.sbspro.2012.09.483
Bordia, S., Hobman, E. V., Resubog, S. L. D., & Bordia, P.
(2010). Advisor-student relationship in
business education project collaborations: A psychological
contract perspective. Journal of
Applied Social Psychology, 40(9), 2360-2386. doi:
10.1111/j.1559-1816.2010.00662.x
Bowker, G. C., & Star, S. L. (1999). Sorting things out:
Classification and its consequences. Cambridge,
MA: The MIT Press.
Carnaghan, C., & Webb, A. (2007). Investigating the effects of
group response systems on student
satisfaction, learning, and engagement in accounting education.
Issues in Accounting
Education, 22(3), 391-409. doi:
http://dx.doi.org/10.2308/iace.2007.22.3.391
Craig S. C. & Martinez M. D. (2005). Ambivalence and the
structure of political opinion. New York:
Palgrave Macmillian.
Cronbach, L. J. (1951). Coefficient alpha and the internal
structure of tests. Psychometrika, 16(3), 297-
334. doi:10.1007/BF02310555
Diamond, J. (1999). Germs, guns, and steel: The fates of human
societies. New York: W. W. Norton &
Company, Inc.
Dziuban, C., & Dziuban, J. (1998). Reactive behavior patterns
in the classroom. Journal of Staff,
Program & Organization Development, 15(2). 85-31.
Online Learning Vol. 19 Issue 2 (2015) 133
Dziuban, C., McMartin, F., Morgan, G., Morrill, J., Moskal, P.,
& Wolf, A. (2013). Examining student
information seeking behaviors in higher education. Journal of
Information Fluency, 2(1), 36-54.
Dziuban, C., & Moskal, P. (2011). A course is a course is a
course: Factor invariance in student
evaluation of online, blended and face-to-face learning
environments. The Internet and Higher
Education, 14(4), 236-241. doi: 10.1016/j.iheduc.2011.05.003
Dziuban, C., Moskal, P., Brophy-Ellison, J., & Shea, P. (2007).
Student satisfaction with asynchronous
learning. Journal of Asynchronous Learning Networks, 11(1),
87-95.
Dziuban, C. D., Moskal, P. D., & Dziuban, E. K. (2000).
Reactive behavior patterns go online. The
Journal of Staff, Program & Organizational Development, 17(3),
155-179.
Dziuban, C.D., Moskal, P.D., Juge, F., Truman-Davis, B., Sorg,
S. & Hartman, J. (2003). Developing a
web-based instructional program in a metropolitan university.
In B. Geibert & S. H. Harvey
(Eds.), Web-wise learning: Wisdom from the field (pp. 47-81).
Philadelphia, PA: Xlibris
Publications.
Dziuban, C., Moskal, P., Kramer, L., & Thompson, J. (2013).
Student satisfaction with online learning in
the presence of ambivalence: Looking for the will-o’-the-wisp.
Internet and Higher Education,
17, 1-8. doi: 10.1016/j.iheduc.2012.08.001
Dziuban, C. D., & Shirkey, E. C. (1974). When is a correlation
matrix appropriate for factor analysis?
Some decision rules. Psychological Bulletin, 81(6), 358-361.
doi: 10.1037/h0036316
Dziuban, C. D., & Shirkey, E. C. (November, 1993). S.D. 50—A
sequential psychometric criterion for the
number of common factors. Presented at The Annual Conference
for Florida Educational
Research Association, Destin, Florida.
Feldman, K. A. (1976). The superior college teacher from the
student’s view. Research in Higher
Education, 5, 243-288. doi: 10.1007/BF00991967
Feldman, K. A. (1993). College students’ views of male and
female college teachers: Part II— evidence
from students’ evaluation of their classroom teachers. Research
in Higher Education, 34(2), 151-
191. doi:10.1007/BF00992161
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical
inquiry in a text-based environment:
Computer conferencing in higher education. The Internet and
Higher Education, 2(2), 87-105.
doi: 10.1016/S1096-7516(00)00016-6
González-Gómez, F., Guardiola, J., Martín Rodríguez, Ó., &
Montero Alonso, M. Á. (2012). Gender
differences in e-learning satisfaction. Computers & Education,
58(1), 283-290. doi:
10.1016/j.compedu.2011.08.017
Greenwald, A.G., & Gilmore, G. M. (1997). Grading leniency is
a removable contaminant of student
ratings. American Psychologist, 52(11), 1209-1217. doi:
10.1037/0003-066X.52.11.1209
Guttman, L. (1953). Image theory for the structure of
quantitative variates. Psychometrika, 18, 277-269.
doi:10.1007/BF02289264
Guttman, L. (1954). Some necessary conditions for common
factor analysis. Psychometrika, 19, 149-161.
doi:10.1007/BF02289162
Hedges, L.V., & Olkin, I. (1985). Statistical methodology in
meta-analysis. San Diego, CA: Academic
Press.
Hendrickson, A. E., & White, P. O. (1964). Promax: A quick
method for rotation to oblique simple
structure. British Journal of Statistical Psychology, 17(1), 65-
70. doi: 10.1111/j.2044-
8317.1964.tb00244.x
Hill, B. D. (2011). The sequential Kaiser-Meyer-Olkin
procedure as an alternative for determining the
number of factors in common-factor analysis: A Monte Carlo
simulation Doctoral dissertation,
Oklahoma State University.
Online Learning Vol. 19 Issue 2 (2015) 134
Hochberg, Yosef (1988). A sharper bonferroni procedure for
multiple tests of significance. Biometrika,
75(4): 800–802. doi:10.1093/biomet/75.4.800.
Kahneman, D. (2011). Thinking, fast and slow. New York:
Farrar, Strauss, Giroux.
Kaiser, H. F. (1958). The varimax criterion for analytic rotation
in factor analysis. Psychometrika, 23(3),
187-200. doi:10.1007/BF02289233
Kaiser, H., & Caffrey, J. (1965). Alpha factor analysis.
Psychometrika, 30(1), 1-14.
doi:10.1007/BF02289743
Kaiser, H. F., & Rice, J. (1974). Little jiffy, mark IV. Journal of
Educational and Psychological
measurement, 34(1), 111-117. doi:
10.1177/001316447403400115
Ke, F., & Kwak, D. (2013). Constructs of student-centered
online learning on learning satisfaction of a
diverse online student body: A structural equation modeling
approach. Journal of Educational
Computing Research, 48(1), 97-122. doi: 10.2190/EC.48.1.e
Keengwe, J., Diteeyont, W., & Lawson-Body, A. (2012).
Student and instructor satisfaction with e-
learning tools in online learning environments. International
Journal of Information and
Communication Technology Education (IJICTE), 8(1), 76-86.
doi:10.4018/jicte.2012010108
Kim, C., Damewood, E., & Hodge, N. (2000). Professor
attitude: Its effect on teaching evaluations.
Journal of Management Education, 24(4), 458-473.
doi:10.1177/105256290002400405
Kuo, Y. C., Walker, A. E., Belland, B. R., & Schroder, K. E.
(2013). A predictive study of student
satisfaction in online education programs. The International
Review of Research in Open and
Distance Learning, 14(1), 16-39.
Lakoff, G. (1987). Women, fire, and dangerous things.
Chicago: University of Chicago Press.
Liu, G. Z., & Hwang, G. J. (2010). A key step to understanding
paradigm shifts in e-learning: towards
context-aware ubiquitous learning. British Journal of
Educational Technology, 41(2), E1-E9. doi:
10.1111/j.1467-8535.2009.00976.
Long, W. A. (2011). Your predictable adolescent. Charleston,
SC: BookSurge Publishing.
Mahmood, A., Mahmood, S. T., & Malik, A. B. (2012). A
comparative study of student satisfaction level
in distance learning and live classroom at higher education
level. Turkish Online Journal of
Distance Education (TOJDE), 13(1), 128-136.
Marsh, H. W., & Roche, L.A. (1997). Making students’
evaluations of teaching effectiveness effective:
The critical issues of validity, bias, and utility. American
Psychologist, 52(11), 1187-1197. doi:
10.1037/0003-066X.52.11.1187
McKeachie, W.J. (1997). Student ratings: The validity of use.
American Psychologist, 52(11), 1218-1225.
Norberg, A., Dziuban, C. D., & Moskal, P. D. (2011). A time-
based blended learning model. On the
Horizon, 19(3), 207-216. doi: 10.1108/10748121111163913
Raja, U., Johns, G., & Ntalianis, F. (2004). The impact of
personality on psychological contracts. The
Academy of Management Journal, 47(3), 350-367. doi:
10.2307/20159586
Richardson, J. C., & Swan, K. (2003). Examining social
presence in online courses in relation to students’
perceived learning and satisfaction. Journal of Asynchronous
Learning Networks, 7(1), 68-88.
Roberts, C. (2007). The unnatural history of the sea.
Washington DC: Island Press.
Rousseau, D. M. (1990). Normative beliefs in fund-raising
organizations linking culture to organizational
performance and individual responses. Group & Organization
Management, 15(4), 448-460. doi:
10.1177/105960119001500408
Rousseau, D. M. & Tijoriwala, S. A. (1998). Assessing
psychological contracts: Issues, alternatives and
measures. Journal of Organizational Behavior, 19, 679-695. doi:
10.1002/(SICI)1099-
1379(1998)19:1+<679::AID-JOB971>3.0.CO;2-N
Online Learning Vol. 19 Issue 2 (2015) 135
Rubin, B., Fernandes, R., & Avgerinou, M. D. (2013). The
effects of technology on the community of
inquiry and satisfaction with online courses. The Internet and
Higher Education, 17, 48-57. doi:
10.1016/j.iheduc.2012.09.006
Sax, L. J., Gilmartin, S. K., & Bryant, A. N. (2003). Assessing
response rates and nonresponse bias in
web and paper surveys. Research in Higher Education, 44(4),
409-432.
doi:10.1023/A:1024232915870
Shirky, C. (2010). Cognitive surplus: Creativity and generosity
in a connected age. New York: Penguin.
Shirky, C. (2008). Here comes everybody: The power of
organizing without organizations. New York:
Penguin.
Spies, A. R., Wilkin, N. E., Bentley, J. P., Bouldin, A. S.,
Wilson, M. C., & Holmes, E. R. (2010).
Instrument to measure psychological contract violation in
pharmacy students. American Journal
of Pharmaceutical Education, 74(6), 1-11.
Stevens, J.P. (2002). Applied multivariate statistics for the
social sciences (4th ed.). Mahwah, NJ:
Lawrence Erlbaum Associates, Inc.
Stewart, L., Hong, E., & Strudler, N. (2004). Development and
validation of an instrument for student
evaluation of the quality of web-based instruction. The
American Journal of Distance Education,
18(3), 131-150. doi: 10.1207/s15389286ajde1803_2
Swan, K. (2001). Virtual interaction: Design factors affecting
student satisfaction and perceived learning
in asynchronous online courses. Distance education, 22(2), 306-
331.
doi:10.1080/0158791010220208
Tolstoy, L. (2004). Anna Karenina. (R. Pevear & L.
Volokhonsky, Trans.). New York, NY: Penguin.
(Original work published 1878).
Wade-Benzoni, K. A., Rousseau, D. M., & Li, M. (2006).
Managing relationships across generations of
academics: Psychological contracts in faculty-doctoral student
collaborations. International
Journal of Conflict Management, 17(1), 4-33. doi:
10.1108/10444060610734154
Wang, M. C., Dziuban, C. D., Cook, I. J., & Moskal, P. D.
(2009). Dr. Fox rocks: Using data-mining
techniques to examine student ratings of instruction. In M. C.
Shelley, L. D. Yore, & B. Hand
(Eds.), Quality research in literacy and science education:
International perspectives and gold
standards (pp. 383-398). Dordrecht, Netherlands: Springer.
doi:10.1007/978-1-4020-8427-0_19
Watts, D. J. (2011). Everything is obvious. New York: Crown
Publishing Group, Random House.
Weigert, A. J. (1991). Mixed emotions: Certain steps toward
understanding ambivalence. Albany: State
University of New York Press.
Young, B. R., & Dziuban, E. (2000). Understanding dependency
and passivity: Reactive behavior
patterns in writing centers. Writing Center Journal, 21(1), 67-
87.
Online Learning Vol. 19 Issue 2 (2015) 136
Copyright of Online Learning is the property of Online
Learning Consortium and its content
may not be copied or emailed to multiple sites or posted to a
listserv without the copyright
holder's express written permission. However, users may print,
download, or email articles for
individual use.
Online Instruction, E-Learning, and Student
Satisfaction: A Three Year Study
(SNnOnlineCourses
Michele T. Cole, Daniel J. Shelley, and Louis B. Swartz
Robert Morris University, United States
Abstract
This article presents the results of a three-year study of
graduate and undergraduate
students’ level of satisfaction with online instruction at one
university. The study
expands on earlier research into student satisfaction with e-
learning. Researchers
conducted a series of surveys over eight academic terms. Five
hundred and fifty-three
students participated in the study. Responses were consistent
throughout, although
there were some differences noted in the level of student
satisfaction with their
experience. There were no statistically significant differences in
the level of satisfaction
based on gender, age, or level of study. Overall, students rated
their online instruction
as moderately satisfactory, with hybrid or partially online
courses rated as somewhat
more satisfactory than fully online courses. “Convenience” was
the most cited reason for
satisfaction. “Lack of interaction” was the most cited reason for
dissatisfaction.
Preferences for hybrid courses surfaced in the responses to an
open-ended question
asking what made the experience with online or partially online
courses satisfactory or
unsatisfactory. This study’s findings support the literature to
date and reinforce the
significance of student satisfaction to student retention.
Keywords: E-learning; instructional design; online education;
student retention;
student satisfaction
Online Instruction, E-Learning, and Student Satisfaction: A
Three Year Study
Cole, Shelley, and Swartz
Vol 15 | No 6 Creative Commons Attribution 4.0
International License Dec/14
112
Introduction
In their ten-year study of the nature and extent of online
education in the United States,
Allen and Seaman (2013) found that interest on the part of
universities and colleges in
online education shows no sign of abating. Online education
continues to expand at a
rate faster than traditional campus-based programs. The authors
reported the number
of students enrolled in at least one online course to be at an all-
time high of 32% of all
enrollments in participating institutions, representing an
increase of 570,000 students
from the previous year. Allen and Seaman also found that 77%
of university leaders
responding to the survey rated learning outcomes to be the
same, if not better, with
online education when compared with face-to-face learning.
Their results support the
no significant difference phenomenon that Russell (1999) found
in his comparative
study of student learning in the online and traditional classroom
environments.
Acknowledging that learning outcomes are equivalent, the
question of how satisfied
students are with their experiences with e-learning persists.
This is important from the
stand point of student retention which is, of course, relevant to
enrollment and
maintaining institutional revenue streams. Also, analysis of
student satisfaction may
point to improvements in e-learning practices which in turn
could improve outcomes.
Literature Review
The Allen and Seaman (2013) report looked at online education,
including the growing
presence of massive open online courses (MOOCs), from the
institutional perspective,
not from the student’s. In their report, the authors noted that
the remaining barriers to
widespread acceptance of online education were lack of faculty
and employer
acceptance, lack of student discipline and low retention rates.
Of these, student
retention in online programs is particularly relevant to the
discussion of student
satisfaction with their online experience. Reinforcing the
instructor’s role in designing
satisfying online curricula, Kransow (2013) posited that if
students were satisfied with
their online experiences, they would be more likely to remain in
the program.
Kransow (2013) poses a critical question for instructors
working in the online
environment. How can online courses be designed to maximize
student satisfaction as
well as student motivation, performance and persistence?
Drawing on the literature,
Kransow emphasizes the importance of building a sense of
community in the online
environment. Yet, building an online community that fosters
student satisfaction
involves strategies that go beyond facilitating interaction with
course components.
Building community also requires, among other elements,
interaction with each other,
that is, between student and instructor and among students in
the course. Sher (2009),
in his study of the role such interactions play in student
learning in a Web-based
environment, found interaction between student and instructor
and among students to
be significant factors in student satisfaction and learning.
Interaction—between the student and the instructor, among
students, and with course
content and technology—was the focus of Strachota’s (2003)
study of student
Online Instruction, E-Learning, and Student Satisfaction: A
Three Year Study
Cole, Shelley, and Swartz
Vol 15 | No 6 Creative Commons Attribution 4.0
International License Dec/14
113
satisfaction with distance education. In her study, learner-
content interaction ranked
first as a determinant of student satisfaction, followed by
learner-instructor and learner-
technology interaction. Interaction between and among students
was not found to be
significantly correlated with satisfaction. Bollinger (2004)
found three constructs to be
important in measuring student satisfaction with online courses:
interactivity,
instructor variables and issues with technology.
Palmer and Holt (2009) found that a student’s comfort level
with technology was
critical to satisfaction with online courses. Secondary factors
included clarity of
expectations and the student’s self-assessment of how well they
were doing in the online
environment. Drennan, Kennedy, and Pisarski (2005) also found
positive perceptions of
technology to be one of two key attributes of student
satisfaction. The second was
autonomous and innovative learning styles. Richardson and
Swan (2003) focused on
the relationship of social presence in online learning to
satisfaction with the instructor.
They found a positive correlation between students’ perceptions
of social presence and
their perceptions of learning and satisfaction. For Sahin (2007),
the strongest predictor
of student satisfaction was personal relevance (linkage of
course content with personal
experience), followed by instructor support, active learning and,
lastly, authentic
learning (real-life problem-solving).
Kleinman (2005) looked at improving instructional design to
maximize active learning
and interaction in online courses. Over a period of ten years,
Kleinman studied online
communities of learning, concluding that an online environment
which fosters active,
engaged learning and which provides the interactive support
necessary to help students
understand what is expected, leads to a satisfied learning
community. Swan (2001), too,
found that interactivity was essential to designing online
courses that positively affect
student satisfaction. Wang (2003) argued that to truly measure
student satisfaction
researchers must first assess the effectiveness of online
education.
Online education represents a major shift in how people learn
and in turn, how learners
are taught. The argument is made that, therefore, there is an
increasing need to
understand what contributes to student satisfaction with online
learning (Sinclaire,
2011). Student satisfaction is one of several variables
influencing the success of online
learning programs, along with the institutional factors that Abel
(2005) listed in his
article on best practices (leadership, faculty commitment,
student support, and
technology). Sener and Humbert (2003) maintained that
satisfaction is a vital element
in creating a successful online program.
There have been a number of studies of student satisfaction with
e-learning (Swan,
2001; Shelley, Swartz, & Cole, 2008, 2007), fully online as well
as with blended learning
models (Lim, Morris, & Kupritz, 2007). There have also been a
number of studies by
Arbaugh and associates on the predictors of student satisfaction
with online learning
(Arbaugh, 2000; Arbaugh, & Benbunan-Fich, 2006; Arbaugh, et
al., 2009; Arbaugh, &
Rau, 2007). Results from this study both support and expand on
earlier work.
Online Instruction, E-Learning, and Student Satisfaction: A
Three Year Study
Cole, Shelley, and Swartz
Vol 15 | No 6 Creative Commons Attribution 4.0
International License Dec/14
114
Discussion about the role that MOOCs are destined to play in
higher education (Deneen,
2013; Shirky, 2013) serves to heighten educators’ interest in
providing quality online
courses that maximize student satisfaction. The controversy
over granting credit for
MOOC courses (Huckabee, 2013; Jacobs, 2013; Kolowich,
2013a; Kolowich, 2013b;
Kolowich, 2013c; Lewin, 2013; Pappano, 2012) reinforces the
relevance of student
satisfaction to successful online education.
This study reports on research into student satisfaction with
online education conducted
over three years. The research has focused largely on business
students at one university
in Southwestern Pennsylvania. The emphasis on student
satisfaction with e-learning
and online instruction is increasingly relevant for curriculum
development which in
turn is relevant for student retention. Understanding what makes
online instruction and
e-learning satisfactory helps to inform instructional design.
This study is an extension of previous research on student
satisfaction with online
education (Cole, Shelley, & Swartz, 2013, Swartz, Cole, &
Shelley, 2010, Shelley, Swartz,
& Cole, 2008, 2007). Researchers used a multi-item survey
instrument to assess how
well student expectations were met in selected online courses.
Graduate and
undergraduate students were asked first whether they were
satisfied with their
experience with e-learning. Following that, they were asked to
explain what made the
experience satisfactory or unsatisfactory. Student satisfaction is
defined as “the learner’s
perceived value of their educational experiences in an
educational setting” (Bollinger &
Erichsen, 2013, p. 5).
Research Questions
This study focused on two survey questions:
1. Please rate your level of satisfaction with the online and/or
partially online
courses you have taken.
2. What made your experience with the online course/s
satisfactory or
unsatisfactory?
Both survey questions were broken into two separate questions
for purposes of analysis,
resulting in four research questions:
1. How satisfied were students with their fully online courses?
2. How satisfied were students with their partially online
courses?
3. What factors contributed to students’ satisfaction with e-
learning?
4. What factors contributed to students’ dissatisfaction with e-
learning?
This paper presents the results of that analysis.
Online Instruction, E-Learning, and Student Satisfaction: A
Three Year Study
Cole, Shelley, and Swartz
Vol 15 | No 6 Creative Commons Attribution 4.0
International License Dec/14
115
Method
Researchers used a Web-based survey created in Vovici, an
online survey software
program. Following a pilot study in spring, 2010, surveys were
sent to students in
graduate and undergraduate business courses over a period of
three years. Researchers
used a mixed-method analysis to evaluate responses to the
selected questions.
Descriptive statistics were used to summarize demographic data
and survey responses.
Results were transferred from Vovici to, and combined in, SPSS
to analyze the first two
research questions. Independent samples t-tests were conducted
on the scaled items.
Keyword analysis was used for the third and fourth research
questions. The survey was
anonymous.
Students in each of the business classes were offered extra
credit for taking the survey.
Credit was given based on notification to the instructor by the
student. The same
instructor taught each of the 19 courses in the second and third
study samples as well as
the business courses included in the initial study.
The initial survey instrument was approved by the University’s
Institutional Review
Board in 2010. Subsequent modifications to the survey were
minor and did not require
separate approvals in 2011/2012 or 2012/2013. The same script
was used seeking
participation in each of the surveys. Participation was solicited
via an e-mail from the
instructor. Each e-mail included the link to the Web-based
survey developed in Vovici.
Data from the completed surveys were transferred from Vovici
into SPSS. Independent
samples t-tests were conducted on the questions asking students
to rate their level of
satisfaction with online learning. Responses from males and
females, “Generation X”
and “Generation Y,” and from graduate and undergraduate
students were compared to
determine if there were any statistically significant differences
in the level of satisfaction
with online and partially online courses. Responses to the
question asking what
contributed to the respondents’ satisfaction or dissatisfaction
with online learning were
tabulated in Vovici. To analyze these responses, researchers
grouped keywords under
themes to form categories. The categories were: convenience,
interaction, structure,
learning style, and platform. “Interaction” included
“communication.” “Structure”
included “clarity” and “instructor’s role.” “Other” was included
to capture responses that
did not fall into any of the stated categories.
Sample and Participant Selection
The sample from the pilot study in spring, 2010 included
graduate students from the
MS in Instructional Technology and the MS in Nonprofit
Management programs,
undergraduate business majors, and Masters of Business
Administration (MBA)
students. No changes to the survey design were indicated as a
result of the pilot study.
The second study was conducted over three terms, summer,
2010, fall, 2010, and spring,
2011. This sample was composed of undergraduate students
enrolled in Legal
Environment of Business (BLAW 1050), taught in the fall 2010
term, and graduate
Online Instruction, E-Learning, and Student Satisfaction: A
Three Year Study
Cole, Shelley, and Swartz
Vol 15 | No 6 Creative Commons Attribution 4.0
International License Dec/14
116
students enrolled in Legal Issues of Executive Management
(MBAD 6063), which was
taught in the summer 2010 and spring 2011 terms. The third
study was conducted over
four terms, fall, 2011, spring, 2012, fall, 2012, and spring,
2013. This sample was
composed of undergraduates in BLAW 1050 taught in the fall
2011, fall 2012, and spring
2013 terms and graduate students in MBAD 6063, taught in the
spring 2012 and spring
2013 terms. Both the graduate and undergraduate business
courses chosen for the study
were taught by the same instructor.
Thirty-three students participated in the spring 2010 survey, a
response rate of 58%.
One hundred and sixty-four students participated in the second
study, a response rate of
92%. Three hundred and fifty-six students participated in the
third study, a response
rate of 97%. Combined, the total number of participants was
553 of 603 enrolled
students, for a response rate of 92%.
Twelve males and 21 females participated in the first survey.
One hundred and three
males and 61 females responded to the survey in the second
study group. Two hundred
and seventeen males and 135 females responded to the survey in
the third study group
for a total of 332 males (60.5%) and 217 females (39.5%) who
participated in the
surveys. Not all participants in the third sample responded to
the question on gender.
Participants were asked to identify themselves as belonging to
one of the following age
groups:
• Traditional Workers (born before 1946)
• Baby Boomers (born between 1946 and 1960),
• Generation X (born between 1961 and 1979) and,
• Generation Y (born after 1979) (Recursos Humanos, 2010).
Eight participants identified themselves as belonging to the
Baby Boomer or the
Traditional Worker categories. Nine people checked “Other.”
Three participants did not
respond to the question on age. The remaining respondents self-
identified as belonging
to “Generation X” or “Generation Y.” Due to the limited sample
sizes for “Baby
Boomers” and “Traditional Workers,” only responses from
participants in the
Generation X and Generation Y categories were compared for
this study.
In the first survey, 22 respondents self-identified as members of
“Generation Y.” Eleven
respondents classified themselves as members of “Generation
X.” In the second study
group, 136 respondents self-identified as “Generation Y.”
Twenty-two respondents self-
identified as “Generation X.” In the third study group, 303
respondents self-identified
as “Generation Y.” Thirty-nine respondents self-identified as
“Generation X.” The total
number of respondents who self-identified as belonging to
“Generation Y” was 461.
Seventy-two respondents self-identified as “Generation X.” The
total number of
respondents belonging to either “Generation X” or “Generation
Y” was 533.
Online Instruction, E-Learning, and Student Satisfaction: A
Three Year Study
Cole, Shelley, and Swartz
Vol 15 | No 6 Creative Commons Attribution 4.0
International License Dec/14
117
Two hundred and sixty graduate students participated in the
surveys. Two hundred and
eighty-one undergraduate students participated, for a total of
541. Some respondents
did not identify themselves clearly as being either graduate or
undergraduate students.
Table 1 presents the respondents’ demographic information.
Table 1
Respondent Sample Demographics*
Study N/Response
%
Male Female Gen X Gen Y Grad UG
I
II
III
Total
33 – 58%
164 -92%
365-97%
553-92%
12
103
217
332
21
61
135
217
11
22
39
72
22
136
303
461
33
89
138
260
0
73
208
281
* Not all respondents answered each question on gender, age,
or level of study
Procedure
Responses to the two questions on student satisfaction from
three surveys, Designing
Online Courses, Students’ Perceptions of Academic Integrity
and Enhancing Online
Learning with Technology, provided the data for the analysis.
Although survey
instruments used in the second and third studies were modified
slightly to gather data
for the studies on academic integrity and use of technology,
each survey asked:
1. Please rate your level of satisfaction with the online and/or
partially online
courses you have taken.
2. What made your experience with the online course/s
satisfactory or
unsatisfactory?
Researchers used a 5 point Likert scale for the first survey
question, asking students to
rate their level of satisfaction with fully online and/or partially
online courses. Zero was
equal to “very satisfied;” four was equal to “very dissatisfied.”
The second survey
question was designed as a follow-up query, asking what
contributed to the student’s
satisfaction or dissatisfaction with online learning.
To help inform the analysis of responses to the research
questions, researchers asked
students how many online or partially online courses they had
taken. To enable
comparisons by gender, age group, and level of study,
demographic questions were
included in each of the surveys.
Online Instruction, E-Learning, and Student Satisfaction: A
Three Year Study
Cole, Shelley, and Swartz
Vol 15 | No 6 Creative Commons Attribution 4.0
International License Dec/14
118
Designing Online Courses was administered in the spring 2010
term. The survey was
composed of 12 questions. Students’ Perceptions of Academic
Integrity was conducted
in the summer 2010, fall 2010, and spring 2011 terms. This
survey was composed of 13
questions. The third survey, Enhancing Online Learning with
Technology, was
composed of 12 questions. This survey was administered in the
fall 2011, spring 2012,
fall 2012, and spring 2013 terms.
Results
The first survey question sought to capture respondents’ level of
experience with e-
learning. In the first two studies, students were asked if they
had taken or were taking
one or more fully online graduate courses, partially online
graduate courses, fully online
undergraduate courses, and/or partially online undergraduate
courses. Responses from
both studies were combined for analysis. There were 198
student responses. Since the
response categories were not mutually exclusive, a student
could select more than one
response. Some students had taken both graduate and
undergraduate-level fully online
and/or partially online courses. As a result, the total number of
responses to the
question (255) exceeds the number of respondents (198). Table
2 presents the results.
Table 2
Level of Experience with E-Learning – Studies I & II
Response Count
N=255
% Student
responses
N=198
As a graduate student in fully online courses
As an under graduate student in fully online
courses
As a graduate student in partially online courses
As an undergraduate student in partially online
courses
As a student taking courses outside of a degree
program
None
Other
65
28
73
50
5
24
10
32.8
14.1
36.8
25.2
2.5
12.1
5.0
Elaboration on “other” included four instances of some
experience with online courses
that did not fit the categories in the question, and two
references to having had online
assignments. Four were unresponsive to the question.
The question asking for the respondent’s level of experience
with online or partially
online was phrased differently in the third study. In the final
surveys (from fall, 2011,
Online Instruction, E-Learning, and Student Satisfaction: A
Three Year Study
Cole, Shelley, and Swartz
Vol 15 | No 6 Creative Commons Attribution 4.0
International License Dec/14
119
spring, 2012, fall, 2012, and spring, 2013), researchers asked
how many fully or partially
online courses the student had taken. There were 391 responses.
Students could choose
only one response. Table 3 illustrates the results.
Table 3
Level of Experience with E-Learning – Study III
Responses Count
N= 391
%of
Responses
1 course
2-4 courses
5-10 courses
More than 10 courses
None
Oher
89
154
56
20
35
37
22.7
39.3
14.3
5.1
8.9
9.4
RQ1 How satisfied were students with their fully online
courses?
In a two part survey question, students were asked to rate their
level of satisfaction with
fully online courses taken and with partially online courses
taken. Students could
respond to either part of the question or to both. To the first
part, level of satisfaction
with fully online courses, there were 472 responses, 85% of the
total 553 participants. A
5 point Likert scale was used to measure responses ranging from
0 (very satisfied) to 4
(very dissatisfied). One hundred and six students or 22.5% of
the total responding said
that they were “very satisfied.” One hundred and seventy-one
(36.2%) said that they
were “satisfied.” One hundred and twenty-six (26.7%) were
“neutral.” Fifty–one (10.8%)
said that they were “dissatisfied.” Eighteen (3.8%) respondents
were “very dissatisfied”
with their experience with fully online courses.
Independent samples t-tests were conducted on this question to
determine if there were
any significant differences in the levels of satisfaction based on
gender, age, or level of
study with regard to satisfaction with fully online learning.
There were no statistically
significant differences between males and females, between
members of “Generation X”
and “Generation Y,” or between graduate and undergraduate
students on the question.
Females, members of Generation X, and upper-level
undergraduate students were more
likely than males, members of Generation Y, and graduate
students to rate their
experiences with fully online courses as satisfactory. The mean
score for females was
1.31; the mean score for males was 1.41. The mean score for
members of Generation X
was 1.24; the mean score for members of Generation Y was
1.40. The mean score for
upper-level undergraduate students was 1.19; the mean score for
graduate students was
1.23. Table 4 presents the results.
Online Instruction, E-Learning, and Student Satisfaction: A
Three Year Study
Cole, Shelley, and Swartz
Vol 15 | No 6 Creative Commons Attribution 4.0
International License Dec/14
120
Table 4
Student Satisfaction with Fully Online Courses
Variable n M t Sig.(2-
tailed)
Female 177 1.31 -1.052 .293
Male 294 1.41
Generation X 63 1.24 -.989 .326
Generation Y 396 1.40
Undergraduates 26 1.19 -.146 .884
Grad. Students 105 1.23
RQ2 How satisfied were students with their partially online
courses?
There were 420 responses, 76% of the total 553 participants, to
the second part of the
question asking students to rate their level of satisfaction with
partially online courses.
The same 5 point Likert scale was used to measure both parts.
Ninety-nine students or
23.6% of the total responding said that they were “very
satisfied.” One hundred and
thirty-six (32.4%) said that they were “satisfied.” One hundred
and thirty-seven (32.6%)
were “neutral.” Forty-three (10.2%) said that they were
“dissatisfied.” Five students
(1.2%) said that they were “very dissatisfied” with their
experience with partially online
courses.
Independent samples t-tests were conducted on this question to
determine if there were
any significant differences in the levels of satisfaction based on
gender, age, or level of
study with regard to satisfaction with partially online learning.
As with the first research
question, there were no statistically significant differences
between males and females,
between members of “Generation X” and “Generation Y,” or
between graduate and
undergraduate students with regard to satisfaction with partially
online courses.
However, unlike satisfaction with fully online courses taken,
males were somewhat
more satisfied than females, and graduate students were more
satisfied than upper-level
undergraduates with partially online courses taken. The mean
score for males was 1.32;
for females, the mean was 1.34. The mean for graduate students
was 1.11; for upper-level
undergraduates, the mean was 1.35. As was the case with fully
online courses, older
students, members of Generation X, were more satisfied with
their partially online
courses than were members of Generation Y. The mean score
for “Generation X” was
1.09; for “Generation Y,” the mean was 1.37. Table 5 presents
the results.
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx
Working Capital ManagementAlternative Working Capital Policies.docx

More Related Content

Similar to Working Capital ManagementAlternative Working Capital Policies.docx

Invest with confidence
Invest with confidenceInvest with confidence
Invest with confidence
Opptiv
 
Selective Alternatives: alternative investments due diligence and selection
Selective Alternatives: alternative investments due diligence and selectionSelective Alternatives: alternative investments due diligence and selection
Selective Alternatives: alternative investments due diligence and selection
Marius Čiuželis
 
Brigham_FFM16_Concise11_ch10ppt_PPT.pptx
Brigham_FFM16_Concise11_ch10ppt_PPT.pptxBrigham_FFM16_Concise11_ch10ppt_PPT.pptx
Brigham_FFM16_Concise11_ch10ppt_PPT.pptx
ssuser9e852e1
 
CFOs RESPONSE TO COVID
CFOs RESPONSE TO COVIDCFOs RESPONSE TO COVID
CFOs RESPONSE TO COVID
Nitin Pahilwani
 
Investor pitch (9 03-17) (2)
Investor pitch (9 03-17) (2)Investor pitch (9 03-17) (2)
Investor pitch (9 03-17) (2)
Timothy Jacquet
 
FINC 340 InvestmentsHow to Create an Investment StrategyThe .docx
FINC 340 InvestmentsHow to Create an Investment StrategyThe .docxFINC 340 InvestmentsHow to Create an Investment StrategyThe .docx
FINC 340 InvestmentsHow to Create an Investment StrategyThe .docx
voversbyobersby
 
BA 100 Chapter 16 PowerPoint - Week 7
BA 100 Chapter 16 PowerPoint - Week 7BA 100 Chapter 16 PowerPoint - Week 7
BA 100 Chapter 16 PowerPoint - Week 7
BealCollegeOnline
 
EnergyFunders Marketplace
EnergyFunders Marketplace EnergyFunders Marketplace
EnergyFunders Marketplace
EnergyFunders.com
 
Gator Capital: Beyond Banks Presentation
Gator Capital: Beyond Banks PresentationGator Capital: Beyond Banks Presentation
Gator Capital: Beyond Banks Presentation
Derek Pilecki
 
Multiple Arbitrage, the Roll up Playbook & Why some Roll-ups Fail - A Primer
Multiple Arbitrage, the Roll up Playbook & Why some Roll-ups Fail - A PrimerMultiple Arbitrage, the Roll up Playbook & Why some Roll-ups Fail - A Primer
Multiple Arbitrage, the Roll up Playbook & Why some Roll-ups Fail - A Primer
Cycam Pty Ltd - Enabling New Models of Competition
 
3 Reasons Why You Need a Chief Renewal Officer
3 Reasons Why You Need a Chief Renewal Officer3 Reasons Why You Need a Chief Renewal Officer
3 Reasons Why You Need a Chief Renewal Officer
TSIA
 
FINANCIAL MANAGEMENT CHAPTER 12
FINANCIAL MANAGEMENT CHAPTER 12FINANCIAL MANAGEMENT CHAPTER 12
FINANCIAL MANAGEMENT CHAPTER 12micjw
 
Capital letter Feb'12 - Fundsindia
Capital letter Feb'12 - FundsindiaCapital letter Feb'12 - Fundsindia
Capital letter Feb'12 - Fundsindia
FundsIndia.com
 
Applying cash flow management strategies part 1 score 12-2-19 final 11-25-19
Applying cash flow management strategies   part 1 score 12-2-19  final 11-25-19Applying cash flow management strategies   part 1 score 12-2-19  final 11-25-19
Applying cash flow management strategies part 1 score 12-2-19 final 11-25-19
ChloePastorelli
 
Risk And Return Measurement For Investment Decisions
Risk And Return Measurement For Investment DecisionsRisk And Return Measurement For Investment Decisions
Risk And Return Measurement For Investment DecisionsSiddharth Sinha
 
Power to Produce the Unstoppable You
Power to Produce the Unstoppable You Power to Produce the Unstoppable You
Power to Produce the Unstoppable You G Wheeler,Jr.
 
Ch07 wrd12e instructor_final
Ch07 wrd12e instructor_finalCh07 wrd12e instructor_final
Ch07 wrd12e instructor_finalfsuttonnnu
 
2018 04-17 How Much Should My Nonprofit Target for Reserves?
2018 04-17 How Much Should My Nonprofit Target for Reserves?2018 04-17 How Much Should My Nonprofit Target for Reserves?
2018 04-17 How Much Should My Nonprofit Target for Reserves?
Raffa Learning Community
 

Similar to Working Capital ManagementAlternative Working Capital Policies.docx (20)

Invest with confidence
Invest with confidenceInvest with confidence
Invest with confidence
 
Selective Alternatives: alternative investments due diligence and selection
Selective Alternatives: alternative investments due diligence and selectionSelective Alternatives: alternative investments due diligence and selection
Selective Alternatives: alternative investments due diligence and selection
 
Brigham_FFM16_Concise11_ch10ppt_PPT.pptx
Brigham_FFM16_Concise11_ch10ppt_PPT.pptxBrigham_FFM16_Concise11_ch10ppt_PPT.pptx
Brigham_FFM16_Concise11_ch10ppt_PPT.pptx
 
CFOs RESPONSE TO COVID
CFOs RESPONSE TO COVIDCFOs RESPONSE TO COVID
CFOs RESPONSE TO COVID
 
Bus 51 ch_2_s15
Bus 51 ch_2_s15Bus 51 ch_2_s15
Bus 51 ch_2_s15
 
Investor pitch (9 03-17) (2)
Investor pitch (9 03-17) (2)Investor pitch (9 03-17) (2)
Investor pitch (9 03-17) (2)
 
FINC 340 InvestmentsHow to Create an Investment StrategyThe .docx
FINC 340 InvestmentsHow to Create an Investment StrategyThe .docxFINC 340 InvestmentsHow to Create an Investment StrategyThe .docx
FINC 340 InvestmentsHow to Create an Investment StrategyThe .docx
 
BA 100 Chapter 16 PowerPoint - Week 7
BA 100 Chapter 16 PowerPoint - Week 7BA 100 Chapter 16 PowerPoint - Week 7
BA 100 Chapter 16 PowerPoint - Week 7
 
EnergyFunders Marketplace
EnergyFunders Marketplace EnergyFunders Marketplace
EnergyFunders Marketplace
 
Gator Capital: Beyond Banks Presentation
Gator Capital: Beyond Banks PresentationGator Capital: Beyond Banks Presentation
Gator Capital: Beyond Banks Presentation
 
Multiple Arbitrage, the Roll up Playbook & Why some Roll-ups Fail - A Primer
Multiple Arbitrage, the Roll up Playbook & Why some Roll-ups Fail - A PrimerMultiple Arbitrage, the Roll up Playbook & Why some Roll-ups Fail - A Primer
Multiple Arbitrage, the Roll up Playbook & Why some Roll-ups Fail - A Primer
 
Ara_Liquid Mutual Fund
Ara_Liquid Mutual FundAra_Liquid Mutual Fund
Ara_Liquid Mutual Fund
 
3 Reasons Why You Need a Chief Renewal Officer
3 Reasons Why You Need a Chief Renewal Officer3 Reasons Why You Need a Chief Renewal Officer
3 Reasons Why You Need a Chief Renewal Officer
 
FINANCIAL MANAGEMENT CHAPTER 12
FINANCIAL MANAGEMENT CHAPTER 12FINANCIAL MANAGEMENT CHAPTER 12
FINANCIAL MANAGEMENT CHAPTER 12
 
Capital letter Feb'12 - Fundsindia
Capital letter Feb'12 - FundsindiaCapital letter Feb'12 - Fundsindia
Capital letter Feb'12 - Fundsindia
 
Applying cash flow management strategies part 1 score 12-2-19 final 11-25-19
Applying cash flow management strategies   part 1 score 12-2-19  final 11-25-19Applying cash flow management strategies   part 1 score 12-2-19  final 11-25-19
Applying cash flow management strategies part 1 score 12-2-19 final 11-25-19
 
Risk And Return Measurement For Investment Decisions
Risk And Return Measurement For Investment DecisionsRisk And Return Measurement For Investment Decisions
Risk And Return Measurement For Investment Decisions
 
Power to Produce the Unstoppable You
Power to Produce the Unstoppable You Power to Produce the Unstoppable You
Power to Produce the Unstoppable You
 
Ch07 wrd12e instructor_final
Ch07 wrd12e instructor_finalCh07 wrd12e instructor_final
Ch07 wrd12e instructor_final
 
2018 04-17 How Much Should My Nonprofit Target for Reserves?
2018 04-17 How Much Should My Nonprofit Target for Reserves?2018 04-17 How Much Should My Nonprofit Target for Reserves?
2018 04-17 How Much Should My Nonprofit Target for Reserves?
 

More from ambersalomon88660

1. Lists crimes and crime involvement on the Mendez brothers.2.I.docx
1. Lists crimes and crime involvement on the Mendez brothers.2.I.docx1. Lists crimes and crime involvement on the Mendez brothers.2.I.docx
1. Lists crimes and crime involvement on the Mendez brothers.2.I.docx
ambersalomon88660
 
1. Lists and analyzes strengths and weaknesses based on each of th.docx
1. Lists and analyzes strengths and weaknesses based on each of th.docx1. Lists and analyzes strengths and weaknesses based on each of th.docx
1. Lists and analyzes strengths and weaknesses based on each of th.docx
ambersalomon88660
 
1. List eight basic initiatives that companies can use to gain c.docx
1. List eight basic initiatives that companies can use to gain c.docx1. List eight basic initiatives that companies can use to gain c.docx
1. List eight basic initiatives that companies can use to gain c.docx
ambersalomon88660
 
1. Koffman Corporation is trying to raise capital. What method wou.docx
1. Koffman Corporation is trying to raise capital. What method wou.docx1. Koffman Corporation is trying to raise capital. What method wou.docx
1. Koffman Corporation is trying to raise capital. What method wou.docx
ambersalomon88660
 
1. List all the entities that interact with the TIMS system. Start b.docx
1. List all the entities that interact with the TIMS system. Start b.docx1. List all the entities that interact with the TIMS system. Start b.docx
1. List all the entities that interact with the TIMS system. Start b.docx
ambersalomon88660
 
1. Know the terminology flash cards.2. Know the hist.docx
1. Know the terminology flash cards.2. Know the hist.docx1. Know the terminology flash cards.2. Know the hist.docx
1. Know the terminology flash cards.2. Know the hist.docx
ambersalomon88660
 
1. Journal Entry The attached (BUROS Center for Testing).docx
1. Journal Entry  The attached (BUROS Center for Testing).docx1. Journal Entry  The attached (BUROS Center for Testing).docx
1. Journal Entry The attached (BUROS Center for Testing).docx
ambersalomon88660
 
1. Introduction and thesisThrough extensive research I hope to f.docx
1. Introduction and thesisThrough extensive research I hope to f.docx1. Introduction and thesisThrough extensive research I hope to f.docx
1. Introduction and thesisThrough extensive research I hope to f.docx
ambersalomon88660
 
1. Is it important the hospital to have a licensure to ensure that.docx
1. Is it important the hospital to have a licensure to ensure that.docx1. Is it important the hospital to have a licensure to ensure that.docx
1. Is it important the hospital to have a licensure to ensure that.docx
ambersalomon88660
 
1. INTRODUCTION In recent years, energy harvesting fro.docx
1. INTRODUCTION In recent years, energy harvesting fro.docx1. INTRODUCTION In recent years, energy harvesting fro.docx
1. INTRODUCTION In recent years, energy harvesting fro.docx
ambersalomon88660
 
1. INTRODUCTIONThe rapid of economic growth in China, is a fou.docx
1. INTRODUCTIONThe rapid of economic growth in China, is a fou.docx1. INTRODUCTIONThe rapid of economic growth in China, is a fou.docx
1. INTRODUCTIONThe rapid of economic growth in China, is a fou.docx
ambersalomon88660
 
1. Introduction to the Topica. What is outsourcingi. Ty.docx
1. Introduction to the Topica. What is outsourcingi. Ty.docx1. Introduction to the Topica. What is outsourcingi. Ty.docx
1. Introduction to the Topica. What is outsourcingi. Ty.docx
ambersalomon88660
 
1. Introduction 1. Technology and communication 1. Technology .docx
1. Introduction 1. Technology and communication 1. Technology .docx1. Introduction 1. Technology and communication 1. Technology .docx
1. Introduction 1. Technology and communication 1. Technology .docx
ambersalomon88660
 
1. In your definition of a well-run company, how important a.docx
1. In your definition of a well-run company, how important a.docx1. In your definition of a well-run company, how important a.docx
1. In your definition of a well-run company, how important a.docx
ambersalomon88660
 
1. In Chapter four titled Academy Training you learned about academi.docx
1. In Chapter four titled Academy Training you learned about academi.docx1. In Chapter four titled Academy Training you learned about academi.docx
1. In Chapter four titled Academy Training you learned about academi.docx
ambersalomon88660
 
1. In 200 words, describe how Hamlet promotes andor subverts th.docx
1. In 200 words, describe how Hamlet promotes andor subverts th.docx1. In 200 words, describe how Hamlet promotes andor subverts th.docx
1. In 200 words, describe how Hamlet promotes andor subverts th.docx
ambersalomon88660
 
1. Image 1 courtesy of httpswww.virginiahospitalcenter.com.docx
1. Image 1 courtesy of httpswww.virginiahospitalcenter.com.docx1. Image 1 courtesy of httpswww.virginiahospitalcenter.com.docx
1. Image 1 courtesy of httpswww.virginiahospitalcenter.com.docx
ambersalomon88660
 
1. If I were to create an SEL program, I would focus on self-awar.docx
1. If I were to create an SEL program, I would focus on self-awar.docx1. If I were to create an SEL program, I would focus on self-awar.docx
1. If I were to create an SEL program, I would focus on self-awar.docx
ambersalomon88660
 
1. Identify and discuss the factors that contribute to heritage cons.docx
1. Identify and discuss the factors that contribute to heritage cons.docx1. Identify and discuss the factors that contribute to heritage cons.docx
1. Identify and discuss the factors that contribute to heritage cons.docx
ambersalomon88660
 
1. I think that the top three management positions in a health pla.docx
1. I think that the top three management positions in a health pla.docx1. I think that the top three management positions in a health pla.docx
1. I think that the top three management positions in a health pla.docx
ambersalomon88660
 

More from ambersalomon88660 (20)

1. Lists crimes and crime involvement on the Mendez brothers.2.I.docx
1. Lists crimes and crime involvement on the Mendez brothers.2.I.docx1. Lists crimes and crime involvement on the Mendez brothers.2.I.docx
1. Lists crimes and crime involvement on the Mendez brothers.2.I.docx
 
1. Lists and analyzes strengths and weaknesses based on each of th.docx
1. Lists and analyzes strengths and weaknesses based on each of th.docx1. Lists and analyzes strengths and weaknesses based on each of th.docx
1. Lists and analyzes strengths and weaknesses based on each of th.docx
 
1. List eight basic initiatives that companies can use to gain c.docx
1. List eight basic initiatives that companies can use to gain c.docx1. List eight basic initiatives that companies can use to gain c.docx
1. List eight basic initiatives that companies can use to gain c.docx
 
1. Koffman Corporation is trying to raise capital. What method wou.docx
1. Koffman Corporation is trying to raise capital. What method wou.docx1. Koffman Corporation is trying to raise capital. What method wou.docx
1. Koffman Corporation is trying to raise capital. What method wou.docx
 
1. List all the entities that interact with the TIMS system. Start b.docx
1. List all the entities that interact with the TIMS system. Start b.docx1. List all the entities that interact with the TIMS system. Start b.docx
1. List all the entities that interact with the TIMS system. Start b.docx
 
1. Know the terminology flash cards.2. Know the hist.docx
1. Know the terminology flash cards.2. Know the hist.docx1. Know the terminology flash cards.2. Know the hist.docx
1. Know the terminology flash cards.2. Know the hist.docx
 
1. Journal Entry The attached (BUROS Center for Testing).docx
1. Journal Entry  The attached (BUROS Center for Testing).docx1. Journal Entry  The attached (BUROS Center for Testing).docx
1. Journal Entry The attached (BUROS Center for Testing).docx
 
1. Introduction and thesisThrough extensive research I hope to f.docx
1. Introduction and thesisThrough extensive research I hope to f.docx1. Introduction and thesisThrough extensive research I hope to f.docx
1. Introduction and thesisThrough extensive research I hope to f.docx
 
1. Is it important the hospital to have a licensure to ensure that.docx
1. Is it important the hospital to have a licensure to ensure that.docx1. Is it important the hospital to have a licensure to ensure that.docx
1. Is it important the hospital to have a licensure to ensure that.docx
 
1. INTRODUCTION In recent years, energy harvesting fro.docx
1. INTRODUCTION In recent years, energy harvesting fro.docx1. INTRODUCTION In recent years, energy harvesting fro.docx
1. INTRODUCTION In recent years, energy harvesting fro.docx
 
1. INTRODUCTIONThe rapid of economic growth in China, is a fou.docx
1. INTRODUCTIONThe rapid of economic growth in China, is a fou.docx1. INTRODUCTIONThe rapid of economic growth in China, is a fou.docx
1. INTRODUCTIONThe rapid of economic growth in China, is a fou.docx
 
1. Introduction to the Topica. What is outsourcingi. Ty.docx
1. Introduction to the Topica. What is outsourcingi. Ty.docx1. Introduction to the Topica. What is outsourcingi. Ty.docx
1. Introduction to the Topica. What is outsourcingi. Ty.docx
 
1. Introduction 1. Technology and communication 1. Technology .docx
1. Introduction 1. Technology and communication 1. Technology .docx1. Introduction 1. Technology and communication 1. Technology .docx
1. Introduction 1. Technology and communication 1. Technology .docx
 
1. In your definition of a well-run company, how important a.docx
1. In your definition of a well-run company, how important a.docx1. In your definition of a well-run company, how important a.docx
1. In your definition of a well-run company, how important a.docx
 
1. In Chapter four titled Academy Training you learned about academi.docx
1. In Chapter four titled Academy Training you learned about academi.docx1. In Chapter four titled Academy Training you learned about academi.docx
1. In Chapter four titled Academy Training you learned about academi.docx
 
1. In 200 words, describe how Hamlet promotes andor subverts th.docx
1. In 200 words, describe how Hamlet promotes andor subverts th.docx1. In 200 words, describe how Hamlet promotes andor subverts th.docx
1. In 200 words, describe how Hamlet promotes andor subverts th.docx
 
1. Image 1 courtesy of httpswww.virginiahospitalcenter.com.docx
1. Image 1 courtesy of httpswww.virginiahospitalcenter.com.docx1. Image 1 courtesy of httpswww.virginiahospitalcenter.com.docx
1. Image 1 courtesy of httpswww.virginiahospitalcenter.com.docx
 
1. If I were to create an SEL program, I would focus on self-awar.docx
1. If I were to create an SEL program, I would focus on self-awar.docx1. If I were to create an SEL program, I would focus on self-awar.docx
1. If I were to create an SEL program, I would focus on self-awar.docx
 
1. Identify and discuss the factors that contribute to heritage cons.docx
1. Identify and discuss the factors that contribute to heritage cons.docx1. Identify and discuss the factors that contribute to heritage cons.docx
1. Identify and discuss the factors that contribute to heritage cons.docx
 
1. I think that the top three management positions in a health pla.docx
1. I think that the top three management positions in a health pla.docx1. I think that the top three management positions in a health pla.docx
1. I think that the top three management positions in a health pla.docx
 

Recently uploaded

Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
Atul Kumar Singh
 
Introduction to Quality Improvement Essentials
Introduction to Quality Improvement EssentialsIntroduction to Quality Improvement Essentials
Introduction to Quality Improvement Essentials
Excellence Foundation for South Sudan
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
Celine George
 
The Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdfThe Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdf
kaushalkr1407
 
Chapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptxChapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptx
Mohd Adib Abd Muin, Senior Lecturer at Universiti Utara Malaysia
 
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdfESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
Fundacja Rozwoju Społeczeństwa Przedsiębiorczego
 
Digital Tools and AI for Teaching Learning and Research
Digital Tools and AI for Teaching Learning and ResearchDigital Tools and AI for Teaching Learning and Research
Digital Tools and AI for Teaching Learning and Research
Vikramjit Singh
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
siemaillard
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
Thiyagu K
 
The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
Delapenabediema
 
The geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasThe geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideas
GeoBlogs
 
Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)
rosedainty
 
How to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERPHow to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERP
Celine George
 
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
bennyroshan06
 
Cambridge International AS A Level Biology Coursebook - EBook (MaryFosbery J...
Cambridge International AS  A Level Biology Coursebook - EBook (MaryFosbery J...Cambridge International AS  A Level Biology Coursebook - EBook (MaryFosbery J...
Cambridge International AS A Level Biology Coursebook - EBook (MaryFosbery J...
AzmatAli747758
 
2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...
Sandy Millin
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptx
RaedMohamed3
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
Celine George
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
MIRIAMSALINAS13
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
Tamralipta Mahavidyalaya
 

Recently uploaded (20)

Language Across the Curriculm LAC B.Ed.
Language Across the  Curriculm LAC B.Ed.Language Across the  Curriculm LAC B.Ed.
Language Across the Curriculm LAC B.Ed.
 
Introduction to Quality Improvement Essentials
Introduction to Quality Improvement EssentialsIntroduction to Quality Improvement Essentials
Introduction to Quality Improvement Essentials
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
 
The Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdfThe Roman Empire A Historical Colossus.pdf
The Roman Empire A Historical Colossus.pdf
 
Chapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptxChapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptx
 
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdfESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
 
Digital Tools and AI for Teaching Learning and Research
Digital Tools and AI for Teaching Learning and ResearchDigital Tools and AI for Teaching Learning and Research
Digital Tools and AI for Teaching Learning and Research
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
Unit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdfUnit 8 - Information and Communication Technology (Paper I).pdf
Unit 8 - Information and Communication Technology (Paper I).pdf
 
The Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official PublicationThe Challenger.pdf DNHS Official Publication
The Challenger.pdf DNHS Official Publication
 
The geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasThe geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideas
 
Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)
 
How to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERPHow to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERP
 
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
 
Cambridge International AS A Level Biology Coursebook - EBook (MaryFosbery J...
Cambridge International AS  A Level Biology Coursebook - EBook (MaryFosbery J...Cambridge International AS  A Level Biology Coursebook - EBook (MaryFosbery J...
Cambridge International AS A Level Biology Coursebook - EBook (MaryFosbery J...
 
2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...2024.06.01 Introducing a competency framework for languag learning materials ...
2024.06.01 Introducing a competency framework for languag learning materials ...
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptx
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
 
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXXPhrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
Phrasal Verbs.XXXXXXXXXXXXXXXXXXXXXXXXXX
 
Home assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdfHome assignment II on Spectroscopy 2024 Answers.pdf
Home assignment II on Spectroscopy 2024 Answers.pdf
 

Working Capital ManagementAlternative Working Capital Policies.docx

  • 1. Working Capital Management Alternative Working Capital Policies Cash Management Inventory and A/R Management Trade Credit Bank Loans Chapter 16 INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-1 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 1 Working Capital Terminology Working capital: current assets. Net working capital: current assets minus current liabilities. Net operating working capital: current assets minus (current liabilities less notes payable).
  • 2. Current assets investment policy: deciding the level of each type of current asset to hold, and how to finance current assets. Working capital management: controlling cash, inventories, and A/R, plus short-term liability management. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-2 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 2 Selected Ratios for SKI Inc.SKIInd. AvgCurrent ratio1.75x2.25xDebt/Assets58.76%50.00%Turnover of cash & securities16.67x22.22xDays sales outstanding45.6332.00Inventory turnover4.82x7.00xFixed assets turnover11.35x12.00xTotal assets turnover2.08x3.00xProfit margin2.07%3.50%Return on equity10.45%21.00% INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS
  • 3. CASH MGMT TRADE CREDIT 16-3 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 3 How does SKI’s current assets investment policy compare with its industry? Current assets investment policy is reflected in the current ratio, turnover of cash and securities, inventory turnover, and days sales outstanding. These ratios indicate SKI has large amounts of working capital relative to its level of sales. SKI is either very conservative or inefficient. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-4
  • 4. © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 4 Is SKI inefficient or conservative? A conservative (relaxed) policy may be appropriate if it leads to greater profitability. However, SKI is not as profitable as the average firm in the industry. This suggests the company has excessive current assets. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-5 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 5 Working Capital Financing Policies Moderate: Match the maturity of the assets with the maturity of the financing.
  • 5. Aggressive: Use short-term financing to finance permanent assets. Conservative: Use permanent capital for permanent assets and temporary assets. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-6 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 6 Moderate Financing Policy Years Lower dashed line would be more aggressive. $
  • 6. Perm C.A. Fixed Assets Temp. C.A. S-T Loans L-T Fin: Stock, Bonds, Spon. C.L. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-7
  • 7. © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 7 Conservative Financing Policy $ Years Perm C.A. Fixed Assets Marketable securities Zero S-T Debt L-T Fin: Stock, Bonds,
  • 8. Spon. C.L. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-8 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 8 Cash Conversion Cycle The cash conversion cycle focuses on the length of time between when a company makes payments to its creditors and when a company receives payments from its customers. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-9
  • 9. © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 9 Cash Conversion Cycle INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-10 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 10 Minimizing Cash Holdings
  • 10. Use a lockbox Insist on wire transfers and debit/credit cards from customers Synchronize inflows and outflows Reduce need for “safety stock” of cash Increase forecast accuracy Hold marketable securities Negotiate a line of credit INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-11 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 11 Cash Budget Forecasts cash inflows, outflows, and ending cash balances. Used to plan loans needed or funds available to invest. Can be daily, weekly, or monthly, forecasts. Monthly for annual planning and daily for actual cash management. INTRO INV & A/R MGMT
  • 11. ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-12 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 12 SKI’s Cash Budget for January and FebruaryJanuaryFebruaryCollections $67,651.95 $62,755.40Purchases 44,603.75 36,472.65Wages 6,690.56 5,470.90Rent 2,500.00 2,500.00Total payments $53,794.31 $44,443.55Net cash flows $13,857.64 $18,311.85 INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-13
  • 12. © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 13 SKI’s Cash BudgetJanuaryFebruaryCash at start if no borrowing $ 3,000.00 $16,857.64Net cash flows 13,857.64 18,311.85Cumulative cash $16,857.64 $35,169.49Less: Target cash 1,500.00 1,500.00Surplus $15,357.64 $33,669.49 INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-14 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 14 How could bad debts be worked into the cash budget? Collections would be reduced by the amount of the bad debt losses. For example, if the firm had 3% bad debt losses, collections
  • 13. would total only 97% of sales. Lower collections would lead to higher borrowing requirements. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-15 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 15 Analyze SKI’s Forecasted Cash Budget Cash holdings will exceed the target balance for each month, except for October and November. Cash budget indicates the company is holding too much cash. SKI could improve its EVA by either investing cash in more productive assets, or by returning cash to its shareholders. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-16
  • 14. © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 16 Why might SKI want to maintain a relatively high amount of cash? If sales turn out to be considerably less than expected, SKI could face a cash shortfall. A company may choose to hold large amounts of cash if it does not have much faith in its sales forecast, or if it is very conservative. The cash may be used, in part, to fund future investments. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-17 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part.
  • 15. 17 Inventory Costs Types of inventory costs Carrying costs: storage and handling costs, insurance, property taxes, depreciation, and obsolescence. Ordering costs: cost of placing orders, shipping, and handling costs. Costs of running short: loss of sales or customer goodwill, and the disruption of production schedules. Reducing inventory levels generally reduces carrying costs, increases ordering costs, and may increase the costs of running short. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-18 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 18 Is SKI holding too much inventory?
  • 16. SKI’s inventory turnover (4.82x) is considerably lower than the industry average (7.00x). The firm is carrying a large amount of inventory per dollar of sales. By holding excessive inventory, the firm is increasing its costs, which reduces its ROE. Moreover, this additional working capital must be financed, so EVA is also lowered. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-19 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 19 If SKI reduces its inventory without adversely affecting sales, what effect will this have on the cash position? Short run: Cash will increase as inventory purchases decline. This will reduce financing or target cash balance. Long run: Company is likely to take steps to reduce its cash holdings and increase its EVA. The “excess” cash can be used to make investments in more productive assets such as plant and equipment resulting in an
  • 17. increase in operating income increasing its EVA. Alternately, can distribute “excess” cash to its shareholders through higher dividends or repurchasing shares resulting in a lower cost of capital increasing its EVA. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-20 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 20 Do SKI’s customers pay more or less promptly than those of its competitors? SKI’s DSO (45.6 days) is well above the industry average (32 days). SKI’s customers are paying less promptly. SKI should consider tightening its credit policy in order to reduce its DSO. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS
  • 18. CASH MGMT TRADE CREDIT 16-21 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 21 Elements of Credit Policy Credit Period: How long to pay? Shorter period reduces DSO and average A/R, but it may discourage sales. Cash Discounts: Lowers price. Attracts new customers and reduces DSO. Credit Standards: Restrictive standards tend to reduce sales, but reduce bad debt expense. Fewer bad debts reduce DSO. Collection Policy: How tough? Restrictive policy will reduce DSO but may damage customer relationships. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-22
  • 19. © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 22 Does SKI face any risk if it restricts its credit policy? Yes, a restrictive credit policy may discourage sales. Some customers may choose to go elsewhere if they are pressured to pay their bills sooner. SKI must balance the benefits of fewer bad debts with the cost of possible lost sales. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-23 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 23 If SKI reduces its DSO without adversely affecting sales, how would this affect its cash position?
  • 20. Short run: If customers pay sooner, this increases cash holdings. This will reduce financing or target cash balance needed. Long run: Over time, the company would hopefully invest the cash in more productive assets, or pay it out to shareholders. Both of these actions would increase EVA. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-24 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 24 What is trade credit? Trade credit is credit furnished by a firm’s suppliers. Trade credit is often the largest source of short-term credit, especially for small firms. Spontaneous, easy to get, but cost can be high. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS
  • 21. CASH MGMT TRADE CREDIT 16-25 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 25 Terms of Trade Credit A firm buys $3,000,000 net ($3,030,303 gross) on terms of 1/10, net 40. The firm can forego discounts and pay on Day 40, without penalty. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-26 © 2016 Cengage Learning. All Rights Reserved. May not be
  • 22. scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 26 Breaking Down Trade Credit Payables level, if the firm takes discounts Payables = $8,219.18(10) = $82,192 Payables level, if the firm takes no discounts Payables = $8,219.18(40) = $328,767 Credit breakdown Total trade credit$328,767Free trade credit- 82,192Costly trade credit$246,575 INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-27 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 27 Nominal Cost of Trade Credit The firm loses 0.01($3,030,303) = $30,303 of discounts to
  • 23. obtain $246,575 in extra trade credit: rNOM = $30,303/$246,575 = 0.1229 = 12.29% The $30,303 is paid throughout the year, so the effective cost of costly trade credit is higher. INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-28 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 28 Nominal Cost of Trade Credit Formula INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-29
  • 24. © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 29 Effective Cost of Trade Credit Periodic rate = 0.01/0.99 = 1.01% Periods/year = 365/(40 – 10) = 12.1667 Effective cost of trade credit INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-30 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 30
  • 25. Bank Loans The firm can borrow $100,000 for 1 year at an 8% nominal rate. Interest may be set under one of the following scenarios: Simple annual interest Installment loan, add-on, 12 months INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-31 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 31 Simple Annual Interest Simple interest means no discount or add-on. Interest = 0.08($100,000) = $8,000 rNOM = EAR = $8,000/$100,000 = 8.0% For a 1-year simple interest loan, rNOM = EAR. INTRO INV & A/R MGMT ALT WC POLICIES
  • 26. BANK LOANS CASH MGMT TRADE CREDIT 16-32 © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 32 Add-on Interest Interest = 0.08($100,000) = $8,000 Face amount = $100,000 + $8,000 = $108,000 Monthly payment = $108,000/12 = $9,000 Avg. loan outstanding = $100,000/2 = $50,000 Approximate cost = $8,000/$50,000 = 16.0% To find the exact effective rate, recognize that the firm receives $100,000 and must make monthly payments of $9,000 (like an annuity). INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-33
  • 27. © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 33 Add-on Interest From the calculator output below, we have: rNOM = 12 (0.012043) = 0.1445 = 14.45% EAR = (1.012043)12 – 1 = 15.45% INPUTS OUTPUT N I/YR PMT PV FV 12 1.2043 100 -9 0 INTRO INV & A/R MGMT ALT WC POLICIES BANK LOANS CASH MGMT TRADE CREDIT 16-34
  • 28. © 2016 Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible website, in whole or in part. 34 period deferral Payables period collection Average period conversion Inventory CCC - + = days 92 30
  • 33. 12.1667 N = - = - + = Student Satisfaction with Online Learning: Is it a Psychological Contract? Charles Dziuban, Patsy Moskal, Jessica Thompson, Lauren Kramer, Genevieve DeCantis and Andrea Hermsdorfer Research Initiative for Teaching Effectiveness University of Central Florida Abstract The authors explore the possible relationship between student satisfaction with online learning and the theory of psychological contracts. The study incorporates latent trait models using the image analysis procedure and computation of Anderson and Rubin factors scores with contrasts for students who are satisfied, ambivalent, or dissatisfied with their online learning experiences. The findings identify three underlying satisfaction components: engaged learning, agency, and assessment. The factor score comparisons indicate that students in the general satisfaction categories characterize important differences in engaged learning and agency, but not assessment. These
  • 34. results lead the authors to hypothesize that predetermined, but unspecified expectations (i.e., psychological contracts) for online courses by both students and faculty members are important advance organizers for clarifying student satisfaction. Introduction From its inception, online learning has been confronted by concerns about quality from the established educational community and society at large (Carnaghan & Webb, 2007; Akdemir & Koszalka, 2008). Often, in addressing these concerns students’ perceptions of their course experience becomes a surrogate for learning engagement in the context of satisfaction (Swan, 2001; Arbaugh, 2001; Richardson & Swan, 2003; Bolliger, 2004). Because contemporary students view information as a commodity which can be traded openly among a community of learners, collaboration becomes fundamental to a variety of educational outcomes (Shirky, 2010; Dziuban et al., 2013). Online Learning Vol. 19 Issue 2 (2015) 122 Modern technologies are contributing to the dissolution of traditional classroom boundaries (Shirky, 2008). Students connect with their instructors and each other through modalities of almost every variety, greatly expanding avenues of communication. Norberg, Dziuban and Moskal’s (2011) development of a time-based blended learning model, for instance, modifies the instructor’s role (Liu & Hwang, 2010) in learning environments based on students’ synchronous and asynchronous learning
  • 35. preferences. The need for new and more authentic assessment techniques in addition to challenges to traditional educational structures (e.g. semester length time boundaries) raises issues about what moderates students’ academic expectations and satisfaction. Studies suggest that online students wish to decrease their ambivalence toward formal education by gaining some sense of a carefully delineated path to success (Dziuban & Dziuban, 1998; Dziuban, Moskal & Dziuban, 2000; Long, 2011; Young & Dziuban, 2000). Students prefer active, rather than passive learning environments, and, because they participate in a highly interactive world, they expect the same in their classes (Dziuban et al., 2003). Today’s learners require more outlets for creativity and collaboration which online learning environments can accommodate through a variety of instructional models that are provided anytime, anyplace. Researchers should not be surprised that identifying the defining elements for satisfaction has become much more dynamic and complex. The construct has multiple facets that tend to be stochastic as a particular course progresses. In this study, we attempt to clarify the underlying (latent) elements of student satisfaction in the context of overall course evaluation for students who respond positively to online experiences on end-of-course evaluation protocols. Feldman (1993) describes the assessment challenges we encounter as distributions of considerations when he argues that responses to survey questions provide only an estimate of the central tendency of an individual’s attitude or belief about a subject or object. Craig and Martinez (2005) summarize the issue: “in retrospect, it seems rather
  • 36. simplistic to think of attitudes as always being unidimensional. After all, who hasn’t experienced mixed feelings about people, places and things that we have encountered or visited in our lives?” (p. 1) Recent Studies on Student Satisfaction with Online Courses Multiple approaches define and assess student satisfaction. Rubin, Fernandes & Avgerinou (2013) extended research on the Community of Inquiry (Garrison, Anderson & Archer, 2000) which defines social, cognitive, and teaching presence as being essential to the student learning experience and, thus, student satisfaction. They determined that learning management system (LMS) features greatly impact perceptions of community according to the inquiry framework. In a related study, Mahmood, Mahmood and Malik (2012) argued that teaching presence plays the most critical role in how students evaluate online learning. The interaction construct plays an important role in both face- to-face and online learning modalities (Kuo, Walker, Belland & Schroder, 2013). In fact, many studies have found that both quantity and quality of student interactions are highly correlated with student satisfaction in almost any learning environment. However, investigators have noted that demographic and cultural considerations also impact the design of appropriate interaction techniques in online learning (González-Gómez, Guardiola, Martín Rodríguez & Montaro Alonso, 2012). Ke and Kwak (2013) identified five elements of student satisfaction: learner relevance, active learning, authentic learning, learner autonomy, and technology
  • 37. competence. Kuo et al. (2013) determined that learner-instructor interaction and learner-content interaction combined with technology efficacy are valid indicators of students’ positive perceptions. However Battalio (2007), using a criterion approach, argued that a positive course rating requires effective learner- instructor interaction. Keengwe, Diteeyont and Lawson-Body (2012) argued that students’ expectations influence the instructor’s design of effective technology tools in online courses and are the key to understanding the satisfaction construct. The authors concluded that satisfaction was most impacted by learning convenience combined with the effectiveness of e-learning tools. Dziuban, Moskal, Brophy-Ellison and Shea (2007) found six key elements that contribute to students’ satisfaction: an enriched learning Online Learning Vol. 19 Issue 2 (2015) 123 environment, well-defined rules of engagements, instructor commitment, reduced ambiguity, an engaging environment, and reduced ambivalence about the value of the course. Because colleges and universities have to be much more responsive to their student client base (Long, 2011; Bordelon, 2012; Allen & Seaman, 2013), ambivalence becomes particularly important. This implies satisfaction is an underlying indicator of success in various learning environments, especially online modalities. Satisfied students appear to be engaged, motivated and responsive; contribute to an
  • 38. effective learning climate; and achieve at higher levels. Dissatisfied or ambivalent students contribute to environments where instructors appear to have much more difficulty facilitating effective learning situations. Faculty members in such circumstances have trouble relating to their students and may incorrectly assume that such difficulties are related primarily to student dissatisfaction with online learning (Dziuban et al., 2007). A precise configuration of student satisfaction with online learning is proving to be elusive because it might be context dependent (e.g., college, discipline, course level, institution, and, of course, instructor). Bowker and Star (1999) use the term “boundary object” to suggest that these items or ideas adapt to specific needs and constraints while maintaining a common identity. While bringing a community of practice together for communication and inquiry purposes, they are generally are weak in the large cohort. According to these researchers, however, the object (student satisfaction, in this case) is much more well-defined within individual constituencies. These definitional issues appear to reflect what Watts (2011) calls confirmation bias—that is, accepting information that confirms our existing beliefs much more readily than information that does not. To express their degree of satisfaction, students react only to things that they expect, but are never expressly stated (i.e., their predetermined psychological contract) or to what they have already assumed about the course. However, should dissonance with these expectations develop, students may encounter ambivalence characterized by simultaneous positive and negative feelings. These are the mixed emotions described by Weigert (1991) and Long (2011).
  • 39. Factor Studies of Student Satisfaction with Online Learning A small number of studies conducted by investigators seeking to identify the dimensionality of student satisfaction with online learning have emerged in the past few years. This work has been a natural extension of inquiry into student satisfaction in higher education (Abrami & d’Apollonia, 1991; Feldman, 1976; Feldman, 1993; Greenwald & Gilmore, 1997; Kim, Damewood & Hodge, 2000; Marsh & Roche, 1997; McKeachie, 1997). While prior studies have focused primarily on face-to-face teaching environments, online learning has provided a new dynamic and has re-energized interest in the topic. Arbaugh (2007) adopted factoring methods to validate the Community of Inquiry framework (Garrison et al., 2000) incorporating social, cognitive, and teaching presences. He retrieved these primary constructs and demonstrated that they exhibited excellent reliability. His work extended the original Community of Inquiry framework to a fourth dimension: course design and organization. Stewart, Hong, and Strudler (2004), using principal components analysis, found a fairly complex underlying dimensionality that defines the pattern of student satisfaction in online learning: the evaluative construct for student involved issues such as web page appearance, hyperlinks and navigation facility, technical constraints, online applications, instructional techniques and expectations, content delivery, and the interaction environment. Bangert (2006) found four underlying elements related to the evaluation of online and blended courses: interaction, active learning, time on task, and student cooperation. In a later study, he validated his previous findings using both exploratory and confirmatory
  • 40. factor methods (Bangert, 2008). In a somewhat different approach to the identification of underlying dimensionality, Wang, Dziuban, Cook, and Moskal (2009) used classification and regression trees to predict student assessment of online courses and identified a series of validated if-then decision rules for predicting students’ perceptions of excellent teaching based on three constructs: facilitation of learning, ability of the instructor to communicate information and concepts, and the instructor’s respect and concern for students. Dziuban and Moskal (2011) conducted a study of the factor invariance in student satisfaction across online, blended, and face-to-face courses. Using Guttman’s (1954) image analysis, they found a single general component that remained constant across all modalities. The authors concluded that Online Learning Vol. 19 Issue 2 (2015) 124 students do not use the modality of a course to differentiate elements of excellent instruction and course satisfaction. In a later study, Dziuban, Moskal, Kramer and Thompson (2013) used the alpha factoring procedure (Kaiser & Caffery, 1965) to identify the underlying dimensionality of satisfaction under varying conditions of student ambivalence toward their online courses. Using overall satisfaction with the course, they classified students into five categories: negative non-ambivalent, negative ambivalent,
  • 41. ambivalent, positive ambivalent and positive non-ambivalent, corresponding with the 5-item Likert scale. By factoring the remaining items of the instrument stratified by those categories, they found the highest dimensionality for students in the ambivalent categories and the lowest dimensionality in the non- ambivalent classifications. The factors in the extreme categories (either positive or negative) were identical as were the factors in the positive ambivalent and negative ambivalent categories. The authors hypothesized that the students who appear to be least engaged in their courses (i.e., ambivalent) may be the most reflective and thoughtful about evaluating their educational experience. Psychological Contracts as a Basis for Understanding Satisfaction By definition, factor analysis studies imply latent dimensions– constructs that cannot be directly observed. Therefore, the underlying components identified in these kinds of studies relate closely to Argyris’s (1960) notion of a psychological contract. These contracts are formed by implicit understanding and are not bound by written or legal agreements of two parties within a reciprocal relationship. They consist of perceived obligations and expectations, and thus, are subjective and vary from person to person (Bordia, Hobman, Restubog, & Bordia, 2010). When broken, or breached, (due to perceived unfairness, inequality, or mistrust), satisfaction and performance decline and workforce turnover increases, consequently impacting attitudes and behaviors (Bordia et al., 2010). All workplace psychological contracts contain six features:
  • 42. voluntary choice, mutual agreement, incompleteness, presence of numerous contract makers, plan for managing unsuccessful contract losses, and a relational model between employer and employee (Rousseau, 1990). Relational, transactional and balanced define these six features contained within three different types as outlined by Rousseau (1990). Relational agreements are based on loyalty and stability and, thus, foster satisfaction (Raja, Johns & Ntalianis, 2004). Transactional agreements include fewer duties, are usually temporary or short in duration, and usually result in behaviors that align or are consistent with the contributions in which one is rewarded. These contributions are economic in nature and viewed as less significant. The balanced agreement: a hybrid contract that is generally open-ended includes a mutual concern between both parties, but with clear expectations (Raja et al., 2004). When analyzing and assessing psychological contracts, the three forms of measurement include content-oriented assessment, feature and evaluation. The content-oriented assessment examines the obligations of the agreement; the feature-oriented assessment compares one agreement to another’s based upon attributes of the contracts; and the evaluation-oriented assessment assesses the degree of fulfillment and the amount of change that results (Rousseau & Tijoriwala, 1998). Psychological Contracts in Education Although Argyris (1960) developed the theory of a psychological contract for the workplace, the idea has important implications for educational environments. Wade-Benzoni, Rousseau, and Li (2006),
  • 43. for instance, contend that students view psychological contracts as a form of mutual exchange in the education process. The interactions between student and instructor are crucial and telling about ongoing perceptions of obligations (Wade-Benzoni et al., 2006). Often there is little to no explicit communication about these arrangements because they are informal and temporary. The power in the relationship within these contracts is predominately asymmetric, favoring faculty members who hold expectations about student performance and control resources such as grades, letters of recommendation, advice on careers, and, in some cases, dissertation research approval (Wade- Benzoni et al., 2006). Prior to viewing a syllabus, students begin to form expectations as they assess course offerings for academic development, decision-making input, challenges, feedback, and support (Spies et al., 2010). Online Learning Vol. 19 Issue 2 (2015) 125 According to Spies et al. (2010), students pay close attention to the following categories: faculty, futuristic factors, student development, course and curricular content, learning opportunities, involvement, and facilities. These agreements tend to change and become more elaborate as the course progresses. Within a particular class, both students and faculty form a large number of contracts that present satisfaction challenges if, in the participants’ judgment, their implicit expectations are not met. This
  • 44. suggests that student satisfaction with online learning is, as Lakoff (1987) termed, a function of an idealized cognitive model—a construct fabricated to bring clarity and structure to a situation. Kahneman (2011) describes this thinking as “what you see is all there is.” Because of the complex interaction of these many constructs, however, student satisfaction with online learning appears to be an example of “there is much more than you can see directly.” The Survey and Sample The Research Initiative for Teaching Effectiveness (RITE) at the University of Central Florida (UCF) has been surveying UCF’s online students as part of an ongoing impact evaluation since 1996, when the university began offering Web courses. The longitudinal nature of the university’s examination of student attitudes has allowed for refinement and validation from the original survey. Ongoing evaluation allows researchers to accommodate rapid change in the online course environments and provide baseline data on items that may contribute to student satisfaction with these courses (Roberts, 2007). Response rates for online surveys are always a concern (Sax, Gilmartin & Bryant, 2003). The art of a student survey is the development of an instrument that addresses the target, yet requires a minimal time to complete. The current RITE instrument focuses specifically on the dynamics of student satisfaction with online learning and is presented in a 5-point Likert scale format, ranging from strongly agree to strongly disagree. Items related to the learning management system or to technology itself have
  • 45. been excluded in an effort to minimize survey fatigue. Survey items were validated by examining their psychometric properties from previous surveys in terms of central tendency, variability, skewness and kurtosis, looking for anomalies, and for their relevance to the current state of the online initiative. Once the original item pool was selected the survey was developed using Google Forms (https://drive.google.com). Students were sent an announcement through the UCF Knights student email platform. Student directions included the purpose of the study, their rights as survey participants, and contact information for both the survey administrators and the University’s Institutional Review Board. Students were advised that the survey was open only to undergraduates 18 years of age and older, and were reminded that the survey was voluntary. Students were free to omit any question they were not comfortable answering or stop at any time with no penalty. Students received no rewards for participation and there were no risks for non-participation. All student data remained anonymous when aggregated. Overall 1,217 surveys were returned. An examination of student responses indicated that 84% of students represented the millennial generation, 72% were female, and 76% were unmarried. Almost half of the responding students worked at least 10 hours per week and data reflected the ethnic demography of the university, with 70% of students having a grade point average of at least 3.0. Respondents were experienced with online learning—97% of students indicated taking at least one online course, with a median of five online courses. Students were predominately upperclassmen, with 83% of respondents being
  • 46. juniors or seniors. The university has targeted the majority of its online offerings to the upper undergraduate level, thereby allowing for the transition of freshmen and sophomores to university life prior to extensive online learning. Our respondent sample of predominately upper undergraduates reflects this philosophy. Students who indicated they had not taken an online course were excluded from analyses, reducing the usable sample to 1,197 responses. Online Learning Vol. 19 Issue 2 (2015) 126 Methodology Reliability and Domain Sampling Prior to any analysis of the item responses collected in this sample, the psychometric quality of the information yielded by the instrument was assessed with validated techniques. Next, coefficient alpha (Cronbach, 1951) was used to determine the survey reliability. The psychometric sampling issue of how well the items comprise a reasonable sample from the domain of interest is an important aspect of analyzing constructs such as student satisfaction. Addressing this issue, Guttman (1953) developed a theoretical solution illustrating that the domain sampling properties of items improve when the inverse of the correlation matrix approaches a diagonal. Kaiser and Rice (1974) used this property to develop their measure of sample adequacy. The index has an upper bound of one with Kaiser offering some decision rules for interpreting the value of measure of sampling adequacy (MSA). If the value of the index is in
  • 47. the range .80 to .99, the investigator has evidence of an excellent domain sample. Values in the .70s signal an acceptable result, and those in the .60s indicate data that are unacceptable. MSA has been used for data assessment prior to the application of any factoring procedures. Computation of the MSA index gives the investigators a benchmark for the construct validity of the items. This procedure was recommended by Dziuban and Shirkey (1974) prior to any latent dimension analysis. An individual MSA for each variable gives the investigators an indication of whether or not a particular item belongs in the particular domain. Dimensionality of Student Responses The investigators sought to determine whether multiple dimensions underlie students’ satisfaction to their online learning environments. This is normally accomplished by the application of some version of the generalized factor analysis procedure. In this study the data were analyzed with Guttman’s (1954) image analysis. The procedure assumes that the data sets divide into two components. The first component is the portion of data that can be predicted from the remaining variables in the set (the image). The second component is the data that is not predictable from the remaining variables (the anti-image). The method is operationalized by predicting a standardized score on a variable for each individual from the remaining variables in the set. The image procedure derives underlying components found in the covariance matrix (the image matrix) of the standardized variables. The number of factors (components) retained in the final
  • 48. solution was determined by a procedure originally proposed by Dziuban and Shirkey (1993) and later validated by Hill (2011). The method involves the initial assessment of the dataset with the MSA followed by subsequent MSA computation on the matrix of partial correlations once the impact of the first, second, third etc. number of factors have been removed from the system. Once a value in the .60s has been reached, the majority of information from the system has been attained. The initial pattern matrix was transformed (rotated) according to the promax (Hendrickson & White, 1964) procedure. Pattern coefficients absolutely larger than .40 were used for interpretation purposes (Stevens, 2002). Once the final dimensionality of the data set was determined, factor scores for each subject in the sample were derived using the Anderson and Rubin (1956) method. These scores have a mean of zero and a standard deviation of one and are uncorrelated with each other. They also have a reasonably good relationship to the estimated factor validity. The final step in the handling of the data involved deriving a linear transformation of the standardized factor scores with T = (Z x 10) + 50 giving the scores a mean of 50 and standard deviation of 10 for ease of interpretation. The scores for each factor were used as dependent measures for a rescaled comparison variable related to overall online course satisfaction. Because the number of dissatisfied students was small, the comparison variable was declassified into satisfied, ambivalent, and dissatisfied and used as a factor in the hypothesis test. The investigators were concerned with trends and effect size differences among the dissatisfied (4%), ambivalent (5%), and satisfied (91%) groups
  • 49. followed by Bonferroni post hoc comparisons (Hochberg, 1988). Online Learning Vol. 19 Issue 2 (2015) 127 Results The promax transformed pattern matrix may be found in Table 1. The overall MSA for the variables set was .94 with an overall alpha reliability coefficient of .96. These values indicate excellent domain sampling and reliability. The individual MSAs indicate each item belongs to the family psychometrically. Upon extraction of three dimensions from the system using the Dziuban-Shirkey procedures, the MSA on the residual correlation matrix was .58 indicating that what remained in the system was essentially noise. Table 1 Pattern Matrix for the Promax Transformed Image Analysis Items Engaged Learning Agency Assessment MSA Generally, I am more engaged in my online courses .84 .04 -.07 .94 I have more opportunities to reflect on what I have learned in online courses .79 -.05 .04 .94 Online learning helps me understand course material .76 .03 .05 .95
  • 50. There are more opportunities to collaborate with other students in an online course .67 -.14 -.03 .93 My online experience has increased my opportunity to access and use information .66 .11 .06 .95 I am more likely to ask questions in an online course .65 -.11 .01 .94 Generally, I understand course requirements better in an online course .64 -.09 .19 .96 Because of online courses, I am more likely to get a degree .56 .09 -.03 .94 I can manage my own learning better in online courses .54 .18 .17 .95 Take more online courses? .47 .22 .04 .96 I am motivated to succeed -.12 .56 -.03 .81 I have strong time management skills .05 .53 -.07 .85 I am a multitasker -.05 .57 .05 .87 Assessment of my academic progress is more accurate in online courses -.19 -.04 .56 .92 I can more easily monitor my academic progress in online courses .14 .11 .51 .92 Response time from teachers and assistants is quicker in online courses .24 -.12 .43 .94 MSA = .94 Residual MSA = .58 Average Item Correlation = .70
  • 51. =.96 From Table 1, the reader may observe that the first factor appears very general, covering a number of issues associated with online courses ranging from engagement through willingness to take Online Learning Vol. 19 Issue 2 (2015) 128 another online course. However, upon closer examination, it is clear what appears to be very general is quite specific in relation to what students evaluate in online courses. These elements include students’ abilities to engage, reflect, understand material, collaborate, find information, question, understand course requirements, manage their own learning, and increase opportunities for degree completion. This finding suggests students simultaneously evaluate multiple aspects of online courses to make decisions about their class experience. Furthermore, students may evaluate each element separately, especially when they are unsure of their satisfaction levels. We name this factor engaged learning (74% factor variance) and in many respects, it conforms to Tolstoy’s (1878/2004) opening argument and Diamond’s (1999) contention that many elements must be accommodated if conditions are to be resolved satisfactorily. Conversely, any one or more of these elements might cause students to be less than satisfied with their educational experience. The second factor (17% factor variance) in the pattern matrix involves motivation, time management skills, and multitasking ability. This dimension
  • 52. suggests that students’ sense of agency—that is, students’ ability to initiate and control their own actions in the learning environment—plays a role in their satisfaction with their online learning experience. Students with a strong sense of agency assume responsibility for their learning and bring a sense of empowerment to their classes. Since the majority of students in this study indicated higher levels of satisfaction with online learning, we might reasonably assume they bring a higher sense of agency as well. Agency, however, may not be specifically related to course modality. The final factor (9% factor variance) depicts the manner in which the assessment process evolves in the online environment. Satisfied students are characterized by an ability to assess and monitor their progress, and indicate that a timely response by the instructor plays an important role in their satisfaction. Therefore, we find online students incorporate three dimensions into their evaluation process of online learning experiences: 1) engaged learning with various course elements, 2) a sense of agency, and 3) an efficient assessment of academic progress. The factor correlation matrix in Table 2 indicates that these student dimensions are highly and positively related in a generally satisfied population. This suggests that engaged learning, agency, and assessment factors form a highly interrelated lexicon for student satisfaction, with engaged learning most highly related to agency (r =.86) and agency most highly related to assessment (r =.77). Table 2 Factor Correlation Matrix
  • 53. Factors Engaged Learning Agency Engaged Learning Agency .86 Assessment .59 .77 The average correlation among the factors in Table 3 is .74 when computed by the method developed by Kaiser (1958). Table 3 contains the means, standard deviations, and significance levels for the three sets of factor scores for the declassified overall satisfaction variable. In addition, the table contains the pairwise combinations that proved significant on the Bonferroni comparison and the associated effect size calculated by the method outlined by Hedges and Olkin (1985). The factor scores for engaged learning and agency lead to the null hypothesis rejection, however assessment did not. For Online Learning Vol. 19 Issue 2 (2015) 129 engaged learning, dissatisfied versus ambivalent ratings produced an effect size of .53, dissatisfied versus satisfied ratings yielded values of 2.01, and ambivalent versus satisfied ratings equaled 1.43. Bonferroni comparisons for the agency factors showed two significant differences with dissatisfied versus satisfied ratings producing an effect size value of 1.03, while ambivalent versus satisfied ratings yielded .77. Each of the above effects sizes by most standards is considered substantial.
  • 54. Table 3 Factor Score Difference by Overall Satisfaction Dissatisfied (D) (n=46) Ambivalent (A) (n=56) Satisfied (S) (n=1016) Sig.Factors Scores S.D. S.D. S.D. Engaged Learning 33.63 10.86 38.92 8.92 51.59 8.86 .00 Agency 41.46 11.27 43.74 13.84 50.98 9.15 .00 Assessment 48.93 9.24 49.21 8.56 50.13 10.19 .63 Significant Bonferroni Pairwise Differences Categories Engaged Learning Agency Assessment Dissatisfied vs. Ambivalent .00 ns ns Dissatisfied vs. Satisfied .00 .00 ns Ambivalent vs. Satisfied .00 .00 ns Effect Sizes - Hedges’ g Categories Engaged Learning Agency Assessment Dissatisfied vs. Ambivalent 0.53 0.18 0.03 Dissatisfied vs. Satisfied 2.01 1.03 0.12 Ambivalent vs. Satisfied 1.43 0.77 0.09 Limitations There are a number of limitations in this study. Initially it
  • 55. should be noted that the factors derived resulted from a one-time administration of the survey instrument during the semester. Therefore, the stability of the satisfaction factors over an entire semester has not been validated. Second, the study was conducted on individual item responses rather than scales. Although this has precedent in literature, single items with presumed low reliability can be problematic in factor studies such as this because of their instability. Third, many aspects of exploratory factor analysis involve arbitrary decisions, for instance, number of factors to extract, values for salience in the pattern matrix, rotational strategy, and naming the final dimensions. Fourth, online survey research using mass e- mailings to students has the possibility of introducing response bias into the data. This makes replication of studies much more difficult. Finally, although the investigators collected extensive demographic data on the responding students, there was no possibility for controlling for many of the student characteristics that might have influenced the results. This raises a more general limitation resulting from the ease with which survey instruments can be distributed in the electronic environment. This causes many students to suffer “survey fatigue” that can adversely impact response rates. Online Learning Vol. 19 Issue 2 (2015) 130 Conclusion Student Satisfaction in the Online Environment From its inception, the Sloan-Consortium (now the Online
  • 56. Learning Consortium) established student satisfaction with online leaning as one of its founding metaphoric pillars. In doing so, the organization demonstrated a commitment to the student voice as a component for understanding effective teaching and learning. This commitment by the Online Learning Consortium resulted in two decades of research devoted to understanding how students define excellence in their learning space. Satisfaction with online learning is becoming increasingly important in higher education for a number of reasons. The most important is the rapid adoption of this teaching and learning modality in colleges, universities, and community colleges across the country. However, another mediating issue is the growing sense of student agency in the educational process. Students are able and do express their opinions about their educational experiences in formats ranging from end of course evaluation protocols to social networks of all varieties making their voice more important than ever before. Factor Studies Online learning has redefined student satisfaction research. It has caused the education research community to reexamine traditionally held assumptions that learning primarily takes place within a metaphoric container called a “course.” In reviewing the studies that address student satisfaction, from a factor analytic perspective, one point becomes obvious: this is a complex system with very little agreement. Even the most recent factor analytic studies have done little to resolve the lack of consensus about the dimensions that underlie satisfaction with online learning. This appears to be the factor invariance problem in full bloom, where differing contexts
  • 57. mediate how students relate to their learning experiences because a common prototype for online courses has been elusive at best. There exists the possibility that each course incorporates several unique characteristics that make it difficult to identity common factors that are robust across context. Although the results of these studies differ in how many and what dimensions constitute satisfaction, their unifying objective was the same: identify the underlying theoretical perspective of student perception of online learning. In addition, all of them subscribed to latent trait theory, recognizing that the important dimensions that students differentiate when they express their opinions about online learning are formed by the combination of the original items that cannot be directly observed—that which underlies student satisfaction. Psychological Contracts as a Lens for Student Satisfaction Very often theories developed in on one discipline inform work in another area. We contend that this is the case with the psychological contracts and factors that define student satisfaction with online learning. The theory of psychological contracts explains employee satisfaction through the perspectives of expectations for the work place and employee interactions. These contracts may be common across employees, for instance safety on the job, or they may be unique to individual employees such as promotion. The elements of the contract are implicit in that they are never formally stated, but they are assumed by the individual holding them to be mutually agreed upon between the employee and the employer. Of course, this may or may not be so. Most importantly, a violation of the psychological
  • 58. contract, either real or perceived, by either party, leads to workplace dissatisfaction. In factor analytic studies, items about student satisfaction with online learning correspond to the formation of a psychological contract. The survey responses are reconfigured into a smaller number of latent (non-observable) dimensions that are never really articulated by the students, but are, nonetheless, fully expected to be satisfied. Of course, instructors have contracts for students as well. Studies such as this identify the student psychological contact after the fact, not prior to the class, however, nothing prevents both from happening and/or a comparison of the two. The prior contract might be identified before the class and the satisfaction factors after the class. Online Learning Vol. 19 Issue 2 (2015) 131 Table 4 depicts the relationship between online student satisfaction factors and the features of a psychological contract specified in the literature. Each factor translates into how it might be expressed in the student voice followed by a corresponding contract feature and an assessment strategy. Engaged learning, the largest contributor to the factor pattern, indicates that students expect instructors to adopt a facilitative role in their teaching. This dimension corresponds to the relational contract where the learning environment is stable and organized with a clearly delineated path to success. Assessment in this situation is evaluation oriented, indexing fulfillment and change (i.e., students achieving learning outcomes).
  • 59. The second factor, agency, characterizes satisfied students who recognize their abilities and accomplishments in a balanced contract arrangement that they assessed by the degree of agreement between them and the instructor (feature oriented). The final factor, assessment, corresponds to that transactional contract with its evaluation determined by the degree to which the obligations of the course have been met (content oriented). Although they have been developed in different contexts, workplace contracts and student satisfaction factors are similar. Both attempt to explain the underlying cause of satisfaction or lack thereof. Both are general and non-nonspecific, becoming more complex as the job, class, or classes evolve over time. They are limited in their scope and at best index a kind of average perception of the workplace or educational environment. Rarely are employees fully satisfied with their jobs or students completely satisfied with their classes. However, both contracts and factors frame blueprints for developing instructional design strategies that can improve learning climates. Online learning has unbundled the classroom and the same technologies have done precisely the same to the workplace: no longer are either bound by physical space. Perhaps in a more traditional time, psychological contracts (predispositions) and student satisfaction elements (post dispositions) were somewhat separate in their context and orientation. However, it seems clear that information and instructional technologies are migrating into the same orientation space. This makes the questions “What did you expect on your way in?” and “Now that you
  • 60. are finished, how was your experience?” part of the same climate assessment paradigm. By coalescing factors and psychological contracts, we might gain insights into more effective learning environments that are not possible when each theory is considered separately. Blending the two takes best features of both and results in something entirely new--something more than you can see in either theory. Table 4 Correspondence Between Satisfaction and Psychological Contracts Student Satisfaction Factors Student Voice Contract Features Contract Assessment Engaged Learning “Facilitate my learning” Relational Evaluation Oriented Agency “Recognize my abilities and accomplishments” Balanced Feature Oriented Assessment “Let me know where I stand” Transactional Content Oriented Online Learning Vol. 19 Issue 2 (2015) 132 References Abrami, P.C., & d’Apollonia, S. (1991). Multidimensional students’ evaluations of teaching effectiveness—generalizability of “N=1” research: Comment of marsh. Journal of Educational Psychology, 83(3), 411-415. doi: 10.1037/0022-0663.83.3.411
  • 61. Akdemir, O., & Koszalka, T. A. (2008). Investigating the relationships among instructional strategies and learning styles in online environments. Computers & Education, 50(4), 1451-1461. doi: 10.1016/j.compedu.2007.01.004 Allen, I. E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Newburyport, MA: Sloan Consortium. Anderson, R. D., & Rubin, H. (1956). Statistical inference in factor analysis. Proceedings of the Third Berkeley Symposium of Mathematical Statistics and Probability, 5, 111-150. Arbaugh, J.B. (2007). An empirical verification of the community of inquiry framework. Journal of Asynchronous Learning Network, 11(1), 73-85. Arbaugh, J. B. (2001). How instructor immediacy behaviors affect student satisfaction and learning in web-based courses. Business Communication Quarterly, 64(4), 42-54. doi: 10.1177/108056990106400405 Argyris, C. (1960). Understanding organizational behavior. Homewood, IL: Dorsey. Bangert, A. W. (2006). Identifying factors underlying the quality of online teaching effectiveness: An exploratory study. Journal of Computing in Higher Education, 17(2), 79-99. doi: 10.1007/BF03032699 Bangert, A. W. (2008). The development and validation of the student evaluation of online teaching
  • 62. effectiveness. Computers in the Schools, 25(1), 35-47. doi: 10.1080/07380560802157717 Battalio, J. (2007). Interaction online: A reevaluation. Quarterly Review of Distance Education, 8(4), 339- 352. Bolliger, D. U. (2004). Key factors for determining student satisfaction in online courses. International Journal on E-Learning, 3(1), 61-67. Bordelon, D. E. (2012). Where have we been? Where are we going? The evolution of American higher education. Procedia-Social and Behavioral Sciences, 55(5), 100- 105. doi: 10.1016/j.sbspro.2012.09.483 Bordia, S., Hobman, E. V., Resubog, S. L. D., & Bordia, P. (2010). Advisor-student relationship in business education project collaborations: A psychological contract perspective. Journal of Applied Social Psychology, 40(9), 2360-2386. doi: 10.1111/j.1559-1816.2010.00662.x Bowker, G. C., & Star, S. L. (1999). Sorting things out: Classification and its consequences. Cambridge, MA: The MIT Press. Carnaghan, C., & Webb, A. (2007). Investigating the effects of group response systems on student satisfaction, learning, and engagement in accounting education. Issues in Accounting Education, 22(3), 391-409. doi: http://dx.doi.org/10.2308/iace.2007.22.3.391 Craig S. C. & Martinez M. D. (2005). Ambivalence and the
  • 63. structure of political opinion. New York: Palgrave Macmillian. Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297- 334. doi:10.1007/BF02310555 Diamond, J. (1999). Germs, guns, and steel: The fates of human societies. New York: W. W. Norton & Company, Inc. Dziuban, C., & Dziuban, J. (1998). Reactive behavior patterns in the classroom. Journal of Staff, Program & Organization Development, 15(2). 85-31. Online Learning Vol. 19 Issue 2 (2015) 133 Dziuban, C., McMartin, F., Morgan, G., Morrill, J., Moskal, P., & Wolf, A. (2013). Examining student information seeking behaviors in higher education. Journal of Information Fluency, 2(1), 36-54. Dziuban, C., & Moskal, P. (2011). A course is a course is a course: Factor invariance in student evaluation of online, blended and face-to-face learning environments. The Internet and Higher Education, 14(4), 236-241. doi: 10.1016/j.iheduc.2011.05.003 Dziuban, C., Moskal, P., Brophy-Ellison, J., & Shea, P. (2007). Student satisfaction with asynchronous learning. Journal of Asynchronous Learning Networks, 11(1), 87-95. Dziuban, C. D., Moskal, P. D., & Dziuban, E. K. (2000).
  • 64. Reactive behavior patterns go online. The Journal of Staff, Program & Organizational Development, 17(3), 155-179. Dziuban, C.D., Moskal, P.D., Juge, F., Truman-Davis, B., Sorg, S. & Hartman, J. (2003). Developing a web-based instructional program in a metropolitan university. In B. Geibert & S. H. Harvey (Eds.), Web-wise learning: Wisdom from the field (pp. 47-81). Philadelphia, PA: Xlibris Publications. Dziuban, C., Moskal, P., Kramer, L., & Thompson, J. (2013). Student satisfaction with online learning in the presence of ambivalence: Looking for the will-o’-the-wisp. Internet and Higher Education, 17, 1-8. doi: 10.1016/j.iheduc.2012.08.001 Dziuban, C. D., & Shirkey, E. C. (1974). When is a correlation matrix appropriate for factor analysis? Some decision rules. Psychological Bulletin, 81(6), 358-361. doi: 10.1037/h0036316 Dziuban, C. D., & Shirkey, E. C. (November, 1993). S.D. 50—A sequential psychometric criterion for the number of common factors. Presented at The Annual Conference for Florida Educational Research Association, Destin, Florida. Feldman, K. A. (1976). The superior college teacher from the student’s view. Research in Higher Education, 5, 243-288. doi: 10.1007/BF00991967 Feldman, K. A. (1993). College students’ views of male and female college teachers: Part II— evidence from students’ evaluation of their classroom teachers. Research
  • 65. in Higher Education, 34(2), 151- 191. doi:10.1007/BF00992161 Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2), 87-105. doi: 10.1016/S1096-7516(00)00016-6 González-Gómez, F., Guardiola, J., Martín Rodríguez, Ó., & Montero Alonso, M. Á. (2012). Gender differences in e-learning satisfaction. Computers & Education, 58(1), 283-290. doi: 10.1016/j.compedu.2011.08.017 Greenwald, A.G., & Gilmore, G. M. (1997). Grading leniency is a removable contaminant of student ratings. American Psychologist, 52(11), 1209-1217. doi: 10.1037/0003-066X.52.11.1209 Guttman, L. (1953). Image theory for the structure of quantitative variates. Psychometrika, 18, 277-269. doi:10.1007/BF02289264 Guttman, L. (1954). Some necessary conditions for common factor analysis. Psychometrika, 19, 149-161. doi:10.1007/BF02289162 Hedges, L.V., & Olkin, I. (1985). Statistical methodology in meta-analysis. San Diego, CA: Academic Press. Hendrickson, A. E., & White, P. O. (1964). Promax: A quick method for rotation to oblique simple structure. British Journal of Statistical Psychology, 17(1), 65- 70. doi: 10.1111/j.2044-
  • 66. 8317.1964.tb00244.x Hill, B. D. (2011). The sequential Kaiser-Meyer-Olkin procedure as an alternative for determining the number of factors in common-factor analysis: A Monte Carlo simulation Doctoral dissertation, Oklahoma State University. Online Learning Vol. 19 Issue 2 (2015) 134 Hochberg, Yosef (1988). A sharper bonferroni procedure for multiple tests of significance. Biometrika, 75(4): 800–802. doi:10.1093/biomet/75.4.800. Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Strauss, Giroux. Kaiser, H. F. (1958). The varimax criterion for analytic rotation in factor analysis. Psychometrika, 23(3), 187-200. doi:10.1007/BF02289233 Kaiser, H., & Caffrey, J. (1965). Alpha factor analysis. Psychometrika, 30(1), 1-14. doi:10.1007/BF02289743 Kaiser, H. F., & Rice, J. (1974). Little jiffy, mark IV. Journal of Educational and Psychological measurement, 34(1), 111-117. doi: 10.1177/001316447403400115 Ke, F., & Kwak, D. (2013). Constructs of student-centered online learning on learning satisfaction of a diverse online student body: A structural equation modeling approach. Journal of Educational
  • 67. Computing Research, 48(1), 97-122. doi: 10.2190/EC.48.1.e Keengwe, J., Diteeyont, W., & Lawson-Body, A. (2012). Student and instructor satisfaction with e- learning tools in online learning environments. International Journal of Information and Communication Technology Education (IJICTE), 8(1), 76-86. doi:10.4018/jicte.2012010108 Kim, C., Damewood, E., & Hodge, N. (2000). Professor attitude: Its effect on teaching evaluations. Journal of Management Education, 24(4), 458-473. doi:10.1177/105256290002400405 Kuo, Y. C., Walker, A. E., Belland, B. R., & Schroder, K. E. (2013). A predictive study of student satisfaction in online education programs. The International Review of Research in Open and Distance Learning, 14(1), 16-39. Lakoff, G. (1987). Women, fire, and dangerous things. Chicago: University of Chicago Press. Liu, G. Z., & Hwang, G. J. (2010). A key step to understanding paradigm shifts in e-learning: towards context-aware ubiquitous learning. British Journal of Educational Technology, 41(2), E1-E9. doi: 10.1111/j.1467-8535.2009.00976. Long, W. A. (2011). Your predictable adolescent. Charleston, SC: BookSurge Publishing. Mahmood, A., Mahmood, S. T., & Malik, A. B. (2012). A comparative study of student satisfaction level in distance learning and live classroom at higher education level. Turkish Online Journal of
  • 68. Distance Education (TOJDE), 13(1), 128-136. Marsh, H. W., & Roche, L.A. (1997). Making students’ evaluations of teaching effectiveness effective: The critical issues of validity, bias, and utility. American Psychologist, 52(11), 1187-1197. doi: 10.1037/0003-066X.52.11.1187 McKeachie, W.J. (1997). Student ratings: The validity of use. American Psychologist, 52(11), 1218-1225. Norberg, A., Dziuban, C. D., & Moskal, P. D. (2011). A time- based blended learning model. On the Horizon, 19(3), 207-216. doi: 10.1108/10748121111163913 Raja, U., Johns, G., & Ntalianis, F. (2004). The impact of personality on psychological contracts. The Academy of Management Journal, 47(3), 350-367. doi: 10.2307/20159586 Richardson, J. C., & Swan, K. (2003). Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Journal of Asynchronous Learning Networks, 7(1), 68-88. Roberts, C. (2007). The unnatural history of the sea. Washington DC: Island Press. Rousseau, D. M. (1990). Normative beliefs in fund-raising organizations linking culture to organizational performance and individual responses. Group & Organization Management, 15(4), 448-460. doi: 10.1177/105960119001500408 Rousseau, D. M. & Tijoriwala, S. A. (1998). Assessing psychological contracts: Issues, alternatives and measures. Journal of Organizational Behavior, 19, 679-695. doi:
  • 69. 10.1002/(SICI)1099- 1379(1998)19:1+<679::AID-JOB971>3.0.CO;2-N Online Learning Vol. 19 Issue 2 (2015) 135 Rubin, B., Fernandes, R., & Avgerinou, M. D. (2013). The effects of technology on the community of inquiry and satisfaction with online courses. The Internet and Higher Education, 17, 48-57. doi: 10.1016/j.iheduc.2012.09.006 Sax, L. J., Gilmartin, S. K., & Bryant, A. N. (2003). Assessing response rates and nonresponse bias in web and paper surveys. Research in Higher Education, 44(4), 409-432. doi:10.1023/A:1024232915870 Shirky, C. (2010). Cognitive surplus: Creativity and generosity in a connected age. New York: Penguin. Shirky, C. (2008). Here comes everybody: The power of organizing without organizations. New York: Penguin. Spies, A. R., Wilkin, N. E., Bentley, J. P., Bouldin, A. S., Wilson, M. C., & Holmes, E. R. (2010). Instrument to measure psychological contract violation in pharmacy students. American Journal of Pharmaceutical Education, 74(6), 1-11. Stevens, J.P. (2002). Applied multivariate statistics for the social sciences (4th ed.). Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
  • 70. Stewart, L., Hong, E., & Strudler, N. (2004). Development and validation of an instrument for student evaluation of the quality of web-based instruction. The American Journal of Distance Education, 18(3), 131-150. doi: 10.1207/s15389286ajde1803_2 Swan, K. (2001). Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance education, 22(2), 306- 331. doi:10.1080/0158791010220208 Tolstoy, L. (2004). Anna Karenina. (R. Pevear & L. Volokhonsky, Trans.). New York, NY: Penguin. (Original work published 1878). Wade-Benzoni, K. A., Rousseau, D. M., & Li, M. (2006). Managing relationships across generations of academics: Psychological contracts in faculty-doctoral student collaborations. International Journal of Conflict Management, 17(1), 4-33. doi: 10.1108/10444060610734154 Wang, M. C., Dziuban, C. D., Cook, I. J., & Moskal, P. D. (2009). Dr. Fox rocks: Using data-mining techniques to examine student ratings of instruction. In M. C. Shelley, L. D. Yore, & B. Hand (Eds.), Quality research in literacy and science education: International perspectives and gold standards (pp. 383-398). Dordrecht, Netherlands: Springer. doi:10.1007/978-1-4020-8427-0_19 Watts, D. J. (2011). Everything is obvious. New York: Crown Publishing Group, Random House. Weigert, A. J. (1991). Mixed emotions: Certain steps toward understanding ambivalence. Albany: State
  • 71. University of New York Press. Young, B. R., & Dziuban, E. (2000). Understanding dependency and passivity: Reactive behavior patterns in writing centers. Writing Center Journal, 21(1), 67- 87. Online Learning Vol. 19 Issue 2 (2015) 136 Copyright of Online Learning is the property of Online Learning Consortium and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. Online Instruction, E-Learning, and Student Satisfaction: A Three Year Study (SNnOnlineCourses Michele T. Cole, Daniel J. Shelley, and Louis B. Swartz Robert Morris University, United States Abstract
  • 72. This article presents the results of a three-year study of graduate and undergraduate students’ level of satisfaction with online instruction at one university. The study expands on earlier research into student satisfaction with e- learning. Researchers conducted a series of surveys over eight academic terms. Five hundred and fifty-three students participated in the study. Responses were consistent throughout, although there were some differences noted in the level of student satisfaction with their experience. There were no statistically significant differences in the level of satisfaction based on gender, age, or level of study. Overall, students rated their online instruction as moderately satisfactory, with hybrid or partially online courses rated as somewhat more satisfactory than fully online courses. “Convenience” was the most cited reason for satisfaction. “Lack of interaction” was the most cited reason for dissatisfaction. Preferences for hybrid courses surfaced in the responses to an open-ended question asking what made the experience with online or partially online courses satisfactory or unsatisfactory. This study’s findings support the literature to date and reinforce the significance of student satisfaction to student retention. Keywords: E-learning; instructional design; online education; student retention; student satisfaction
  • 73. Online Instruction, E-Learning, and Student Satisfaction: A Three Year Study Cole, Shelley, and Swartz Vol 15 | No 6 Creative Commons Attribution 4.0 International License Dec/14 112 Introduction In their ten-year study of the nature and extent of online education in the United States, Allen and Seaman (2013) found that interest on the part of universities and colleges in online education shows no sign of abating. Online education continues to expand at a rate faster than traditional campus-based programs. The authors reported the number of students enrolled in at least one online course to be at an all- time high of 32% of all enrollments in participating institutions, representing an increase of 570,000 students from the previous year. Allen and Seaman also found that 77% of university leaders responding to the survey rated learning outcomes to be the same, if not better, with online education when compared with face-to-face learning.
  • 74. Their results support the no significant difference phenomenon that Russell (1999) found in his comparative study of student learning in the online and traditional classroom environments. Acknowledging that learning outcomes are equivalent, the question of how satisfied students are with their experiences with e-learning persists. This is important from the stand point of student retention which is, of course, relevant to enrollment and maintaining institutional revenue streams. Also, analysis of student satisfaction may point to improvements in e-learning practices which in turn could improve outcomes. Literature Review The Allen and Seaman (2013) report looked at online education, including the growing presence of massive open online courses (MOOCs), from the institutional perspective, not from the student’s. In their report, the authors noted that the remaining barriers to widespread acceptance of online education were lack of faculty and employer acceptance, lack of student discipline and low retention rates. Of these, student retention in online programs is particularly relevant to the discussion of student satisfaction with their online experience. Reinforcing the instructor’s role in designing satisfying online curricula, Kransow (2013) posited that if students were satisfied with their online experiences, they would be more likely to remain in the program.
  • 75. Kransow (2013) poses a critical question for instructors working in the online environment. How can online courses be designed to maximize student satisfaction as well as student motivation, performance and persistence? Drawing on the literature, Kransow emphasizes the importance of building a sense of community in the online environment. Yet, building an online community that fosters student satisfaction involves strategies that go beyond facilitating interaction with course components. Building community also requires, among other elements, interaction with each other, that is, between student and instructor and among students in the course. Sher (2009), in his study of the role such interactions play in student learning in a Web-based environment, found interaction between student and instructor and among students to be significant factors in student satisfaction and learning. Interaction—between the student and the instructor, among students, and with course content and technology—was the focus of Strachota’s (2003) study of student Online Instruction, E-Learning, and Student Satisfaction: A Three Year Study Cole, Shelley, and Swartz
  • 76. Vol 15 | No 6 Creative Commons Attribution 4.0 International License Dec/14 113 satisfaction with distance education. In her study, learner- content interaction ranked first as a determinant of student satisfaction, followed by learner-instructor and learner- technology interaction. Interaction between and among students was not found to be significantly correlated with satisfaction. Bollinger (2004) found three constructs to be important in measuring student satisfaction with online courses: interactivity, instructor variables and issues with technology. Palmer and Holt (2009) found that a student’s comfort level with technology was critical to satisfaction with online courses. Secondary factors included clarity of expectations and the student’s self-assessment of how well they were doing in the online environment. Drennan, Kennedy, and Pisarski (2005) also found positive perceptions of technology to be one of two key attributes of student satisfaction. The second was autonomous and innovative learning styles. Richardson and Swan (2003) focused on the relationship of social presence in online learning to satisfaction with the instructor. They found a positive correlation between students’ perceptions of social presence and their perceptions of learning and satisfaction. For Sahin (2007), the strongest predictor
  • 77. of student satisfaction was personal relevance (linkage of course content with personal experience), followed by instructor support, active learning and, lastly, authentic learning (real-life problem-solving). Kleinman (2005) looked at improving instructional design to maximize active learning and interaction in online courses. Over a period of ten years, Kleinman studied online communities of learning, concluding that an online environment which fosters active, engaged learning and which provides the interactive support necessary to help students understand what is expected, leads to a satisfied learning community. Swan (2001), too, found that interactivity was essential to designing online courses that positively affect student satisfaction. Wang (2003) argued that to truly measure student satisfaction researchers must first assess the effectiveness of online education. Online education represents a major shift in how people learn and in turn, how learners are taught. The argument is made that, therefore, there is an increasing need to understand what contributes to student satisfaction with online learning (Sinclaire, 2011). Student satisfaction is one of several variables influencing the success of online learning programs, along with the institutional factors that Abel (2005) listed in his article on best practices (leadership, faculty commitment, student support, and technology). Sener and Humbert (2003) maintained that
  • 78. satisfaction is a vital element in creating a successful online program. There have been a number of studies of student satisfaction with e-learning (Swan, 2001; Shelley, Swartz, & Cole, 2008, 2007), fully online as well as with blended learning models (Lim, Morris, & Kupritz, 2007). There have also been a number of studies by Arbaugh and associates on the predictors of student satisfaction with online learning (Arbaugh, 2000; Arbaugh, & Benbunan-Fich, 2006; Arbaugh, et al., 2009; Arbaugh, & Rau, 2007). Results from this study both support and expand on earlier work. Online Instruction, E-Learning, and Student Satisfaction: A Three Year Study Cole, Shelley, and Swartz Vol 15 | No 6 Creative Commons Attribution 4.0 International License Dec/14 114 Discussion about the role that MOOCs are destined to play in higher education (Deneen, 2013; Shirky, 2013) serves to heighten educators’ interest in providing quality online courses that maximize student satisfaction. The controversy over granting credit for
  • 79. MOOC courses (Huckabee, 2013; Jacobs, 2013; Kolowich, 2013a; Kolowich, 2013b; Kolowich, 2013c; Lewin, 2013; Pappano, 2012) reinforces the relevance of student satisfaction to successful online education. This study reports on research into student satisfaction with online education conducted over three years. The research has focused largely on business students at one university in Southwestern Pennsylvania. The emphasis on student satisfaction with e-learning and online instruction is increasingly relevant for curriculum development which in turn is relevant for student retention. Understanding what makes online instruction and e-learning satisfactory helps to inform instructional design. This study is an extension of previous research on student satisfaction with online education (Cole, Shelley, & Swartz, 2013, Swartz, Cole, & Shelley, 2010, Shelley, Swartz, & Cole, 2008, 2007). Researchers used a multi-item survey instrument to assess how well student expectations were met in selected online courses. Graduate and undergraduate students were asked first whether they were satisfied with their experience with e-learning. Following that, they were asked to explain what made the experience satisfactory or unsatisfactory. Student satisfaction is defined as “the learner’s perceived value of their educational experiences in an educational setting” (Bollinger & Erichsen, 2013, p. 5).
  • 80. Research Questions This study focused on two survey questions: 1. Please rate your level of satisfaction with the online and/or partially online courses you have taken. 2. What made your experience with the online course/s satisfactory or unsatisfactory? Both survey questions were broken into two separate questions for purposes of analysis, resulting in four research questions: 1. How satisfied were students with their fully online courses? 2. How satisfied were students with their partially online courses? 3. What factors contributed to students’ satisfaction with e- learning? 4. What factors contributed to students’ dissatisfaction with e- learning? This paper presents the results of that analysis. Online Instruction, E-Learning, and Student Satisfaction: A Three Year Study Cole, Shelley, and Swartz
  • 81. Vol 15 | No 6 Creative Commons Attribution 4.0 International License Dec/14 115 Method Researchers used a Web-based survey created in Vovici, an online survey software program. Following a pilot study in spring, 2010, surveys were sent to students in graduate and undergraduate business courses over a period of three years. Researchers used a mixed-method analysis to evaluate responses to the selected questions. Descriptive statistics were used to summarize demographic data and survey responses. Results were transferred from Vovici to, and combined in, SPSS to analyze the first two research questions. Independent samples t-tests were conducted on the scaled items. Keyword analysis was used for the third and fourth research questions. The survey was anonymous. Students in each of the business classes were offered extra credit for taking the survey. Credit was given based on notification to the instructor by the student. The same instructor taught each of the 19 courses in the second and third study samples as well as the business courses included in the initial study. The initial survey instrument was approved by the University’s
  • 82. Institutional Review Board in 2010. Subsequent modifications to the survey were minor and did not require separate approvals in 2011/2012 or 2012/2013. The same script was used seeking participation in each of the surveys. Participation was solicited via an e-mail from the instructor. Each e-mail included the link to the Web-based survey developed in Vovici. Data from the completed surveys were transferred from Vovici into SPSS. Independent samples t-tests were conducted on the questions asking students to rate their level of satisfaction with online learning. Responses from males and females, “Generation X” and “Generation Y,” and from graduate and undergraduate students were compared to determine if there were any statistically significant differences in the level of satisfaction with online and partially online courses. Responses to the question asking what contributed to the respondents’ satisfaction or dissatisfaction with online learning were tabulated in Vovici. To analyze these responses, researchers grouped keywords under themes to form categories. The categories were: convenience, interaction, structure, learning style, and platform. “Interaction” included “communication.” “Structure” included “clarity” and “instructor’s role.” “Other” was included to capture responses that did not fall into any of the stated categories. Sample and Participant Selection
  • 83. The sample from the pilot study in spring, 2010 included graduate students from the MS in Instructional Technology and the MS in Nonprofit Management programs, undergraduate business majors, and Masters of Business Administration (MBA) students. No changes to the survey design were indicated as a result of the pilot study. The second study was conducted over three terms, summer, 2010, fall, 2010, and spring, 2011. This sample was composed of undergraduate students enrolled in Legal Environment of Business (BLAW 1050), taught in the fall 2010 term, and graduate Online Instruction, E-Learning, and Student Satisfaction: A Three Year Study Cole, Shelley, and Swartz Vol 15 | No 6 Creative Commons Attribution 4.0 International License Dec/14 116 students enrolled in Legal Issues of Executive Management (MBAD 6063), which was taught in the summer 2010 and spring 2011 terms. The third study was conducted over four terms, fall, 2011, spring, 2012, fall, 2012, and spring, 2013. This sample was composed of undergraduates in BLAW 1050 taught in the fall
  • 84. 2011, fall 2012, and spring 2013 terms and graduate students in MBAD 6063, taught in the spring 2012 and spring 2013 terms. Both the graduate and undergraduate business courses chosen for the study were taught by the same instructor. Thirty-three students participated in the spring 2010 survey, a response rate of 58%. One hundred and sixty-four students participated in the second study, a response rate of 92%. Three hundred and fifty-six students participated in the third study, a response rate of 97%. Combined, the total number of participants was 553 of 603 enrolled students, for a response rate of 92%. Twelve males and 21 females participated in the first survey. One hundred and three males and 61 females responded to the survey in the second study group. Two hundred and seventeen males and 135 females responded to the survey in the third study group for a total of 332 males (60.5%) and 217 females (39.5%) who participated in the surveys. Not all participants in the third sample responded to the question on gender. Participants were asked to identify themselves as belonging to one of the following age groups: • Traditional Workers (born before 1946) • Baby Boomers (born between 1946 and 1960),
  • 85. • Generation X (born between 1961 and 1979) and, • Generation Y (born after 1979) (Recursos Humanos, 2010). Eight participants identified themselves as belonging to the Baby Boomer or the Traditional Worker categories. Nine people checked “Other.” Three participants did not respond to the question on age. The remaining respondents self- identified as belonging to “Generation X” or “Generation Y.” Due to the limited sample sizes for “Baby Boomers” and “Traditional Workers,” only responses from participants in the Generation X and Generation Y categories were compared for this study. In the first survey, 22 respondents self-identified as members of “Generation Y.” Eleven respondents classified themselves as members of “Generation X.” In the second study group, 136 respondents self-identified as “Generation Y.” Twenty-two respondents self- identified as “Generation X.” In the third study group, 303 respondents self-identified as “Generation Y.” Thirty-nine respondents self-identified as “Generation X.” The total number of respondents who self-identified as belonging to “Generation Y” was 461. Seventy-two respondents self-identified as “Generation X.” The total number of respondents belonging to either “Generation X” or “Generation Y” was 533.
  • 86. Online Instruction, E-Learning, and Student Satisfaction: A Three Year Study Cole, Shelley, and Swartz Vol 15 | No 6 Creative Commons Attribution 4.0 International License Dec/14 117 Two hundred and sixty graduate students participated in the surveys. Two hundred and eighty-one undergraduate students participated, for a total of 541. Some respondents did not identify themselves clearly as being either graduate or undergraduate students. Table 1 presents the respondents’ demographic information. Table 1 Respondent Sample Demographics* Study N/Response % Male Female Gen X Gen Y Grad UG I II III
  • 87. Total 33 – 58% 164 -92% 365-97% 553-92% 12 103 217 332 21 61 135 217 11 22 39
  • 88. 72 22 136 303 461 33 89 138 260 0 73 208 281 * Not all respondents answered each question on gender, age, or level of study Procedure Responses to the two questions on student satisfaction from
  • 89. three surveys, Designing Online Courses, Students’ Perceptions of Academic Integrity and Enhancing Online Learning with Technology, provided the data for the analysis. Although survey instruments used in the second and third studies were modified slightly to gather data for the studies on academic integrity and use of technology, each survey asked: 1. Please rate your level of satisfaction with the online and/or partially online courses you have taken. 2. What made your experience with the online course/s satisfactory or unsatisfactory? Researchers used a 5 point Likert scale for the first survey question, asking students to rate their level of satisfaction with fully online and/or partially online courses. Zero was equal to “very satisfied;” four was equal to “very dissatisfied.” The second survey question was designed as a follow-up query, asking what contributed to the student’s satisfaction or dissatisfaction with online learning. To help inform the analysis of responses to the research questions, researchers asked students how many online or partially online courses they had taken. To enable comparisons by gender, age group, and level of study, demographic questions were included in each of the surveys.
  • 90. Online Instruction, E-Learning, and Student Satisfaction: A Three Year Study Cole, Shelley, and Swartz Vol 15 | No 6 Creative Commons Attribution 4.0 International License Dec/14 118 Designing Online Courses was administered in the spring 2010 term. The survey was composed of 12 questions. Students’ Perceptions of Academic Integrity was conducted in the summer 2010, fall 2010, and spring 2011 terms. This survey was composed of 13 questions. The third survey, Enhancing Online Learning with Technology, was composed of 12 questions. This survey was administered in the fall 2011, spring 2012, fall 2012, and spring 2013 terms. Results The first survey question sought to capture respondents’ level of experience with e- learning. In the first two studies, students were asked if they had taken or were taking one or more fully online graduate courses, partially online graduate courses, fully online undergraduate courses, and/or partially online undergraduate
  • 91. courses. Responses from both studies were combined for analysis. There were 198 student responses. Since the response categories were not mutually exclusive, a student could select more than one response. Some students had taken both graduate and undergraduate-level fully online and/or partially online courses. As a result, the total number of responses to the question (255) exceeds the number of respondents (198). Table 2 presents the results. Table 2 Level of Experience with E-Learning – Studies I & II Response Count N=255 % Student responses N=198 As a graduate student in fully online courses As an under graduate student in fully online courses As a graduate student in partially online courses As an undergraduate student in partially online courses As a student taking courses outside of a degree program None Other
  • 92. 65 28 73 50 5 24 10 32.8 14.1 36.8 25.2 2.5 12.1 5.0 Elaboration on “other” included four instances of some experience with online courses that did not fit the categories in the question, and two references to having had online assignments. Four were unresponsive to the question. The question asking for the respondent’s level of experience with online or partially online was phrased differently in the third study. In the final surveys (from fall, 2011,
  • 93. Online Instruction, E-Learning, and Student Satisfaction: A Three Year Study Cole, Shelley, and Swartz Vol 15 | No 6 Creative Commons Attribution 4.0 International License Dec/14 119 spring, 2012, fall, 2012, and spring, 2013), researchers asked how many fully or partially online courses the student had taken. There were 391 responses. Students could choose only one response. Table 3 illustrates the results. Table 3 Level of Experience with E-Learning – Study III Responses Count N= 391 %of Responses 1 course 2-4 courses 5-10 courses More than 10 courses None Oher
  • 94. 89 154 56 20 35 37 22.7 39.3 14.3 5.1 8.9 9.4 RQ1 How satisfied were students with their fully online courses? In a two part survey question, students were asked to rate their level of satisfaction with fully online courses taken and with partially online courses taken. Students could respond to either part of the question or to both. To the first part, level of satisfaction with fully online courses, there were 472 responses, 85% of the total 553 participants. A 5 point Likert scale was used to measure responses ranging from 0 (very satisfied) to 4 (very dissatisfied). One hundred and six students or 22.5% of the total responding said that they were “very satisfied.” One hundred and seventy-one (36.2%) said that they
  • 95. were “satisfied.” One hundred and twenty-six (26.7%) were “neutral.” Fifty–one (10.8%) said that they were “dissatisfied.” Eighteen (3.8%) respondents were “very dissatisfied” with their experience with fully online courses. Independent samples t-tests were conducted on this question to determine if there were any significant differences in the levels of satisfaction based on gender, age, or level of study with regard to satisfaction with fully online learning. There were no statistically significant differences between males and females, between members of “Generation X” and “Generation Y,” or between graduate and undergraduate students on the question. Females, members of Generation X, and upper-level undergraduate students were more likely than males, members of Generation Y, and graduate students to rate their experiences with fully online courses as satisfactory. The mean score for females was 1.31; the mean score for males was 1.41. The mean score for members of Generation X was 1.24; the mean score for members of Generation Y was 1.40. The mean score for upper-level undergraduate students was 1.19; the mean score for graduate students was 1.23. Table 4 presents the results. Online Instruction, E-Learning, and Student Satisfaction: A Three Year Study
  • 96. Cole, Shelley, and Swartz Vol 15 | No 6 Creative Commons Attribution 4.0 International License Dec/14 120 Table 4 Student Satisfaction with Fully Online Courses Variable n M t Sig.(2- tailed) Female 177 1.31 -1.052 .293 Male 294 1.41 Generation X 63 1.24 -.989 .326 Generation Y 396 1.40 Undergraduates 26 1.19 -.146 .884 Grad. Students 105 1.23 RQ2 How satisfied were students with their partially online courses? There were 420 responses, 76% of the total 553 participants, to the second part of the question asking students to rate their level of satisfaction with partially online courses. The same 5 point Likert scale was used to measure both parts. Ninety-nine students or 23.6% of the total responding said that they were “very satisfied.” One hundred and thirty-six (32.4%) said that they were “satisfied.” One hundred
  • 97. and thirty-seven (32.6%) were “neutral.” Forty-three (10.2%) said that they were “dissatisfied.” Five students (1.2%) said that they were “very dissatisfied” with their experience with partially online courses. Independent samples t-tests were conducted on this question to determine if there were any significant differences in the levels of satisfaction based on gender, age, or level of study with regard to satisfaction with partially online learning. As with the first research question, there were no statistically significant differences between males and females, between members of “Generation X” and “Generation Y,” or between graduate and undergraduate students with regard to satisfaction with partially online courses. However, unlike satisfaction with fully online courses taken, males were somewhat more satisfied than females, and graduate students were more satisfied than upper-level undergraduates with partially online courses taken. The mean score for males was 1.32; for females, the mean was 1.34. The mean for graduate students was 1.11; for upper-level undergraduates, the mean was 1.35. As was the case with fully online courses, older students, members of Generation X, were more satisfied with their partially online courses than were members of Generation Y. The mean score for “Generation X” was 1.09; for “Generation Y,” the mean was 1.37. Table 5 presents the results.