Adaptive Courseware Vendor
Selection and Engagement
Karen Vignare, PhD, MBA
CEO, KV Consulting
kvignare@gmail.com
Agenda
• Introductions
• Vendor Engagement
• Selection
• University team
• Project Management
• Setting Expectations
• Changing Culture and Building Buy-In
• Iterating and Scaling
Karen Vignare, PhD, MBA
• 20 years experience online, emerging technologies in learning
• Started online programs, worked with international universities
• Managed blended learning, MOOCs, and adaptive learning projects
• Extensive research portfolio
• More about me at LinkedIn, https://www.linkedin.com/in/karen-
vignare
Selection process
• Short and simple…mission fit
• Tools exist to help make choices (Thanks Tyton and EdSurge)
• Efficacy Research exists (Thanks SRI)
• Many of you already have vendor partners
• Multiple tools could be useful if you can make that work
What We Did Not Have
Source: Tyton Partners
Lists of
companies
and
categorization
of tool type
Source: https://www.edsurge.com/higher-ed/special-projects/dl/orientation
Source: http://coursewareincontext.org/
Adaptive Technology Integration Comparisons
Next Generation Digital
Learning Environment
Courseware in Context
Framework
Reported pilots
Interoperability and
Integration
Latest version compliance
LTI
Most report easy
interoperability
Personalization Critical within system No clear standard match
so reports are more on
reported outcome success
Analytics, Advising, and
Learning Assessment
Caliper and LTI Needs improvement as
data exchange is not
widely used (yet)
Collaboration Currently not as important
Accessibility and Universal
Design
Critical to scale Major concerns (not just a
vendor issue)
The Situation before Pilots at UMUC
• Moving to all OER/free resources
• Needed LTI (latest version) for integration with BrightSpace (D2L)
• Preferred platform tools over others
• Involved in previous grants, Gates Foundation PLN, Breakthrough
Models Incubators, and Digital Courseware
Evaluation Process
Inputs
Outputs
Outcomes
Analyze
Iterate
Research Questions Independent
Variable
Dependent
Variables/Outcome
Measures
Analysis
1. Do completion rates differ by
condition?
Condition
(Treatment and
Control)
Course completion chi-square
2. Does student achievement
differ by condition?
Condition
(Treatment and
Control)
Homework grades
Project grades
Midterm exam grades
Final exam grades
Final course grades
t-test
3. Do students view the adaptive
learning system as an effective
learning tool
Student Survey Descriptives
4. Is there a relationship between
time spent studying in the
adaptive learning system and
student achievement?
Total login time
Final course grades
Correlation
Research
Questions
Independent
Variable
Dependent
Variables/
Outcome Measures
Analysis
Is there a
relationship between
the level of faculty
engagement and
student
achievement?
Faculty
Engagement
Indicators
Final course grades
Correlation
Is there a
relationship between
response time effort
and student
achievement?
RTE
Quiz grades
Homework grades
Correlation
Evaluations
• Random Control Trials are the gold standard
• But there are always variables that can’t be completely controlled in
education
• Repeated findings are important BUT
• It may not be that easy (Or it may)
• Looking for the silver bullet could deter leveraging the tool in new
ways
• Most technologies require more data/students to be effective
Resources: SRI, https://www.sri.com/work/projects/adaptive-learning-market-acceleration-
program
University Selection Team
• Center for Innovation in Learning & Student Success
• Learning Design & Solutions
• Information Technology
• Undergraduate School representative
• Graduate School representative
• Procurement
Timing of Pilot Initiation
• Initial vendor discussions fall 2014
• Multiple demos, online and on site
• Both continuing and new grant activities were concurrent
• Goal to start Summer 2015 especially with a design
Search & Demos
(Fall, 2014)
Evaluation
(Winter,
2015)
Contracting
(S/S 2015)
Launch
(Summer
2015)
Pilots Chosen
• Mix of those courses identified by analytics as those needing
improvement
• Program chairs interest/willingness
• Student support/services
• Grant obligations
Vendors Selected at UMUC
Summer 2015 Launch
Cogbooks
RealizeIT
EdReady
Grants
OLI (CMU started in 2009)
OLI (Stanford) potentially with Open EdX (2015)
Lumen Learning (2015)
Project Management
• Developed a work flow and estimated types of personnel needed
• Used consultant from earlier grant to get documents & sites prepared
• Instructional Design team dedicated one person as project manager
• Instruction Design team dedicated three designers (not full-time)
• Information Technology dedicated one person
• Innovation team included three people
• This group then broke into the multiple projects which included
faculty and subject matter experts
• Established a steering committee
Project Manager
• Led documentation efforts (extensive at UMUC)
• Identified with lead ID vendor PM
• Set up training/meetings
• Updated documentation
Instructional Designers
• Learn technologies
• Identify existing content
• Work with IT on integration
• Setting up pilot processes
• Work with faculty to design course map and content flow
• Work with vendors ID staff
• Work with faculty to improve course design (more content &
activities)
Information Technology
• LTI integration with D2L
• Solve back-end issues
• Work with IDs whenever vendor meetings included technical
architecture/software discussions
• Integrate data into data warehouse
Center for Innovation in Learning
• Build shared knowledge base
• Responsible for scope & contracts
• Project Owner
• Co-manage with ID project progress
• Run Steering Committee & shared progress (transparently)
• Report to senior academics
• Evaluations
• Presentations
Academics
• Assigned by program chairs
• Generally an adjunct and/or collegiate (full-time)
• Adjuncts were paid by contract
• Collegiates in lieu of teaching
• Build MORE content, assessments
Vendor
• Stay transparent
• Agree to a definitions/taxonomy—so you are on the same page
• Agree to the schedule but also remember delays happen
• Try to find others working with the vendor (create a community)
• If you don’t get answers, escalate
Challenges
• Staff consistency
• Project manager left university
• IDs needed to be transferred as other projects had ebbs & flows
• Vendor challenges (with above & some other issues)
• Projects chosen
• Often relied on a single SME/faculty
• Projects chosen needed to be high priority instead of having
competing priorities (OER, redesign)
• NEED more content than expected
Setting Expectations
• Be Transparent with everyone
• Be CLEAR about hypothesis and why this pilot
• Meet with team (even virtually) regularly
• Meet with vendor regularly
• Set evaluation expectation, by this date we expect X
• Be clear about the funding, where is it being spent
• Keep senior leadership informed
• Share your progress with the community
Iterating and Scaling
• Test and Learn
• There is often a vicious cycle of not having final evaluations in time to
iterate before the next launch
• Plan your schedule to allow for appropriate redesign
• Scaling is critical though for testing
Changing the Culture
Pisano, G. (2015). You Need an Innovation Strategy. Harvard
Business Review. June 2015
Changing the Culture
• Universities are not very good at scaling effective practices
• What are your innovation/piloting goals—research, student success,
redesigned teaching (no wrong answers)
• You need to build the case post grant
• You will need to create an innovation pipeline
• Leverage the tools in the community to get beyond one off projects
Lessons Learned
The future of adaptive learning in higher education depends on the
commitment level of universities
• Preparation...really take the time to understand the power of the tool, take
a class
• Skill up faculty, instructional staff and technology team
• Pick your course(s) based on solving problems (is content difficult, would
more student practice help, does immediacy help students
• Build content maps and you will need more content than you currently
have (unless you use pre-packaged)
• Learn to use the dashboard
• Iterate probably at least three times....
Questions
Karen Vignare, PhD, MBA
kvignare@gmail.com
240.462.2160

Adaptive courseware vendor selection and engagement

  • 1.
    Adaptive Courseware Vendor Selectionand Engagement Karen Vignare, PhD, MBA CEO, KV Consulting kvignare@gmail.com
  • 2.
    Agenda • Introductions • VendorEngagement • Selection • University team • Project Management • Setting Expectations • Changing Culture and Building Buy-In • Iterating and Scaling
  • 3.
    Karen Vignare, PhD,MBA • 20 years experience online, emerging technologies in learning • Started online programs, worked with international universities • Managed blended learning, MOOCs, and adaptive learning projects • Extensive research portfolio • More about me at LinkedIn, https://www.linkedin.com/in/karen- vignare
  • 4.
    Selection process • Shortand simple…mission fit • Tools exist to help make choices (Thanks Tyton and EdSurge) • Efficacy Research exists (Thanks SRI) • Many of you already have vendor partners • Multiple tools could be useful if you can make that work
  • 5.
    What We DidNot Have Source: Tyton Partners Lists of companies and categorization of tool type
  • 6.
  • 7.
  • 8.
    Adaptive Technology IntegrationComparisons Next Generation Digital Learning Environment Courseware in Context Framework Reported pilots Interoperability and Integration Latest version compliance LTI Most report easy interoperability Personalization Critical within system No clear standard match so reports are more on reported outcome success Analytics, Advising, and Learning Assessment Caliper and LTI Needs improvement as data exchange is not widely used (yet) Collaboration Currently not as important Accessibility and Universal Design Critical to scale Major concerns (not just a vendor issue)
  • 9.
    The Situation beforePilots at UMUC • Moving to all OER/free resources • Needed LTI (latest version) for integration with BrightSpace (D2L) • Preferred platform tools over others • Involved in previous grants, Gates Foundation PLN, Breakthrough Models Incubators, and Digital Courseware
  • 11.
  • 12.
    Research Questions Independent Variable Dependent Variables/Outcome Measures Analysis 1.Do completion rates differ by condition? Condition (Treatment and Control) Course completion chi-square 2. Does student achievement differ by condition? Condition (Treatment and Control) Homework grades Project grades Midterm exam grades Final exam grades Final course grades t-test 3. Do students view the adaptive learning system as an effective learning tool Student Survey Descriptives 4. Is there a relationship between time spent studying in the adaptive learning system and student achievement? Total login time Final course grades Correlation
  • 13.
    Research Questions Independent Variable Dependent Variables/ Outcome Measures Analysis Is therea relationship between the level of faculty engagement and student achievement? Faculty Engagement Indicators Final course grades Correlation Is there a relationship between response time effort and student achievement? RTE Quiz grades Homework grades Correlation
  • 14.
    Evaluations • Random ControlTrials are the gold standard • But there are always variables that can’t be completely controlled in education • Repeated findings are important BUT • It may not be that easy (Or it may) • Looking for the silver bullet could deter leveraging the tool in new ways • Most technologies require more data/students to be effective Resources: SRI, https://www.sri.com/work/projects/adaptive-learning-market-acceleration- program
  • 15.
    University Selection Team •Center for Innovation in Learning & Student Success • Learning Design & Solutions • Information Technology • Undergraduate School representative • Graduate School representative • Procurement
  • 16.
    Timing of PilotInitiation • Initial vendor discussions fall 2014 • Multiple demos, online and on site • Both continuing and new grant activities were concurrent • Goal to start Summer 2015 especially with a design Search & Demos (Fall, 2014) Evaluation (Winter, 2015) Contracting (S/S 2015) Launch (Summer 2015)
  • 17.
    Pilots Chosen • Mixof those courses identified by analytics as those needing improvement • Program chairs interest/willingness • Student support/services • Grant obligations
  • 18.
    Vendors Selected atUMUC Summer 2015 Launch Cogbooks RealizeIT EdReady Grants OLI (CMU started in 2009) OLI (Stanford) potentially with Open EdX (2015) Lumen Learning (2015)
  • 19.
    Project Management • Developeda work flow and estimated types of personnel needed • Used consultant from earlier grant to get documents & sites prepared • Instructional Design team dedicated one person as project manager • Instruction Design team dedicated three designers (not full-time) • Information Technology dedicated one person • Innovation team included three people • This group then broke into the multiple projects which included faculty and subject matter experts • Established a steering committee
  • 20.
    Project Manager • Leddocumentation efforts (extensive at UMUC) • Identified with lead ID vendor PM • Set up training/meetings • Updated documentation
  • 21.
    Instructional Designers • Learntechnologies • Identify existing content • Work with IT on integration • Setting up pilot processes • Work with faculty to design course map and content flow • Work with vendors ID staff • Work with faculty to improve course design (more content & activities)
  • 22.
    Information Technology • LTIintegration with D2L • Solve back-end issues • Work with IDs whenever vendor meetings included technical architecture/software discussions • Integrate data into data warehouse
  • 23.
    Center for Innovationin Learning • Build shared knowledge base • Responsible for scope & contracts • Project Owner • Co-manage with ID project progress • Run Steering Committee & shared progress (transparently) • Report to senior academics • Evaluations • Presentations
  • 24.
    Academics • Assigned byprogram chairs • Generally an adjunct and/or collegiate (full-time) • Adjuncts were paid by contract • Collegiates in lieu of teaching • Build MORE content, assessments
  • 25.
    Vendor • Stay transparent •Agree to a definitions/taxonomy—so you are on the same page • Agree to the schedule but also remember delays happen • Try to find others working with the vendor (create a community) • If you don’t get answers, escalate
  • 26.
    Challenges • Staff consistency •Project manager left university • IDs needed to be transferred as other projects had ebbs & flows • Vendor challenges (with above & some other issues) • Projects chosen • Often relied on a single SME/faculty • Projects chosen needed to be high priority instead of having competing priorities (OER, redesign) • NEED more content than expected
  • 27.
    Setting Expectations • BeTransparent with everyone • Be CLEAR about hypothesis and why this pilot • Meet with team (even virtually) regularly • Meet with vendor regularly • Set evaluation expectation, by this date we expect X • Be clear about the funding, where is it being spent • Keep senior leadership informed • Share your progress with the community
  • 28.
    Iterating and Scaling •Test and Learn • There is often a vicious cycle of not having final evaluations in time to iterate before the next launch • Plan your schedule to allow for appropriate redesign • Scaling is critical though for testing
  • 29.
    Changing the Culture Pisano,G. (2015). You Need an Innovation Strategy. Harvard Business Review. June 2015
  • 30.
    Changing the Culture •Universities are not very good at scaling effective practices • What are your innovation/piloting goals—research, student success, redesigned teaching (no wrong answers) • You need to build the case post grant • You will need to create an innovation pipeline • Leverage the tools in the community to get beyond one off projects
  • 31.
    Lessons Learned The futureof adaptive learning in higher education depends on the commitment level of universities • Preparation...really take the time to understand the power of the tool, take a class • Skill up faculty, instructional staff and technology team • Pick your course(s) based on solving problems (is content difficult, would more student practice help, does immediacy help students • Build content maps and you will need more content than you currently have (unless you use pre-packaged) • Learn to use the dashboard • Iterate probably at least three times....
  • 32.
    Questions Karen Vignare, PhD,MBA kvignare@gmail.com 240.462.2160

Editor's Notes