Effective Professional Development: The Evolution Solution
The race to capture experiential learning and competency based education (CBE)
1. THE RACE TO CAPTURE EXPERIENTIAL
LEARNING AND COMPETENCY BASED
EDUCATION (CBE)
NCES STATS Conference
July 10, 2015
Joellen Shendy, UMUC
Jeff Alderson, Eduventures
Patrick Elliott, UMUC/PESC
2. PRESENTERS
P20W Education Standards Council (PESC) members:
Joellen Shendy,
Associate Vice Provost and University
Registrar
University of Maryland University College
Jeff Alderson,
Principal Analyst,
Enterprise Software
Eduventures
Patrick Elliott,
Associate Registrar and
PESC Board Member
University of Maryland University College
3. WHAT IS CBE?
● Competency Based Education
● No standard definition
● CBE covers multiple approaches:
● Course-based vs. direct assessment
● All-you-can-learn and self-paced
● Prior learning assessments, co-ops, and credit by exam
Source: http://flex.wisconsin.edu
4. SO REALLY, WHAT IS CBE?
● A large term, covering multiple approaches
● Common characteristics:
● Change in pedagogy: “Not the sage on the stage, but the guide on the side.”
● Focus on assessment as a means to demonstrate learning
● Values what you know over where you obtained the knowledge
5. WHY IS CBE IMPORTANT
(GENERAL)?
● Accessible
● Recognizes what a student already knows
● Not rigid to term structure
● Flexible
● Pace may vary
● Transparent
● Extended transcripts can provide deeper information
● Affordable
● Viewed as a potential solution to reduce cost of education
● Relevant
● Industry relevant competencies that can be aligned to workforce needs
Source: www.flickr.com/photos/brenda-
starr/3509344100
6. WHY IS CBE IMPORTANT TO YOU
● Different data
● Redefines key components of education
● Current systems and processes do not
adequately support CBE
7. OTHER ISSUES
● Financial aid and other funding
sources (VA, etc.)
● Regional accreditations
guidelines recently released
● Relatively new and lack of
benchmarksSource:
upload.wikimedia.org/wikipedia/commons/thumb/8/85/"Rules_and_Regulatio
ns...Threshing_Committee_of_the_U.S._Food_Administration_for_Knox_Co.
"_-_NARA_-_512713.jpg/1280px-
"Rules_and_Regulations...Threshing_Committee_of_the_U.S._Food_Admini
stration_for_Knox_Co."_-_NARA_-_512713.jpg
8. WHAT IS EXPERIENTIAL LEARNING?
Experiential Learning is learning that
occurs through experience
● Internships
● Clubs and Activities
● Study Abroad
● Research
● Service
Source:
upload.wikimedia.org/wikipedia/commons/a/a7/Experiential_Learning_faciliti
es_at_les_Roches_Marbella.jpg
9. WHY IS EXP. LEARNING
IMPORTANT?
Experiential Learning captures unique elements of the
student's collegiate experience that are not usually
documented on an official transcript that help form a more
holistic view of the student
● Distinguish the student or graduate from other students for
employment or further education pursuits
● Demonstrate knowledge, skills, and abilities that may not
be apparent through other means
● Recognizes contributions and achievements outside of
academic coursework
● May help brand an institution with distinct elementsSource:
https://upload.wikimedia.org/wikipedia/commons/thumb/
8/84/Important-3.svg/341px-Important-3.svg.png
10. WHY IS EXP. LEARNING
IMPORTANT?
● Capturing and recording experiential learning
provides additional insight into learner’s knowledge,
skills, and abilities for both learner and institution
● Supplements traditional academic work
● Brand institution, providing unique value-add
● Helps establish life long learning
Source:
https://upload.wikimedia.org/wikipedia/commons/thumb/8/84/Important-
3.svg/341px-Important-3.svg.png
11. CBE MODELS
● Course Connected
o CBE is being delivered within courses that are connected to time and
credit based metrics (Carnegie Model)
● Direct Assessment
o CBE is being delivered by assessment (projects, simulations, papers,
etc) absent of time and credit constraints
Subscription Model
Must meet all competencies
● Hybrid Model
o CBE may be being delivered by a mix of models including direct
assessment and credit based learning
o Program transition hybrids
12. WHAT DOES THE DATA LOOK LIKE
(CBE)?
● Different than current models
o Some CBE will retain elements of typical structures (may map to
courses or carry units and a grade)
o CBE allows multiple attempts to satisfy a requirement rather than the
“one and done” model we have now
● More expansive and detailed than a typical course and more individualized
o CBE structures may dive deeper than a typical learning outcome
o Artifacts and assessments may be more granular
o No relation of number of competencies to a particular program (no
magic 120 credit hours happening)
13. WHAT DOES THE DATA LOOK LIKE
(CBE)?
● CBE not supported within majority of
current system structures
o Issues are Pace and Place
Not tied to “time”
Not tied to place - recognition that
learning can occur anywhere
o How to capture data that does not fit
within current structural framework
Manual
Categories are problematic
Source:
http://www.texample.net/media/tikz/examples/PNG/3d-
graph-model.png
14. WHAT DOES THE DATA LOOK LIKE
(EXP. LEARNING)?
Experiential learning complements
traditional academic courses but may
not be assessed the same way
No grades
Different categories
Not typically housed in a
single system but, if so, it
usually is not in the SIS where
regular records are stored
Verification of the learning can
be official or unofficial - poses
issues if the data needs to be
trusted
Source: http://www.flickr.com
15. WHAT IS HAPPENING IN THE
INDUSTRY?
● Which vendors are collecting, transferring, storing data today?
● Ellucian: acquired the Helix LMS, focused on CBE; mapping LMS data models to ERP/SIS
structures
● LoudCloud: artifacts stored locally to the LMS, using LTI/SSO to send data to
SIS/Assessments
● Civitas: data agnostic, mapping/aligning/transforming data from multiple sources, write back
to SIS
● Kaplan University: Learning Outcome Manager and Repository; 1:1 credit hour
equivalencies
● D2L/Brightspace: Adaptive Learning Engine and assessments integrated to the LMS
● Trends:
● LMS and Assessment vendors continue store learning outcomes locally to their databases
● LTI standard not robust enough to transport outcomes data back to the SIS
● Emerging standard for Caliper for learning events, but not robust for entire objects to
16. WHERE DO WE THINK IT IS GOING?
● Alignment of CEDS elements to CBE applicability
o Summary of John Milam’s work on current state of standards
o Q) Why is this important? A) The basic framework exists to save R&D
$$, don’t reinvent the wheel for portability of data between institutions
for degree audit, transfer and employment verification
o Issues for CBE standardization fall into three categories:
o Technology
o Policy
o Taxonomy
● Emergent need for transfer, portability, audit, mapping, system integration?
● All major for-profit institutions are pursuing homegrown or vendor-led CBE data systems
● All CBE-focused LMS vendors
● Standards forums to share best practices around technology, policy and taxonomy
17. ANALYSIS OF CEDS 5.0 FOR CBE
IMPACT
• 232 distinct data elements of higher CBE interest for review
• Persons
• Programs
• Awards
• Learning Resources
• Learning Objectives
• Learning Experiences
• Competencies
• Verifications
• Rubrics
• However, CEDS illustrates how a course-based approach is outdated
• Taxonomies are still needed for CEDS & CBE systems beyond CIP,
SOC, & NAICS related to competency frameworks
Source:
https://upload.wikimedia.org/wikipedia/commons/8/80/Data_Structure_Diagram.jpg
18. CBE DOCUMENTS AND SYSTEMS
Extended Transcripts
● Provide greater evidence of students learning journey
● IMS Global Work Group on CBE Data and Extended Transcript (eT)
o Development of CBE data standards for institutional systems
o Prototype and provide open source code for a CBE digital eT
● A few sample schools providing
o Kaplan University (CBE)
o Capella University (CBE)
o Northern Arizona University (CBE)
o University of Wisconsin (CBE)
o Elon University (Experiential)
o Stanford University (Outcomes)
Source:
https://pixabay.com/static/uploads/photo/2015/02/06/14/4
2/document-626145_640.png
19. CALL TO ACTION!
● Stakeholders need to join the conversations
● Participate in workgroups and PESC task force.
● Tom Black, Stanford University, Chair
● Joellen Shendy, UMUC, Co-Chair
● Alex Jackl, Bardic Systems, Co-Chair
● For more info, contact jennifer.kim@pesc.org or visit
http://www.pesc.org/interior.php?page_id=244
● What questions does the NCES group have? Take aways to be addressed
and followed up.