2011Learning assessment systems that work a follow up report
Upcoming SlideShare
Loading in...5
×
 

2011Learning assessment systems that work a follow up report

on

  • 789 views

 

Statistics

Views

Total Views
789
Views on SlideShare
789
Embed Views
0

Actions

Likes
0
Downloads
14
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    2011Learning assessment systems that work a follow up report 2011Learning assessment systems that work a follow up report Presentation Transcript

    • Jeff Grann George Lucas23nd Annual Conference, Thayer ReedDenver, CO Jennifer StephensOctober 28, 2011 Moderator: Patricia CaseLEARNING ASSESSMENT SYSTEMS THAT WORK!A FOLLOW-UP REPORT
    • Capella University• Founded in 1993• Charter AQIP participant in 2000• Courses 100% online• For-profit university• Enrollment: 39,447 (average age 39) • Doctoral: 31% • Master’s: 47% • Baccalaureate: 21% • Certificate: 1%• Enrollment status: 94% part-time• Faculty: 1,398 (82% doctoral degree) © 2008 Capella University - Confidential - Do not distribute 2
    • Value Creation Model © 2008 Capella University - Confidential - Do not distribute 3
    • Academic Program Definition © 2008 Capella University - Confidential - Do not distribute 4
    • © 2008 Capella University - Confidential - Do not distribute 5
    • © 2008 Capella University - Confidential - Do not distribute 6
    • Online Course Design and Development © 2008 Capella University - Confidential - Do not distribute 7
    • CRITERION© 2008 Capella University - Confidential - Do not distribute 8 LINK
    • Fully-Embedded Assessment Model• Extends outcome assessment throughout program to develop actionable leading indicators.• Documents alignment of all scoring guide criteria to course competencies and program learning outcomes.• Projected university completion by Q4 2012. © 2008 Capella University - Confidential - Do not distribute 11
    • FEAM Worked ExamplePrinciples of Organizational CommunicationCourse Competencies1. Analyze business communication situations.2. Analyze the interrelationships of communication within organizational systems.3. Communicate effectively.u03a1 Scoring Guide CriteriaPost-FEAMPre FEAM1. Tailors coachingadvice. on analysis of a direct report’s motivational Provides Henry based2. needs. (3) Identifies Henrys educational/training needs in regard to2. Validates approach relative to the theories of motivation. (1) communication competency.3. Predicts stakeholder consequences communication and motivation. Compares the relationship between of communication. (2)4. Identifies educational/training needs based on limited work sample. (1) Applies the theories of motivation.5. Analyzes peer evaluations relative to communication- Confidential - Do not distribute (2) Applies the 10 related evaluations. © 2008 Capella University competencies. 12
    • © 2008 Capella University - Confidential - Do not distribute 13
    • Make learning matter broadly• Implementing a fully-embedded assessment model (FEAM) has proven effective in: • Promoting a coherent learning experience. • Faculty collaboration.• Students have a shared basis to view their learning experience: • As a long-term effort to demonstrate professionally-relevant learning outcomes, not just a series of assignments and courses.• Faculty have a shared basis to establish a community of practice around their program’s learning outcomes: • By sharing professional experiences, teaching practices, instructional strategies, performance expectations, technical innovations, and illustrative case © 2008 Capella University - Confidential - Do not distribute 14 studies.
    • WHO WE ARE “Educating Those Who Serve” American Public University System has been dedicated to educating those who serve since our doors first opened in 1991. The American Public University System (APUS) is a fully online system, encompassing both American Military University and American Public University. We are regionally accredited by the Higher Learning Commission (HLC) of the North Central Association. AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • 2011: Celebrating 20 Years of Service and LeadershipMission: To provide quality higher education with emphasis on educating the nation’s militaryand public service communities by offering respected, relevant, accessible andaffordable, student-focused online programs, which prepare them for service and leadership ina diverse, global society. STRATEGY:• Founded as American Military University (AMU) in • Focus on quality and affordability 1991 by James P. Etter, a Marine Corps Officer • Graduate successful alumni who make important contributions to• In 2002, AMU expanded into the American Public society and their professions University System (APUS), adding American Public University (APU) • Enhance America’s competitiveness by meaningfully• Nationally Accredited (DETC) in 1995 increasing the population with college degrees• Regionally Accredited (NCA) in 2006 • Collaborate with other universities to create solutions to• Serving over 96,000 students worldwide U.S. higher education challenges AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • “Quality begins on the inside . . . and thenData Driven Culture works its way out.” -Bob Moawad • Internal and external benchmarking • Data tools for stakeholders - dashboards, prog ram and course level fact books, data warehouse, stude nt learning assessment reports. AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • NATIONALLY BENCHMARKED INSTRUMENTSAPUS utilizes a number of nationally benchmarked tests and validatedinstruments to measure student engagement, student satisfaction, andthe achievement of student learning outcomes. National Community Survey of Proficiency Major Field of Inquiry Student Profile Tests Tests Survey Engagement AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO 18 SERVE
    • Direct and Indirect Measures of Assessment AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • Regular Dissemination of Data to Stakeholders: Dashboards “Experts often possess more data than judgment.” – Colin Powell AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • Regular Dissemination of Data to Stakeholders: Course and Program Level Fact Books“In the spirit of science, there really is no suchthing as a "failed experiment." Any test thatyields valid data is a valid test.” - Adam Savage AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • Student Learning Assessment Report:Alignment of Student Learning Outcomes AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • Student Learning Assessment Report: Culminating Experience AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • Student Learning Assessment Report: Key Course Embedded Assessments AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • Student Learning Assessment Report: Assessment Data AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • Student Learning Assessment Report: Checklist AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • Student Learning Assessment Report: Changes Based on Data AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • PROGRAM REVIEW PROCESS“Quality is never an “One of the greataccident; it is always mistakes is tothe result of high judge policies andintention, sincere programs by theireffort, intelligent Data External intentions ratherdirection and skillful Collection Review than their results.”execution; it Milton Friedmanrepresents the wisechoice of many”William A. Foster Follow-Up Analysis Program Three Year Review Plan Meeting AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • Data Aggregation & AnalysisLibrary and Curriculum ExternalLearning Resources Assessment Reviewer Feedback• Course books • Student learning • Expert reviewer report• Electronic resources outcomes • Industry Advisory • Instructional strategies Council report• Learning strategies • Evaluation procedures • Academic rigor AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • Data Aggregation & AnalysisFaculty Students Learning Outcomes Assessment• Analysis of faculty • Student demographic • Curricular Mapping credentials and expertise information • Assessment measures to ensure breadth and • Enrollment History diversity • Fact books • Growth trends • Student learning assessment reports • reports AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE SERVE
    • Data Aggregation & AnalysisProgram Benchmarking Program Directory Review and Findings Summary• Program benchmarking • Evaluation of findings • Dean’s observations with similar programs • Program recommendations • Meeting minutes and institutions • Three year proposed strategic plan AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE SERVE
    • CONTACT INFORMATION Jennifer Stephens Helm, Ph.D. jstephens@apus.edu Vice-President, Institutional Research and Assessment AMERICAN PUBLIC UNIVERSITY SYSTEM | EDUCATING THOSE WHO SERVE
    • Kaplan University• Offering online programs since 2001• 58,000+ students online; 7,000+ students at campuses• For-profit• Certificates, Associates, Bachelors & Masters• Incoming students have average of four NCES-identified risk factors• 1,000+ courses• 4 tracks; 14+ term starts per year; overlapping terms• Outcomes-based institution
    • So We do All this Assessment …Measuring activities is usefulExample: How many curriculum changes did we make this year?Assessing what is really going on is essentialExample: How many of those changes improved retention?Requirement:Sustainable infrastructure to support assessment approachResult:Meaningful analysis of data to generate actionable information
    • Assessment-Related Infrastructure University Outside Vendor Course Authoring Course Delivery • All course content• Curriculum development Reporting and Analytics• Structured content Database development and Outcomes Package management• Centralized framework for consistency across How do we courses know if we are• Content available in a • Data collection on outcomes variety of formats improving • Real-time assessment data learning? reporting to faculty • Consistency of outcome and rubric Outcome Repository • Assignments mapped to outcomes and rubrics • Single source of truth for course outcomes and rubrics • Course to program mapping of outcomes • Course to course mapping • Versions of all outcomes and rubrics
    • Sustainable Systems
    • Information at the Course Level
    • Information at the Institutional LevelExample: Improvement as defined by improvement inoutcomes achievement levels and student retentionSource:Thayer E. Reed, Jason Levin & Geri H. Malandra (2011): Closing the Assessment Loop by Design, Change: The Magazine ofHigher Learning, 43:5, 44-52
    • Assessment andAcademic Quality at the Rockies George LucasDirector of Academic Quality
    • University of the RockiesPrivate, independent graduate school of thesocial and behavioral sciences.Two schools including the School ofProfessional Psychology, and the School ofOrganizational LeadershipMasters and Doctorate degree programsboth on campus and onlineClinical Specialization at the campus
    • Academic Quality and Assessment at the Rockies are grounded in Learning Outcomes. Institutional Programmatic Course
    • Institutional Our Mission Programmatic Two Schools w/Master and Course Doctoral Programs Our Curriculum and SyllabiUsing established Learning Outcomes as a Foundation,University of the Rockies has:• Acquired application tools and resources,• Incorporated and leveraged them in a coordinated manner,• Aligned our Learning Outcomes, and (this quarter)• Developed and Launched a Comprehensive Assessment Plan
    • What are the steps we’ve taken to reach this point?What will we do in the coming months and in 2012?
    • Gates Model
    • Doctor of Psychology “Gates” Model for Assessment Post Doc and Post Internship Graduation Licensure Dissertation Comprehensive Process Practicum 1 Thru 5 Courses (early, mid, late) Multiple Assessment Methods alignAdmission with the student experience andProcess learning context including but not limited to Holistic Assessment, Evaluations, Grades, Annual Review, Licensure Exams, etc.
    • Gates ModelCurriculum Mapping
    • A successful and comprehensive CurriculumDevelopment ProcessLearning Outcomes are identified in allCurriculum with our Development TemplateCurriculum Mapping and IRMA tables in
    • Gates ModelCurriculum Mapping “Convergence”
    • Mapping, Outcomes Alignment Usage Statistics Quality Assurance, CurriculumIntegrated SupportRubrics, Holisticand AnalyticAssessment Faculty Development/Accreditation Student Lifecycle
    • Gates ModelCurriculum Mapping “Convergence” Continuous Improvement
    • Cycle of Continuous Improvement Assessment that is “done right” never ends!www.ala.org
    • Gates ModelCurriculum Mapping “Convergence” Continuous Improvement
    • Fitting it all together…
    • Baseline Concepts of our Assessment Strategy: Quality and Assessment belong together. Our ongoing goal is to assess and continually improve Institutional Quality and improve the student experience. To continually improve, you must continually: Measure Assess Innovate Evolve
    • University of the Rockies Assessment Plan:Three dimensions: • Academic Achievement • Student Engagement • Faculty EngagementSeven (w/potentially more) categoriesserving as data sources: • Surveys • Campus Vue • WayPoint Outcomes • eCollege • Library Writing Center • Curriculum Development • Quality Matters ™
    • Assessment and Quality“Coding Table”
    • Mapping, Outcomes Alignment Usage Statistics Quality Assurance, CurriculumIntegrated SupportRubrics, Holisticand AnalyticAssessment Faculty Development/Accreditation Student Lifecycle
    • Jeff Grann George Lucas23nd Annual Conference, Thayer ReedDenver, CO Jennifer StephensOctober 28, 2011 Moderator: Patricia CaseLEARNING ASSESSMENT SYSTEMS THAT WORK!A FOLLOW-UP REPORT