WPU Scholarship Day 2013

191 views
136 views

Published on

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
191
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

WPU Scholarship Day 2013

  1. 1. It Can’t Be Done: AssessingEducational Quality Across InstitutionsCorbin M. Campbell, Assistant ProfessorTheresa Cruz Paul, Research Team MemberHigher and Postsecondary Education ProgramTeachers College, Columbia University
  2. 2. Advisory Board Members• Jennifer Glaser• Director of Student Services, Fairfax High School• Dr. Wendell Hall• Deputy Director• Institute for Higher Education Policy• Dr. Karen Inkelas• Director, Center for the Advanced Study of Teaching and Learning• University of Virginia• Dr. Christine Keller• Executive Director, Voluntary System of Accountability; AssociateVice President for Academic Affairs• Association of Public and Land-grant Universities• Sharon La Voy• Director of Assessment• University of Maryland• Dr. Jennifer Lindholm• Special Asst. to the Dean & Accreditation Coordinator• University of California Los Angeles• Dr. George Mehaffy• Vice President for Academic Leadership and Change• American Assoc. of State Colleges and Universities• Dr. Jessica Mislevy• Research Social Scientist• SRI International• Dr. Daniel Newhart• Senior Researcher & Associate Director• Center for the Study of Student Life• The Ohio State University• Dr. Anna Neumann• Professor of Higher Education• Teachers College, Columbia University• Dr. KerryAnn OMeara• Associate Professor of Higher Education• University of Maryland, College Park• Dr. Aaron Pallas• Professor of Sociology and Education• Teachers College, Columbia University• Dr. Stephen Porter• Professor of Higher Education• North Carolina State University• Dr. Priscilla Wohlstetter• Distinguished Research Professor• Teachers College, Columbia University
  3. 3. ClaimThere is currently no comprehensive way forthe public and prospective students andfamilies to know about the quality of theeducation that is happening inside the wallsof a college or university—and how thatquality compares to the quality at other collegesand universities.
  4. 4. The Black Box of Higher EducationExtensive accountability data is largely unseen bythe public:• Accreditation• Collegiate Learning Assessment (CLA)• Course-based learning outcomes• Course evaluations
  5. 5. Impacts of the Black Box [& Rising Costs]• Spellings Commission• Academically Adrift (Arum & Roksa, 2010)• Numerous articles in top newspapers regardingquestions to academic rigor in higher education.• 12/10/12: “Who will hold collegesaccountable?” (Carey, NYT, p. A27)• “Affront on Higher Education”
  6. 6. How do prospective students, parents, and thepublic decide which institution has the highestquality education?• US News• World Rankings• Other ranking venues: Princeton Review• Word of Mouth—Family, Friends, High SchoolGuidance Counselors• Reputation[Newer, but under used forms: NSSE?, VSA/CollegePortrait?]
  7. 7. US News & World Report’s Formula1. Graduation & retentionrates-20%1. Average graduation rates-80%2. Average freshman retention rate-20%2. Financial resources-10%1. Average educational expenditureper student-100%3. Alumni giving-5%4. Graduation rateperformance–5%5. Peer assessment-25%6. Student selectivity-15%1. Acceptance rate-10%2. High school ranking-40%3. SAT/ACT scores-50%7. Faculty resources-20%1. Faculty compensation-35%2. % faculty with top terminaldegrees-15%3. Percent full time faculty-5%4. Student/Faculty ratio-5%5. Class size 1-19 students-30%6. Class size 50 or more-10%
  8. 8. Possible unintended consequences ofpast and current measures….Policy: No Child Left Behind (and Collegiate LearningAssessment—CLA)• Teaching to the test• Altering curriculumPublic: US News & World Report• Students and parents use rank instead of fit to select whichcollege to attend.Institutions: US News & World Report• Mission creep/Striving• Over-reliance on SAT (GWU, for example)• Numbers Manipulation: Cornell removed non-graduates fromthe alumni list
  9. 9. Questions Absent in These Measures…• What is the level of academic rigor?• What is the quality of teaching?• What are the educational practices that aninstitution employs that affect student learning?[Maybe a few examples of rankings that use surveys to measurethese items, for example, Princeton Review]
  10. 10. Why are these questions absent?• These data are difficult to obtain!!• These data are expensive to obtain!!• Colleges and Universities are protective ofacademic freedom and are insular with data aboutthe educational core.
  11. 11. Enter NSSE, CLA, VSA/College PortraitThe National Survey of Student Engagement (NSSE): Surveysinstitutions about their effective educational practicesCLA: Measures student’s critical thinking skills pre and post college viaa standardized test.Voluntary System of Accountability (VSA)/College Portrait: Createdto ward off imposed and mandating higher education testing. Aimed atthe public transparency of higher education—compiles several datasources: NSSE, CLA, Grad/retentionPROBLEMS:1) Concerns with validity2) Relies on a single data collection method3) Not primarily intended for the public4) Assumes learning is due to college environment5) Missing data / limited data
  12. 12. PurposeThis research agenda aims to createalternative, innovative, and comprehensivemeasures of educational quality across institutionsthat could contribute to public understanding ofcollege and university quality.
  13. 13. What are the intended consequences ofthis new educational quality measure?1) A stronger focus on the educational core of institutions:teaching and academic rigor, and educationalexperiences2) Public access to comprehensive data about teaching,academic rigor, and educational experiences in highereducation at the institutional level3) Administrators having an in-depth understanding of howtheir institution compares to others in terms ofteaching quality, academic rigor, and educationalexperiences
  14. 14. Three Phases:• Dual-Institution Pilot (Spring 2013)• One large, public, research extensive institution• One medium private research extensive institution• Multi-Institution Peer Benchmarking Pilot(Spring/Fall 2014)• National Study with Publicly posted Data(Spring 2016)
  15. 15. Dual-Institution Pilot1) Student and faculty survey: NSSE FSSE2) Syllabus analysis3) Experience sampling method4) Class observations5) Analyzing Student Work6) Course Evaluations*Academic Rigor, Teaching Quality, Learning Outcomes
  16. 16. Academic Rigor• Based on the cognitive complexity required by students inthe coursework as defined by the revised Bloom’sTaxonomy (Anderson & Krathwohl, 2001)
  17. 17. Teaching Quality• Based on Anna Neumann’s claims on teaching andlearning (2012).• According to this framework, quality teaching entails:(Part I) Orchestrating an encounter of subject matter ideas(Part II) Connecting student’s learning to prior knowledge(Part III) Supporting students in working through thecognitive and emotional features of encounters betweentheir own long-held understandings and new ones gainedduring the course.
  18. 18. Essential Learning Outcomes• Based on the American Association of Colleges andUniversities’ (AAC&U) Essential Learning Outcomes (ELO).• This framework was developed by AAC&U throughengagement with hundreds of institutions, accreditors, andhigher education stakeholders (AAC&U, 2004).• Four Parts:• ELO Part I: Knowledge of Human Cultures and the Physical andNatural World• ELO Part II: Intellectual and Practical Skills• ELO Part III: Personal and Social Responsibility• ELO Part IV: Integrative and Applied Learning
  19. 19. Where we are now• At this point we have observed 100 classrooms• In the process of collecting 300+ syllabi• Collecting NSSE FSSE data• Discovering that Experience Sampling may not be aviable option• Collecting student workNext steps• Preparing to collect data from a second pilot institution• Searching for a multi-site pilot institution that would liketo compare data to defined peer institutions
  20. 20. Questions? Comments?Corbin M. Campbell, Assistant Professorcampbell2@tc.columbia.eduTheresa Cruz Paul, Research Team MemberHigher and Postsecondary Education ProgramTeachers College, Columbia University

×