Assessment David Kniola, Ph.D. Assistant Director Office of Academic Assessment
 
http://www.youtube.com/watch_popup?v=6Cf7IL_eZ38&vq=medium
Rising tension within and between Internal and External
Where might we be headed?
Within the next 2-3 years Continue current practices.  LEAP essential learning outcomes and authentic assessment.  New Leadership Alliance (http://www.newleadershipalliance.org/)   Shift from quantitative to a balance with qualitative measures. LEAP report from employers wanting “context and substance.”
Within the next 2-3 years Keeping eye on emerging trends: Discerning Learning from Lumina ( http://www.luminafoundation.org/publications/ )  Degree Qualifications Profile ( http://www.luminafoundation.org/publications/The_Degree_Qualifications_Profile.pdf )  ETS e-rater, automated essay evaluation ( http :// www.ets.org/erater/about ) Targeted analytics defined by NRC doctoral program survey
Transparency Framework National Institute for Learning Outcomes Assessment (NILOA) http://www.learningoutcomeassessment.org/TransparencyFrameworkIntro.htm
Diana Chapman Walsh who served as president of Wellesley College from 1993 to 2007. Essay in Inside HE: Toward a Science of Learning Advances in learning sciences and fast-changing technology Transcending new and better measures of SLOs Assessing WHAT students have learned is less valuable than finding out HOW they learn. Polanyi’s "Learning about" involves explicit knowledge, "learning to be" is more tacit…combine with “socially constructed understanding” in digital age Transformation in Assessment
Define higher education and learning in traditional university. Walsh calls for a “science of improvement.” Creation of “highly intentional learning” What is quality? What is “value added?” Do we need improvement in educational output? We have seen quality of life improvements (res halls, dining, fitness center). These have not led to educational improvements. What is the role of land-grant? What do these mean to VT? How do we invent  our  future? Data, honest conversation, systematic research. Transformation in Assessment
Analytics
Rise of learning and business analytics How do we know if a student is struggling? Do we know what makes a student successful? What data do we have? What data do we need? Learning Analytics
Learning analytics: Utilizing performance data (grades, quizzes, tests) captured by LMS. Purdue Signals ( http://www.itap.purdue.edu/tlt/signals/ )  … but what if this data was augmented with data from “outside” the LMS?  What if we could see patterns of data that has been captured at every interaction a student has with the university? What if this could then be used by faculty as well as students to better understand learning?  Learning Analytics
Learning Analytics
Within the next 5-10 years Convergence of data sources (implementation of learning analytics) Portable and personalized assessment tools. Student or professor can deploy bot to retrieve data. Call up on mobile device. Where does student fit in relation to others? Modeling ontologies? Simulations? Networked assessment. Move from department/discipline to university wide. Include PK-12? A social network for assessment? Is privacy an issue for the Facebook generation?
Within the next 15 years
Within the next 15 years Artificial intelligence and augmented intelligence Assessment hinges on how we define learning?  Computers readily retrieve “answers” (e.g., Watson, which will be commercialized by IBM).  Human capabilities enhanced by interaction with computer (e.g., pattern recognition, devices as extension of our senses, decisions based on data) Asking questions (humans) not answering them (computers)
Within the next 15 years Where does that leave assessment? Possibility to explore deeper questions about student learning. If learning is individualized, assessment will need to be. Different questions important to different users: Faculty—how/what are students learning in my class? Advisors—which students need attention and guidance? Students—what am I learning and how do I compare to others? Administrators—where do we need to focus resources to support learning? Public—are they doing what they say they are doing?
Within the next 15 years Next Generation Learning Challenges ( http://nextgenlearning.org/ ) Goal: Scale the real-time use of learning analytics by students, instructors, and academic advisors to improve student success.   “ That’s our challenge to you: Develop a model that identifies, improves, and scales existing solutions of learner analytics.”    
Within the next 15 years Think back to the Corning video. Imagine this as a university campus. Imagine this is what we could do with our data. Can we build this at VT?  An interdisciplinary “meeting of the minds”: instructional design, learning technologies, brain science, education, psychology, computer science, systems engineering, OAA, CIDER, others…

Assessment

  • 1.
    Assessment David Kniola,Ph.D. Assistant Director Office of Academic Assessment
  • 2.
  • 3.
  • 4.
    Rising tension withinand between Internal and External
  • 5.
    Where might webe headed?
  • 6.
    Within the next2-3 years Continue current practices. LEAP essential learning outcomes and authentic assessment. New Leadership Alliance (http://www.newleadershipalliance.org/)   Shift from quantitative to a balance with qualitative measures. LEAP report from employers wanting “context and substance.”
  • 7.
    Within the next2-3 years Keeping eye on emerging trends: Discerning Learning from Lumina ( http://www.luminafoundation.org/publications/ ) Degree Qualifications Profile ( http://www.luminafoundation.org/publications/The_Degree_Qualifications_Profile.pdf ) ETS e-rater, automated essay evaluation ( http :// www.ets.org/erater/about ) Targeted analytics defined by NRC doctoral program survey
  • 8.
    Transparency Framework NationalInstitute for Learning Outcomes Assessment (NILOA) http://www.learningoutcomeassessment.org/TransparencyFrameworkIntro.htm
  • 9.
    Diana Chapman Walshwho served as president of Wellesley College from 1993 to 2007. Essay in Inside HE: Toward a Science of Learning Advances in learning sciences and fast-changing technology Transcending new and better measures of SLOs Assessing WHAT students have learned is less valuable than finding out HOW they learn. Polanyi’s "Learning about" involves explicit knowledge, "learning to be" is more tacit…combine with “socially constructed understanding” in digital age Transformation in Assessment
  • 10.
    Define higher educationand learning in traditional university. Walsh calls for a “science of improvement.” Creation of “highly intentional learning” What is quality? What is “value added?” Do we need improvement in educational output? We have seen quality of life improvements (res halls, dining, fitness center). These have not led to educational improvements. What is the role of land-grant? What do these mean to VT? How do we invent our future? Data, honest conversation, systematic research. Transformation in Assessment
  • 11.
  • 12.
    Rise of learningand business analytics How do we know if a student is struggling? Do we know what makes a student successful? What data do we have? What data do we need? Learning Analytics
  • 13.
    Learning analytics: Utilizingperformance data (grades, quizzes, tests) captured by LMS. Purdue Signals ( http://www.itap.purdue.edu/tlt/signals/ ) … but what if this data was augmented with data from “outside” the LMS? What if we could see patterns of data that has been captured at every interaction a student has with the university? What if this could then be used by faculty as well as students to better understand learning? Learning Analytics
  • 14.
  • 15.
    Within the next5-10 years Convergence of data sources (implementation of learning analytics) Portable and personalized assessment tools. Student or professor can deploy bot to retrieve data. Call up on mobile device. Where does student fit in relation to others? Modeling ontologies? Simulations? Networked assessment. Move from department/discipline to university wide. Include PK-12? A social network for assessment? Is privacy an issue for the Facebook generation?
  • 16.
  • 17.
    Within the next15 years Artificial intelligence and augmented intelligence Assessment hinges on how we define learning? Computers readily retrieve “answers” (e.g., Watson, which will be commercialized by IBM). Human capabilities enhanced by interaction with computer (e.g., pattern recognition, devices as extension of our senses, decisions based on data) Asking questions (humans) not answering them (computers)
  • 18.
    Within the next15 years Where does that leave assessment? Possibility to explore deeper questions about student learning. If learning is individualized, assessment will need to be. Different questions important to different users: Faculty—how/what are students learning in my class? Advisors—which students need attention and guidance? Students—what am I learning and how do I compare to others? Administrators—where do we need to focus resources to support learning? Public—are they doing what they say they are doing?
  • 19.
    Within the next15 years Next Generation Learning Challenges ( http://nextgenlearning.org/ ) Goal: Scale the real-time use of learning analytics by students, instructors, and academic advisors to improve student success.   “ That’s our challenge to you: Develop a model that identifies, improves, and scales existing solutions of learner analytics.”    
  • 20.
    Within the next15 years Think back to the Corning video. Imagine this as a university campus. Imagine this is what we could do with our data. Can we build this at VT? An interdisciplinary “meeting of the minds”: instructional design, learning technologies, brain science, education, psychology, computer science, systems engineering, OAA, CIDER, others…

Editor's Notes

  • #3 In the year 2020….predicting the future after it happens.
  • #4 As you are watching this…think about a university. What if these technologies were available at a university? What if we could do these things with our data?
  • #5 Internal: (1) VP’s and the President’s commitment to assessment; (2) Deans’ perception of value in student learning outcomes; (3) linking assessment with budget and resource allocation decisions; and, (4) faculty’s willingness to participate in the assessment. How will budget reductions impact these?   External: Expectations of students/parents, Fed, and accreditation agencies (professional and regional) with regard to student learning assessment will likely not lessen. Focus will move toward institutional assessment to improve graduation (i.e., retention) and demonstrate quality and value.
  • #8 Defining “excellence” and paths to excellence. Searching for ways to measure student learning.
  • #9 Idea of transparency…move towards being more public with results of our assessments.
  • #10 This is somewhat getting away from assessment…but it further shows the need to define learning in conjunction with figuring out ways to measure it.
  • #12 Leads us to analytics. We know these companies operate on empirical data not hunches. They are very sophisticated and can “predict” and guide behavior.
  • #14 Retention of our students will be critical without lowering standards of quality. Data can help us better understand the student experience.
  • #16 An ontology describes the concepts and relationships that are important in a particular domain, providing a vocabulary for that domain as well as a computerized specification of the meaning of terms used in the vocabulary.
  • #18 Students already sit in class and Google everything we say to make sure it is “correct” (according to Wikipedia).
  • #21 An interdisciplinary “meeting of the minds”: instructional design, learning technologies, brain science, education, psychology, computer science, systems engineering, OAA, CIDER, others…