Measuring What Matters: Noncognitive Skills - GRIT

599 views
384 views

Published on

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
599
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Measuring What Matters: Noncognitive Skills - GRIT

  1. 1. Measuring What Matters The Role of Non-Cognitive Factors in Student Success Dr. Mac Adkins, President SmarterServicesProvided by
  2. 2. Question 1? • How do you determine who can be enrolled at your school? – Standardized test scores – Prior grade point averages – Admissions exams
  3. 3. Top Admissions Factors • The National Association for College Admission Counseling rated these factors. • CONSIDERABLY IMPORTANT – College prep course grades – Strength of high school curriculum – Standardized test scores – Overall GPA • MODERATELY IMPORTANT – Admissions essay – Letters of recommendation – Demonstrated interest – Class rank – Extracurricular commitment
  4. 4. Question 2 Why Do Students Drop Out? A study funded by the Bill and Melinda Gates Foundation ranked these reasons: 1. Conflict with work schedule 2. Affordability of tuition 3. Lack of support from family – financial and practical support 4. Lack of belief that a college degree is valuable 5. Lack of discipline – too much socializing, not enough studying http://www.publicagenda.org/pages/with-their-whole-lives-ahead-of-them
  5. 5. To Find Out What Matters Let’s Ask: Employers Colleges Faculty National Research Council US Department of Education Mothers
  6. 6. Skills that Employer’s Want National Association of Colleges and Employers Survey of Employers http://www.unl.edu/svcaa/documents/how_e mployers_see_candidates.pdf
  7. 7. Outcomes Schools Want Elements of Mission Statements From 35 Universities Michigan State University, 2004 1. Knowledge, learning, mastery of general principles 2. Continuous learning, intellectual interest, curiosity 3. Artistic cultural appreciation 4. Appreciation for diversity 5. Leadership 6. Interpersonal skills 7. Social responsibility, citizenship and involvement 8. Physical and psychosocial health 9. Career preparation 10.Adaptability and life skills 11.Perseverance 12.Ethics and integrity
  8. 8. Traits Online Faculty Want WICHE Cooperative for Educational Technologies, 2013
  9. 9. 2012 National Research Council COGNITIVE Problem solving Critical thinking Systems thinking Study skills Adaptability Creativity Meta-cognitive skills INTERPERSONAL Communication Social Intelligence Teamwork Leadership Cultural sensitivity Tolerance for diversity INTRAPERSONAL Anxiety Self-efficacy Self-concept Attributions Work ethic Persistence Organization Time management Integrity Life-long learning
  10. 10. US Department of Education “The test score accountability movement and conventional educational approaches tend to focus on intellectual aspects of success, such as content knowledge. However, this is not sufficient. If students are to achieve their full potential, they must have opportunities to engage and develop a much richer set of skills. There is a growing movement to explore the potential of the “noncognitive” factors — attributes, dispositions, social skills, attitudes, and intrapersonal resources, independent of intellectual ability—that high-achieving individuals draw upon to accomplish success.”
  11. 11. Parents Teach It
  12. 12. Are You Beginning To See The Picture? • Non-cognitive skills matter – Determine student retention – Determine employer satisfaction – Determine online course success – Federal agencies recognize their importance – They are the mission of many schools – Parents value them
  13. 13. “Years of schooling predicts labor market outcomes — cognitive skills account for only 20%; therefore 80% of the “years of schooling” benefit is due to noncognitive skills” (Bowles, Gintis, & Osborne, 2001) http://www.umass.edu/preferen/gintis/jelpap.pdf
  14. 14. Types of Data Used To Predict Learner Success APTITUDE ATTITUDE SITUATION
  15. 15. What Are Non-Cognitive Skills?
  16. 16. Can Non-Cognitive Skills Be Taught? You can’t change a tiger’s stripes, but you can teach that tiger to hunt in a different environment.
  17. 17. Recommended Uses of Non-Cognitive Skills Measures 1. Optic – A lens through which students can view their strengths and opportunities for improvement 2. Student Service – A tool to guide students toward available resources for support 3. Placement – Developmental / remedial course placement 4. Talking Points – A collection of statements which academic advisors can use to advise their students 5. Early Alert – A list of students who are likely to be benefitted by the instructor reaching out to them early in the course. 6. Predictive Analytic - A set of data which can be analyzed at the individual and aggregate level to project student performance
  18. 18. Methods of Measurement • Instructor ratings – Time and task intensive for the faculty • Observer records – Expensive and time consuming • Letters of recommendation – Rarely objective • Interviews – Time consuming to conduct and code • Socioeconomic data – Beneficial mostly at the aggregate level due to exceptions and bias • Self assessment – Yes, there are limitations, but it is the preferred method.
  19. 19. Construct Comparison Matrix ACT Engage ETS Success Navigator Wonderlic Admissions Risk Profile SmarterMeasure Individual Attributes X X X X Life Factors X Learning Styles X Technical Skills X X Reading Skills X Keyboarding Skills X Custom Questions X
  20. 20. SmarterMeasure Learning Readiness Indicator • A 124-item online skills test and attributes inventory that measures a student’s level of readiness for studying online • Used by over 500 Colleges and Universities • Since 2002 taken by over 2,500,000 students
  21. 21. What Does The Assessment Measure? INTERNAL INDIVIDUAL ATTRIBUTES Motivation Procrastination Time Management Help Seeking Locus of Control LEARNING STYLES Visual Verbal Social Solitary Physical Aural Logical EXTERNAL LIFE FACTORS Availability of Time Dedicated Place Reason Support from Family SKILLS TECHNICAL Technology Usage Life Application Tech Vocabulary Computing Access TYPING Rate Accuracy ON-SCREEN READING Rate Recall
  22. 22. Adjusting Readiness Ranges Adjusting the cut points can make the reporting a more accurate predictor of success.
  23. 23. How Do Schools Use It? • Orientation Course • Enrollment Process • Information Webinar • Public Website • Class Participation • Facebook • 68% of client schools administer the assessment to all students, not just eLearning students
  24. 24. Thermometer Analogy • More important than taking your child’s temperature is taking appropriate action based on their temperature. • More important than measuring student readiness is taking appropriate action based on the scores.
  25. 25. Predictive Correlation Comparison Descriptive Student Service Progression of SmarterMeasure Data Utilization
  26. 26. Research Ideas on the Research Page of the Website
  27. 27. Internally Conducted Company Assisted Professionally Assisted Approaches to Research Projects
  28. 28. Middlesex Community College • 6% to 13% more students failed online courses than on-ground courses. • Intervention Plan - Administer SmarterMeasure - Identify which constructs best predicted success - Provide “Success Tips” as identified Distributed by website, email, orientation course, records office, library, posters, and mail
  29. 29. Research Findings • Analyzed 3228 cases over two years • Significant positive correlation between individual attributes and grades GradesImpactsMotivation
  30. 30. Results of Middlesex Research Before SmarterMeasure™ was implemented, 6% to 13% more students failed online courses than students taking on-ground courses. After the implementation, the gaps were narrowed: 1.3% to 5.8% more online students failed than on- ground students.
  31. 31. Results of Middlesex Research Failure rates reduced by as much as 10%
  32. 32. Action Plan • Empower eLearning staff, faculty advisors, and academic counselors with student data Motivation Self Discipline Time Management Three areas of focus
  33. 33. Project Summary “In summary, the implementation of SmarterMeasure has helped students to achieve better academic success by identifying their strengths and weaknesses in online learning.” In essence, with various strategies implemented to promote SmarterMeasure™, a “culture” was created during advising and registration for students, faculty, and support staff to know that there is a way for students to see if they are a good fit for learning online.
  34. 34. CEC - The Need • We need to know which students to advise to take online, hybrid or on-campus courses. • We need to know which students to direct to which student services to help them succeed. • We need to know how to best design our courses so that new students are not overwhelmed.
  35. 35. The Analysis • What is the relationship between measures of student readiness and variables of: – Academic Success - GPA – Engagement – Survey (N=587) – Satisfaction – Survey (Representative Sample based on GPA and number of courses taken per term) – Retention – Re-enrollment data
  36. 36. The Analysis • Phase One – Summer 2011 – Included data from all three delivery systems – online, hybrid and on-campus – Analyzed data at the scale level • Phase Two – Fall 2011 – Focused the research on online learners only – Analyzed data at the sub-scale level • A neutral, third-part research firm (Applied Measurement Associates) used the following statistical analyses in the project: – ANOVA, Independent Samples t-tests, Discriminant Analysis, Structural Equation Modeling, Multiple Regression, Correlation.
  37. 37. The Findings • Academic Achievement – The scales of Individual Attributes, Technical Knowledge, and Life Factors had statistically significant mean differences with the measures of GPA.
  38. 38. The Findings • Retention – The measure of Learning Styles produced a statistically significant mean difference between students who were retained and those who left. • A 73% classification accuracy of this retention measure was achieved. – The scales of Individual Attributes and Technical Knowledge were statistically significant predictors of retention as measured by the number of courses taken per term.
  39. 39. The Findings • Engagement – The scales of Individual Attributes and Technical Competency had statistically significant relationships with the four survey items related to Engagement. – The scales of Life Factors, Individual Attributes, Technical Competency, Technical Knowledge, and Learning Styles were used to correctly classify responses to the survey questions related to engagement and satisfaction with up to 93% classification accuracy.
  40. 40. The Findings • Satisfaction – Structural equation modeling was used to create a hypothesized theoretical model to determine if SmarterMeasure scores would predict satisfaction as measured by the survey. – Results indicated that prior to taking online courses, student responses to the readiness variables were statistically significant indicators of later student satisfaction. – Therefore, the multiple SmarterMeasure assessment scores are a predictor of the Career Education survey responses.
  41. 41. The Findings • Statistically Significant Relationships Academic Achievement Engagement Retention Individual Attributes X X X Technical Knowledge X X X Learning Styles X X Life Factors X X Technical Competency X
  42. 42. The Findings • Student Categorizations – Enrollment Status • Positive – active/graduated (34.3%) • Negative – withdrew/dismissed/transfer (65.7%) – Academic Success Status • Passing – A, B or C (48.9%) • Failing – D, F or Other (21.1%) – Transfer Credit – (21.8%) – Not reported – (8.2%)
  43. 43. The Findings - Correlates Readiness Domain Readiness Domain Subscales Positive vs. Negative Pass vs. Fail Life Factor Place, Reason, and Skills Place Learning Styles Social and Logical N/A Personal Attributes Academic, Help Seeking, Procrastination, Time Management, and Locus of Control Time Management Technical Competency Internet Competency Internet Competency and Computer Competency Technical Knowledge Technology Usage and Technical Vocabulary Technical Vocabulary
  44. 44. The Findings - Predictors Readiness Domains GPA F p Life Factor Place and Skills 12.35 .0001 Learning Styles Verbal a and Logical 3.95 .02 Personal Attributes Help Seeking, Time Management, and Locus of Control 21.11 .0001 Technical Competency Computer and Internet Competency 22.75 .0001 Technical Knowledge Technology Vocabulary 38.76 .0001
  45. 45. The Findings - Predictors Readiness Domains Credit Hours Earned F p Life Factor Place 12.37 .0001 Learning Styles Visual 6.81 .01 Personal Attributes Academic Attributes, Help Seeking, and Locus of Control 13.40 .0001 Technical Competency Computer Competency and Internet Competency 12.23 .0001 Technical Knowledge Technology Usage and Technology Vocabulary 26.97 .0001
  46. 46. The Recommendations • We need to know which students to advise to take online, hybrid or on-campus courses. – A profile of a strong online student is one who: • Has a dedicated place to study online • Possesses strong time management skills • Demonstrates strong technical skills • Exhibits a strong vocabulary of technology terms
  47. 47. The Recommendations • We need to know which students to direct to which student services to help them succeed. – An online student who should be directed toward remedial/support resources is one who: • Has a weak reason for returning to school • Has weak prior academic skills • Is not likely to seek help on their own • Is prone to procrastinate • Has low, internal locus of control • Has weak technology skills
  48. 48. The Recommendations • We need to know how to best design our courses so that new students are not overwhelmed. – Limit advanced technology in courses offered early in a curriculum – Foster frequent teacher to student interaction early in the course – Require milestones in assignments to prevent procrastination – Clearly provide links to people/resources for assistance
  49. 49. Argosy University • Required in Freshman Experience course • Students reflect on scores and identify areas for improvement in their Personal Development Plan • Group reflection with others with similar levels of readiness
  50. 50. Argosy University - COMPARE • Compared the traits, attributes, and skills of the online and hybrid students. • Substantial differences between the two groups existed. • Changes were made to the instructional design process for each delivery system. Online Hybrid
  51. 51. Argosy University - EXPLORE • Correlational analysis between SmarterMeasure scores and student satisfaction, retention, and academic success Satisfaction Retention Success Technical Motivation Time Statistically Significant Factors: Technical Competency Motivation Availability of Time.
  52. 52. Argosy University - TREND • Aggregate analysis of SmarterMeasure data to identify mean scores for students. • Comparison made to the national mean scores from the Student Readiness Report. National Scores Argosy Scores
  53. 53. Argosy University - APPLY • Findings were shared with the instructional design and student services groups and improvements in processes were made. For example, since technical competency scores increase as the students take more online courses, the instructional designers purposefully allowed only basic forms of technology to be infused into the first courses that students take.
  54. 54. J. Sargeant Reynolds Community College • Required as admissions assessment • Integral part of their QEP • Computed correlations with grades and SmarterMeasure sub-scales of over 4000 students. • P Grades Attributes Technical Learning Styles Life Factors
  55. 55. Findings • Statistically significant correlations: Scores Grades - Dedicated place, support from employers and family, access to study resources, and academic skills (Life Factors) - Tech vocabulary (Technical Knowledge) - Procrastination (Individual Attributes)
  56. 56. Academic Success Rates 0 10 20 30 40 50 60 70 Skills Resources Time High Score Low Score Less than 10% of students with low scores experienced academic success.
  57. 57. Five Schools What is the relationship between measures of online student readiness and measures of online student satisfaction?
  58. 58. Methodology Data from 1,611 students who completed both the SmarterMeasure Learning Readiness Indicator and the Priority Survey for Online Learners were analyzed. Incoming vs Outgoing
  59. 59. Findings • There were statistically significant relationships between factors of readiness and satisfaction.
  60. 60. Comparison to Compass Scores North Central Michigan College - Petoskey, MI
  61. 61. National Data • 2013 Student Readiness Report • Data from 639,324 students from 275 colleges and universities
  62. 62. Online Learner Demographics • 69% were female • 54% were Caucasian/White • 54% had never taken an online course before • 40% were traditional aged college students • 53% were students at an associate’s level institution
  63. 63. Online Learner Demographics • Dominant Social learning style • Highly motivated • Moderate reading skills • Pressed for time • Increasing technical skills
  64. 64. Profile of a Successful Online Student • Four demographic variables have had a statistically significant higher mean for five years in a row. Females higher in Individual Attributes, Academic Attributes, and Time Management. Males higher in Technical Knowledge.
  65. 65. Profile of a Successful Online Student • Caucasians have had the highest means for five years in Technical Knowledge. • Students who have taken five or more online courses have had the highest means for five years in Individual Attributes, and Technical Knowledge.
  66. 66. Conclusion • Statistically significant relationships exist between measures of online student readiness and measures of academic success, engagement, satisfaction and retention.
  67. 67. SmarterMeasure.com
  68. 68. How important do you consider non-cognitive skills? How is your school measuring and using non-cognitive factors?
  69. 69. For More Info SmarterServices.com (877) 499-SMARTER info@SmarterServices.com

×