Improving Student Achievement with New Approaches to Data


Published on

Presentation delivered at WASC ARC conference on April 11, 2013 on the CSU Data Dashboard and Chico State Learning Analytics case study.

Chico State Case Study: Academic technologies collect highly detailed student usage data. How can this data be used to understand and predict student performance, especially of at-risk students? This presentation will discuss research on a high-enrollment undergraduate course exploring the relationship between LMS activity, student background characteristics, current enrollment information, and student achievement.

CSU Data Dashboard: By monitoring on-track indicators institutional leaders can better understand not only which milestones students are failing to reach, but why they are not reaching them. It can also help campuses to design interventions or policy changes to increase student success and to gauge the impact of interventions.

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Context: California State University & Graduation Initiative (5)Chico State Learning Analytics Case Study (20)CSU Data Dashboard Project (20)Next Steps (5)Q & A (10)
  • Kathy
  • John
  • John
  • John
  • Opportunity: If you have large number of students not meeting a particular indicator, gives you an opportunity
  • Overall graduation rates and goalsAchievement gapShows that we’re “green” for retention rates, but yellow for rates by achievement gap
  • Overall graduation rates and goalsAchievement gapShows that we’re “green” for retention rates, but yellow for rates by achievement gap
  • Overall graduation rates and goalsAchievement gapShows that we’re “green” for retention rates, but yellow for rates by achievement gap
  • Overall graduation rates and goalsAchievement gapShows that we’re “green” for retention rates, but yellow for rates by achievement gap
  • Drill into system – select multiple ethnicities. See the variation by overall ethnicity
  • Select Bakersfield campus – problem in second-year retention – but no problem by achievement gap
  • Bakersfield by gender – big problem for male, especially URM male students.
  • Comparison between campuses – and by cohort year
  • Kathy
  • Improving Student Achievement with New Approaches to Data

    1. 1. John Whitmer, Ed.D.Academic Technology ServicesCalifornia State University, Office of the ChancellorWASC ARC ConferenceApril 11, 2013Improving Student Achievement withNew Approaches to Data:Learning Analytics &the CSU Data Dashboardslides @
    2. 2. slides @ Context: California State University &Graduation Initiative2. Chico State Learning Analytics Case Study3. CSU Data Dashboard Project4. Next Steps5. Discussion
    3. 3. slides @ CONTEXT
    4. 4. slides @ State University 23 campuses 437,000 FTE students 44,000 faculty and staff Largest, most diverse, &one of the mostaffordable universitysystems in the country Play a vital role in thegrowth & development ofCalifornias communitiesand economy
    5. 5. slides @ Achievement Gap
    6. 6. slides @– Baseline 6-Year Graduation Rate: 46%– Target 6-Year Graduation Rate: 54%– Baseline Achievement Gap: 11%– Target Achievement Gap: 5.5%2
    7. 7. slides @ Approaches to Using DataEnable data-driven decision making forinterventions earlier in the student experience by1. Integrate new data sources & variables2. Disseminate findings to a broader audience3. Provide ability to interact with data analysis,conduct ad-hoc and custom reporting
    9. 9. slides @ (2010, 11/4/2010). Augmented business: Smart systems will disrupt lots of industries, and perhaps the entire economy.The Economist.200MB of data emissions annually!
    10. 10. slides @ jisc_infonet @ Flickr.comSource: jisc_infonet @ Flickr.comLogged into course within 24hoursInteracts frequently indiscussion boardsFailed first examHasn’t taken college-levelmathNo declared major
    11. 11. slides @ Study: Intro to Religious Studies• Undergraduate, introductory, highdemand• Redesigned to hybrid delivery formatthrough “academy eLearning program”• Enrollment: 373 students(54% increase on largest section)• Highest LMS (Vista) usageentire campus Fall 2010(>250k hits)• Bimodal outcomes compared totraditional course• 10% increase on final exam• 7% & 11% increase in DWF• Why? Can’t tell with aggregated data54 F’s
    12. 12. slides @
    13. 13. slides @ Analytics“ ... measurement, collection, analysis andreporting of data about learners and theircontexts, for purposes of understanding andoptimizing learning and the environments inwhich it occurs.” (Siemens, 2011)
    14. 14. slides @ Adoption of Learning Management SystemsInstitution-Supported ITResources and Tools. Reprintedfrom “The ECAR Study ofUndergraduate Students andInformation Technology,” EdenDahlstrom, 2012 by theEDUCAUSE Center for AppliedResearch.
    15. 15. slides @ Questions1. How is student LMS use related to academicachievement in a single course section?2. How does that finding compare to the relationship ofachievement with traditional student characteristicvariables?3. How are these relationships different for“at-risk” students (URM & Pell-eligible)?4. What data sources, variables and methods are mostuseful to answer these questions?
    16. 16. slides @ Use Variables1. Administrative Activities(calendar, announcements)2. Assessment Activities(quiz, homework, assignments,grade center)3. Content Activities(web hits, PDF, content pages)4. Engagement Activities(discussion, mail)Student Char. Variables1. Enrollment Status2. First in Family to AttendCollege3. Gender4. HS GPA5. Major-College6. Pell Eligible7. URM and Pell-EligibilityInteraction8. Under-RepresentedMinority9. URM and GenderInteraction
    17. 17. slides @ UsedApp FunctionExcel Early data exploration; simple sorting; tablesfor print/publicationTableau Complex data summaries and explorations;complex charts; presentation chartsFinal/formal descriptive data; statisticalanalysis; some charts (scatterplots)Statistical analysis (factor analysis)Statistical analysis (charts)
    18. 18. slides @ Student Char. w/Final GradeScatterplot ofHS GPA vs. CourseGrade
    19. 19. slides @ the trend LMS use and final grade is _______ compared tostudent characteristics and final grade:a) 50% smallerb) 25% smallerc) the samed) 200% largere) 400% larger
    20. 20. slides @ the trend LMS use and final grade is _______ compared tostudent characteristics and final grade:a) 50% smallerb) 25% smallerc) the samed) 200% largere) 400% larger
    21. 21. slides @ LMS Use w/Final GradeScatterplot ofAssessment ActivityHits vs. CourseGrade
    22. 22. slides @ LMS & Student Characteristics
    23. 23. slides @ Variables Regression Final Grade byLMS Use & Student Characteristic VariablesLMSUseVariables25%(r2=0.25)Explanation of changein final gradeStudentCharacteristicVariables+10%(r2=0.35)Explanation of changein final grade>
    24. 24. slides @ the trend LMS use and final grade is ______ for “at-risk”*students compared to not at-risk students?a) 50% smallerb) 20% smallerc) No differenced) 20% largere) 100% largerRelationship indicates how strongly LMS use is correlatedwith final grade; lower value equals less impact*at-risk = BOTH under-represented minority and Pell-eligible
    25. 25. slides @ the trend LMS use and final grade is ______ for “at-risk”*students compared to not at-risk students?a) 50% smallerb) 20% smallerc) No differenced) 20% largere) 100% larger*at-risk = BOTH under-represented minority and Pell-eligible
    26. 26. slides @ 3 Results:Regression by “At Risk” Population Subsamples
    27. 27. slides @ Students: “Over-Working Gap”27
    28. 28. slides @ by Pell and GradeExtra effortin content-relatedactivities
    29. 29. slides @ LMS use is a better predictor of academicachievement than student characteristics.– LMS use frequency is a proxy for effort.2. LMS data requires extensive filtering to be useful;student variables need pre-screening for missingdata.3. LMS effectiveness for at-risk students may becaused by non-technical barriers.4. Small strength magnitude suggests that bettermethods could produce stronger results.
    30. 30. slides @ Generation Learning AnalyticsGraphic Courtesy Sasha Dietrichson, X-Ray Research SRL
    31. 31. slides @ StepsPotential for improved LMS analysis methods: time series analysis social learning activity patterns discourse content analysisGroup students by broader identity, with uniquevariables: Continuing student (Current college GPA, URM, etc.) First-time freshman (HS GPA, SAT/Act, etc)
    32. 32. slides @ DATA DASHBOARD PROJECT
    33. 33. slides @ FRAMEWORKAdvancing by Degrees: AFramework for IncreasingCollege Completion byOffenstein, Moore &SchulockInstitute for HigherEducation Leadership andPolicy and The EducationTrust (
    34. 34. slides @ research describesacademic patterns (or leadingindicators) that occur early inthe pipeline that can be trackedand monitored in real timeagainst milestones on thegraduation route.
    35. 35. slides @ indicators statisticallyimprove predictedprobabilities of completionover just the use of studentbackground characteristics
    36. 36. slides @ are measurableeducational achievementsthat students reach alongthe path to degreecompletion.
    37. 37. slides @ Leading IndicatorsYear-to-year RetentionTransition to college level coursework(English and Math)Earn one year of college level creditsComplete General EducationComplete degreeRemediationBegin remedial coursework in the first term, ifneeded.Complete needed remediationGateway CoursesComplete college-level math and/or English inthe first or second yearComplete a college-success course or otherfirst-year experience programCredit Accumulation and Related AcademicBehaviorsComplete high percentage of coursesattempted (low rate of course dropping and/orfailure)Complete 20-30 credits in the first yearEarn summer creditsEnroll full timeEnroll continuously, without stop-outsRegister on-time for coursesMaintain adequate academic progress
    38. 38. slides @ Questions for Dashboard1. What percentage of students reach each of theleading indicators?2. What is the impact of reaching each of theleading indicators on success rate?3. Does meeting any of the indicators reduce oreliminate gaps between student demographicgroups?
    39. 39. slides @ OF CONCEPT
    40. 40. slides @ Demonstrate potential value of combinedreporting and statistics Evaluate availability and integration of data Pilot potential tools in real-world scenario NOTE: production system may be dramaticallydifferent from POC, given lessons learned andscalability
    41. 41. slides @
    42. 42. slides @
    43. 43. slides @ ReportParameters
    44. 44. slides @ Retention Rates
    45. 45. slides @ Retention Rates by URM Status
    46. 46. slides @ Data Export Options
    47. 47. slides @
    48. 48. slides @
    49. 49. slides @ Male, 2ndYear Persistence
    50. 50. slides @
    51. 51. slides @
    52. 52. slides @
    53. 53. slides @
    54. 54. slides @
    55. 55. slides @ NEXT STEPS
    56. 56. slides @’s Now … And Next Conducting 3 multi-campus pilots1. mCURL: Moodle Common Usage Reporting & LearningAnalytics: (8 CSU & 2 UC campuses)2. Blackboard Analytics for Learn (3 campuses)3. LMS-agnostic campus surveys Investigating additional pilot with LMS-agnostic tool tomove beyond “clickometry” into social network analysis,discourse analysis, etc. Raises question for MOOC research: relationship betweenstudent intent/motivation, student characteristics/leadingindicators, MOOC use, and achievement
    57. 57. slides @ Dashboard
    58. 58. slides @ Questions?John Monograph Twitter: johncwhitmerDesdemona