Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Learning Analytics and Knowledge (LAK) 14 Education Data Sciences

562 views

Published on

Full paper presentation

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

Learning Analytics and Knowledge (LAK) 14 Education Data Sciences

  1. 1. Education Data Sciences Framing Emergent Practices for Analytics of Learning, Organizations, and Systems Philip J. Piety Ed Info Connections ppiety@edinfoconnections.com Daniel T. Hickey Learning Sciences, School of Education Indiana University dthickey@indiana.edu MJ Bishop Center for Innovation and Excellence in Learning & Teaching University System of Maryland mjbishop@usmd.edu 1
  2. 2. Acknowledgements 2
  3. 3. Four Big Ideas 1. Sociotechnical paradigm shift 2. Notion of Education Data Sciences (EDS) – Academic/Institutional Analysis – Learning Analytics/Educational Data Mining – Learning Analytics/Personalization – Systemic Instructional Improvement 3. Common features across these communities 4. Framework for EDS 3
  4. 4. SOCIOTECHNICAL PARADIGM SHIFT IN CONCEPTION OF DATA From External/Distant/Artificial to Internal/Current/Contextual 4
  5. 5. Paradigm Shifts 5
  6. 6. The Educational Data Movement Understanding how the organizational model of education is similar to/different from other fields is key to understanding the educational data movement. 1980 – 1990 - 2000 - 2010 Finance Manufacturing Retail Health Care Education 6
  7. 7. The Educational Data Movement Understanding how the organizational model of education is similar to/different from other fields is key to understanding the educational data movement. 1980 – 1990 - 2000 - 2010 Finance Manufacturing Retail Health Care Education 7
  8. 8. The Educational Data Movement 8
  9. 9. Early Childhood K-12 Post Secondary Continuing/ Career Individuals Cohorts Organizations Systems Scale of Educational Context EducationalLevel(Age) The EDS Landscape 9
  10. 10. Early Childhood K-12 Post Secondary Continuing/ Career Individuals Cohorts Organizations Systems Scale of Educational Context EducationalLevel(Age) Academic/ Institutional Analytics Academic/Institutional Analytics 10
  11. 11. Academic/Institutional Analytics 11
  12. 12. Systemic/ Instructional Improvement Early Childhood K-12 Post Secondary Continuing/ Career Individuals Cohorts Organizations Systems Scale of Educational Context EducationalLevel(Age) Academic/ Institutional Analytics Systemic/Instructional Improvement 12
  13. 13. Systemic/Instructional Improvement “In many ways, the practice of data use is out ahead of research. Policy and interventions to promote data use far outstrip research studying the process, context, and consequences of these efforts. But the fact that there is so much energy promoting data use and so many districts and schools that are embarking on data use initiatives means that conditions are ripe for systematic, empirical study.” Coburn, Cynthia E., and Erica O. Turner. "Research on data use: A framework and analysis." Measurement: Interdisciplinary Research & Perspective 9.4 (2011): 173-206. 13
  14. 14. Systemic/ Instructional Improvement Early Childhood K-12 Post Secondary Continuing/ Career Individuals Cohorts Organizations Systems Scale of Educational Context EducationalLevel(Age) Academic/ Institutional Analytics EDM/Learning Analytics 14
  15. 15. EDM/Learning Analytics 15
  16. 16. Systemic/ Instructional Improvement Early Childhood K-12 Post Secondary Continuing/ Career Individuals Cohorts Organizations Systems LearnerAnalytics/ Personalization Scale of Educational Context EducationalLevel(Age) Academic/ Institutional Analytics LearnER Analytics/Personalization 16
  17. 17. LearnER Analytics/Personalization 17
  18. 18. Systemic/ Instructional Improvement Early Childhood K-12 Post Secondary Continuing/ Career Individuals Cohorts Organizations Systems LearnerAnalytics/ Personalization Scale of Educational Context EducationalLevel(Age) Academic/ Institutional Analytics D. Flipped Classrooms C. Early Warning Systems A. School to College Analyses B. Teacher Preparation Efficacy Evaluation Boundary Conditions 18
  19. 19. COMMON FEATURES & FACTORS IN EDUCATIONAL DATA SCIENCES A unified perspective for Educational Data Science 19
  20. 20. Five Common Features in EDS 1. Rapidly changing - Indicative of sociotechnical movement 2. Boundary issues - All communities touch on other communities 3. Disruption in evidentiary practices - Big data is disrupting all the sectors 4. Visualization, interpretation, and culture - Dashboards, representations, APIs, open data 5. Ethics, privacy and governance - FERPA & COPPA 20
  21. 21. Four Factors that Make All Educational Data Unique • Human/social creation –Most requires human manipulation • Measurement imprecision –Reliability issues are huge • Comparability challenges –Validity creates “wicked problems” • Fragmentation –Systems can’t talk to each other 21
  22. 22. SOME COMMON PRINCIPLES A unified perspective for Educational Data Science 22
  23. 23. Interdisciplinary Perspectives 23
  24. 24. Recognize Social/Temporal Levels Timescale Context Targeted Educational Content Time Frame Format of Educational Evidence Appropriate Formative Function for Students Ideal Formative Functions for Others Immediate Curricular Activity (lesson) Minutes Event-oriented observations (Informal observations of the enactment of the activity) Discourse during the enactment of a particular activity. Teacher: Refining discourse during the enactment of a particular activity. Close Curricular Routines (chapet/unit) Days Activity-oriented quizzes (semi-formal classroom assessments) Discourse following the enactment of chapter, quiz. Teacher: Refining the specific curricular routines and providing informal remediation to students. Proximal Entire Curricula Weeks Curriculum-oriented exams (Formal classroom assessments) Understanding of primary concepts targeted in curriculum. Teacher/curriculum developer: providing formal remediation and formally refining curricula. Distal Regional/Na tional Content Standards Months Criterion-referenced tests (external tests aligned to content standards) Administrators: Selection of curricula that have the largest impact on achievement in broad content domains. Remote National Achieve- ment Years Norm-referenced external tests standardized across years (ex: ITBS, NAEP) Policy makers: Long-term impact of policies on broad achievement targets. 24
  25. 25. Digital Fluidity State Longitudinal Data Systems District Data Warehouses and Teacher Evaluation Systems Learning Tools-Driven Analytics School Teams School Leaders District Curriculum District Leaders Teacher Planning Individual Students State Analysis 25
  26. 26. Values in Design Infrastructure and Tools Context Organizational and Political Context •routines •access to data •leadership •time •norms •power relations Processes of data use •noticing •interpreting •constructing implications •data components •linkages •time span covered •Infrastructure boundaries •data quality •technology features 26
  27. 27. Flashlights, Imperfect Lenses 27
  28. 28. Four Big Ideas 1. Sociotechnical paradigm shift 2. Notion of Education Data Sciences (EDS) – Academic/Institutional Analysis – Learning Analytics/Educational Data Mining – Learning Analytics/Personalization – Systemic Instructional Improvement 3. Common features across these communities 4. Framework for EDS 28
  29. 29. Education Data Sciences Framing Emergent Practices for Analytics of Learning, Organizations, and Systems Philip J. Piety Ed Info Connections ppiety@edinfoconnections.com Daniel T. Hickey Learning Sciences, School of Education Indiana University dthickey@indiana.edu MJ Bishop Center for Innovation and Excellence in Learning & Teaching University System of Maryland mjbishop@usmd.edu 29

×