NCME Big Data in Education

772 views
637 views

Published on

Opening/Framing Comments: John Behrens, Vice President, Center for Digital Data, Analytics, & Adaptive Learning Pearson
Discussion of how the field of educational measurement is changing; how long held assumptions may no longer be taken for granted and that new terminology and language are coming into the.
Panel 1: Beyond the Construct: New Forms of Measurement
This panel presents new views of what assessment can be and new species of big data that push our understanding for what can be used in evidentiary arguments.
 Marcia Linn, Lydia Liu from UC Berkeley and ETS discuss continuous assessment of science and new kinds of constructs that relate to collaboration and student reasoning.
 John Byrnes from SRI International discusses text and other semi-structured data sources and different methods of analysis.
 Kristin Dicerbo from Pearson discusses hidden assessments and the different student interactions and events that can be used in inferential processes.
Panel 2: The Test is Just the Beginning: Assessments Meet Systems Context
This panel looks at how assessments are not the end game, but often the first step in larger big-data practices at districts/state/national levels.
 Gerald Tindal from the University of Oregon discusses State data systems and special education, including curriculum-based measurement across geographic settings.
 Jack Buckley Commissioner of the National Center for Educational Statistics discussing national datasets where tests and other data connect.
 Lindsay Page, Will Marinell from the Strategic Data Project at Harvard discussing state and district datasets used for evaluating teachers, colleges of education, and student progress.
Panel 3: Connecting the Dots: Research Agendas to Integrate Different Worlds
This panel will look at how research organizations are viewing the connections between the perspectives presented in Panels 1 and 2; what is known, what is still yet to be discovered in order to achieve the promised of big connected data in education.
 Andrea Conklin Bueschel Program Director at the Spencer Foundation
 Ed Dieterle Senior Program Officer at the Bill and Melinda Gates Foundation
 Edith Gummer Program Manager at National Science Foundation

Published in: Education
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
772
On SlideShare
0
From Embeds
0
Number of Embeds
54
Actions
Shares
0
Downloads
37
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide
  • Your organizations have all invested heavily in the use of data in education. In which areas have these efforts been most difficult and most successful? – alterable variables, get to people so that they can be used, grain size
  • Successful innovations in data structuring, visualization, and an offer for projects to consider the extent to which they need the use of supercomputer capabilities
  • Closer to education is the work how present data for people at different levels can use data
  • Here, I’m going to stay focused on the view from education
  • What are a few of the areas you believe researchers, especially from the measurement community, should be thinking about for the next 2, 5, and 10 years?
  • The central goal of our K-12 strategy is for 80% of the class of 2025 to graduate high school capable of matriculation into a post-secondary institution without the need for remediation, which we call college readyMany of the students currently in the pipeline are not on track to graduate college ready. To get them back on track will require accelerated learning. The class of 2025 entered into kindergarten in Fall 2012Every student, over their next 13 years of schooling, is presents with approximately 1 million instructional minutes. To utilize each minute wisely and maximize the learning experiences for each and every child will require increased levels of personalized learningPersonalized learning tailors what-is-taught, when-it-is-taught, and how-it-is-taught to the needs, skill levels, interests, dispositions, and abilities of the learner working individually and with othersSource: http://www.census.gov/hhes/school/data/cps/2011/tables.html
  • A confluence of breakthroughs is moving us closer to the personalization of learning for all learnersThe Common Core State Standards provide a consistent, clear understanding of what students are expected to learnBetter measures of teaching, as revealed by the Measures of Effective Teaching study, have unlocking essential behaviors and practices associated with effective teaching, informing innovative forms of professional development and pre-service training. Systematic investigations of cognitive, intrapersonal, and interpersonal capacities have advanced significantly our knowledge of how people learn. Launching of inBloom represents the first multi-state, open source cyberinfrastructure.
  • Personalized learning for all students requires continuously capturing, deriving meaning from, and acting on data generated by students with varying needs, skill levels, interests, dispositions, and abilitiesTo make possible personalized learning for all students in the U.S. requires continuously capturing, deriving meaning from, and acting on data generated in the cyberinfrastructure (e.g., inBloom) by students with varying needs, skill levels, interests, dispositions, and abilitiesCreating a talent base of education data scientists with deep analytical talent won’t happen overnight. It will require prioritizing resources, developing a professional infrastructure, and creating new research tools. It will necessitate changes in education policies and a new social contract that strikes an appropriate balance between protecting privacy and drawing on large volumes of learning data to advance education outcomes. And it will require strengthening collaboration among academy, industry, practice, government, and private foundations.
  • A confluence of breakthroughs is moving us closer to the personalization of learning for all learnersThe Common Core State Standards provide a consistent, clear understanding of what students are expected to learnBetter measures of teaching, as revealed by the Measures of Effective Teaching study, have unlocking essential behaviors and practices associated with effective teaching, informing innovative forms of professional development and pre-service training. Systematic investigations of cognitive, intrapersonal, and interpersonal capacities have advanced significantly our knowledge of how people learn. Launching of inBloom represents the first multi-state, open source cyberinfrastructure.
  • Learning requires engagement and college readiness requires academic tenacity. Without engagement, students are not maximizing their likelihood of learning; without academic tenacity, students will likely not succeed in their academic pursuits over the long haul. Engagement and academic tenacity are measurable, teachable, and socializable, and are shaped by (a) physical, mental, and emotional development, (b) chemical processes within people, (c) personal interests and sociocultural influences, and (d) tasks and situations, cognitive challenge, arousal, expectancy, and incentive
  • Assessments embedded within games to unobtrusively, accurately, and dynamically measure how players are progressing relative to targeted competencies. This two-year investment supports the use of evidence center design (ECD) to develop assessments that measure three competencies: (a) conceptual physics, understanding Newton’s Laws of motion; (b) persistence, continuing to work hard despite challenging conditions; and (c) creativity, the ability to create novel solutions to various problems. Over the course of two years, Shute will test: (a) the degree to which the stealth assessments yield valid, reliable, and fair measures of the respective competencies; (b) the effects of gameplay in relation to the selected competencies (e.g., improving understanding of conceptual physics); and (c) the ease and challenge of re-using the evidence-based models in a second game. Reuse of the assessments in other games is important because the development costs of ECD-based assessments can be relatively high for complex competencies. Thus, an aim of this investment is to establish a proof-of-concept for creating stealth assessment models that can be used in related games.
  • Developing and investigating two learning games that cultivate academic tenacity in eighth grade studentsself-regulation of learning; the regulation of attention when completing cognitively challenging, academically-oriented tasks; pro-social behavior, especially being mindful and sensitive to others and skillful at building productive social relationships
  • A confluence of breakthroughs is moving us closer to the personalization of learning for all learnersThe Common Core State Standards provide a consistent, clear understanding of what students are expected to learnBetter measures of teaching, as revealed by the Measures of Effective Teaching study, have unlocking essential behaviors and practices associated with effective teaching, informing innovative forms of professional development and pre-service training. Systematic investigations of cognitive, intrapersonal, and interpersonal capacities have advanced significantly our knowledge of how people learn. Launching of inBloom represents the first multi-state, open source cyberinfrastructure.
  • NCME Big Data in Education

    1. 1. National Council on Measurement in Education Sunday, April 28, 10:00 Grand Ballroom A, 3rd Floor
    2. 2. John Behrens (Pearson, Center for Digital Data, Analytics, & Adaptive Learning) Framing comments Panel 1: Beyond the Construct: New Forms of Measurement • Marcia Linn (UC Berkeley): Interpreting student progress w/ embedded assessments • John Byrnes (SRI International): Text Analytics for Big Data • Kristin Dicerbo (Pearson): Invisible assessments in the digital ocean - Questions/discussion Panel 2: The Test is Just the Beginning: Assessments Meet System Context • Gerald Tindal (U of Oregon): Curriculum-based Measurement and State Data • Lindsay Page (Harvard University): The Strategic Data Project • Jack Buckley (NCES): Federal data efforts - Questions/discussion Panel 3: Connecting the Dots: Research Agendas to Integrate Different Worlds • Andrea Conklin Bueschel (Spencer Foundation) • Ed Dieterle (Bill and Melinda Gates Foundation) • Edith Gummer (National Science Foundation) - Questions/discussion
    3. 3. BIG DATA AMERICAN STYLE: TECHNOLOGY, INNOVATION, AND THE PUBLIC INTEREST Monday, Apr 29 - 10:35am - 12:05pm, Building/Room: Parc 55 / Divisadero • Ryan Baker (Teachers College/Pres. Int. Ed. Data Mining Society): Educational Data Mining: Potentials and Possibilities • John T. Behrens (Pearson): Harnessing the Currents of the Digital Ocean • Aimee Rogstad Guidera (Data Quality Campaign): The 4 Ts of State Data Systems: Turf, Trust, Technology, and Time: Policy Perspective on Empowering Education Stakeholders with Data • Kathleen Styles (Chief Privacy Officer, Department of Education): Hold Your Horses! –Addressing Privacy and Governance for Big Data & Analytics • Phil Piety, John T. Behrens, Roy Pea: Educational Decision Sciences and Interpretive Skills • Barbara Schneider (Michigan State, AERA President for 2013-2014): Discussant
    4. 4. • What is “BIG DATA”… really? • How does “Big data” relate to education? • How does “big data” impact the field of measurement? • How much is “BIG data” is hype, how much real change?
    5. 5. “Big data exceeds the reach of commonly used hardware environments and software tools to capture, manage, and process it with in a tolerable elapsed time for its user population.” - Teradata Magazine article, 2011 “Big data refers to data sets whose size is beyond the ability of typical database software tools to capture, store, manage and analyze.” - The McKinsey Global Institute, 2011 From Steamrolled by Big Data by Gary Marcus, New Yorker, April 3, 2013
    6. 6. Copyright 2012, Cognizant. http://www.cognizant.com/InsightsWhitepapers/Big-Datas- Impact-on-the-Data-Supply-Chain.pdf
    7. 7. Tavo De León: BigDataArchitecture.com http://bigdataarchitecture.com/wp-content/uploads/2012/02/Big-Data-New-Frontiers-for-IT-Management-AITP.pdf
    8. 8. Mark Gahegan Centre for eResearch & Computer Science University of Auckland
    9. 9. Mark Gahegan Centre for eResearch & Computer Science University of Auckland
    10. 10. Tableau Software: http://www.tableausoftware.com/solutions/supply-chain-analysis
    11. 11. Which one is Education?
    12. 12. Which one is Education?
    13. 13. • Natural evolution with parallels to other fields • Education faces data differences – Error – Comparability – Human factors • Infrastructure challenges • Forward movement is inevitable BIG DATA is coming
    14. 14. Panel 1
    15. 15. INTERPRETING STUDENT PROGRESS FROM EMBEDDED ASSESSMENTS: EXPANDING ITEM TYPES FOR ASSESSING INQUIRY • Marcia C. Linn, University of California, Berkeley • Ou Lydia Liu, Educational Testing Service • Kihyun (Kelly) Ryoo, University of North Carolina, Chapel Hill • Vanessa Svihla, University of New Mexico • & Elissa Sato University of California, Berkeley
    16. 16. Invisible Assessment in the Digital Ocean Kristen DiCerbo, Ph.D. @kdicerbo April 28, 2013
    17. 17. The Digital Ocean Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 22
    18. 18. Invisible Assessment Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 23 The ability to capture data from everyday events should fundamentally change how we think about assessment.
    19. 19. Micro-level Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 24
    20. 20. Macro-level Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 25 Sept June
    21. 21. Evidence-Centered Assessment Design • What complex of knowledge, skills, or other attributes should be assessed? • What behaviors or performances should reveal those constructs? • What tasks or situations should elicit those behaviors? Student Model  Evidence Model(s) Measurement Model Scoring Model X1 Task Model(s) 1. xxxxxxxx 2. xxxxxxxx 3. xxxxxxxx 4. xxxxxxxx 5. xxxxxxxx 6. xxxxxxxx 7. xxxxxxxx 8. xxxxxxxx        X2 X1 X2 Mislevy, Steinberg, & Almond (2003)
    22. 22. We Don’t Know it All… Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 27 •How do we capture, store, and extract huge event log files? Technical Issues •How do we model changing proficiency? •How do we make sense of stream data? •How do we eliminate experience and interface effects? Measurement Issues •How do we balance rich environments with the need to isolate skills? •How do we allow student control while observing what we need? •How do we communicate results? Design Issues •Will teachers and parents trust the scores? Implementation Issues
    23. 23. A Change in Thinking Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 28 • Item paradigm to activity paradigm • Individual view to social ecosystem view • Assessment isolation to educational unification
    24. 24. Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 29 Thank you Kristen.DiCerbo@pearson.com http://researchnetwork.pearson.com
    25. 25. Text Analytics for Big Data Big Data: New Opportunities for Measurement and Data Analysis National Council on Measurement in Education 2013 Meeting John Byrnes Computer Scientist SRI International 29 April 2013
    26. 26. Automatic organization and identification of text • Collection analysis for review of National Science Foundation programs • Analysis of clinician notes for expert advisor for National Institutes of Health • Massive data analysis for the US Intelligence Community • Information extraction of names of: – persons, locations, organizations – ships, cargo, ports – scientific entities from text sources: – web forums, blogs – scientific journal articles 31
    27. 27. Distributional Semantics 32
    28. 28. Automated Front End • Real-Time Concept Recognition – Custom hardware – Fiberoptic rate (2.4Gbps) • Real-time Language Identification – Separate platform – web data without pre-processing
    29. 29. Data as Subject-Matter Expert • Hypothesis generation for understanding premature birth • Medical diagnostics for pediatric kidney injury • User behavior modeling • Data fusion and integration Age Weight
    30. 30. Headquarters: Silicon Valley SRI International 333 Ravenswood Avenue Menlo Park, CA 94025-3493 650.859.2000 Washington, D.C. SRI International 1100 Wilson Blvd., Suite 2800 Arlington, VA 22209-3915 703.524.2053 Princeton, New Jersey SRI International Sarnoff 201 Washington Road Princeton, NJ 08540 609.734.2553 Additional U.S. and international locations www.sri.com Thank You
    31. 31. Panel 1
    32. 32. Data Management, Data Mining, and Data Utilization with Curriculum-Based Measurement Systems Gerald Tindal and Julie Alonzo Behavioral Research and Teaching (BRT) – College of Education, University of Oregon
    33. 33. Center for Education Policy Research at Harvard University | April 28, 2013 The Strategic Data Project: Annual Meeting of the National Council on Measurement in Education www.gse.harvard.edu/sdp
    34. 34. MISSION Transform the use of data in education to improve student achievement.
    35. 35. The SDP Family
    36. 36. I. Fellows Place and support data strategists in agencies who will influence policy at the local, state, and national levels. Core Strategies 2. Diagnostic Analyses Create policy- and management-relevant standardized analyses for districts and states. 3. Scale Improve the way data is used in the education sector. Achieve broad impact through wide dissemination of analytic tools, methods, and best practices.
    37. 37. Standard Analyses Customized Analyses Data WorkTeaching • Human capital, college- going • ~ 35 analyses each • 10 CG analyses to be on Schoolzilla platform by year end • Key issues identified by partner • Denver: course grades analysis • LA: on-track for A-G requirements • Collect, clean, connect • Often this is a huge lift • Much discovery happens (laying the groundwork for better data collection and management strategies in the future) • Example: course data, teacher hiring data • Set up, manage, support working groups • Connect diagnostic to policy implications • Change management • Methods training • Publishing findings; distribution Diagnostic: Product + Process
    38. 38. • Set of specific recommendations about actions agencies should take to improve performance • Comprehensive collection of all that can be done with existing data • Root-cause analyses for specific issues • Ranking of agencies What the diagnostics are not…
    39. 39. The SDP Human-Capital Diagnostic Pathway
    40. 40. • Recruitment: When are teachers hired? How does teacher effectiveness vary with hire date? • Placement: Which students are assigned to new teachers? How does this compare to those assigned to veteran teachers? • Development: How do teachers develop in their level of effectiveness over time? • Evaluation: How much variation exists among teachers based on effectiveness measures from the agency’s traditional teacher evaluation system? Based on a value-added measure of teacher effectiveness? • Retention: What share of novice teachers remain in the same school and/or in the same district after five years? Illustrative Guiding Questions
    41. 41. The SDP College-Going Diagnostic Pathway
    42. 42. • 9th to 10th transition: What share of students are on-track to graduate at the end of the first year of high school? Of those who are off track, what share is able to get back on track? • High school graduation: To what extent do graduation rates vary across high schools when comparing students with similar incoming achievement? • College enrollment: To what extent do highly college-qualified students fail to matriculate in college? • College persistence: To what extent does college persistence vary across post-secondary institutions? Illustrative Guiding Questions
    43. 43. Illustrative Diagnostic Analysis
    44. 44. Korynn Schooley Chris Matthews Summer PACE: • College-Going Diagnostic revealed 22% of “college-intending” high school graduates were not matriculating to college • Worked with faculty and staff to design a summer counseling intervention • Utilized a randomized control trial to rigorously assess the impact of the intervention Fulton County Schools Impact
    45. 45. • 7 weeks (June 6 – July 22, 2011) • 6 schools participated; selected based on 2010 estimated summer melt rates and geographic location: 3 in South county and 3 in North county with highest estimated rates • Randomized control trial • 2 counselors per school with caseload of 40 students each • $115/student Summer PACE Quick Facts
    46. 46. Contact information: Lindsay Page lindsay_page@gse.harvard.edu Will Marinell william_marinell@gse.harvard.edu Thank you
    47. 47. Federal Perspectives of Big Data Jack Buckley, Commissioner, National Center for Educational Statistics
    48. 48. Panel 1
    49. 49. Big Data: New Opportunities for Measurement & Data Analysis – NSF Perspectives Edith Gummer Program Officer Division of Research on Learning Directorate of Education and Human Resources National Science Foundation
    50. 50. NSF Investments- Data in STEM Education • Mathematics and Physical Sciences • Fundamental and statistical research in the field of computational and data-enabled science and engineering • Social, Behavioral and Economic Sciences • Science Learning Centers – multiple projects • Digging in the Data Challenge • Methodology, Measurement, and Statistics
    51. 51. NSF Investments- Data in STEM Education • Directorate for Computer & Information Science and Engineering (CISE) – Computing Research Infrastructure program – data repositories and visualization capabilities – Supercomputers whose mission also includes reserving capacity for education research users
    52. 52. NSF Investments- Data in STEM Education • CISE Cyberlearning – a crosscutting program that studies learning in technology-enabled environments • Education and Human Resources – Research on Education and Learning (REAL) – Discovery Research K-12 (DRK-12) – Advancing Informal STEM Learning (AISL) – Promoting Research and Innovation in Methodologies in Evaluation (PRIME) • SBE/EHR – Building Community Capacity for Data Intensive Research
    53. 53. Success and Challenge • Expanding diversity of learning environments in which a variety of theoretical, methodological, and research to practice perspectives inform the R & D field But • Insights from data that inform learning, classroom practices, and pathways through education
    54. 54. Future Directions • Expanded view of what it means to “know and be able to do” – Models of achievement • Common Core Standards in Mathematics and Next Generation Science Standards – connecting disciplinary knowledge and practice • NRC – Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century – Models of individual performance from group settings • Opportunity to learn connected to achievement • NRC – Monitoring Progress Toward Successful K-12 STEM Education: A Nation Advancing • Developing instructional systems databases that track not only achievement but what a student has experienced.
    55. 55. NSF Funding Sources • EHR Core Research (ECR) NSF 13-555 – Target date July 12, 2013 – 4 Areas of research • Learning • Learning Environments • Workforce Development • Broadening Participation • SBE/EHR Building Community Capacity • EHR Ideas Lab to foster transformative approaches to teaching and learning
    56. 56. Perspectives from the Spencer Foundation Andrea Conklin-Bueschel Senior Program Officer
    57. 57. Ed Dieterle, Ed.D. Senior Program Officer for Research, Measurement, and Evaluation US Program New Opportunities for Measurement & Data Analysis to Personalize Learning For every complex question there is a simple answer – and it’s wrong. - H.L. Mencken
    58. 58. 2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 67 Personalized Learning at Scale A means to achieve our U.S. Education strategy goal: 80% of the class of 2025 graduating high school college ready 55 M Students in the Pipeline 4.2 M Entering the Pipeline Goal: Accelerate Learning Goal: Use 1 Million In-School Minutes Wisely
    59. 59. 2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 68 A confluence of breakthroughs is moving us closer to the personalization of learning for all learners Common Core Standards Measures of Effective Teaching Science of How People Learn Personalized Blended Learning Models Digitally Born Learning Innovations New Measures of Learning Advanced Learning Analytics inBloom
    60. 60. 2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 69 Multiple Funders One Workgroup Bill & Melinda Gates Foundation MacArthur Foundation Academy Industry Government/ Philanthropy Practice Learning Analytics Workgroup Multiple Sectors There are urgent and growing global needs for the development of human capital, research tools and strategies, and professional infrastructure in the field of learning analytics
    61. 61. 2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 70 Learning Analytic Workgroup Roy Pea | Stanford University Provide a conceptual framework and define critical questions for understanding Articulate and prioritize new tools, approaches, policies, markets, and programs of study Determine resources needed to address priorities Map how to implement the strategy and how to evaluate progress Group of 30 experts from academy, government, industry, practice, and philanthropy
    62. 62. 2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 71 A confluence of breakthroughs is moving us closer to the personalization of learning for all learners Common Core Standards Measures of Effective Teaching Science of How People Learn Personalized Blended Learning Models Digitally Born Learning Innovations New Measures of Learning Advanced Learning Analytics inBloom
    63. 63. 2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 72 Measures of Learning Cognitive, interpersonal, intrapersonal factors associated with learning Without reliable, valid, fair, and efficient measures collected from multiple sources, and analyzed by trained researchers applying methods and techniques appropriately, the entire value of a research study or a program evaluation is questionable, even with otherwise rigorous research designs and large sample sizes
    64. 64. 2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 73 Analog Digitally Reborn Digitally Born All tools aren’t born equally Note: “Digitally Born” vs. “Digitally Reborn” was first articulated by Bernard Frischer, Professor of Art History and Classics at the University of Virginia Differences stem from the activities they support, the outputs they generate, and what one can do with those outputs
    65. 65. 2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 74 Newton’s Playground Valerie Shute | Florida State University Measure three competencies unobtrusively through use of Newton’s Playground Simulation: a) conceptual physics, understanding Newton’s Laws of motion b) persistence, continuing to work hard despite challenging conditions c) creativity, the ability to create novel solutions to various problems Shute, V. J., & Ventura, M. (Eds.). (2013). Stealth assessment: Measuring and supporting learning in video games. Cambridge, MA: MIT Press.
    66. 66. 2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 75 Data Analytics Studies of Engagement Ryan Baker | Columbia University Application of education data mining and field observations to develop sensors that detect: Engaged/Disengaged Behaviors: – off-task – gaming the system – on-task solitary – on-task conversation Relevant Affect: – engaged concentration – boredom – frustration – confusion – delight ASSISTments Worcester Polytechnic Institute EcoMUVE Harvard University Newton's Playground Florida State University Reasoning Mind
    67. 67. 2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 76 Mindfulness and Prosocial Games Richard Davidson | University of Wisconsin Madison Before After Mindfulness Game: Tenacity By monitoring and controlling breathing, players grow flowers and learn to regulate their attention Prosocial Game: Krystals of Kaydor Players assess emotional facial expressions to perceive the emotional state of members of the inhabitants of an alien planet and engage in prosocial behavior appropriate to the setting where the emotion is encountered Bavelier, D., & Davidson, R. J. (2013). Brain training: Games to do you good. Nature, 494(7438), 425–426. Davidson, R. J., & Begley, S. (2012). The emotional life of your brain: How its unique patterns affect the way you think, feel, and live--and how you can change them. New York, NY: Hudson Street Press.
    68. 68. 2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 77 Mindfulness and Prosocial Games Richard Davidson | University of Wisconsin Madison Measures • Mind/brain measures: Functional Magnetic Resonance Imaging (fMRI), Electroencephalograph (EEG), Galvanic Skin Response (GSR) • Best-in-class, self-report measures from psychology • Logfiles generated from activity with each game Goals • Change brain function in specific attention and social behavior circuits in beneficial ways • Improve performance on cognitive tasks of attention and working memory and on measures of the perception of social cues and the propensity to share and behave altruistically
    69. 69. 2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 78 A confluence of breakthroughs is moving us closer to the personalization of learning for all learners Common Core Standards Measures of Effective Teaching Science of How People Learn Personalized Blended Learning Models Digitally Born Learning Innovations New Measures of Learning Advanced Learning Analytics inBloom
    70. 70. Ed Dieterle, Ed.D. Senior Program Officer for Research, Measurement, and Evaluation US Program New Opportunities for Measurement & Data Analysis to Personalize Learning If you're not failing every now and again, it's a sign you're not doing anything very innovative. - Woody Allen

    ×