• Save
Gobert, Dede, Martin, Rose "Panel: Learning Analytics and Learning Sciences"
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Gobert, Dede, Martin, Rose "Panel: Learning Analytics and Learning Sciences"

on

  • 244 views

 

Statistics

Views

Total Views
244
Views on SlideShare
161
Embed Views
83

Actions

Likes
0
Downloads
0
Comments
0

1 Embed 83

http://www.cite.hku.hk 83

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Gobert, Dede, Martin, Rose "Panel: Learning Analytics and Learning Sciences" Presentation Transcript

  • 1. Panel: Learning Analytics and Learning Sciences Janice Gobert (Worcester Polytechnic Institute) Chris Dede (Harvard University) Taylor Martin (Utah State University) Carolyn Rose (Carnegie Mellon University) Summarized by Gaowei Chen Faculty of Education, HKU July 4, 20141
  • 2.  Professor of Educational Psychology at Simon Fraser University  Canada Research Chair in self-regulated learning and learning technologies  Research interests include self-regulated learning, metacognition, motivation, adaptive software for researching and promoting self- regulated learning About the Keynote Speaker 1 Janice Gobert 2
  • 3. 3
  • 4.  Standardized tests are not measuring the right stuff, and teachers do not have time to give kids feedback Skill assessment is very limited Educators cannot know who needs help Feedback is given too late to be formative Many students may struggle in silence Problems with Standardized Tests 4
  • 5. 5 How can we leverage technology & data mining to improve learning and assessment?
  • 6.  Potential:  Offer greater authenticity  Generate rich log files  Work on both products and inquiry processes  Can scale to many learners  Can blend learning and assessment  Challenges:  Complex tasks (not one-type measures)  Students have more than one way to conduct inquiry  Sub-tasks are not independent from each other  Real-time features make traditional measurement methods hard to apply  Theory needed before aggregate data and design categories 6 Interactive labs have assessment potential, but with challenges
  • 7.  It’s in intelligent-tutoring system  Provides an assessment environment for middle school physics, life science, and earth science using Microworlds  It’s implemented during content unit to provide formative data for teachers  Assessment & real-time scaffolding7 An example solution: Inq-ITS
  • 8.  Students are generating log files in real time  These log files are analyzed using algorithm  From these log-files, two reports are generated (teacher’s and students’ reports)  Teacher can walk in classroom and help students in real- time 8 How does the assessment model work?
  • 9.  The algorithm captures / assesses the skills  The assessment method has scalability implications:  It provides automatic, rigorous scoring of inquiry processes  Generalizable to new students and new domains The approach has the potential to inform the design of future assessment for science inquiry skills 9 What did Janice found?
  • 10. About the Keynote Speaker 2 Chris Dede 10
  • 11. 11
  • 12.  Examples:  Virtual Reality  Virtual Environments  Ubiquitous Computing You can sit in a classroom physically, but psychologically you can immerse into a different world, like a virtual world; You can walk home, and you can meanwhile walk into an automated reality 12 His research on ‘Immersive Learning’
  • 13.  Inquiry practices involve sub-skills (features of inquiry skills):  Asking questions and defining problems;  Developing and using models;  Planning and carrying out investigations;  Analyzing and interpreting data;  …… 13 Building virtual reality to teach Inquiry Skills
  • 14.  River City (1999-2009)  (For middle school students to sort out diseases that could happen in town)  Pond Ecosystem (2008-2012)  (Created a digital immersive ecosystem for middle school students)  Eco-mobile (recent)  (A set of mobile reality which has magic eyes and can see different kinds of things posted) 14 Some of his virtual reality projects
  • 15.  Unlike Intelligent-tutoring system and Micro-Worlds  ITS is highly constrained and Micro-worlds are partly constrained  But virtual world is unstructured  made it extremely difficult to interpret the log files  Actions as basis for assessments  Log files indicate with Timestamps  Where students went  With whom they communicated and what they said  What artifacts they activated  What databases they viewed  What data they gathered 15 These are open-ended environment; how to use data to inform learning?
  • 16. While you keep the environment open-ended, you can look at parts of the environment where actions are constrained and provide lots of diagnostic feedback for learners and teachers. 16 To summarize the solution
  • 17. About the Keynote Speaker 3 Taylor Martin 17
  • 18. 18
  • 19.  Elements of Microgenetic Research:  1) The time span of the research covers the period when a competency is likely to develop or be learned  2) Observations of learning behavior are as dense as possible within this window, and  3) Analysis of learning behavior is conducted on an instance by instance basis EDM and LA can improve 2) and 3) particularly. 19 Microgenetic Research and Learning Analytics (LA)
  • 20.  Theory driven vs Discovery driven  It’s more of a design cycle for research (i.e., Design-based Research for data analysis)  More than one versus the other  Data Size Continuum  The data are changing quickly  With really big data often need machine learning discovery driven approach to even know what features might be useful 20 Some dimensions
  • 21. About the Keynote Speaker 4 Carolyn Rosé 21
  • 22. 22
  • 23.  E.g., Whole class conversations, group of learners’ conversations  Her recent thinking and research on a more theory-driven framework  This new theoretical framework is based upon psychology, sociolinguistics, and language technology 23 Conversational data
  • 24.  Basic concepts  We gain influence in interaction through manipulation of horizontal and vertical social distance (which are a social processes)  In social processes, they can support learning, they offer opportunities for learners, but sometimes they also hold the learner’s back 24 Theoretical Framework (new move)  Models that embody these structures will be able to predict social processes from interaction data
  • 25. 25 A 3-dimensional Coding scheme SouFlé Framework (Howley et al., 2013)  Cognitive  Engagement  Social  Vertical distance  horizontal distances
  • 26.  Question: Does these learning analytics and diagnostic assessment lead to improved score in STANDARDIZED TESTS (ST) ?  Janice:  1) big problems with ST, having validity issues (ABCD choice sometimes is not related to skills/practices needed in real-life)  2) These methods improve learning. With knowledge, students can do Multiple choice question better. After all, knowledge is your own knowledge, but rote learning does not generate patterns and models. Panel Q & A 26
  • 27.  Chris:  In the long run, the predicted validity of those ST is quite low (i.e., to predict what people come out from college)   performance tests need to be in a variety of ways Time to drive test makers to move 27 Panel Q & A
  • 28.  Video of the keynote speech available at http://new.livestream.com/accounts/6514521/events/3 105335 Thank You 28