• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
201101 affective learning
 

201101 affective learning

on

  • 885 views

 

Statistics

Views

Total Views
885
Views on SlideShare
778
Embed Views
107

Actions

Likes
0
Downloads
0
Comments
0

3 Embeds 107

http://javiergs.com 50
http://lsrl.lab.asu.edu 40
http://angle.lab.asu.edu 17

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    201101 affective learning 201101 affective learning Presentation Transcript

    • +     Automated  Detec-on  of  Affec-ve  States     to  Measure  Learning  Experience   Dr.  Robert  Atkinson,     M.  Robert  Christopherson,  M.  Javier  Gonzalez-­‐Sanchez,  M.  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +   Schedule   1.  Introduc9on   2.  Human  Computer  Interac9on   3.  Anatomy  of  the  System   4.  Learning  Experience   5.  SoKware  Architecture     6.  Analysis  and  Results   7.  Conclusions  Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray   2  
    • +   Introduc-on     Learning  Science  Research  Lab  at  Arizona  State   University.     Research  shows  that  learning  is  enhanced  when   empathic  support  is  present.  Various  studies  have  linked   interpersonal  rela-onships  between  teachers  and   students  to  increase  student  mo-va-on  over  the  long   term.       Thus  great  interest  exists  to  develop  systems  that   embed  affec9ve  support  into  tutoring  applica-ons.    Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +  Introduc-on    Learning    Empathic  Support    Mo9va9on  
    • +  Introduc-on    Tutoring  applica9ons    Support    Interac9on  
    • +  Introduc-on     University   universitas  magistrorum  et  scholarium     community  of  teachers  and  scholars     group  of  interac-ng  en99es  sharing  a  common  loca9on  
    • +   Human  Computer  Interac-on     The  design  and  use  of  systems  and  devices  that  deals   with  sensing  and  percep-on  (affect  recogni9on)  will   provide  direct  customized  instruc9on  or  feedback  to   students  without  the  aid  of  human  beings.  Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +   Human  Computer  Interac-on  Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +   Anatomy  of  the  System       Providing  the  computer  with  the  ability  of  “perceive”   feelings,  thoughts,  or  aJtudes  requires  the   implementa9on  of  addi9onal  sensing  and  percep9on   mechanisms  such  as  biofeedback  and  brain-­‐computer   interfaces,  face-­‐based  emo9on  recogni9on  systems  and   eye-­‐tracking  systems.       Using  the  informa9on  provided  by  these  mechanisms  as   input,  it  is  possible  to  measure  in  an  objec-ve  way  the   user  experience  and  to  be  able  to  create  user’s  models  to   predict  user’s  behavior    Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +  Anatomy  of  the  System          
    • +   Anatomy  of  the  System     The  device  report  data  with  intervals  of  125  ms     approximated.       The  output  of  the  neuroheadset  includes  14  sensors  or   channels    (7  on  each  brain  hemisphere:  AF3,  F7,  F3,   FC5,  T7,  P7,  O1,  O2,  P8,  T8,  FC6,  F4,  F8,  and  AF4)  and   two  values  of  the  accelera9on  of  the  head  when   leaning  (gyrox  and  gyroy).       This  report  Engagement,  Boredom,  Excitement  ,   Frustra9on,  Medita9on.     Wireless  Emo9v®  EPOC  Headset  Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +   Anatomy  of  the  System     The  device  report  data  with  intervals  of  100  ms   approximated.       provides  data  concerning  aSen-on  direc-on  and  -me   of  focus  during  individual  use  of  a  computer.     As  part  of  the  collected  data  from  this  system  we  also   are  gecng  a  video  stream  of  the  whole  session.  This   video  is  the  record  of  the  screen  of  the  computer   during  the  experiment.       Tobii®Eye  Tracker  Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +   Anatomy  of  the  System     This  is  about  inferring  a  person  mental  state  from  non-­‐ verbal  cues.    Visual  system  infers  mental  states  from  head     gestures  and  facial  expressions  in  a  video  stream  in  real-­‐   9me  at  data  intervals  of  100  ms  approximately.       By  gecng  images  of  the  facial  expressions  and  head   movements  it  is  possible  to  infer  emo9ons  from  a  person.       The  automated  mind-­‐reading  system  implements  the   model  by  combining  top-­‐down  predic9ons  of  mental   state  models  with  bodom-­‐up  vision-­‐based  processing  of   the  face."         With  this  system  it  is  possible  to  infer  six  different   emo9ons  beyond  the  basic  emo9ons:  agreeing,   concentra-ng,  disagreeing,  interested,  thinking  and   unsure.     MindReader  SoKware  from  MIT  Media  Lab  Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +   Anatomy  of  the  System     Arousal  Sensing.         For  this  we  are  using  a  skin  electrical  conductance   sensor.  This  sensor  measures  the  electrical   conductance  of  the  skin,  which  varies  with  its  moisture   level  that  depends  on  the  sweat  glands,  which  are   controlled  by  the  sympathe9c,  and  parasympathe9c   nervous  systems.     This  sensor  is  a  wireless  Bluetooth  device  that  reports   conductance  data  in  intervals  of  500  ms   approximately.     Hardware  designed  by  MIT  Media  Lab.  Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +   Learning  Experience       The  experiment  was  run  over  21  subjects,  undergrad  and   grad  students  of  Arizona  State  University  ranging   between  18  to  25  years.  For  the  purpose  of  our   experiment  we  consider  all  levels  of  exper9se  from   novice  to  expert  users  of  Guitar  Hero,  we  also  consider   regular  and  no  regular  gamers  ,  and  we  also  consider   both  genders.  Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +  Learning  experience  
    • +   SoTware  Architecture       Learning  and  Tutoring    Systems     Framework.   Automated  Detec-on  of  Affec-ve  States.   >  SoKware  Design  Paderns   >  Reusable  Components   >  Architecture-­‐Based        Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +   SoTware  Architecture       http://old.javiergs.com/paper/amtJavier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +  Analysis  and  Results    Euerqa  is  used  to  discover  mathema9cal  expressions  of  the  structural   rela9onships  in  the  data  records;  the  records  hold  informa9on  about  the   physical  and  emo9onal  behavior  of  an  individual  who  was  engaged  in  a   single  experimental  secng.    Example  1.    
    • +   Conclusions     We  present  soKware  architecture  for  Automated  Detec9on  of     Affec9ve  States  that  integrates  emo9onal  measures  of  learners  as  a   founda9onal  component.  A  soKware  architecture  that  realizes   portability,  high  reuse,  modifiability,  generality  and  robustness  as   required  soKware  quali9es.       Sensor  network  analyses  of  responses  to  digital  media  experiences   are  beginning  to  map  the  rela9onships  between  interac9ons  and   emo9ons  such  as  engagement,  frustra9on,  focus  of  aden9on  and  a   range  of  other  physical  and  mental  states.  In  the  laboratory,  a  user   plays  with  digital  media  while  wearing  a  number  of  sensors  that   provide  a  stream  of  data,  which  are  then  analyzed  for  clustering  and   correla9onal  paderns.  This  project  reports  on  the  laboratory  secng,   sensors,  analyses  and  ini9al  findings  from  this  exploratory  research.  Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray  
    • +   Q+A  Javier  Gonzalez-­‐Sanchez  |  Maria-­‐Elena  Chavez-­‐Echeagaray