This document outlines a study that aims to develop systems that can automatically detect students' affective states during learning to improve the learning experience. The proposed system uses multiple sensors and tools - including a neuroheadset, eye tracker, facial expression recognition software, and skin conductance sensor - to collect data on engagement, emotions, attention, and arousal. The real-time data from these devices will be analyzed to understand the user experience and provide customized feedback or instruction to students without human aid.
08448380779 Call Girls In Friends Colony Women Seeking Men
201101 affective learning
1. +
Automated
Detec-on
of
Affec-ve
States
to
Measure
Learning
Experience
Dr.
Robert
Atkinson,
M.
Robert
Christopherson,
M.
Javier
Gonzalez-‐Sanchez,
M.
Maria-‐Elena
Chavez-‐Echeagaray
2. +
Schedule
1. Introduc9on
2. Human
Computer
Interac9on
3. Anatomy
of
the
System
4. Learning
Experience
5. SoKware
Architecture
6. Analysis
and
Results
7. Conclusions
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
2
3. +
Introduc-on
Learning
Science
Research
Lab
at
Arizona
State
University.
Research
shows
that
learning
is
enhanced
when
empathic
support
is
present.
Various
studies
have
linked
interpersonal
rela-onships
between
teachers
and
students
to
increase
student
mo-va-on
over
the
long
term.
Thus
great
interest
exists
to
develop
systems
that
embed
affec9ve
support
into
tutoring
applica-ons.
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
6. +
Introduc-on
University
universitas
magistrorum
et
scholarium
community
of
teachers
and
scholars
group
of
interac-ng
en99es
sharing
a
common
loca9on
7. +
Human
Computer
Interac-on
The
design
and
use
of
systems
and
devices
that
deals
with
sensing
and
percep-on
(affect
recogni9on)
will
provide
direct
customized
instruc9on
or
feedback
to
students
without
the
aid
of
human
beings.
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
8. +
Human
Computer
Interac-on
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
9. +
Anatomy
of
the
System
Providing
the
computer
with
the
ability
of
“perceive”
feelings,
thoughts,
or
aJtudes
requires
the
implementa9on
of
addi9onal
sensing
and
percep9on
mechanisms
such
as
biofeedback
and
brain-‐computer
interfaces,
face-‐based
emo9on
recogni9on
systems
and
eye-‐tracking
systems.
Using
the
informa9on
provided
by
these
mechanisms
as
input,
it
is
possible
to
measure
in
an
objec-ve
way
the
user
experience
and
to
be
able
to
create
user’s
models
to
predict
user’s
behavior
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
11. +
Anatomy
of
the
System
The
device
report
data
with
intervals
of
125
ms
approximated.
The
output
of
the
neuroheadset
includes
14
sensors
or
channels
(7
on
each
brain
hemisphere:
AF3,
F7,
F3,
FC5,
T7,
P7,
O1,
O2,
P8,
T8,
FC6,
F4,
F8,
and
AF4)
and
two
values
of
the
accelera9on
of
the
head
when
leaning
(gyrox
and
gyroy).
This
report
Engagement,
Boredom,
Excitement
,
Frustra9on,
Medita9on.
Wireless
Emo9v®
EPOC
Headset
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
12. +
Anatomy
of
the
System
The
device
report
data
with
intervals
of
100
ms
approximated.
provides
data
concerning
aSen-on
direc-on
and
-me
of
focus
during
individual
use
of
a
computer.
As
part
of
the
collected
data
from
this
system
we
also
are
gecng
a
video
stream
of
the
whole
session.
This
video
is
the
record
of
the
screen
of
the
computer
during
the
experiment.
Tobii®Eye
Tracker
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
13. +
Anatomy
of
the
System
This
is
about
inferring
a
person
mental
state
from
non-‐
verbal
cues.
Visual
system
infers
mental
states
from
head
gestures
and
facial
expressions
in
a
video
stream
in
real-‐
9me
at
data
intervals
of
100
ms
approximately.
By
gecng
images
of
the
facial
expressions
and
head
movements
it
is
possible
to
infer
emo9ons
from
a
person.
The
automated
mind-‐reading
system
implements
the
model
by
combining
top-‐down
predic9ons
of
mental
state
models
with
bodom-‐up
vision-‐based
processing
of
the
face."
With
this
system
it
is
possible
to
infer
six
different
emo9ons
beyond
the
basic
emo9ons:
agreeing,
concentra-ng,
disagreeing,
interested,
thinking
and
unsure.
MindReader
SoKware
from
MIT
Media
Lab
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
14. +
Anatomy
of
the
System
Arousal
Sensing.
For
this
we
are
using
a
skin
electrical
conductance
sensor.
This
sensor
measures
the
electrical
conductance
of
the
skin,
which
varies
with
its
moisture
level
that
depends
on
the
sweat
glands,
which
are
controlled
by
the
sympathe9c,
and
parasympathe9c
nervous
systems.
This
sensor
is
a
wireless
Bluetooth
device
that
reports
conductance
data
in
intervals
of
500
ms
approximately.
Hardware
designed
by
MIT
Media
Lab.
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
15. +
Learning
Experience
The
experiment
was
run
over
21
subjects,
undergrad
and
grad
students
of
Arizona
State
University
ranging
between
18
to
25
years.
For
the
purpose
of
our
experiment
we
consider
all
levels
of
exper9se
from
novice
to
expert
users
of
Guitar
Hero,
we
also
consider
regular
and
no
regular
gamers
,
and
we
also
consider
both
genders.
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
17. +
SoTware
Architecture
Learning
and
Tutoring
Systems
Framework.
Automated
Detec-on
of
Affec-ve
States.
>
SoKware
Design
Paderns
>
Reusable
Components
>
Architecture-‐Based
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
18. +
SoTware
Architecture
http://old.javiergs.com/paper/amt
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
19. +
Analysis
and
Results
Euerqa
is
used
to
discover
mathema9cal
expressions
of
the
structural
rela9onships
in
the
data
records;
the
records
hold
informa9on
about
the
physical
and
emo9onal
behavior
of
an
individual
who
was
engaged
in
a
single
experimental
secng.
Example
1.
20. +
Conclusions
We
present
soKware
architecture
for
Automated
Detec9on
of
Affec9ve
States
that
integrates
emo9onal
measures
of
learners
as
a
founda9onal
component.
A
soKware
architecture
that
realizes
portability,
high
reuse,
modifiability,
generality
and
robustness
as
required
soKware
quali9es.
Sensor
network
analyses
of
responses
to
digital
media
experiences
are
beginning
to
map
the
rela9onships
between
interac9ons
and
emo9ons
such
as
engagement,
frustra9on,
focus
of
aden9on
and
a
range
of
other
physical
and
mental
states.
In
the
laboratory,
a
user
plays
with
digital
media
while
wearing
a
number
of
sensors
that
provide
a
stream
of
data,
which
are
then
analyzed
for
clustering
and
correla9onal
paderns.
This
project
reports
on
the
laboratory
secng,
sensors,
analyses
and
ini9al
findings
from
this
exploratory
research.
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray
21. +
Q+A
Javier
Gonzalez-‐Sanchez
|
Maria-‐Elena
Chavez-‐Echeagaray