This document discusses representing and visualizing human emotions using wearable technologies. It outlines several existing projects that visualize emotions through wearables, such as devices that detect physiological signals and display emotions visually. The document proposes researching how to recognize, compare, and disseminate emotions in order to visualize them on the body. Example projects are described, such as one where audience emotions online and in-person are aggregated and used to control a dancer's body and multimedia environment. The goal is to better understand how emotions can be published and experienced through the physical body.
9. EMOTION DISCIPLINES
humanistic
communication sciences
neuroscience
psychology economics
law
anthropology
education
linguistics
design political sciences
sociology
philosophy
robotics
computer science
artificial intelligence systems theory
44. TALKERS PERFORMANCE
WEB+ DANCER'S
SOCIAL NETS BODY
EXPERT
SYSTEM
LIVE MULTIMEDIA
AUDIENCE ENVIRONMENT
45. TALKERS PERFORMANCE
● people interact with touch interfaces, social networks and mobile phones
● interfaces are designed to express emotions through music, graphics...
● data is gathered and fed to expert system, to aggregate information
● aggregated emotions are used in groups to be visualized:
● dancer's body interprets information as body electric shocks
● video projection 1 generates representation using Plutchick
● video projection 2 generates interferences on vide input
● Speakers on body of dancer generate words according to monitored emotional state
48. ONEAVATAR
Physical Body connects to the Digital Body of the Avatar
SENSES TRANSFORMS ENACTS
PHYSICAL DIGITAL
BODY BODY
ENACTS TRANSFORMS SENSES
Interactions transfer emotions across digital/analog boundaries