201209 An Introduction to Building Affective-Driven Self-Adaptive Software


Published on

One important characteristic in modern software systems is self-
adaptation, the capability of monitoring and reacting to changes into the environment. A particular case of self-adaptation is affect-driven self- adaptation. Affect-driven self-adaptation involves using sensing devices to measure physiological signals of human affectivestate’s (emotions) changes, learning about the meaning of those changes, and then reacting (self-adapting) in consequence. Affect-driven self-adaptive systems take advantage of brain-computer interfaces, eye-tracking, face-based emotion recognition, and sensors to measure physiological signals.
Systems such as learning environments, health care systems, and videogames are able to take advantage of affect-driven self-adaptive capabilities. Today these capabilities are brittle, costly to change, difficult to reuse, and limited in scope. A software factory approach has been suggested to make feasible adding affective-driven self-adaptive capabilities either into new or existing systems. Software factories capture knowledge of how to produce applications that share common characteristics and make that knowledge available in the form of assets (patterns, model, framework, and tools) and systematically apply those assets to automate development reducing cost and time while improving product quality.
This talk provides a sneak peek of affective-driven self-adaptive capabilities and explores how a software factory approach is used to build affect-driven self-adaptive software.

Published in: Technology
  • Be the first to comment

201209 An Introduction to Building Affective-Driven Self-Adaptive Software

  1. 1. An Introduction to BuildingAffective-Driven Self-Adaptive Software through a Software Factory Approach Javier Gonzalez-Sanchez javiergs@asu.edu www.javiergs.com
  2. 2. ContextHuman Computer InteractionAffective Computing Software Architecture System Engineering
  3. 3. 1Affective-Driven
  4. 4. Affective-Driven (B) Emotions are generally understood as representing a synthesis of a subjective experience, an expressive behavior, and a neurochemical activity.(A) Feeling Emotion Mood Affective State Affect (C) facilitation of social communication
  5. 5. Affective-DrivenEmpathy“to put ones self in anothers shoes”
  6. 6. Affective-Driven 1 Health CareEducation Entertainment
  7. 7. Affective-Driven EmpathySENSING PERCEPTION BELIEFS
  8. 8. Affective-Driven
  9. 9. Face-BasedFace-based emotion recognition systemsThese systems infer affective states by capturing images of the users’ facialexpressions and head movements.We are going to show the capabilities of face-based emotion recognition systemsusing a simple 30 fps USB webcam and software from MIT Media Lab [8].[8] R. E. Kaliouby and P. Robinson, “Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures,” Proc. Conference onComputer Vision and Pattern Recognition Workshop (CVPRW ‘04), IEEE Computer Society, June 2004, Volume 10, p. 154.
  10. 10. Brain-Based
  11. 11. Text-Based
  12. 12. Eye Tracking
  13. 13. Eye Tracking
  14. 14. Other Sensors…Galvanic Skin Conductance Posture Sensor Nike+, Bodymedia, etc.
  15. 15. Other Sensors…S. Mota, and R. W. Picard, "Automated Posture Analysis for Detecting Learners Interest Level," Proc. Computer Visionand Pattern Recognition Workshop (CVPRW 03), IEEE Press, June 2003, vol. 5, pp. 49, doi:10.1109/CVPRW.2003.10047.Y. Qi, and R. W. Picard, "Context-Sensitive Bayesian Classifiers and Application to Mouse Pressure PatternClassification," Proc. International Conference on Pattern Recognition (ICPR 02), Aug. 2002, vol 3, pp. 30448, doi:10.1109/ICPR.2002.1047973.M. Strauss, C. Reynolds, S. Hughes, K. Park, G. McDarby, and R.W. Picard, “The HandWave Bluetooth Skin ConductanceSensor,” Proc. First International Conference on Affective Computing and Intelligent Interaction (ACII 05), Springer-Verlang, Oct. 2005, pp. 699-706, doi:10.1007/11573548_90.
  16. 16. Machine LearningI. Arroyo, D. G. Cooper, W. Burleson, F. P. Woolf, K. Muldner,and R. Christopherson, “Emotion Sensors Go to School,” Proc.Artificial Intelligence in Education; IOS Press, July 2009, vol.Frontiers in Artificial Intelligence and Applications 200, pp.17-24.
  17. 17. 2Self-Adaptive
  18. 18. Self-AdaptiveAdaptive “change behavior“
  19. 19. Self-Adaptive LEARNING HAVING OPTIONS (internal software structure) AdaptiveDYNAMIC - REAL-TIME DECISION THEORY
  20. 20. Close-Loop 1
  21. 21. Close-Loop Framework (i.e Components, Tools)
  22. 22. Close-Loop
  23. 23. 3Software Factories
  24. 24. Software 1
  25. 25. Software 1
  26. 26. Softwaresoftware ARCHITECTURE ENGINEERING“reducing complexity through abstraction and separation of concerns”
  27. 27. Not One… But Many
  28. 28. Not One… But ManyProcess, Rules, and Regulations software FACTORY Manufacturing using TOOLS… industrial production… COMPONENTS are transformed or assembled…
  29. 29. Not One… But Many IMPLEMENTATION: VALIDATION: Frameworks ATAM ToolsEmpirical Software Methods software ARCHITECTURE MODELING: Software Design Patterns Pattern Languages Model-Driven Design Component-Based Engineering
  30. 30. ABE Framework
  31. 31. A Pattern Language Design Patterns
  32. 32. A Pattern LanguageGonzalez-Sanchez, J., Chavez-Echeagaray, M.E., Atkinson, R. and Burleson, W. (2011) Affective Computing Meets DesignPatterns: A Pattern-Based Model of A Multimodal Emotion Recognition Framework. Proceedings of the16th EuropeanConference on Pattern Languages of Programs. Irsee, Germany. July 2011.Gonzalez-Sanchez, J., Chavez-Echeagaray, M.E., Atkinson, R. and Burleson, W. (2012) Towards a Pattern Language forAffective Systems. Proceedings of the19th Conference on Pattern Languages of Programs. Tucson, Arizona, USA. October2012. In press.
  33. 33. Close-Loop Framework (i.e Components, Tools)
  35. 35. 4Study Cases
  36. 36. VideogamesBernays, R., Mone, J., Yau, P., Murcia, M., Gonzalez-Sanchez, J., Chavez-Echeagaray, M.E., Christopherson, C., Atkinson, R., (2012), "Lost in the Dark:Emotion Adaption", ACM Symposium on User Interface Software and Technology 2012. Cambridge, MA USA. October 2012.
  37. 37. Visualization
  38. 38. Visualization
  39. 39. Visualization
  40. 40. Visualization
  41. 41. Questions
  42. 42. ContactJavier Gonzalez-Sanchez javiergs @ asu.edu www . javiergs . com
  43. 43. Acknowledgements This research was supported by Office of Naval Research under Grant N00014-10-1-0143 awarded to Dr. Robert Atkinson, and by National Science Foundation, Award 0705554, IIS/HCCAffective Learning Companions: Modeling and Supporting Emotion During Teaching awarded to Dr. Beverly Woolf and Dr. Winslow Burleson.