Inredis And Machine Learning Nips

1,009 views

Published on

This presentation address some of the research lines on machine learning in order to foster accessibility in the ICT design.

Published in: Technology, Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,009
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
23
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Inredis And Machine Learning Nips

  1. 1. Machine learning applied to multi-modal interaction, adaptive interfaces and ubiquitous assistive technologies December 10, 2009 Jaisiel Madrid Sánchez R&D Consultant INREDIS project
  2. 2. <ul><li>Technology company belonging to the ONCE’s Foundation </li></ul><ul><li>Over 70% of Technosite’s staff are people with disabilities . </li></ul><ul><li>It is precisely in that aspect that we have been able to boost our competitive edge: </li></ul><ul><ul><li>Our technological development follows accessibility criteria </li></ul></ul><ul><ul><li>Business area focusing on social studies : </li></ul></ul><ul><ul><ul><li>users’ needs </li></ul></ul></ul><ul><ul><ul><li>preferences </li></ul></ul></ul><ul><ul><ul><li>expectations </li></ul></ul></ul><ul><li>Social Spaces for Research and Innovation (SSRIs) : exchange information and network among users, designers and stakeholders for the ICT development. </li></ul>Technosite ( who are we?…)
  3. 3. Transforming the Assistive Technology Ecosystem <ul><li>INREDIS project is developing basic technologies for communication and interaction channels between people with disabilities and their ICT environment ( IN terfaces for RE lationships between people with DIS abilities and their ICT environment). </li></ul><ul><li>Accessibility : technologies must be designed for diversity ( design for all ). </li></ul><ul><ul><li>Interoperability. </li></ul></ul><ul><ul><li>Adaptability. </li></ul></ul><ul><ul><li>Multimodality. </li></ul></ul><ul><ul><li>Ubiquity </li></ul></ul>
  4. 5. <ul><ul><li>Interoperability and ubiquity (cloud computing): structured data sharing. </li></ul></ul><ul><ul><li>Adaptability  machine learning </li></ul></ul><ul><ul><ul><li>Adaptive user interfaces (personalization): accessibility becomes a special case of adaptation. </li></ul></ul></ul><ul><ul><li>Multimodality  machine learning </li></ul></ul><ul><ul><ul><li>Multimodal interaction (detection): accessibility becomes a natural interaction according to user capabilities. </li></ul></ul></ul><ul><li>Little to say about particular learning methods, but specific setups to apply them. </li></ul>Accessibility and Machine Learning <ul><ul><li>INREDIS </li></ul></ul>
  5. 6. <ul><ul><ul><li>Multimodal interaction is achieved by multimodal assistive technologies (executed in local/remote services): </li></ul></ul></ul><ul><ul><ul><ul><li>vary the interaction channel or perform a code translation : </li></ul></ul></ul></ul><ul><ul><ul><ul><li>considered as “interaction resources” of the user interface (to be adapted). </li></ul></ul></ul></ul>Adaptive user interfaces and multimodal assistive technologies <ul><ul><li>Text to Speech. </li></ul></ul><ul><ul><li>Speech to Text. </li></ul></ul><ul><ul><li>ECA (Embodied </li></ul></ul><ul><ul><li>Conversational Agents) </li></ul></ul><ul><ul><li>Text to Augmentative Communication </li></ul></ul><ul><ul><li>Text to Sign Language. </li></ul></ul><ul><ul><li>Sign Language to text. </li></ul></ul><ul><ul><li>etc. </li></ul></ul>
  6. 7. <ul><ul><li>Levels of adaptation of user interface ( accessibility resources on the user interface ): </li></ul></ul><ul><ul><ul><li>Lexical level: navigation windows, button sizes, figures with reduced detail, textual description of non-textual resources, etc. </li></ul></ul></ul><ul><ul><ul><li>Interaction level: multimodal assistive technologies </li></ul></ul></ul>Adaptive user interfaces and multimodal assistive technologies <ul><ul><li>Selection of : </li></ul></ul><ul><ul><ul><li>Type of multimodal AT. </li></ul></ul></ul><ul><ul><ul><li>Configuration options: “ ready from the first moment ” </li></ul></ul></ul>
  7. 8. <ul><ul><li>Data for adaptation: </li></ul></ul><ul><ul><ul><li>Persistent features ( off-line adaptation ): </li></ul></ul></ul><ul><ul><ul><ul><li>User profile: needs, preferences*, expectations*. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Technological profile : user device, target service/device. </li></ul></ul></ul></ul><ul><ul><ul><li>Non-persistent features ( on-line adaptations ): </li></ul></ul></ul><ul><ul><ul><ul><li>User profile: user experience, affective detection (and other activity response systems: brain, eye,…) </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Context profile : wearable sensors, complex event processing (INREDIS platform-level). </li></ul></ul></ul></ul>Adaptive user interfaces
  8. 9. <ul><ul><li>Knowledge organization for data-adaptation matching: </li></ul></ul><ul><ul><ul><li>INREDIS ontology : organizes concepts, their properties and their relations. </li></ul></ul></ul><ul><ul><ul><li>Populating the ontology is a difficult task: machine learning as a tool to discover instances and enrich the ontology. </li></ul></ul></ul><ul><ul><ul><ul><li>Persistent features: </li></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>User profile: needs, preferences, expectations. </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>- Implicit interaction systems (vs. explicit user input: e.g., on-line form). </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><li>Non-persistent features: </li></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>- User profile: user experience , affective detection . </li></ul></ul></ul></ul></ul><ul><ul><ul><ul><ul><li>- Context profile : wearable sensors , complex event processing </li></ul></ul></ul></ul></ul><ul><ul><ul><li>Evolving the ontology : new concepts and relations according to experience by means of machine learning. </li></ul></ul></ul>Adaptive user interfaces
  9. 10. Persistent user features: implicit interaction systems <ul><ul><ul><li>persistent user profile </li></ul></ul></ul><ul><ul><ul><li>multimodal games </li></ul></ul></ul><ul><ul><ul><li>social analysis </li></ul></ul></ul><ul><ul><ul><li>interaction logs </li></ul></ul></ul>
  10. 11. Persistent user features: implicit interaction systems <ul><ul><li>Multimodal ( natural ) interaction games : </li></ul></ul><ul><ul><ul><li>“ Tell me and I forget, show me and I remember, involve me and I understand”: Chinese proverb </li></ul></ul></ul><ul><ul><ul><li>Goals : </li></ul></ul></ul><ul><ul><ul><ul><li>Capture of persistent user profile: needs and preferred adaptations (provide personal predictions for each user). </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Reflect user’s actual practices, not user’s beliefs (forms, etc.). </li></ul></ul></ul></ul><ul><ul><ul><ul><li>“ Static over time”: explicitly reconfigured by user. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Multimodal: accessible from the first interaction </li></ul></ul></ul></ul><ul><ul><ul><li>The game involves: vision, auditory, motor and cognitive problems. </li></ul></ul></ul>
  11. 12. <ul><ul><ul><li>The game actively interacts with user to generate queries and examples to evaluate user needs and preferences (following a consistent goal). </li></ul></ul></ul><ul><ul><ul><li>The system collects traces of user decisions and apply machine learning to these traces to construct a persistent user profile model (needs, preferences and expectations). </li></ul></ul></ul><ul><ul><ul><li>This profile will be used for future interface adaptations (non-persistent updates). </li></ul></ul></ul><ul><ul><ul><li>Dynamic modeling: </li></ul></ul></ul><ul><ul><ul><ul><li>users provide different feedbacks for similar situations according to needs, preferences and expectations. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>the agent might ask questions to learn more effectively according to given feedbacks and select a subset of observed samples. </li></ul></ul></ul></ul>Persistent user features: implicit interaction systems
  12. 13. <ul><ul><ul><li>Complexity of the tasks can be extended: </li></ul></ul></ul><ul><ul><ul><ul><li>Additional modalities (incorporated to the model). </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Media contents. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Real time. </li></ul></ul></ul></ul><ul><ul><ul><li>Choosing the right problems : designers choose different questions depending on user profiles and agent performance, maintaining minimal interactions. </li></ul></ul></ul><ul><ul><ul><ul><li>Measure of efficiency : number of interactions (clicks, etc.) to complete the game. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Measures of quality : several criterion (different users differ in the relative importance they assign to such criteria: according to expectations). </li></ul></ul></ul></ul><ul><ul><ul><li>ML Literature (connections): advisory systems by information filtering, multi-task learning, etc. </li></ul></ul></ul>Persistent user features: implicit interaction systems
  13. 14. <ul><ul><li>Social network analysis: </li></ul></ul><ul><ul><ul><li>Finding relevant information from social network monitoring. </li></ul></ul></ul><ul><ul><ul><li>Relevant information: accessibility and usability features . </li></ul></ul></ul><ul><ul><ul><li>Help increasing accuracy on the persistent user profile , so matching more relevant interface resources to user . </li></ul></ul></ul><ul><ul><ul><li>Feedback focus on user interests, feelings, needs, preferences and expectations about accessibility features ( instead of functionality features ): </li></ul></ul></ul><ul><ul><ul><ul><li>At the level of single experience in 2.0 portals and blogs (targeting of individuals based on expressed preferences). </li></ul></ul></ul></ul><ul><ul><ul><ul><li>At the level of related user groups : improve relevancy and trustworthiness of opinion data for interface resources recommendation. </li></ul></ul></ul></ul>Persistent user features: implicit interaction systems
  14. 15. <ul><ul><ul><li>Incorporating the experience of those who used particular accessibility resources before . Opinion mining. </li></ul></ul></ul><ul><ul><ul><li>Grouping of 2.0 content based on natural language expressions about user like and dislike about accessibility and usability features: categorization of interests </li></ul></ul></ul><ul><ul><ul><li>Taking into account inconsistencies in the opinion of conflicting authors (by determining reputation of authors). </li></ul></ul></ul><ul><ul><ul><li>Requires a specific semantic technology (represent the original semantic structure of authors information (with different needs and reputations) ). Parse tree + semantic rules which navigates these trees. </li></ul></ul></ul><ul><ul><ul><li>ML connections: text categorization using Support Vector Machines. </li></ul></ul></ul>Persistent user features: implicit interaction systems
  15. 16. <ul><ul><li>User interaction logs: </li></ul></ul><ul><ul><ul><li>Within the symp. schedule: </li></ul></ul></ul><ul><ul><ul><li>“ Data Mining based user modeling systems for web personalization applied to people with disabilities”. J. Abascal, O. Arbelaitz, J. Munguerza and I. Perona. </li></ul></ul></ul>Persistent user features: implicit interaction systems
  16. 17. <ul><ul><li>User experience. </li></ul></ul><ul><ul><ul><li>First adaptation of interface has been already done (by using persistent features): off-line adaptation. </li></ul></ul></ul><ul><ul><ul><li>Learned knowledge should reflect the preferences of individual interface resources: personalized assistive technologies . </li></ul></ul></ul><ul><ul><ul><li>On-line adaptation of user interface according to user experience: each time interaction with the interface occurs (on-line learning, which contrast with work on datamining). </li></ul></ul></ul><ul><ul><ul><li>INREDIS aims to construct an interaction manager makes recommendations to the user or generates actions on the interface resources (both lexical and interaction) that the user can always override: these update persistent user profile . </li></ul></ul></ul><ul><ul><ul><li>Collaborative filtering : find similar user profiles and suggest on-line accessibility resources that they liked but the current user has not yet used. </li></ul></ul></ul>Non-persistent user features
  17. 18. <ul><ul><li>Affective detection ( attentive interfaces ): </li></ul></ul><ul><ul><ul><li>Goal : ( the ability to simulate empathy: natural interaction…). </li></ul></ul></ul><ul><ul><ul><ul><li>To accept or reject on-line modifications (from explicit interactions) on the interface resources according to an implicit feedback (user’s behaviour), in order to improve user experience. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>To generate new modifications from implicit (emotional) user interaction , in order to better meet dynamic usability goals. </li></ul></ul></ul></ul><ul><ul><ul><li>INREDIS affective intelligent agent : </li></ul></ul></ul><ul><ul><ul><ul><li>Multimodal : speech and facial detection (hypoacusis, cognitive, etc.). </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Combined with eye activity detection and brain response. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Negative, neutral and positive emotions (Litman y Forbes-Riley.2004). </li></ul></ul></ul></ul>Non-persistent user features
  18. 19. <ul><ul><ul><ul><li>Video, audio and fusion classifiers (“unambiguity”). </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Support vector machines. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>ML literature: detection until 40 emotions. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Essential step: training over specific users (multimodal games may give this offline information). </li></ul></ul></ul></ul><ul><ul><ul><li>Affective visual output system: </li></ul></ul></ul>Non-persistent user features
  19. 20. <ul><ul><li>Wearable sensors: </li></ul></ul><ul><ul><ul><li>Context-awareness: interface adaptation should be able to behave in a context-sensitive way (of person of computing device). </li></ul></ul></ul><ul><ul><ul><ul><li>Remind: INREDIS focus on lexical and interaction adaptations! </li></ul></ul></ul></ul><ul><ul><ul><li>To collect data from a dynamic and unknown environment : the context (of user or device) . </li></ul></ul></ul><ul><ul><ul><li>Standard machine learning methods are generally used to integrate and interpret the collected sensor traces from multiple sources of information (see “ learning from multiple sources ” papers…). </li></ul></ul></ul><ul><ul><ul><li>Context-sensitive adaptations: non-persistent disabilities … </li></ul></ul></ul>Non-persistent context features
  20. 21. <ul><ul><ul><li>Context-sensitive adaptations: “ non-persistent disabilities ”: </li></ul></ul></ul><ul><ul><ul><ul><li>Noisy context : hypoacusis  visual alternative (text, graphic) </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Reflecting light on screen : low vision  magnifier/auditory alternative. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Cold temperature/gloves or walking/driving : motor impairment  voice interaction. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Surrounding people (ATM): hearing impairment  visual alternative. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>etc. </li></ul></ul></ul></ul>Non-persistent context features
  21. 22. <ul><ul><li>Non-persistent features  “non-persistent disabilities” </li></ul></ul><ul><ul><li>“ Every day we can have the same needs as a person with disabilities” </li></ul></ul>
  22. 23. <ul><ul><li>INREDIS: multimodal remote services </li></ul></ul><ul><ul><ul><li>Image/text/audio/haptic processing. </li></ul></ul></ul><ul><ul><ul><li>Fusion and syncronization of multimodal streams. </li></ul></ul></ul><ul><ul><ul><li>High dimensional data: SVM. </li></ul></ul></ul><ul><ul><ul><li>E.g.: Spanish sign language classifier: </li></ul></ul></ul>Multimodal assistive technologies
  23. 24. interoperability adaptability multimodality ubiquity
  24. 25. Thank you for your attention <jaisiel madrid sánchez> [email_address] www.technosite.es Machine learning applied to multi-modal interaction, adaptive interfaces and ubiquitous assistive technologies

×