This presentation discusses developing personalized emotion models using associative classifiers. It begins with background on adaptive user interfaces and computational emotions. A new corpus was created with video recordings of people expressing basic emotions and affects. Initial analysis of Kinect data for happy expressions across subjects is shown. The approach explores generating individualized sets of rules linking facial action units using tree/rule classifiers like M5, with the goal of personalized emotion detection models.
Mind Control to Major Tom: Is It Time to Put Your EEG Headset On? Steve Poole
JavaOne 2016 talk
Using your mind to interact with computers is a long-standing desire. Advances in technology have made it more practical, but is it ready for prime time? This session presents practical examples and a walkthough of how to build a Java-based end-to-end system to drive a remote-controlled droid with nothing but the power of thought. Combining off-the-shelf EEG headsets with cloud technology and IoT, the presenters showcase what capabilities exist today. Beyond mind control (if there is such a concept), the session shows other ways to communicate with your computer besides the keyboard. It will help you understand the art of the possible and decide if it's time to leave the capsule to communicate with your computer.
What's on your mind? Exploring new methods for measuring emotional engagement...UXPA International
Measuring emotional engagement is becoming an increasingly important aspect of interaction design. In order to truly measure the emotional impact that an experience is creating, UX researchers and designers need to have an understanding of methods that can accurately capture this information. Equally important to knowing what emotional responses are being created are ways to enhance or focus the emotional experience. A new branch of user experience research that includes physiological responses is on its way. This workshop will start a dialog on how these new tools can be applied to the UX field.
Smart Cities and Big Data - Research Presentationannegalang
Research presentation on smart cities (sensor technology) and big data, presented in a graduate course I took on Transmedia Design and Digital Culture.
Thelxinoë: Recognizing Human Emotions Using Pupillometry and Machine Learningmlaij
In this study, we present a method for emotion recognition in Virtual Reality (VR) using pupillometry. We analyze pupil diameter responses to both visual and auditory stimuli via a VR headset and focus on extracting key features in the time-domain, frequency-domain, and time-frequency domain from VR-generated data. Our approach utilizes feature selection to identify the most impactful features using Maximum Relevance Minimum Redundancy (mRMR). By applying a Gradient Boosting model, an ensemble learning technique using stacked decision trees, we achieve an accuracy of 98.8% with feature engineering, compared to 84.9% without it. This research contributes significantly to the Thelxinoë framework, aiming to enhance VR experiences by integrating multiple sensor data for realistic and emotionally resonant touch interactions. Our findings open new avenues for developing more immersive and interactive VR environments, paving the way for future advancements in virtual touch technology.
Thelxinoë: Recognizing Human Emotions Using Pupillometry and Machine Learningmlaij
In this study, we present a method for emotion recognition in Virtual Reality (VR) using pupillometry. We analyze pupil diameter responses to both visual and auditory stimuli via a VR headset and focus on extracting key features in the time-domain, frequency-domain, and time-frequency domain from VR-generated data. Our approach utilizes feature selection to identify the most impactful features using Maximum Relevance Minimum Redundancy (mRMR). By applying a Gradient Boosting model, an ensemble learning technique using stacked decision trees, we achieve an accuracy of 98.8% with feature engineering, compared to 84.9% without it. This research contributes significantly to the Thelxinoë framework, aiming to enhance VR experiences by integrating multiple sensor data for realistic and emotionally resonant touch interactions. Our findings open new avenues for developing more immersive and interactive VR environments, paving the way for future advancements in virtual touch technology.
The series of presentations contains the information about "Management Information System" subject of SEIT for University of Pune.
Subject Teacher: Tushar B Kute (Sandip Institute of Technology and Research Centre, Nashik)
http://www.tusharkute.com
Mind Control to Major Tom: Is It Time to Put Your EEG Headset On? Steve Poole
JavaOne 2016 talk
Using your mind to interact with computers is a long-standing desire. Advances in technology have made it more practical, but is it ready for prime time? This session presents practical examples and a walkthough of how to build a Java-based end-to-end system to drive a remote-controlled droid with nothing but the power of thought. Combining off-the-shelf EEG headsets with cloud technology and IoT, the presenters showcase what capabilities exist today. Beyond mind control (if there is such a concept), the session shows other ways to communicate with your computer besides the keyboard. It will help you understand the art of the possible and decide if it's time to leave the capsule to communicate with your computer.
What's on your mind? Exploring new methods for measuring emotional engagement...UXPA International
Measuring emotional engagement is becoming an increasingly important aspect of interaction design. In order to truly measure the emotional impact that an experience is creating, UX researchers and designers need to have an understanding of methods that can accurately capture this information. Equally important to knowing what emotional responses are being created are ways to enhance or focus the emotional experience. A new branch of user experience research that includes physiological responses is on its way. This workshop will start a dialog on how these new tools can be applied to the UX field.
Smart Cities and Big Data - Research Presentationannegalang
Research presentation on smart cities (sensor technology) and big data, presented in a graduate course I took on Transmedia Design and Digital Culture.
Thelxinoë: Recognizing Human Emotions Using Pupillometry and Machine Learningmlaij
In this study, we present a method for emotion recognition in Virtual Reality (VR) using pupillometry. We analyze pupil diameter responses to both visual and auditory stimuli via a VR headset and focus on extracting key features in the time-domain, frequency-domain, and time-frequency domain from VR-generated data. Our approach utilizes feature selection to identify the most impactful features using Maximum Relevance Minimum Redundancy (mRMR). By applying a Gradient Boosting model, an ensemble learning technique using stacked decision trees, we achieve an accuracy of 98.8% with feature engineering, compared to 84.9% without it. This research contributes significantly to the Thelxinoë framework, aiming to enhance VR experiences by integrating multiple sensor data for realistic and emotionally resonant touch interactions. Our findings open new avenues for developing more immersive and interactive VR environments, paving the way for future advancements in virtual touch technology.
Thelxinoë: Recognizing Human Emotions Using Pupillometry and Machine Learningmlaij
In this study, we present a method for emotion recognition in Virtual Reality (VR) using pupillometry. We analyze pupil diameter responses to both visual and auditory stimuli via a VR headset and focus on extracting key features in the time-domain, frequency-domain, and time-frequency domain from VR-generated data. Our approach utilizes feature selection to identify the most impactful features using Maximum Relevance Minimum Redundancy (mRMR). By applying a Gradient Boosting model, an ensemble learning technique using stacked decision trees, we achieve an accuracy of 98.8% with feature engineering, compared to 84.9% without it. This research contributes significantly to the Thelxinoë framework, aiming to enhance VR experiences by integrating multiple sensor data for realistic and emotionally resonant touch interactions. Our findings open new avenues for developing more immersive and interactive VR environments, paving the way for future advancements in virtual touch technology.
The series of presentations contains the information about "Management Information System" subject of SEIT for University of Pune.
Subject Teacher: Tushar B Kute (Sandip Institute of Technology and Research Centre, Nashik)
http://www.tusharkute.com
Emotional Multi-Agents System for Peer to peer E-LearningKarlos Svoboda
The document describes EMASPEL, an emotional multi-agent system for peer-to-peer e-learning. The system uses multiple agent types, including interface agents, emotional agents, curriculum agents, tutor agents, and emotional embodied conversational agents. Emotional agents analyze learners' facial expressions to recognize emotions, while embodied conversational agents express emotions through facial animations. Together this allows the system to personalize instruction based on learners' cognitive and emotional states.
This presentation summarizes a thesis proposal on detecting human emotion on social media based on textual data. The proposal will use a classifier model to identify emotions from social media texts. It will cluster text data into 8 emotion classes to train the classifier. The goal is to analyze social media posts to understand public sentiment on issues and help inform decisions. While the approach only uses text data in English, identifying emotion across languages and media poses challenges.
This document provides an overview of artificial intelligence (AI) including definitions, approaches, foundations, capabilities, and comparisons to human intelligence. It defines AI as the study of intelligent behavior in machines, discusses the four main approaches of acting humanly, thinking humanly, thinking rationally, and acting rationally. The foundations of AI are explained including contributions from fields like philosophy, mathematics, psychology, neuroscience, and more. Both strong AI which aims to truly replicate human reasoning and weak AI which focuses on narrow domains are described. Current capabilities of AI systems in areas such as games, robotics, diagnosis, and planning are summarized. Finally, differences between human and machine intelligence are outlined.
This document provides an overview of artificial intelligence techniques. It begins with definitions of AI and discusses branches of AI like logical AI, search, pattern recognition, knowledge representation, inference and more. It also discusses AI applications, problems in AI and the levels of modeling human intelligence. Several examples are then provided to illustrate increasingly sophisticated AI techniques for playing tic-tac-toe and answering questions to demonstrate moving towards knowledge representations that generalize information and are more extensible.
Adaptive Emotional Personality Model based on Fuzzy Logic Interpretation of F...CSCJournals
In recent years, emotional personality has found an important application in the field of human machine interaction. Interesting examples of this domain are computer games, interface agents, human-robot interaction, etc. However, few systems in this area include a model of personality, although it plays an important role in differentiating and determining the way they experience emotions and the way they behave. Personality simulation has always been a complex issue due to the complexity of the human personality itself, and the difficulty to model human psychology on electronic basis. Current efforts for emotion simulation are rather based on predefined set or inputs and its responses or on classical models which are simple approximate and have proven flaws. In this paper an emotional simulation system was presented. It utilizes the latest psychological theories to design a complex dynamic system that reacts to any environment, without being pre-programmed on sets of input. The design was relying on fuzzy logic to simulate human emotional reaction, thus increasing the accuracy by further emulating human brain and removing the pre-defined set of input and its matched outputs.
This document discusses emotion detection from text. It presents an emotion detection model that extracts emotion from text at the sentence level without relying on existing affect lexicons. The model detects emotion by searching for direct emotional keywords and emotion-affect words/phrases. Experiments show the method achieves over 77% accuracy in detecting Ekman's six basic emotions from text. The document also reviews related work on emotion detection approaches, including keyword-based, rule-based, and machine learning methods. It discusses challenges like the lack of large annotated training data and limitations of dictionary-based approaches.
By thinking about systems thinking, our aim was to provide other system thinkers with a mind map for the key elements of the thinking that took place.
The sepsis problem is highly complex and spans not just the biological system, but also the healthcare enterprise. Sepsis is the context in which systems thinking has been applied and examined.
Sepsis is an emergency situation that if left unrecognised and untreated in its early stage leads to multiple organ dysfunction and death.This case study highlights the attitudes; comments on the system approach, and puts forward the cognitive concepts.
All these concepts are integrated in an overall mind map looking like a tree: the branches of the tree represent the systems thinker's attitudes; the roots of the tree represent features of systems that are commonly considered when systems’ thinking.
This document discusses practical emotional neural networks. It summarizes two existing approaches - brain emotional learning (BEL) based networks and emotional backpropagation (EmBP) based networks. BEL networks are inspired by the limbic system and amygdala in the brain, modeling fast emotional processing via short neural pathways. EmBP networks apply emotional states and appraisals to neural network learning. The document proposes a new limbic-based artificial emotional neural network (LiAENN) that models emotional situations like anxiety and confidence during learning. It applies anxious confident decayed brain emotional learning rules and achieves higher accuracy than BEL and EmBP networks on facial detection and emotion recognition tasks.
The document provides an introduction to artificial intelligence (AI), including its key concepts, scope, components, types, and applications. It defines AI as the science and engineering of creating intelligent machines, especially computer programs. The main types of AI discussed are narrow/weak AI, which can perform specific tasks, and general AI, which aims to create human-level intelligence. The document also outlines the core components of AI in areas like logic, cognition, and computation, and how these combine to form knowledge-based systems. Common applications of AI mentioned include gaming, natural language processing, and robotics.
1) Affective computing aims to expand human emotional intelligence to machines by creating socially intelligent machines that can respond appropriately according to the situation and interlocutor.
2) There are two main approaches to modeling emotions in affective computing: discrete theories that identify basic emotions like Ekman's six emotions, and continuous theories that describe emotions along dimensions of arousal and valence.
3) Empath's goal is to recognize emotions from speech regardless of language, which presents challenges of combining speech processing with emotion recognition from voice cues alone. Empath is developing methods to extract pitch, intensity, and speech rate from voice samples to train models to classify emotions.
This document discusses different definitions and approaches to artificial intelligence (AI). It begins by defining AI as helping machines solve complex problems like humans by applying human-like algorithms. It then discusses AI's links to other fields and its history. The rest of the document explores definitions of AI and different goals or approaches in AI research, including systems that think or act like humans and systems that think or act rationally. It focuses on the Turing Test approach of acting humanly and the cognitive modeling approach of thinking humanly by modeling human cognition.
The document discusses the definitions and goals of artificial intelligence, including attempting to match or surpass human intelligence (strong AI), or accomplishing specific tasks without full human cognitive abilities (weak AI). It also covers the components of intelligence like reasoning, learning, and problem solving, as well as the history and importance of AI research in areas like philosophy, mathematics, psychology and its applications in tasks like games, scientific analysis and medical diagnosis.
This document provides information about an Artificial Intelligence course. The key details are:
- The course is CSC 343, taught over 3 lecture hours and 2 lab hours
The document discusses a mind reading computer that can infer mental states from facial expressions and gestures. It works by using functional near-infrared spectroscopy to measure blood oxygen levels in the brain while the user performs tasks. The technology has advantages like helping disabled people but also risks like privacy breaches if used improperly. Researchers are working to develop this technology further so computers can interact based on brain activity readings.
This document discusses behavioral emotional intelligence (EQ). It begins by providing background on the development of EQ models over time. It then introduces the Behavioral EQ Model, which considers both emotional intelligence (ability to perceive and understand emotions) and behavioral intelligence (ability to recognize how emotions impact behavior and use that awareness). The rest of the document outlines an EQ profile and assessment that measures 15 constructs related to emotional and behavioral skills through ratings from multiple people. It aims to gather data to analyze the profile's reliability and validity.
This presentation discusses designing an English language compiler to detect emotion from text. It begins with an introduction to emotion and common emotion models. It then outlines the objectives and architecture of the emotion detection system. Key aspects covered include language processing techniques like keyword analysis and parsing, semantic analysis, and the word-processing and sentence analysis modules. Challenges in developing such a system are also discussed. Finally, potential future work and references are presented.
This document discusses new approaches to natural language processing systems and how they can be improved. It notes that current NLP systems have limitations in areas like translation, information retrieval, understanding context and searching for relations. It suggests that NLP systems could be enhanced by reviewing current tools, understanding how humans are able to process language more effectively, and incorporating human-like characteristics like continual learning, motivation and the ability to learn from any source. Next steps proposed include developing new ways to store and access knowledge, understanding how humans learn, and creating systems that can understand users' intentions.
Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. It is an interdisciplinary field spanning computer science, psychology, and cognitive science. While the origins of the field may be traced as far back as to early philosophical enquiries into emotion ("affect" is, basically, a synonym for "emotion."), the more modern branch of computer science originated with Rosalind Picard's 1995 paper on affective computing. A motivation for the research is the ability to simulate empathy. The machine should interpret the emotional state of humans and adapt its behavior to them, giving an appropriate response for those emotions.
Face expression recognition using Scaled-conjugate gradient Back-Propagation ...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
International Journal of Modern Engineering Research (IJMER) covers all the fields of engineering and science: Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Computer Engineering, Agricultural Engineering, Aerospace Engineering, Thermodynamics, Structural Engineering, Control Engineering, Robotics, Mechatronics, Fluid Mechanics, Nanotechnology, Simulators, Web-based Learning, Remote Laboratories, Engineering Design Methods, Education Research, Students' Satisfaction and Motivation, Global Projects, and Assessment…. And many more.
Emotional Multi-Agents System for Peer to peer E-LearningKarlos Svoboda
The document describes EMASPEL, an emotional multi-agent system for peer-to-peer e-learning. The system uses multiple agent types, including interface agents, emotional agents, curriculum agents, tutor agents, and emotional embodied conversational agents. Emotional agents analyze learners' facial expressions to recognize emotions, while embodied conversational agents express emotions through facial animations. Together this allows the system to personalize instruction based on learners' cognitive and emotional states.
This presentation summarizes a thesis proposal on detecting human emotion on social media based on textual data. The proposal will use a classifier model to identify emotions from social media texts. It will cluster text data into 8 emotion classes to train the classifier. The goal is to analyze social media posts to understand public sentiment on issues and help inform decisions. While the approach only uses text data in English, identifying emotion across languages and media poses challenges.
This document provides an overview of artificial intelligence (AI) including definitions, approaches, foundations, capabilities, and comparisons to human intelligence. It defines AI as the study of intelligent behavior in machines, discusses the four main approaches of acting humanly, thinking humanly, thinking rationally, and acting rationally. The foundations of AI are explained including contributions from fields like philosophy, mathematics, psychology, neuroscience, and more. Both strong AI which aims to truly replicate human reasoning and weak AI which focuses on narrow domains are described. Current capabilities of AI systems in areas such as games, robotics, diagnosis, and planning are summarized. Finally, differences between human and machine intelligence are outlined.
This document provides an overview of artificial intelligence techniques. It begins with definitions of AI and discusses branches of AI like logical AI, search, pattern recognition, knowledge representation, inference and more. It also discusses AI applications, problems in AI and the levels of modeling human intelligence. Several examples are then provided to illustrate increasingly sophisticated AI techniques for playing tic-tac-toe and answering questions to demonstrate moving towards knowledge representations that generalize information and are more extensible.
Adaptive Emotional Personality Model based on Fuzzy Logic Interpretation of F...CSCJournals
In recent years, emotional personality has found an important application in the field of human machine interaction. Interesting examples of this domain are computer games, interface agents, human-robot interaction, etc. However, few systems in this area include a model of personality, although it plays an important role in differentiating and determining the way they experience emotions and the way they behave. Personality simulation has always been a complex issue due to the complexity of the human personality itself, and the difficulty to model human psychology on electronic basis. Current efforts for emotion simulation are rather based on predefined set or inputs and its responses or on classical models which are simple approximate and have proven flaws. In this paper an emotional simulation system was presented. It utilizes the latest psychological theories to design a complex dynamic system that reacts to any environment, without being pre-programmed on sets of input. The design was relying on fuzzy logic to simulate human emotional reaction, thus increasing the accuracy by further emulating human brain and removing the pre-defined set of input and its matched outputs.
This document discusses emotion detection from text. It presents an emotion detection model that extracts emotion from text at the sentence level without relying on existing affect lexicons. The model detects emotion by searching for direct emotional keywords and emotion-affect words/phrases. Experiments show the method achieves over 77% accuracy in detecting Ekman's six basic emotions from text. The document also reviews related work on emotion detection approaches, including keyword-based, rule-based, and machine learning methods. It discusses challenges like the lack of large annotated training data and limitations of dictionary-based approaches.
By thinking about systems thinking, our aim was to provide other system thinkers with a mind map for the key elements of the thinking that took place.
The sepsis problem is highly complex and spans not just the biological system, but also the healthcare enterprise. Sepsis is the context in which systems thinking has been applied and examined.
Sepsis is an emergency situation that if left unrecognised and untreated in its early stage leads to multiple organ dysfunction and death.This case study highlights the attitudes; comments on the system approach, and puts forward the cognitive concepts.
All these concepts are integrated in an overall mind map looking like a tree: the branches of the tree represent the systems thinker's attitudes; the roots of the tree represent features of systems that are commonly considered when systems’ thinking.
This document discusses practical emotional neural networks. It summarizes two existing approaches - brain emotional learning (BEL) based networks and emotional backpropagation (EmBP) based networks. BEL networks are inspired by the limbic system and amygdala in the brain, modeling fast emotional processing via short neural pathways. EmBP networks apply emotional states and appraisals to neural network learning. The document proposes a new limbic-based artificial emotional neural network (LiAENN) that models emotional situations like anxiety and confidence during learning. It applies anxious confident decayed brain emotional learning rules and achieves higher accuracy than BEL and EmBP networks on facial detection and emotion recognition tasks.
The document provides an introduction to artificial intelligence (AI), including its key concepts, scope, components, types, and applications. It defines AI as the science and engineering of creating intelligent machines, especially computer programs. The main types of AI discussed are narrow/weak AI, which can perform specific tasks, and general AI, which aims to create human-level intelligence. The document also outlines the core components of AI in areas like logic, cognition, and computation, and how these combine to form knowledge-based systems. Common applications of AI mentioned include gaming, natural language processing, and robotics.
1) Affective computing aims to expand human emotional intelligence to machines by creating socially intelligent machines that can respond appropriately according to the situation and interlocutor.
2) There are two main approaches to modeling emotions in affective computing: discrete theories that identify basic emotions like Ekman's six emotions, and continuous theories that describe emotions along dimensions of arousal and valence.
3) Empath's goal is to recognize emotions from speech regardless of language, which presents challenges of combining speech processing with emotion recognition from voice cues alone. Empath is developing methods to extract pitch, intensity, and speech rate from voice samples to train models to classify emotions.
This document discusses different definitions and approaches to artificial intelligence (AI). It begins by defining AI as helping machines solve complex problems like humans by applying human-like algorithms. It then discusses AI's links to other fields and its history. The rest of the document explores definitions of AI and different goals or approaches in AI research, including systems that think or act like humans and systems that think or act rationally. It focuses on the Turing Test approach of acting humanly and the cognitive modeling approach of thinking humanly by modeling human cognition.
The document discusses the definitions and goals of artificial intelligence, including attempting to match or surpass human intelligence (strong AI), or accomplishing specific tasks without full human cognitive abilities (weak AI). It also covers the components of intelligence like reasoning, learning, and problem solving, as well as the history and importance of AI research in areas like philosophy, mathematics, psychology and its applications in tasks like games, scientific analysis and medical diagnosis.
This document provides information about an Artificial Intelligence course. The key details are:
- The course is CSC 343, taught over 3 lecture hours and 2 lab hours
The document discusses a mind reading computer that can infer mental states from facial expressions and gestures. It works by using functional near-infrared spectroscopy to measure blood oxygen levels in the brain while the user performs tasks. The technology has advantages like helping disabled people but also risks like privacy breaches if used improperly. Researchers are working to develop this technology further so computers can interact based on brain activity readings.
This document discusses behavioral emotional intelligence (EQ). It begins by providing background on the development of EQ models over time. It then introduces the Behavioral EQ Model, which considers both emotional intelligence (ability to perceive and understand emotions) and behavioral intelligence (ability to recognize how emotions impact behavior and use that awareness). The rest of the document outlines an EQ profile and assessment that measures 15 constructs related to emotional and behavioral skills through ratings from multiple people. It aims to gather data to analyze the profile's reliability and validity.
This presentation discusses designing an English language compiler to detect emotion from text. It begins with an introduction to emotion and common emotion models. It then outlines the objectives and architecture of the emotion detection system. Key aspects covered include language processing techniques like keyword analysis and parsing, semantic analysis, and the word-processing and sentence analysis modules. Challenges in developing such a system are also discussed. Finally, potential future work and references are presented.
This document discusses new approaches to natural language processing systems and how they can be improved. It notes that current NLP systems have limitations in areas like translation, information retrieval, understanding context and searching for relations. It suggests that NLP systems could be enhanced by reviewing current tools, understanding how humans are able to process language more effectively, and incorporating human-like characteristics like continual learning, motivation and the ability to learn from any source. Next steps proposed include developing new ways to store and access knowledge, understanding how humans learn, and creating systems that can understand users' intentions.
Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. It is an interdisciplinary field spanning computer science, psychology, and cognitive science. While the origins of the field may be traced as far back as to early philosophical enquiries into emotion ("affect" is, basically, a synonym for "emotion."), the more modern branch of computer science originated with Rosalind Picard's 1995 paper on affective computing. A motivation for the research is the ability to simulate empathy. The machine should interpret the emotional state of humans and adapt its behavior to them, giving an appropriate response for those emotions.
Face expression recognition using Scaled-conjugate gradient Back-Propagation ...IJMER
International Journal of Modern Engineering Research (IJMER) is Peer reviewed, online Journal. It serves as an international archival forum of scholarly research related to engineering and science education.
International Journal of Modern Engineering Research (IJMER) covers all the fields of engineering and science: Electrical Engineering, Mechanical Engineering, Civil Engineering, Chemical Engineering, Computer Engineering, Agricultural Engineering, Aerospace Engineering, Thermodynamics, Structural Engineering, Control Engineering, Robotics, Mechatronics, Fluid Mechanics, Nanotechnology, Simulators, Web-based Learning, Remote Laboratories, Engineering Design Methods, Education Research, Students' Satisfaction and Motivation, Global Projects, and Assessment…. And many more.
Similar to Cognitive Reasoning and Inferences through Psychologically based Personalised Modelling of Emotions Using Associative Classifiers (20)
This document summarizes an invited talk given by Dr. Aladdin Ayesh on artificial intelligence topics. The talk covered definitions of AI, major AI fields like machine learning, planning, natural language processing and computer vision. It also discussed applications of AI such as intelligent interfaces, personalization, smart services and analytics. Throughout the talk, examples and potential future directions were provided for different AI topics.
Creativity Conversations - 2007
Bret Battey Vs. Aladdin Ayesh
Part of Institute Of Creative Technology Creative Conversations.
Video and Photos available from:
http://creem.dmu.ac.uk/CreativityConversations/res171007.htm
Social Robots: From Emotional Consciousness to Buddy DevicesAladdin Ayesh
This document summarizes Dr. Aladdin Ayesh's invited talk at Eton College on social robots. The talk covers several topics: introducing social robots and examples like AIBO and NAO robots; the challenges of creating emotional, personality and social norm abilities in robots; moving from theories of artificial consciousness to practical buddy devices; and ethical issues with increasingly social technologies. The talk explores concepts like intelligent spaces, virtual worlds, medical cyborg applications, and how everyday devices could become more social.
Multi-Agent Modelling With applications to robotics and cognitionAladdin Ayesh
This document summarizes a keynote talk on multi-agent modeling and its applications to robotics and cognition. The talk discusses what constitutes an agent and examines cognition from the perspectives of senses, thinking, emotions, and cognitive architectures. It also explores two types of agent embodiment: robots, which impose challenges related to physical limitations and neurology; and avatars, which raise questions about virtual bodies. The talk aims to bring together different areas of research in developing cognitive systems and modeling human behavior.
Emotions Modelling and Synthetic CharactersAladdin Ayesh
Dr. Aladdin Ayesh presents on emotions modelling and synthetic characters. He discusses some key issues in emotions modelling like selecting emotions to represent, expressing emotions, and recognizing emotions from facial expressions and other cues. He outlines some of his research projects in this area, including computational models of emotion and emotionally expressive communication languages.
Complex Systems Approach to Emotionally-aware Learning EnvironmentsAladdin Ayesh
Agent technologies have emerged from Artificial Intelligence and Social Sciences studies to become an emergent filed within main stream Software Engineering. Their characteristics of scalability, robustness, and easy maintainability made them attractive technique for modelling and implementing variety of systems especially Complex Systems. Whilst their similarity to objects in object-oriented systems make them easy to model and implement using extended versions of existing tools, their association with AI and human notion of agency keep them an ever evolving field.
In this talk, we will be looking at some of our current projects on emotions and intelligent learning environments to a proposal for a new more concentrated effort for emotionally-aware learning environments by utilising intelligent agents concepts and complex systems principles. The presentation will give general background on Cognitive Agents and Emotions Modelling, and on some interesting recent developments in Complex Systems.
Agent-Oriented Systems: From the Primitive to the EmotionalAladdin Ayesh
The document summarizes a talk given by Dr. Aladdin Ayesh on agent-oriented systems from primitive to emotional. It discusses the notion of agency, modeling emotions using different psychological theories, and applying agent models to projects like eLearning and text mining. It also describes collaboration between De Montfort University and Valencia University on modeling emotions in eLearning environments.
State of Artificial intelligence Report 2023kuntobimo2016
Artificial intelligence (AI) is a multidisciplinary field of science and engineering whose goal is to create intelligent machines.
We believe that AI will be a force multiplier on technological progress in our increasingly digital, data-driven world. This is because everything around us today, ranging from culture to consumer products, is a product of intelligence.
The State of AI Report is now in its sixth year. Consider this report as a compilation of the most interesting things we’ve seen with a goal of triggering an informed conversation about the state of AI and its implication for the future.
We consider the following key dimensions in our report:
Research: Technology breakthroughs and their capabilities.
Industry: Areas of commercial application for AI and its business impact.
Politics: Regulation of AI, its economic implications and the evolving geopolitics of AI.
Safety: Identifying and mitigating catastrophic risks that highly-capable future AI systems could pose to us.
Predictions: What we believe will happen in the next 12 months and a 2022 performance review to keep us honest.
Analysis insight about a Flyball dog competition team's performanceroli9797
Insight of my analysis about a Flyball dog competition team's last year performance. Find more: https://github.com/rolandnagy-ds/flyball_race_analysis/tree/main
The Ipsos - AI - Monitor 2024 Report.pdfSocial Samosa
According to Ipsos AI Monitor's 2024 report, 65% Indians said that products and services using AI have profoundly changed their daily life in the past 3-5 years.
Global Situational Awareness of A.I. and where its headedvikram sood
You can see the future first in San Francisco.
Over the past year, the talk of the town has shifted from $10 billion compute clusters to $100 billion clusters to trillion-dollar clusters. Every six months another zero is added to the boardroom plans. Behind the scenes, there’s a fierce scramble to secure every power contract still available for the rest of the decade, every voltage transformer that can possibly be procured. American big business is gearing up to pour trillions of dollars into a long-unseen mobilization of American industrial might. By the end of the decade, American electricity production will have grown tens of percent; from the shale fields of Pennsylvania to the solar farms of Nevada, hundreds of millions of GPUs will hum.
The AGI race has begun. We are building machines that can think and reason. By 2025/26, these machines will outpace college graduates. By the end of the decade, they will be smarter than you or I; we will have superintelligence, in the true sense of the word. Along the way, national security forces not seen in half a century will be un-leashed, and before long, The Project will be on. If we’re lucky, we’ll be in an all-out race with the CCP; if we’re unlucky, an all-out war.
Everyone is now talking about AI, but few have the faintest glimmer of what is about to hit them. Nvidia analysts still think 2024 might be close to the peak. Mainstream pundits are stuck on the wilful blindness of “it’s just predicting the next word”. They see only hype and business-as-usual; at most they entertain another internet-scale technological change.
Before long, the world will wake up. But right now, there are perhaps a few hundred people, most of them in San Francisco and the AI labs, that have situational awareness. Through whatever peculiar forces of fate, I have found myself amongst them. A few years ago, these people were derided as crazy—but they trusted the trendlines, which allowed them to correctly predict the AI advances of the past few years. Whether these people are also right about the next few years remains to be seen. But these are very smart people—the smartest people I have ever met—and they are the ones building this technology. Perhaps they will be an odd footnote in history, or perhaps they will go down in history like Szilard and Oppenheimer and Teller. If they are seeing the future even close to correctly, we are in for a wild ride.
Let me tell you what we see.
ViewShift: Hassle-free Dynamic Policy Enforcement for Every Data LakeWalaa Eldin Moustafa
Dynamic policy enforcement is becoming an increasingly important topic in today’s world where data privacy and compliance is a top priority for companies, individuals, and regulators alike. In these slides, we discuss how LinkedIn implements a powerful dynamic policy enforcement engine, called ViewShift, and integrates it within its data lake. We show the query engine architecture and how catalog implementations can automatically route table resolutions to compliance-enforcing SQL views. Such views have a set of very interesting properties: (1) They are auto-generated from declarative data annotations. (2) They respect user-level consent and preferences (3) They are context-aware, encoding a different set of transformations for different use cases (4) They are portable; while the SQL logic is only implemented in one SQL dialect, it is accessible in all engines.
#SQL #Views #Privacy #Compliance #DataLake
Predictably Improve Your B2B Tech Company's Performance by Leveraging DataKiwi Creative
Harness the power of AI-backed reports, benchmarking and data analysis to predict trends and detect anomalies in your marketing efforts.
Peter Caputa, CEO at Databox, reveals how you can discover the strategies and tools to increase your growth rate (and margins!).
From metrics to track to data habits to pick up, enhance your reporting for powerful insights to improve your B2B tech company's marketing.
- - -
This is the webinar recording from the June 2024 HubSpot User Group (HUG) for B2B Technology USA.
Watch the video recording at https://youtu.be/5vjwGfPN9lw
Sign up for future HUG events at https://events.hubspot.com/b2b-technology-usa/
Beyond the Basics of A/B Tests: Highly Innovative Experimentation Tactics You...Aggregage
This webinar will explore cutting-edge, less familiar but powerful experimentation methodologies which address well-known limitations of standard A/B Testing. Designed for data and product leaders, this session aims to inspire the embrace of innovative approaches and provide insights into the frontiers of experimentation!
Cognitive Reasoning and Inferences through Psychologically based Personalised Modelling of Emotions Using Associative Classifiers
1. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Cognitive Reasoning and Inferences through
Psychologically based Personalised Modelling of
Emotions Using Associative Classifiers
Aladdin Ayesh∗
Miguel Arevalillo-Herr´aez+
Francesc J. Ferri+
∗
De Montfort University
+
University of Valencia
Presentation given at ICCI*CC 2014
August 18, 2014
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
2. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Agenda
1 Context
Adaptive User Interfaces
Computational Emotions
2 Towards Dynamic Personalised Emotion Models
Corpus Development
Data Analysis
Personalized Model
3 Critical Review
Findings
Future Work
4 Conclusion
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
3. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
eLearning and Adaptive User Interfaces
eLearning includes:
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
4. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
eLearning and Adaptive User Interfaces
eLearning includes:
Managed Learning Environments (MLE)
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
5. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
eLearning and Adaptive User Interfaces
eLearning includes:
Managed Learning Environments (MLE)
Intelligent Tutoring Systems (ITS)
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
6. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
eLearning and Adaptive User Interfaces
eLearning includes:
Managed Learning Environments (MLE)
Intelligent Tutoring Systems (ITS)
Virtual Classrooms and Social Networking
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
7. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
eLearning and Adaptive User Interfaces
eLearning includes:
Managed Learning Environments (MLE)
Intelligent Tutoring Systems (ITS)
Virtual Classrooms and Social Networking
In the first two the prime interaction happens between human and
computer. Whilst other humans will facilitate the interaction in the
third case of virtual classrooms and social networking, the system
has to have emotional intelligence in the first two cases of MLE and
ITS.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
8. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
eLearning and Adaptive User Interfaces
eLearning includes:
Managed Learning Environments (MLE)
Intelligent Tutoring Systems (ITS)
Virtual Classrooms and Social Networking
In the first two the prime interaction happens between human and
computer. Whilst other humans will facilitate the interaction in the
third case of virtual classrooms and social networking, the system
has to have emotional intelligence in the first two cases of MLE and
ITS.
but why ...
and would it be possible to have emotionally intelligent systems?
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
9. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
Emotion Model
Modelling emotions is the first step in the growing field of computational
emotions research and enables us to develop further processes such as
emotion detection and expression.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
10. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
Emotion Model
Modelling emotions is the first step in the growing field of computational
emotions research and enables us to develop further processes such as
emotion detection and expression.
Preliminaries
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
11. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
Emotion Model
Modelling emotions is the first step in the growing field of computational
emotions research and enables us to develop further processes such as
emotion detection and expression.
Preliminaries
Most common emotion model is the Darwinian 6-basic emotions
Helped by Ekman FACS system (Ekman basic emotions)
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
12. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
Emotion Model
Modelling emotions is the first step in the growing field of computational
emotions research and enables us to develop further processes such as
emotion detection and expression.
Preliminaries
Most common emotion model is the Darwinian 6-basic emotions
Helped by Ekman FACS system (Ekman basic emotions)
FACS system novelty is in the standardisation of the different muscle
movements into recognisable Action Units (AU).
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
13. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
Emotion Model
Modelling emotions is the first step in the growing field of computational
emotions research and enables us to develop further processes such as
emotion detection and expression.
Preliminaries
Most common emotion model is the Darwinian 6-basic emotions
Helped by Ekman FACS system (Ekman basic emotions)
FACS system novelty is in the standardisation of the different muscle
movements into recognisable Action Units (AU).
FACS system AUs are more than what is necessary to produce an
automated system for facial expressions and emotion detection.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
14. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
Emotion Model
Modelling emotions is the first step in the growing field of computational
emotions research and enables us to develop further processes such as
emotion detection and expression.
Preliminaries
Most common emotion model is the Darwinian 6-basic emotions
Helped by Ekman FACS system (Ekman basic emotions)
FACS system novelty is in the standardisation of the different muscle
movements into recognisable Action Units (AU).
FACS system AUs are more than what is necessary to produce an
automated system for facial expressions and emotion detection.
Few researchers have already noticed this observation and attempted
to identify a small set of sufficient AUs.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
15. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
Emotion Model
Modelling emotions is the first step in the growing field of computational
emotions research and enables us to develop further processes such as
emotion detection and expression.
Preliminaries
Most common emotion model is the Darwinian 6-basic emotions
Helped by Ekman FACS system (Ekman basic emotions)
FACS system novelty is in the standardisation of the different muscle
movements into recognisable Action Units (AU).
FACS system AUs are more than what is necessary to produce an
automated system for facial expressions and emotion detection.
Few researchers have already noticed this observation and attempted
to identify a small set of sufficient AUs.
Also, facial expressions alone are not sufficient means to determine a
given user’s emotions.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
16. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
Emotion Detection: Kinect and FACS
Kinect SDK can detect to a high degree of accuracy a small set of 6
AUs only.
It uses CANDIDE model 1
to map the face, thus detects AUs, head
postures, etc.
1CANDIDE model itself implements a subset of FACS system AUs.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
17. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Adaptive User Interfaces
Computational Emotions
Emotion Detection: Kinect and FACS
Kinect SDK can detect to a high degree of accuracy a small set of 6
AUs only.
It uses CANDIDE model 1
to map the face, thus detects AUs, head
postures, etc.
Relating Kinect AUs to FACS AUs
FACS AUs Kinect AUs
AU10 AU0
AU26 AU1
AU20 AU2
AU4 AU3
AU15 AU4
AU2 AU5
1CANDIDE model itself implements a subset of FACS system AUs.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
18. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Why a new corpus?
Requirements
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
19. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Why a new corpus?
Requirements
We aim for a system that works with minimum constraints in
natural user environment.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
20. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Why a new corpus?
Requirements
We aim for a system that works with minimum constraints in
natural user environment.
The system success rate expected to be low but
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
21. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Why a new corpus?
Requirements
We aim for a system that works with minimum constraints in
natural user environment.
The system success rate expected to be low but
Sufficient for an intelligent adaptation of user’s interface.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
22. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Why a new corpus?
Requirements
We aim for a system that works with minimum constraints in
natural user environment.
The system success rate expected to be low but
Sufficient for an intelligent adaptation of user’s interface.
Developed Corpus consists of two parts:
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
23. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Why a new corpus?
Requirements
We aim for a system that works with minimum constraints in
natural user environment.
The system success rate expected to be low but
Sufficient for an intelligent adaptation of user’s interface.
Developed Corpus consists of two parts:
The first part includes a video recording of 5 volunteers expressing
the 6 basic emotions of Happy, Sad, Anger, Disgust, Surprise and
Fear.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
24. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Why a new corpus?
Requirements
We aim for a system that works with minimum constraints in
natural user environment.
The system success rate expected to be low but
Sufficient for an intelligent adaptation of user’s interface.
Developed Corpus consists of two parts:
The first part includes a video recording of 5 volunteers expressing
the 6 basic emotions of Happy, Sad, Anger, Disgust, Surprise and
Fear.
The second part was developed in similar manner for detecting
expressions of affects.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
25. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Initial Analysis of Kinect Data
Happy expression across 3 subjects
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
26. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Classifiers Approach
What if ...
we can generate a set of associative rules of AUs per individual, i.e. an
individualised expert system?
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
27. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Classifiers Approach
What if ...
we can generate a set of associative rules of AUs per individual, i.e. an
individualised expert system?
we explored some popular tree/rule classifiers.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
28. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Classifiers Approach
What if ...
we can generate a set of associative rules of AUs per individual, i.e. an
individualised expert system?
we explored some popular tree/rule classifiers.
they are mostly regression algorithms (evolving around M5 classifier)
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
29. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Classifiers Approach
What if ...
we can generate a set of associative rules of AUs per individual, i.e. an
individualised expert system?
we explored some popular tree/rule classifiers.
they are mostly regression algorithms (evolving around M5 classifier)
number of rules generated differ from algorithm to the next but ...
general trends are maintained
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
30. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Classifiers Approach
What if ...
we can generate a set of associative rules of AUs per individual, i.e. an
individualised expert system?
we explored some popular tree/rule classifiers.
they are mostly regression algorithms (evolving around M5 classifier)
number of rules generated differ from algorithm to the next but ...
general trends are maintained
concrete numbers limit the generalisation of the generated rules
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
31. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Generated Rules: Examples
Two rules generated for AU0 and AU5 from subject-1 Anger data
LMnum : 46 AU0 = 0.1579 ∗ AU1 − 0.053 ∗ AU2 − 0.115 ∗ AU3 −
0.306 ∗ AU4 − 0.3503 ∗ AU5 + 0.0975
LMnum : 47 AU0 = 0.2874 ∗ AU1 − 0.0072 ∗ AU2 − 0.1478 ∗ AU3 −
0.2311 ∗ AU4 − 0.0311 ∗ AU5 + 0.3731
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
32. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Corpus Development
Data Analysis
Personalized Model
Generated Rules: Examples
Two rules generated for AU0 and AU5 from subject-1 Anger data
LMnum : 46 AU0 = 0.1579 ∗ AU1 − 0.053 ∗ AU2 − 0.115 ∗ AU3 −
0.306 ∗ AU4 − 0.3503 ∗ AU5 + 0.0975
LMnum : 47 AU0 = 0.2874 ∗ AU1 − 0.0072 ∗ AU2 − 0.1478 ∗ AU3 −
0.2311 ∗ AU4 − 0.0311 ∗ AU5 + 0.3731
Examples of AU5 rules
LMnum : 40 AU5 = −0.3099 ∗ AU0 + 0.534 ∗ AU1 − 0.0305 ∗ AU2 −
0.2681 ∗ AU3 − 0.2012 ∗ AU4 − 0.0643
LMnum : 41 AU5 = −0.2827 ∗ AU0 + 0.4997 ∗ AU1 − 0.0314 ∗ AU2 −
0.2681 ∗ AU3 − 0.2012 ∗ AU4 − 0.0724
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
33. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Findings
Future Work
Findings
There is clear inter-relationships between AUs that could allow us to
determine the presence and intensity of a given AU from raw data.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
34. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Findings
Future Work
Findings
There is clear inter-relationships between AUs that could allow us to
determine the presence and intensity of a given AU from raw data.
AUs relationships are not reflexive in connection to emotions, i.e.
the contribution of AU5 in the rules of AU0 are not necessary of the
same value and significance as of AU0 in AU5 rules.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
35. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Findings
Future Work
Findings
There is clear inter-relationships between AUs that could allow us to
determine the presence and intensity of a given AU from raw data.
AUs relationships are not reflexive in connection to emotions, i.e.
the contribution of AU5 in the rules of AU0 are not necessary of the
same value and significance as of AU0 in AU5 rules.
Number of rules required for each AU to determine a given emotion
differs for each user and for each emotion.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
36. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Findings
Future Work
Findings
There is clear inter-relationships between AUs that could allow us to
determine the presence and intensity of a given AU from raw data.
AUs relationships are not reflexive in connection to emotions, i.e.
the contribution of AU5 in the rules of AU0 are not necessary of the
same value and significance as of AU0 in AU5 rules.
Number of rules required for each AU to determine a given emotion
differs for each user and for each emotion.
M5 and M5P are more likely candidates to enable us in generating a
personified rule-based system for emotion detection.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
37. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Findings
Future Work
Findings
There is clear inter-relationships between AUs that could allow us to
determine the presence and intensity of a given AU from raw data.
AUs relationships are not reflexive in connection to emotions, i.e.
the contribution of AU5 in the rules of AU0 are not necessary of the
same value and significance as of AU0 in AU5 rules.
Number of rules required for each AU to determine a given emotion
differs for each user and for each emotion.
M5 and M5P are more likely candidates to enable us in generating a
personified rule-based system for emotion detection.
The rules generated will require partial generalisation to reduce the
number of rules and widen applicability.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
38. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Findings
Future Work
Future plans
Explore classifier algorithms further and generalise the generated
rules
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
39. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Findings
Future Work
Future plans
Explore classifier algorithms further and generalise the generated
rules
Automate the process to dynamically generate personified set of
rules for emotion detection.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
40. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Findings
Future Work
Future plans
Explore classifier algorithms further and generalise the generated
rules
Automate the process to dynamically generate personified set of
rules for emotion detection.
Explore alternative rule representations, e.g. rules based on fuzzy
and multi-valued logic, which may also help in generalisation.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised
41. Context
Towards Dynamic Personalised Emotion Models
Critical Review
Conclusion
Conclusion
Dr. Aladdin Ayesh
aayesh@dmu.ac.uk – dr.aladdin.ayesh@ieee.org
www.aladdin-ayesh.info
Referencing paper
A. Ayesh, M. Arevalillo-Herr´aez, and F. J. Ferri, “Cognitive
reasoning and inferences through psychologically based personalised
modelling of emotions using associative classifiers,” in ICCI*CC 2014
Proceedings. London: IEEE, 2014, pp. 67–72.
Aladdin Ayesh∗ Miguel Arevalillo-Herr´aez+ Francesc J. Ferri+ Cognitive Reasoning and Inferences through Psychologically based Personalised