1. People have long been fascinated with understanding themselves through objective analysis of their surroundings and tracking tools help provide insights into activities, behaviors, and improvements.
2. Tracking technologies have evolved from simple devices like pedometers to more advanced sensors in devices that can track a variety of daily activities both active and inactive.
3. Emerging areas of focus include location-based tracking and combining multiple sensory data streams to gain a more holistic view of behaviors.
The document discusses observations from the 2013 Consumer Electronics Show in Las Vegas. It notes that sensors will soon be integrated into many everyday objects, generating vast amounts of user data. It also emphasizes that mobility is becoming a top priority, as people increasingly rely on mobile devices and expect connectivity anywhere. The hyper-connected world it envisions is one where all of our devices, appliances, and surroundings interact seamlessly and are made smarter through sensors and data analysis.
The Sixth Sense is the Basic Latest Technology. It is the a wearable gestural interface that augments the physical world around us with digital information
The document describes a proposed idea called Track-O-Shoes, which uses sensors in shoes to track fitness metrics and location of the user. The shoes would contain a GPS sensor to track location in real-time, as well as proximity, accelerometer, gyroscope, and LED sensors to calculate step count, distance, activity time, calories burned, and provide light in dark conditions. The data would be sent to a mobile app via microcontroller. The app could then be used for fitness tracking and live location tracking of the user. Key components proposed include the sensors, microcontroller, and a mobile app compatible with Android devices.
Twiny is a small device that contains sensors to monitor vital signs and mood. It connects to apps via Bluetooth to track activity, health observations, and allow data sharing. The device can be worn as a patch, elastic band, or pin and collects information on things like skin conductance, temperature, and heart rate. This data helps users learn how to control their mood, reduce stress, and improve their overall well-being and quality of life. The goal is to create awareness of one's physical and mental states through an eco-friendly, non-intrusive device.
Sixth sense technology allows users to interact with digital information using natural hand gestures by augmenting the physical world. It uses a camera, colored markers, mobile device, mirror, and projector. Some applications include making calls by projecting a keypad, navigating maps with finger gestures, telling time by drawing a circle on the wrist, and taking photos by forming a square with fingers. The technology recognizes objects and displays information automatically. Future enhancements could eliminate color markers, incorporate the camera and projector into mobile devices, enable 3D gesture tracking, and help disabled individuals.
Sensors are devices that measure physical quantities and convert them to signals that can be read by instruments or observers. Modern cellphones contain many sensors like microphones, cameras, GPS, touchscreens, accelerometers, and gyroscopes that have replaced the need for separate devices. The document discusses how common sensors like infrared sensors, accelerometers, gyroscopes, touchscreens, and cameras work, as well as where they are used. It notes that sensors have improved over time by becoming smaller, faster, better, and cheaper due to advances in processing technology and manufacturing.
The document describes Sirui (Quincy) Liu's PhD dissertation on modeling and simulating human activities in smart spaces using the Persim 3D system. The system allows defining environments, objects, sensors and character models to visualize activities. It was validated through similarity comparisons between synthesized and real sensor datasets, and a user study found the visualizations had average realism ratings between 6-8 on a 10-point scale. The system aims to enable low-cost activity experimentation and testing of smart home technologies without human subjects.
The document discusses observations from the 2013 Consumer Electronics Show in Las Vegas. It notes that sensors will soon be integrated into many everyday objects, generating vast amounts of user data. It also emphasizes that mobility is becoming a top priority, as people increasingly rely on mobile devices and expect connectivity anywhere. The hyper-connected world it envisions is one where all of our devices, appliances, and surroundings interact seamlessly and are made smarter through sensors and data analysis.
The Sixth Sense is the Basic Latest Technology. It is the a wearable gestural interface that augments the physical world around us with digital information
The document describes a proposed idea called Track-O-Shoes, which uses sensors in shoes to track fitness metrics and location of the user. The shoes would contain a GPS sensor to track location in real-time, as well as proximity, accelerometer, gyroscope, and LED sensors to calculate step count, distance, activity time, calories burned, and provide light in dark conditions. The data would be sent to a mobile app via microcontroller. The app could then be used for fitness tracking and live location tracking of the user. Key components proposed include the sensors, microcontroller, and a mobile app compatible with Android devices.
Twiny is a small device that contains sensors to monitor vital signs and mood. It connects to apps via Bluetooth to track activity, health observations, and allow data sharing. The device can be worn as a patch, elastic band, or pin and collects information on things like skin conductance, temperature, and heart rate. This data helps users learn how to control their mood, reduce stress, and improve their overall well-being and quality of life. The goal is to create awareness of one's physical and mental states through an eco-friendly, non-intrusive device.
Sixth sense technology allows users to interact with digital information using natural hand gestures by augmenting the physical world. It uses a camera, colored markers, mobile device, mirror, and projector. Some applications include making calls by projecting a keypad, navigating maps with finger gestures, telling time by drawing a circle on the wrist, and taking photos by forming a square with fingers. The technology recognizes objects and displays information automatically. Future enhancements could eliminate color markers, incorporate the camera and projector into mobile devices, enable 3D gesture tracking, and help disabled individuals.
Sensors are devices that measure physical quantities and convert them to signals that can be read by instruments or observers. Modern cellphones contain many sensors like microphones, cameras, GPS, touchscreens, accelerometers, and gyroscopes that have replaced the need for separate devices. The document discusses how common sensors like infrared sensors, accelerometers, gyroscopes, touchscreens, and cameras work, as well as where they are used. It notes that sensors have improved over time by becoming smaller, faster, better, and cheaper due to advances in processing technology and manufacturing.
The document describes Sirui (Quincy) Liu's PhD dissertation on modeling and simulating human activities in smart spaces using the Persim 3D system. The system allows defining environments, objects, sensors and character models to visualize activities. It was validated through similarity comparisons between synthesized and real sensor datasets, and a user study found the visualizations had average realism ratings between 6-8 on a 10-point scale. The system aims to enable low-cost activity experimentation and testing of smart home technologies without human subjects.
This document discusses key concepts in research methods and statistics. It covers variables, levels of measurement, experimental design, operational definitions, types of variables, scales of measurement, and order of operations. The major focus is on measurement, variables, experimental design, and understanding different scales and levels of measurement which are fundamental aspects of research methods and statistics.
The document discusses various methods for measuring liquid level, including direct and indirect methods. Direct methods involve devices that come into direct contact with the liquid, such as sight glasses, dipsticks, floats, and displacers. Indirect methods measure liquid level without contact, including hydrostatic pressure devices, electrical methods like capacitance probes, and technologies using lasers, microwaves, or ultrasound. Each method has advantages and limitations depending on the application and type of liquid.
This document discusses different types of variables: independent variables that can cause changes in other variables, dependent variables that can change in response to independent variables, and different levels of measurement for variables - nominal (qualitative categories without ranking), ordinal (ranked categories without defined distances), interval (ranked categories with equal distances), and ratio (ranked categories with an absolute zero point).
Today's document discusses methods for measuring liquid and solid levels in containers. There are two main categories: continuous level monitoring and single point sensing. Continuous monitoring constantly measures levels while single point sensing detects levels only when they reach a predetermined point. Direct sensing devices like level gauges and transmitters measure actual levels while indirect devices like differential pressure transmitters sense a liquid property like pressure to determine level. Common direct sensing devices include tubular and reflex type level gauges as well as float switches.
The document summarizes several common level measurement methods: float type, RF capacitance, RF impedance, conductance, hydrostatic head, radar, and ultrasonic. It provides details on how each method works, including explanations of concepts like dielectric constants, time of flight measurements, and guided wave radar. Radar level measurement can be done through air, using through air radar, or with contact devices like guided wave radar. Ultrasonic level measurement also uses time of flight principles with top-mounted transducers. Choosing a measurement method depends on factors like vessel dimensions, product composition, and process conditions.
Replication allows data from a MySQL master database to be synchronized with one or more slave databases. The master records all data changes in its binary log. Slave databases connect to the master and receive the binary log transactions, which they then apply locally to stay synchronized with the master database. Replication can be used for load balancing reads across multiple slave servers or for high availability by failing over to a slave if the master fails.
Level measuring devices are used to accurately measure the volume of fluid in containers on a continuous basis. Common level measuring instruments include level gauges, float devices, those that measure hydrostatic pressure, displacement type, echo type, and capacitive type. Level gauges provide a direct visual indicator by using a sealed cavity with a transparent wall. Float devices use floats that move up and down with the liquid level. Hydrostatic pressure instruments measure pressure at the bottom of a fluid column to indicate level. Displacement instruments use Archimedes' principle to measure the weight of displaced fluid. Echo type instruments measure the time of flight of waves reflected off the liquid surface. Capacitive instruments measure changes in capacitance between a probe and vessel walls
Quantified Self: The what, the why and the brave new futureAlja Isakovic
Slides from a lecture on the quantified self trend, prepared for communications students in May 2016. The lecture is based on a series of blog posts I wrote on the topic: https://medium.com/exploring-the-quantified-self
Possibilities and perils of the data-driven world.joshuakauffman
I gave this lecture and led a discussion at the Future Insight summit in Oslo, Norway, March 13, 2014.
This was an introduction to subjects relating to the data-driven world, including a lengthier bit on the Quantified Self.
I improvised from the presenter notes.They give a pretty good sense of the contour of the talk.
In the Q and A session, people were mostly concerned about privacy implications of personal data collection.
My short answer is that I am also concerned, and think we need to broaden the discussion of privacy so that it transcends the concept of unwanted exposure and recenters itself on questions relating to the terms of exchange of personal data as they relate to social and economic value.
The document is a proposal for a student project called "The Mood Tracker" which is a bracelet that tracks the wearer's emotional levels throughout the day using sensors. It will detect and record the user's energy levels and emotions on a scale of 1-10 to help people understand what affects their moods. The proposal describes the technology behind the bracelet, its design intended to be stylish, its syncing capabilities with a computer program, and the goal of helping users understand their emotions and find more positive experiences.
1) The document discusses using wearable sensors and augmented reality to capture expert experiences and knowledge and allow trainees to re-enact and experience those experiences.
2) Key aspects that are captured include gaze direction, videos, audio, gestures, physiological data, and positioning of the expert in an environment. This data is stored and synchronized to create an experience recording.
3) Trainees can then re-enact the expert's experience in real-time by having the expert's contextualized data augmented onto their own experience through methods like displaying the expert's position or where they were looking. This allows the trainee to experience the presence of the expert during tasks.
This document describes a study to design a smart plantar to support correct posture and movement during sports activities. The plantar would use sensors to monitor a user's movements and provide data on any incorrect behaviors via a mobile application. An early evaluation would involve users testing a prototype plantar during activities while providing feedback. The goal is to help users identify postural issues and motivate behavioral changes through personalized data visualization.
Society is currently going through a phase of having an adversarial relationship with personal data. Our data is gathered by third parties ranging from companies like Facebook and Google to governments and their agencies and although in theory we ourselves own our data, we don’t manage, get value from it, or use it ourselves. The only times we encounter our own data is when we read about abuses of it, or we get confused when we try to understand what GDPR means. One day we will live in a world where we actually own our own data and it will be managed for us, with our interests at heart, by trusted third parties analogous to how banks manage our wealth. Those third parties may increase the value of our data by pooling it, equivalent to banks lending money, and by sharing it with organisations like social media companies, educational institutions, entertainment companies, etc. In such a world we would be delighted rather than afraid, to gather data and to have data gathered about ourselves and used for our benefit. In such a world, what are the data points that can be gathered, what is our digital footprint ? In this talk I will present an overview of what data can, and is gathered by people about themselves. I will cover off-the-self and popular sensors as well as the more unusual and uncommon and as a focus I will give an overview of sleep, how it can be measured and what use that can be. Gathering data about oneself is also known as lifelogging or the quantified self and I will draw inspiration and case studies from the work we have done in the area of lifelogging over the last 15 years. (thanks to Cathal Gurrin for some of the slides).
The document discusses perception and how it relates to optical illusions. It provides definitions of perception as the process by which sensory information is organized and interpreted by the brain. It describes the three steps of perception - selection, organization, and interpretation. It discusses how perception is shaped by individual factors rather than objectively reflecting reality. Our perception can be influenced by optical illusions and differs from how things actually are in the world.
Lifelogging is the practice of tracking personal data generated by our own behavioral activities in continuous digital streams. As it is slowly becoming mainstream, it raises a lot of intriguing questions and thoughts.
Lifelogging and self-tracking are altering the Futures of:
Memory,
Remembering,
Forgetting,
Storytelling,
Privacy,
Law enforcement,
Governance,
Bodies,
and our very Humanness.
This report explores these questions, thoughts and futures.
Designing a Future We Want to Live In - UX STRAT USA 2017Andrew Hinton
The document discusses the importance of user experience strategy in the context of new technologies like artificial intelligence. It argues that as technologies become more pervasive in our environments and able to perceive and act on their own, it is crucial to understand them not just as products but as "users" themselves that experience the world differently than humans. The document advocates taking a holistic, service design approach to understand how technologies fit into and shape human contexts and experiences. It also stresses the need for UX professionals to engage at strategic, organizational levels and consider all stakeholders to ensure technologies are developed and used in truly human-centered ways.
This document discusses designing for the Internet of Things (IoT). It begins by defining the IoT as networks of physical objects with embedded sensors and actuators that communicate with other objects, databases, and people. It then discusses some challenges in designing for the IoT, including creating new interaction paradigms that leverage sensing capabilities while accommodating human behaviors. The document outlines characteristics of natural user interfaces for the IoT, such as considering context, cognitive load, social aspects, and movement. It provides examples of techniques for designing IoT interfaces, like bodystorming, gestural studies, prototyping, and usability testing.
This document discusses mind reading technology that can analyze a person's facial expressions and infer their mental state in real time using computer vision and machine learning. It works by tracking 24 feature points on the face and modeling the relationship between facial displays and mental states over time. Potential applications include monitoring driver attention and improving human-computer interfaces, but issues around privacy and predicting future behavior need to be addressed. Research is ongoing to develop less intrusive methods like using headbands that detect blood oxygen levels to read thoughts.
This document summarizes an interview study of 22 people who use fitness tracking technologies like apps and devices. The following key points are made:
- People track a variety of health metrics like walking, exercise, eating, weight, and sleep, often using multiple trackers for different purposes like training, weight loss, and sleep.
- Tracking is usually not long-term, but rather focused on specific events or daily goals. Data is rarely shared on social media due to perceptions of egotism.
- Tracking is often a social activity done with families, partners, or coworkers to compare data and progress together.
- Future tracker design should support co-emergence of activities and tracking over time,
Human Activity Recognition using Smartphone's sensor Pankaj Mishra
Human activity recognition plays significant role in medical field and in security system. In this project we have design a model which recognize a person’s activity based on Smartphone.
A 3- dimensional Smartphone sensor named accelerometer and gyroscope is used to collect time series signal, from which 26 features are generated in time and frequency domain. The activities are classified using 2 different dormant learning method i.e. k-nearest neighbor algorithm, decision tree algorithm.
This document discusses key concepts in research methods and statistics. It covers variables, levels of measurement, experimental design, operational definitions, types of variables, scales of measurement, and order of operations. The major focus is on measurement, variables, experimental design, and understanding different scales and levels of measurement which are fundamental aspects of research methods and statistics.
The document discusses various methods for measuring liquid level, including direct and indirect methods. Direct methods involve devices that come into direct contact with the liquid, such as sight glasses, dipsticks, floats, and displacers. Indirect methods measure liquid level without contact, including hydrostatic pressure devices, electrical methods like capacitance probes, and technologies using lasers, microwaves, or ultrasound. Each method has advantages and limitations depending on the application and type of liquid.
This document discusses different types of variables: independent variables that can cause changes in other variables, dependent variables that can change in response to independent variables, and different levels of measurement for variables - nominal (qualitative categories without ranking), ordinal (ranked categories without defined distances), interval (ranked categories with equal distances), and ratio (ranked categories with an absolute zero point).
Today's document discusses methods for measuring liquid and solid levels in containers. There are two main categories: continuous level monitoring and single point sensing. Continuous monitoring constantly measures levels while single point sensing detects levels only when they reach a predetermined point. Direct sensing devices like level gauges and transmitters measure actual levels while indirect devices like differential pressure transmitters sense a liquid property like pressure to determine level. Common direct sensing devices include tubular and reflex type level gauges as well as float switches.
The document summarizes several common level measurement methods: float type, RF capacitance, RF impedance, conductance, hydrostatic head, radar, and ultrasonic. It provides details on how each method works, including explanations of concepts like dielectric constants, time of flight measurements, and guided wave radar. Radar level measurement can be done through air, using through air radar, or with contact devices like guided wave radar. Ultrasonic level measurement also uses time of flight principles with top-mounted transducers. Choosing a measurement method depends on factors like vessel dimensions, product composition, and process conditions.
Replication allows data from a MySQL master database to be synchronized with one or more slave databases. The master records all data changes in its binary log. Slave databases connect to the master and receive the binary log transactions, which they then apply locally to stay synchronized with the master database. Replication can be used for load balancing reads across multiple slave servers or for high availability by failing over to a slave if the master fails.
Level measuring devices are used to accurately measure the volume of fluid in containers on a continuous basis. Common level measuring instruments include level gauges, float devices, those that measure hydrostatic pressure, displacement type, echo type, and capacitive type. Level gauges provide a direct visual indicator by using a sealed cavity with a transparent wall. Float devices use floats that move up and down with the liquid level. Hydrostatic pressure instruments measure pressure at the bottom of a fluid column to indicate level. Displacement instruments use Archimedes' principle to measure the weight of displaced fluid. Echo type instruments measure the time of flight of waves reflected off the liquid surface. Capacitive instruments measure changes in capacitance between a probe and vessel walls
Quantified Self: The what, the why and the brave new futureAlja Isakovic
Slides from a lecture on the quantified self trend, prepared for communications students in May 2016. The lecture is based on a series of blog posts I wrote on the topic: https://medium.com/exploring-the-quantified-self
Possibilities and perils of the data-driven world.joshuakauffman
I gave this lecture and led a discussion at the Future Insight summit in Oslo, Norway, March 13, 2014.
This was an introduction to subjects relating to the data-driven world, including a lengthier bit on the Quantified Self.
I improvised from the presenter notes.They give a pretty good sense of the contour of the talk.
In the Q and A session, people were mostly concerned about privacy implications of personal data collection.
My short answer is that I am also concerned, and think we need to broaden the discussion of privacy so that it transcends the concept of unwanted exposure and recenters itself on questions relating to the terms of exchange of personal data as they relate to social and economic value.
The document is a proposal for a student project called "The Mood Tracker" which is a bracelet that tracks the wearer's emotional levels throughout the day using sensors. It will detect and record the user's energy levels and emotions on a scale of 1-10 to help people understand what affects their moods. The proposal describes the technology behind the bracelet, its design intended to be stylish, its syncing capabilities with a computer program, and the goal of helping users understand their emotions and find more positive experiences.
1) The document discusses using wearable sensors and augmented reality to capture expert experiences and knowledge and allow trainees to re-enact and experience those experiences.
2) Key aspects that are captured include gaze direction, videos, audio, gestures, physiological data, and positioning of the expert in an environment. This data is stored and synchronized to create an experience recording.
3) Trainees can then re-enact the expert's experience in real-time by having the expert's contextualized data augmented onto their own experience through methods like displaying the expert's position or where they were looking. This allows the trainee to experience the presence of the expert during tasks.
This document describes a study to design a smart plantar to support correct posture and movement during sports activities. The plantar would use sensors to monitor a user's movements and provide data on any incorrect behaviors via a mobile application. An early evaluation would involve users testing a prototype plantar during activities while providing feedback. The goal is to help users identify postural issues and motivate behavioral changes through personalized data visualization.
Society is currently going through a phase of having an adversarial relationship with personal data. Our data is gathered by third parties ranging from companies like Facebook and Google to governments and their agencies and although in theory we ourselves own our data, we don’t manage, get value from it, or use it ourselves. The only times we encounter our own data is when we read about abuses of it, or we get confused when we try to understand what GDPR means. One day we will live in a world where we actually own our own data and it will be managed for us, with our interests at heart, by trusted third parties analogous to how banks manage our wealth. Those third parties may increase the value of our data by pooling it, equivalent to banks lending money, and by sharing it with organisations like social media companies, educational institutions, entertainment companies, etc. In such a world we would be delighted rather than afraid, to gather data and to have data gathered about ourselves and used for our benefit. In such a world, what are the data points that can be gathered, what is our digital footprint ? In this talk I will present an overview of what data can, and is gathered by people about themselves. I will cover off-the-self and popular sensors as well as the more unusual and uncommon and as a focus I will give an overview of sleep, how it can be measured and what use that can be. Gathering data about oneself is also known as lifelogging or the quantified self and I will draw inspiration and case studies from the work we have done in the area of lifelogging over the last 15 years. (thanks to Cathal Gurrin for some of the slides).
The document discusses perception and how it relates to optical illusions. It provides definitions of perception as the process by which sensory information is organized and interpreted by the brain. It describes the three steps of perception - selection, organization, and interpretation. It discusses how perception is shaped by individual factors rather than objectively reflecting reality. Our perception can be influenced by optical illusions and differs from how things actually are in the world.
Lifelogging is the practice of tracking personal data generated by our own behavioral activities in continuous digital streams. As it is slowly becoming mainstream, it raises a lot of intriguing questions and thoughts.
Lifelogging and self-tracking are altering the Futures of:
Memory,
Remembering,
Forgetting,
Storytelling,
Privacy,
Law enforcement,
Governance,
Bodies,
and our very Humanness.
This report explores these questions, thoughts and futures.
Designing a Future We Want to Live In - UX STRAT USA 2017Andrew Hinton
The document discusses the importance of user experience strategy in the context of new technologies like artificial intelligence. It argues that as technologies become more pervasive in our environments and able to perceive and act on their own, it is crucial to understand them not just as products but as "users" themselves that experience the world differently than humans. The document advocates taking a holistic, service design approach to understand how technologies fit into and shape human contexts and experiences. It also stresses the need for UX professionals to engage at strategic, organizational levels and consider all stakeholders to ensure technologies are developed and used in truly human-centered ways.
This document discusses designing for the Internet of Things (IoT). It begins by defining the IoT as networks of physical objects with embedded sensors and actuators that communicate with other objects, databases, and people. It then discusses some challenges in designing for the IoT, including creating new interaction paradigms that leverage sensing capabilities while accommodating human behaviors. The document outlines characteristics of natural user interfaces for the IoT, such as considering context, cognitive load, social aspects, and movement. It provides examples of techniques for designing IoT interfaces, like bodystorming, gestural studies, prototyping, and usability testing.
This document discusses mind reading technology that can analyze a person's facial expressions and infer their mental state in real time using computer vision and machine learning. It works by tracking 24 feature points on the face and modeling the relationship between facial displays and mental states over time. Potential applications include monitoring driver attention and improving human-computer interfaces, but issues around privacy and predicting future behavior need to be addressed. Research is ongoing to develop less intrusive methods like using headbands that detect blood oxygen levels to read thoughts.
This document summarizes an interview study of 22 people who use fitness tracking technologies like apps and devices. The following key points are made:
- People track a variety of health metrics like walking, exercise, eating, weight, and sleep, often using multiple trackers for different purposes like training, weight loss, and sleep.
- Tracking is usually not long-term, but rather focused on specific events or daily goals. Data is rarely shared on social media due to perceptions of egotism.
- Tracking is often a social activity done with families, partners, or coworkers to compare data and progress together.
- Future tracker design should support co-emergence of activities and tracking over time,
Human Activity Recognition using Smartphone's sensor Pankaj Mishra
Human activity recognition plays significant role in medical field and in security system. In this project we have design a model which recognize a person’s activity based on Smartphone.
A 3- dimensional Smartphone sensor named accelerometer and gyroscope is used to collect time series signal, from which 26 features are generated in time and frequency domain. The activities are classified using 2 different dormant learning method i.e. k-nearest neighbor algorithm, decision tree algorithm.
Digital Marketing First 2014 - Context Aware Computing and Cross Channel Pers...Argus Labs
The document discusses context-aware computing and how Argus Labs is addressing it. Argus Labs has created a sensor fusion platform that can understand context, behavior, and mood using deep learning. It can profile users based on sensors to understand habits and predict human behavior. Argus Labs is applying this across industries like insurance, healthcare, advertising, and more to engage users based on their context in a personalized manner.
Netention is a system for interactively describing a community's present situation & exploring potential futures. People create networks of things, ideas, sentiments, intentions, assets, interests, tasks, locations, messages, parts… that compose semantic stories or processes waiting to become reality: a team, a product, a symphony, a diagnose, a learning journey... and Netention helps them discover opportunities that are mutually satisfying.
More info on http://www.automenta.com/.
This document discusses mind reading technology that can analyze a person's facial expressions in real time to infer their mental state. It works by tracking facial feature points and using dynamic Bayesian networks to model the relationship between expressions and mental states. Potential applications include improving human-computer interaction, monitoring human interactions, and detecting driver states like drowsiness. However, issues around privacy and predicting future behavior must still be addressed.
The Architecture of Understanding (World IA Day Chicago Keynote)Stephen Anderson
Keynote for World IA Day, answering the question "When, Where and How does Understanding occur?" Specifically, this talk discussed (1) interactions (and embodiement) (2) how new technology is changing the "information environments" we design for, and (3) a bit about perceptions and cognition.
The document discusses the Blue Eyes technology, which aims to develop computers that can understand users' emotions, identity, and presence through techniques like facial recognition and speech recognition. The technology uses non-obtrusive sensing methods to gather physiological data from users to determine their emotional states. This would allow computers to interact more naturally with humans. Experimental results showed that measures of skin conductivity, heart rate, finger temperature, and mouse movements can reliably predict a user's emotional state. Future work aims to improve these techniques with smaller, less intrusive sensors.
6. “But if life itself is good and pleasant (...) and if one who sees is conscious that he sees, one who
hears that he hears, one who walks that he walks, (...) whenever we perceive, we are conscious
Aristotle. that we perceive, and whenever we think, we are conscious that we think, and to be conscious that
we are perceiving or thinking is to be conscious that we exist...
8. We strive toward an objective analysis of our surroundings in
order to understand our shortcomings and get better.
9. In 1883, Nietzsche described the idea of the Übermensch/
Superman as a goal for humanity.
10. Nietzsche's Overman or Superman is a human being who
generates values in accordance with data that he collects from
his environment. He employs his intuition (regarding good and
evil) to form values and then tests them empirically and without
prejudice. That which works, promotes his welfare and
happiness and helps him realize his full range of potentials - is
good. And everything - including values and the Superman
himself - everything - is transitory, contingent, replaceable,
changeable and subject to the continuous scrutiny of Darwinian
natural selection. His values are: self-realization, survival in
strength, and continual re-invention. Overcoming is not only a
process or a mechanism - it constitutes the reason to live.
11. A 100 years later, Marshall McLuhan analyzes the relationship
between us and our surroundings and proclaims that “the
medium is the message”.
12. We become what we behold. We shape our tools and then our
tools shape us. - Marshall McLuhan
18. And ends with the discovery of new information that not only helps us
perform tasks faster and better. but anticipates our future needs.
19. We’ve started whole trends in product design around the tools that can
track us, and we even came up with a word to describe the drive towards
an objective learning of oneself - the Quantified Self.
20. The Quantified Self as a term and as a group was formed in 2007 when
Kevin Kelly and Gary Wolf, former Wired contributors, began looking at
some new practices that seemed, loosely, to belong together: life logging,
personal genomics, location tracking, biometrics. These new tools were
being developed for many different reasons, but all of them had
something in common: they added a computational dimension to ordinary
existence.
21. 1.Why the fascination with sensing devices?
2.How do we track activity?
3.Where does my project fit?
22. An odometer for measuring distance was first described by Vitruvius
around 27 and 23 BC. The Roman empire needed to measure the
empire’s roads and thus understand the size of the provinces.
23. We also have evidence of a Chinese odometer in the form of a mechanical carriage.
At one li, a mechanical-driven wooden figure strikes a drum. When ten li is traversed,
another wooden figure would strike a bell with its mechanical-operated arm.
24. Pedometers were popular in the 18th and 19th centuries.
The modern-day pedometer is commonly attributed to Thomas Jefferson.
25. Today pedometers come in thousands of packages and within
products, helping us measure our all-day activity.
26. Tracking is a powerful tool.
Measuring accomplishments makes heroes out of all of us.