Dan Berlin, Jon Strohl, David Hawkins and I presented this at UXPA 2013. Eye tracking is well known and accepted in the UX community. Here we present preliminary evidence for the usefulness of adding electrodermal activity (EDA), continuous dial ratings, etc. to user experience research.
Beyond Eye Tracking: Bringing Biometrics to Usability ResearchDan Berlin
User experience research has traditionally relied upon qualitative techniques that entail users telling us their feelings, wants, and needs. This creates an inherent cognitive bias – data is filtered through the participant’s cognition. That is, we may not necessarily be hearing the participants’ true feelings. They may be trying to please the moderator or may just be unable to articulate the cause of their emotions. But researchers and stakeholders alike are thirsty for quantitative data that complements the qualitative. Luckily, we live in exciting times – there are two particular technologies that are becoming more accessible that will help usability researchers break through cognitive bias and provide that ever tantalizing quantitative data: eye tracking and biometrics. Eye tracking equipment has only recently started to become affordable to most anyone who wants to use it. Researchers must now get up-to-speed on eye tracking methodology and analysis. When is it appropriate? How can we turn the data into actionable findings? What the heck do I do with all of this new data?! More importantly, we should find new research techniques that will break through cognitive bias.
This is where the second technology comes in: biometrics. Psychophysiology is the study of how emotions affect changes in the body. Changes in heart rate, breathing rate, heart rate variability, and galvanic skin response (GSR) have all been shown to be accurate indicators of a person’s emotions, among others. Just as with eye tracking, the equipment to measure these biometrics are just now starting to become accessible to usability researchers. Until very recently, the equipment to gather this data was rather obtrusive and invasive. This not only affected participant comfort, but also did not lend to conducting “discount” usability research. But new technology allows the collection of biometrics in non-invasive ways. For instance, Affectiva’s Q Sensor is worn on the wrist and wirelessly gathers a participant’s GSR. The problem with integrating psychophysiological data into usability research is that individual researchers will need to come up with not only the algorithms to interpret the biometrics but also the technology to temporally marry the biometrics to the eye tracking data. These are no small tasks. There are companies out there that will collect and interpret the data for you for a hefty fee. But this technique should be in every usability researcher’s toolkit. As such, we should come together as a research community to figure this out. We need an open dialogue. We need to share techniques and stories.
Beyond Eye Tracking: Bringing Biometrics to Usability ResearchDan Berlin
User experience research has traditionally relied upon qualitative techniques that entail users telling us their feelings, wants, and needs. This creates an inherent cognitive bias – data is filtered through the participant’s cognition. That is, we may not necessarily be hearing the participants’ true feelings. They may be trying to please the moderator or may just be unable to articulate the cause of their emotions. But researchers and stakeholders alike are thirsty for quantitative data that complements the qualitative. Luckily, we live in exciting times – there are two particular technologies that are becoming more accessible that will help usability researchers break through cognitive bias and provide that ever tantalizing quantitative data: eye tracking and biometrics. Eye tracking equipment has only recently started to become affordable to most anyone who wants to use it. Researchers must now get up-to-speed on eye tracking methodology and analysis. When is it appropriate? How can we turn the data into actionable findings? What the heck do I do with all of this new data?! More importantly, we should find new research techniques that will break through cognitive bias.
This is where the second technology comes in: biometrics. Psychophysiology is the study of how emotions affect changes in the body. Changes in heart rate, breathing rate, heart rate variability, and galvanic skin response (GSR) have all been shown to be accurate indicators of a person’s emotions, among others. Just as with eye tracking, the equipment to measure these biometrics are just now starting to become accessible to usability researchers. Until very recently, the equipment to gather this data was rather obtrusive and invasive. This not only affected participant comfort, but also did not lend to conducting “discount” usability research. But new technology allows the collection of biometrics in non-invasive ways. For instance, Affectiva’s Q Sensor is worn on the wrist and wirelessly gathers a participant’s GSR. The problem with integrating psychophysiological data into usability research is that individual researchers will need to come up with not only the algorithms to interpret the biometrics but also the technology to temporally marry the biometrics to the eye tracking data. These are no small tasks. There are companies out there that will collect and interpret the data for you for a hefty fee. But this technique should be in every usability researcher’s toolkit. As such, we should come together as a research community to figure this out. We need an open dialogue. We need to share techniques and stories.
Systems thinking - a new approach for decision makingJuhana Huotarinen
Systems thinking helps us to understand why people behave like they do. It is a tool for modern decision making and suits well the Agile mindset. Originally presented at Mini Italian Agile day 05/2018
Business analysts have several key skills that make them invaluable to their organizations and the projects they work on. One of those skills id problem analysis. These slides cover the 5 steps you need to take to be effective at problem analysis.
Attention Approximation: From the web to multi-screen televisionCaroline Jay
The move towards the provision of television content over two or more screens represents an enormous opportunity and a considerable challenge. A scientific understanding of what causes people to switch attention between the main screen and a 'second screen' mobile device during television viewing is key to the development of this technology. This seminar describes how ‘attention approximation’, a technique we have used to model visual attention and design screen reader presentation of Web content, can be used to investigate viewing behaviour, and ultimately drive the provision of content across multiple screens.
This presentation explains why most innovation programs fail because they fail to understand what innovation really means. It explains the power of modern day innovation, through various technologies available, lies within the concept of agile validation.
This is a slide deck I originally presented at ALE2011 in Berlin about A3 thinking and Kaizen in the context of a large Lean & Agile enterprise transition.
At LESS2011 in Stockholm a few weeks later, it was even awarded as overall "best session of the conference" (in addition to "best in the Organisation Transformations category".
(also presented at SDC12 and lssc12)
Nov 2011: I want to humbly dedicate this work to Grant (PG) Rule, who suddenly left this world due to a tragic accident.
Mar 2013: Latest News: As a follow-up to the presentation, I created an iPhone/iPad app (called A3 Thinker)!!. It's a set of brainstorming cards to help people create substantially better A3s. see http://a3thinker.com for details. A set of physical cards will soon follow.
24 July 2013: ...and yesterday I released the Android version on Google Play!
Sep 2013: finally! The formidable A3 Thinker action deck is live! http://a3thinker.com/deck
A SYSTEM is a collection of objects such as people, resources, concepts, and procedures intended to perform an identifiable function or to serve a goal
Eye-tracking Glasses Help Define Shop Layout and Record Visitor Experiences -...User Vision
In this presentation we talk about novel techniques we employed in using eye tracking glasses in our field research. Our client wanted to better understand the needs of visitors and how effective the layout of Tourist Information Centre is in answering those needs. Eye tracking was employed to help understand how visitors to Information Centre engage with it, which sections of the literature and merchandise shelving were looked at the most and whether the signage in the centre was noticed. We asked visitors to wear eye-tracking glasses and to use the centre to accomplish the goals of their visit. Findings allowed the client to both take remedial action in areas where the experience was not as effective as it could be, and to take advantage of the insight to maximise the revenue potential of various areas of the centre.
Systems thinking - a new approach for decision makingJuhana Huotarinen
Systems thinking helps us to understand why people behave like they do. It is a tool for modern decision making and suits well the Agile mindset. Originally presented at Mini Italian Agile day 05/2018
Business analysts have several key skills that make them invaluable to their organizations and the projects they work on. One of those skills id problem analysis. These slides cover the 5 steps you need to take to be effective at problem analysis.
Attention Approximation: From the web to multi-screen televisionCaroline Jay
The move towards the provision of television content over two or more screens represents an enormous opportunity and a considerable challenge. A scientific understanding of what causes people to switch attention between the main screen and a 'second screen' mobile device during television viewing is key to the development of this technology. This seminar describes how ‘attention approximation’, a technique we have used to model visual attention and design screen reader presentation of Web content, can be used to investigate viewing behaviour, and ultimately drive the provision of content across multiple screens.
This presentation explains why most innovation programs fail because they fail to understand what innovation really means. It explains the power of modern day innovation, through various technologies available, lies within the concept of agile validation.
This is a slide deck I originally presented at ALE2011 in Berlin about A3 thinking and Kaizen in the context of a large Lean & Agile enterprise transition.
At LESS2011 in Stockholm a few weeks later, it was even awarded as overall "best session of the conference" (in addition to "best in the Organisation Transformations category".
(also presented at SDC12 and lssc12)
Nov 2011: I want to humbly dedicate this work to Grant (PG) Rule, who suddenly left this world due to a tragic accident.
Mar 2013: Latest News: As a follow-up to the presentation, I created an iPhone/iPad app (called A3 Thinker)!!. It's a set of brainstorming cards to help people create substantially better A3s. see http://a3thinker.com for details. A set of physical cards will soon follow.
24 July 2013: ...and yesterday I released the Android version on Google Play!
Sep 2013: finally! The formidable A3 Thinker action deck is live! http://a3thinker.com/deck
A SYSTEM is a collection of objects such as people, resources, concepts, and procedures intended to perform an identifiable function or to serve a goal
Eye-tracking Glasses Help Define Shop Layout and Record Visitor Experiences -...User Vision
In this presentation we talk about novel techniques we employed in using eye tracking glasses in our field research. Our client wanted to better understand the needs of visitors and how effective the layout of Tourist Information Centre is in answering those needs. Eye tracking was employed to help understand how visitors to Information Centre engage with it, which sections of the literature and merchandise shelving were looked at the most and whether the signage in the centre was noticed. We asked visitors to wear eye-tracking glasses and to use the centre to accomplish the goals of their visit. Findings allowed the client to both take remedial action in areas where the experience was not as effective as it could be, and to take advantage of the insight to maximise the revenue potential of various areas of the centre.
A lightning talk I did for UPA 2011 covering why I think eye-tracking is not worth my money. Hint: if you're good enough to use it, you don't need to use it.
This presentation is based on a HBR case study P&G: Marketing Capabilities made by A Ankit Rao during an internship under Prof. Sameer Mathur, IIM Lucknow
Online Eye Tracking and Facial Coding SolutionsEyeSee Research
EyeSee is revolutionizing the market research industry! Our unique platform for tracking peoples’ eyes and facial expressions with their laptop and webcam at home enables delivery of fast, cost-effective and actionable insights. Learn how to increase impact of marketing communication by having insight in your customers' perspective.
User Centric is now a part of GfK! Read about our eye tracking services by visiting http://www.gfk.com/solutions/ux/eye-tracking/Pages/Eye-tracking.aspx
It’s a well-known fact that eye tracking can provide some interesting insight into how people process information. But how can user experience professionals determine if eye tracking is indeed a useful addition to their studies? Our complimentary webinar, “No, But Really, Do I Need Eye Tracking?,” addressed this subject by discussing the benefits of eye tracking and the proper application of the method.
During the webinar, Aga Bojko, VP, User Experience, spoke candidly about when to use and, perhaps more importantly, when not to use eye tracking. Bojko described both qualitative and quantitative types of findings that can be obtained with eye tracking research, and explained how to decide whether or not stakeholders benefit from this method. This presentation outlines example situations in which eye tracking is most effectively utilized, from determining the ease of new drug label differentiation from existing labels to evaluating which package design will be most effective on a shelf.
Trabajo realizado por Sara Chueca, Miriam Ortiz, Eduardo Fariño, Sandra García, Paula Baeta y Miguel Valenzuela para la asignatura “Elementos de la publicidad y RR.PP” del profesor Jose Antonio Gabelas Barroso, Grado de Periodismo, Universidad de Zaragoza.
The marketing team decided to boost sales of Nescafe with a TV Campaign. They received 2 Commercials from HQ and wanted to know which one will increase 1) the top of mind (brand visibility) and 2) association with number of flavors. Also, they wanted to understand what scenes should to be improved and how.
Usability Testing for Survey Research:How to and Best Practicesegeisen
This presentation describes how usability testing of surveys can be used to improve data quality and reduce respondent burden. We describe what kind of surveys can be tested and when. We also provide practice advice for planning, conducting, and analyzing usability tests of surveys.
UX Field Research Toolkit - Updated for Big Design 2018Kelly Moran
Looking for practice with in-depth UXR fieldwork methods? You may have read about these techniques in the past, but methods must be practiced to be understood. projekt202 has been employing the experience research craft with great success since 2003. This workshop is your opportunity to try these tools of the trade in a structured environment without pressing deadlines or looming stakeholders. Our experienced research and design professionals will share industry tips and tricks that will help you put theory to practice.
The workshop will be hands-on and interactive; instructional elements will be reinforced with stories of impact to real projects. We will not only cover methods of gathering user data, but the importance of spending time internalizing and analyzing the data through activities such as affinity diagramming, persona building, and journey mapping. Participants will gain exposure to these important practices in a low-pressure atmosphere and with the guidance of experienced professionals.
Designing for maximum usability – the goal of interaction design
Principles of usability
general understanding
Standards and guidelines
direction for design
Design patterns
capture and reuse design knowledge
What is Lean UX? Come get introduced to the topic of Lean UX and learn the fundamentals of this approach, and how it is revolutionizing the field of UX with UserTesting. Discover how constant iterating through cycles and learning from each cycle can create products which can overcome business challenges and meet customer needs, while saving big bucks, resources, and time.
We will cover the basic principles of Lean UX, and how UserTesting fits into this model of research.
2 hours training on Mobile UX with Farah Nuraini, Interaction Designer at Traveloka, Indonesia
45 min theory: Research, Analysis, Design solutions and Testing
+ 1h15 min of hands-on exercises with the 5 facilitators from Traveloka.
UX Field Research Toolkit - A Workshop at Big Design - 2017Kelly Moran
Workshop Description:
Looking for practice with in-depth user-experience research methods? You may have read about techniques in the past, but methods must be practiced to be understood. projekt202 has been employing these methodologies with great success since 2003. This workshop is your opportunity to try these tools in a structured environment without pressing deadlines or looming stakeholders. Our experienced research and design professionals will share industry tips and tricks that will help you put theory to practice.
The workshop will be hands-on and interactive; instructional elements will be reinforced with stories of impact to real projects. We will not only cover methods of gathering user data, but the importance of spending time internalizing and analyzing the data through activities such as affinity diagramming. Participants will gain exposure to these important practices in a low-pressure atmosphere and with the guidance of experienced professionals.
Processing Speed and Vocabulary are Related to Older Adults' Internet Experie...Jennifer Romano Bergstrom
Some cognitive declines commonly occur with aging; yet they are seldom taken into account by Website designers and User Experience (UX) researchers. In this empirical study, we compared younger adults, middle-age adults, high-functioning older adults, and low-functioning older adults to examine whether there is a relationship between aspects of cognition
and performance when using a Website. Performance was measured by accuracy (percent of tasks completed successfully), efficiency (mean time to complete tasks) and self-rated satisfaction, three commonly used usability metrics. Results
suggest that processing speed and vocabulary may be related to Internet performance. Specifically, older adults with faster processing speed and/or high vocabulary may perform better than their lower-functioning counterparts. More importantly, these older adults perform similar to younger adults.
Whether surveys or forms are your final product or a part of your website/app, creating usable surveys and forms is crucial to a strong user experience for both the user entering information and the user receiving the information. In this session, you will learn about UX principles that drive a strong user experience when completing surveys/forms. The session will focus on understanding the key components of surveys/forms, what they are used for, and how to use them effectively. Topics include using labels to make forms and surveys easier, writing clear instructions, reducing respondent burden, and determining appropriate input fields such as check boxes versus radio buttons. Examples will include findings from usability studies and empirical research, some of which include eye tracking. Usability testing, eye tracking, and other user experience research methods will be discussed.
The way we ask questions and behave during UX sessions affects the data we collect and the interpretations of our findings. In order to collect good UX data, it is important for the moderator to be neutral, structured, and unbiased while setting a comfortable stage for participants to share their thoughts and reactions. In this interactive 45-minute session, you will learn about the importance of structured, unbiased methods to collecting user feedback. We will discuss different methods (e.g., in-lab testing, remote moderated/unmoderated testing, surveys, card sorting, focus groups) and pros/cons of each. You will learn about different data that can be collected from usability tests, including subjective (e.g., what participants verbalize about their experience), behavioral (e.g., what participants do) and implicit (e.g., what participants think but cannot explain) data. We will discuss how to ask participants questions in ways that do not introduce biases, and how additional methods, such as eye tracking, may be valuable in understanding the users’ experience. You will learn how to ensure the data we get from UX tests are reliable and valid.
User-Centered Research on the Paying for College Website and Tools - EDUI 2014Jennifer Romano Bergstrom
The Paying for College website is designed to help consumers make informed decisions about college finance. The Consumer Financial Protection Bureau (CFPB) began development with a user-centered design process for this tool-set, which is now in its 4th iteration. The college cost and financial aid comparison tool, a central feature of these resources, supports efforts by the Department of Education to standardize financial aid disclosures.
During this session, we’ll cover the most recent rounds of usability testing conducted with multiple groups across the U.S. We’ll highlight difficulties when designing and testing for multiple audiences with different needs as well as testing and iterating with live and prototype versions of the site. Data will also be emphasized as we share collection methodologies (click paths, eye tracking, questionnaires, etc.) and the importance of each. We’ll also provide insights into planning, execution, and reporting and how these findings informed major changes on the website.
workshop for UXPA DC on April 12, 2014, entitled "All this UX data! Now what?" Attendees learned how to deal with large amounts of user experience data from tests, and how to combine certain data to tell a succinct story.
This talk briefly covers usability and the user experience and then discusses posting to social media in a way that is consistent with how people use it.
The visual design of surveys and other types of online data collection tools impacts how users perceive, understand and navigate the instrument as well as the responses they provide. Two key considerations that impact how users experience online data collection tools are the device they are using (e.g., smartphone, tablet, computer) and the method of interaction (e.g., website, app or both). When designing online data collection tools, creating a common user experience across different devices and methods of interaction is important to create a consistent user experience and to minimize measurement differences.
In this talk, we will compare the user experience across four different combinations of device and method of interaction of a survey: (1) desktop PC-website, (2) smartphone-app, (3) tablet-app, and (4) tablet-website. Through performance and eye-tracking data, we identify UX elements that must be unified across all devices as well as elements that might need customization for difference devices or methods of interaction.
Age-Related Differences in Search Strategy and Performance When Using a Data-...Jennifer Romano Bergstrom
Erica Olmsted-Hawala presented these findings at HCII 2013 in Las Vegas. Data are from a lab-based experimental usability study, in which we showed that older adults have greater difficulties with cognitively challenging tasks. However, even young adults have difficulties with complex data tables that are often found on government Web sites.
Jonathan Mendelson presented this talk at HCI in Las Vegas. Data are from a probability-based online panel with US adults over the age of 25. We found that QR Code awareness, knowledge, and usage were highest for young adults and lowest for older adults. See slides for more details and see Jonathan's blog post about this talk at: http://www.forsmarshgroup.com/index.php/blog/post/hcii-2013-preview-age-and-qr-codes
Caitlin Krulikowski presented this at ESOMAR 2013 in Boston. Results are based on a probability-based paper survey of American youth. Find out what youth think about and do on Pinterest... See Caitlin's blog post about this presentation here: http://www.forsmarshgroup.com/index.php/blog/post/esomar-pinterest-preview
Jon Strohl organized an Ignite session in which many of us "pitted" different UX methods against each other. In mine, I argue for why remote UX testing is the best UX method. (Of course, I like many methods, and each is "the best" in different situations, but for the sake of this presentation, I hd some fun...) See my blog post about this: http://www.forsmarshgroup.com/index.php/blog/post/uxpa-recap-part-ii-why-remote-testing-is-the-most-preferred-ux-method.
Slides for a short course I taught for UXPA DC on Feb 27, 2013. This is a UX 101- basics if you are new to UX and Usability. The focus is on desktop websites, but many of these principles apply to other products (e.g., surveys, apps) and devices (e.g., tablets, smartphones). Stay tuned for an updated version that is mobile-heavy.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Search and Society: Reimagining Information Access for Radical FuturesBhaskar Mitra
The field of Information retrieval (IR) is currently undergoing a transformative shift, at least partly due to the emerging applications of generative AI to information access. In this talk, we will deliberate on the sociotechnical implications of generative AI for information access. We will argue that there is both a critical necessity and an exciting opportunity for the IR community to re-center our research agendas on societal needs while dismantling the artificial separation between the work on fairness, accountability, transparency, and ethics in IR and the rest of IR research. Instead of adopting a reactionary strategy of trying to mitigate potential social harms from emerging technologies, the community should aim to proactively set the research agenda for the kinds of systems we should build inspired by diverse explicitly stated sociotechnical imaginaries. The sociotechnical imaginaries that underpin the design and development of information access technologies needs to be explicitly articulated, and we need to develop theories of change in context of these diverse perspectives. Our guiding future imaginaries must be informed by other academic fields, such as democratic theory and critical theory, and should be co-developed with social science scholars, legal scholars, civil rights and social justice activists, and artists, among others.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
Empowering NextGen Mobility via Large Action Model Infrastructure (LAMI): pav...
Beyond Eye Tracking: Using User Temperature, Rating Dials, and Facial Analysis to Understand the User Experience
1. Beyond Eye Tracking
Using user temperature, rating dials, and facial
analysis to understand the user experience
Jen Romano Bergstrom, Jon Strohl, David Hawkins
Dan Berlin
UXPA2013 | Washington, DC
@romanocog @forsmarshgroup @banderlin
3. 3
Client’s needs
• Traditionally…
– What works well
– What needs help
• Measure the UX
Observations
Selection/click behavior
Contextual observations
Time to complete taskReaction time
AccuracyAbility to complete tasks
4. 4
Task efficiency and accuracy
Accuracy
Steps to
Complete
Task*
Time to
Complete
Task*
Users 10% 8 170 seconds
Admins 21% 8.3 32 seconds
All
Participants
15% 8.2 101 seconds
5. Session observations
5
• Observational click behavior
• Facial expressions of frustration
• Fidgeting and other observations of emotion
Areas of the website that participants explored first.
7. Think aloud protocol
7
• Rooted in cognitive psychology and the study of thinking
• Makes explicit what is implicitly present to participants
• Concurrent vs. retrospective
“This
is
really
confusing!”
8. Satisfaction questionnaires & difficulty ratings
8
• Assess users subjective satisfaction
• Consistent questionnaire used across interfaces or
customized for its features and capabilities
• Structured vs. unstructured
Satisfaction Questionnaire
Please circle the numbers that most appropriately reflect your impressions about using this Web-based
instrument.
terrible wonderful
1. Overall reaction to the Web site:
1 2 3 4 5 6 7 8 9 not applicable
confusing clear
2. Screen layouts: 1 2 3 4 5 6 7 8 9 not applicable
inconsistent consistent
3. Use of terminology throughout the Web site: 1 2 3 4 5 6 7 8 9 not applicable
inadequate adequate
4. Information displayed on the screens: 1 2 3 4 5 6 7 8 9 not applicable
illogical logical
5. Arrangement of information on the screen: 1 2 3 4 5 6 7 8 9 not applicable
never always
6. Tasks can be performed in a straight-forward manner: 1 2 3 4 5 6 7 8 9 not applicable
confusing clear
7. Organization of information on the site: 1 2 3 4 5 6 7 8 9 not applicable
impossible easy
8. Forward navigation: 1 2 3 4 5 6 7 8 9 not applicable
9. 9
Client’s needs
• For this project…
– What grabs attention?
– What is engaging?
– What is a turn off?
– What about the videos?
– Good parts? Bad?
– Is green better than…?
11. 11
Client’s needs
• For this project…
– What grabs attention?
– What is engaging?
– What is a turn off?
– What about the videos?
– Good parts? Bad?
– Is green better than…?
Explicit
Post-task satisfaction questionnaires
Moderator follow up
In-session difficulty ratings
Verbal responses
Real-time +/- dial
Observations
Selection/click behavior
Contextual observations
Time to complete taskReaction time
AccuracyAbility to complete tasks
12. Implicit measures
12
• Physiological responses are difficult to control
• Implicit responses are unfiltered
• Responses occur before explicit measures
Definition: Underlying reactions (e.g., eye tracking, arousal) that people are unaware of, cannot control, or
cannot express at a granular level
Stimulus
Implicit
Responses
Thought
Processes
Explicit
Responses
13. Why don’t we measure the implicit?
13
• Very difficult, if even possible, to
communicate the subconscious.
• Responses occur in a very short time
interval.
• A lot of noise in the signal
• Unfamiliar lexicon used in the
literature.
• The technology is just beginning to
become usable by a wider audience.
• Analyses appear overwhelmingly time
consuming and complicated.
• It’s difficult to justify the ROI.
14. Why should we measure the implicit?
14
• Evaluates thought processes and emotions (not what the
participant tells you)
• Quantifiable data that goes beyond task performance
• Moment by moment interaction
• Cause and effect triggers
• Deeper insights
15. Why should we measure the implicit?
15
• Evaluates thought processes and emotions (not what the
participant tells you)
• Quantifiable data that goes beyond task performance
• Moment by moment interaction
• Cause and effect triggers
• Deeper insights
Traditional research is good at explaining what
people say and do, not what they think and feel.
16. 16
Observations
Selection/click behavior
Ethnography
Time to complete task
Reaction time
Accuracy
Ability to complete tasks
The Complete UX
Explicit
Post-task satisfaction questionnaires
Moderator follow up
In-session difficulty ratings
Verbal responses
Real-time +/- dial
Implicit
Eye tracking
Electrodermal activity (EDA)
Behavioral analysis
Pupil dilation
Facial expression coding
Implicit associations
Linguistic analysis of verbalizations
Heart rate variability
18. Neuroimaging metrics
18
• Indirectly or directly
measures activity in the
brain.
• Typically measures the
hemodynamic response
or brain electrical
activity.
• Examine what “people
are thinking”
23. What is eye tracking
23
• Observing and recording eye movements
as a participant interacts with a product
– Allows us to gain deeper insight into how users
perform tasks
• Allows UX researchers to collect objective
behavioral data
• Doesn’t include observing pupil dilation,
blink rate, or facial recognition
Yesterday
25. Qualitative heat maps
25
• Aggregate of fixation count or duration across participants
Example:
• Participants have similar fixation counts across links
• Displays uncertainty of where to click to get started
26. Qualitative gaze plots
26
• Plot of fixations for a single participant
Example:
• Participant fixates
back and forth
between two
different sections
• Displays
uncertainty on how
to use the sections
• The instructional
paragraph did not
facilitate web
reading
27. 27
Example:
• Participant has
repeated fixations in
the upper right hand
corner
• Participant said that
he/she was looking for
a search tool on the
page
• The search tool was
contained within a
disappearing banner
on the page
Qualitative gaze plots
28. Quantitative eye-tracking data
28
• Quantitative data
– Attention
• Time to first fixation
– Are users finding the important content quickly?
• Total number of fixations in an area of interest
• Percentages of fixations in an AOI compared to the total page
– Are users spending an inordinate amount of time looking at a
single area?
– Processing
• Fixation duration
– Are users spending a long period of time in this area?
– Efficiency
• Repeat fixations
– Is information clear and presented efficiently?
29. Quantitative eye tracking
29
• Break the page up into
separate “areas of interest” or
AOIs
• Compare the fixation data
between important areas and
less important ones
– Or compare data between
designs
Areas of Interest
30. Combining quantitative and qualitative data
30
• Using multiple sources of data makes the evidence more
compelling
• Example: “LAUNCH” was expected to be the most clicked
• Heat map supports the quantitative eye-tracking data
31. Beyond eye tracking
31
• Eye tracking is just one type of biometric measure
• It tells us where participants are looking
• It does not tell us
– Emotional state
– Level of arousal
– Level of mental workload
38. What is it?
38
• Electrodermal activity (EDA)
encompasses skin conductance
responses and body
temperature.
• Nerve fibers release sweat in
response to a stimulus.
• Sweat facilitates the travel of an
electrical signal.
• After a stimulus onset, glands
return to a baseline status.
• Sweat secretion is related to
sympathetic nervous system
activity.
39. Who cares?
39
• Skin conductance is an established measure of arousal
• Arousal can indicate engagement, fear, frustration, or other
emotional changes
• Continuously measure changes in arousal throughout a test
• Establish bench marks and use them to compare previous
iterations
• Determine if the design facilitated typical levels of arousal
or if there were specific triggers
40. EDA in UX research
40
• EDA can indicate usability problems
• Assess “good” and “bad” interfaces and compare biometrics (Ward
& Marsden, 2002)
• “Bad” interface causes higher skin conductivity, lower blood
volume, and increased pulse rate
• Assess frustration while playing a game (Lin and Hu, 2005)
41. 41
How do I do it?
• The electrodes on an EDA sensor measure the resistance electricity faces
when traveling across the skin.
• Electrodes can be placed on three locations
– Best option - Palm
– Good option - Finger
– Acceptable option – Wrist
• Wired and wireless available
EDA recording device & analysis software
44. Dial Rating
44
FMG Rating Dial
• Continuous real-time feedback on videos and
commercials
• Researcher can choose anchors for the ratings
• Tear dropped knob allows participant to remain
focused on the video
• Time sensitive
Position of dial
Max position of dial
Min position of dial
Dial Recorder Software
47. 47
• Tonic and phasic activity
– Tonic activity is slow, state-based level of arousal
– Phasic activity is a rapid, stimulus based change in arousal
• EDA activity is long periods of gradual change with a series of
peaks in activity.
2.6
2.8
3.0
0 4 8 11 15 19 23 26 30
µS
Seconds
Processing the EDA signal
48. 48
• The phasic response begins 1-4 seconds after onset of stimulus
• The signal is analyzed in discrete time intervals
• The area under the curve is analyzed to determine changes
2.6
2.8
3.0
0 4 8 11 15 19 23 26 30
µS
Seconds
Response onset Returning to baseline Response onset Peak is delayed
Analyzing EDA data
50. 50
P
I found my mind
wandering while the
advertisement was on
While the
advertisement was
on, I found myself
thinking about other
things
I had a hard time
keeping my mind
on the
advertisement
Average
P1
1 1 1
1.0
P2
1 2 1 1.3
P3
1 1 1
1.0
P4
3 3 3
3.0
P5
2 2 2
2.0
P6
2 2 2
2.0
Explicit rating of attention: Please indicate how much you agree with the following statements
Response options: 1 (Not at all) | 2 | 3 | 4 | 5 | 6 | 7 (Extremely)
51. 51
Explicit rating of emotion: Please indicate how much you experienced each of the following
while viewing the advertisement
P
Amused,
fun-loving,
silly
angry,
irritated,
or
annoyed
disgust,
distaste, or
revulsion
guilty,
repentant, or
blameworthy
inspired,
uplifted, or
elevated
interested,
alert, or
curious
joyful,
glad, or
happy
sad,
downheart
ed, or
unhappy
scared,
fearful,
or afraid
sympathy,
concern, or
compassion
surprised,
amazed, or
astonished
P1
2 1 1 1 1 3 2 1 1 1 1
P2
2 3 1 1 1 1 1 1 1 1 1
P3
4 1 1 1 2 3 3 1 1 1 2
P4
1 2 1 1 1 1 1 1 1 1 1
P5
4 1 1 1 3 4 4 1 1 1 1
P6
5 1 1 1 3 4 4 1 1 1 2
Response options: 1 (Not at all) | 2 | 3 | 4 | 5 | 6 | 7 (Extremely)
52. 52
• When?
– When did minds start to wander?
– When were people engaged?
• What?
– What did people focus on?
– What did people miss?
– What caused the negative/positive emotions?
• Was it something specific or overall?
Unanswered Questions
54. 54
Traditional Likert-Scale Overall Rating
New Continuous Dial Rating
Visa Video Ad Example
Question: Please indicate how much you experienced each of the following while viewing the advertisement.
Response options: Not At All | A little bit| Moderately | Quite a bit | Extremely
P
amused, fun-
loving, or silly
angry,
irritated, or
annoyed
disgust,
distaste, or
revulsion
guilty,
repentant, or
blameworthy
inspired,
uplifted, or
elevated
interested,
alert, or
curious
joyful, glad, or
happy
sad,
downhearted,
or unhappy
scared,
fearful, or
afraid
sympathy,
concern, or
compassion
surprised,
amazed, or
astonished
P1 2 1 1 1 1 3 2 1 1 1 1
P2 2 3 1 1 1 1 1 1 1 1 1
P3 4 1 1 1 2 3 3 1 1 1 2
P4 1 2 1 1 1 1 1 1 1 1 1
P5 4 1 1 1 3 4 4 1 1 1 1
P6 5 1 1 1 3 4 4 1 1 1 2
-1.1
0.0
1.1
P1
P2
P3
P4
P5
P6
Mean
55. 55
1.6
1.65
1.7
1.75
1.8
1.85
1.9
1.95
2
2.05
2.1
-5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32
Electrodermal Activity: Visa Video Ad
You
c
a
n
p
u
t
n
o
t
e
s
h
e
r
e
,
b
u
t
i
f
y
o
u
d
o
n
’
t
i
t
w
o
n
’
t
a
p
p
e
a
r
w
h
e
n
y
o
u
p
r
e
s
e
n
t
[music
only,
screen
change
from
bright
to
dark]
[drama<c
screen
change
to
black
with
white
words,
"without
the
worry
of
currency
exchange";
music
consistent]
[almost
falls
in
water]
[tail
end
of
previous
screen
which
appeared
for
several
seconds
and
then
change
to
first
men<on
of
brand]
[middle
of
second
screen
change—MUSIC
changes]
+
+
+
+
+
[music
change]
[scene
bright
and
beachy]
+
56. 56
Traditional Likert-Scale Overall Rating
New Physiological Measure of Arousal
Visa Video Ad Example
Question: Please indicate how much you experienced each of the following while viewing the advertisement.
Response options: Not At All | A little bit| Moderately | Quite a bit | Extremely
P
amused, fun-
loving, or silly
angry,
irritated, or
annoyed
disgust,
distaste, or
revulsion
guilty,
repentant, or
blameworthy
inspired,
uplifted, or
elevated
interested,
alert, or
curious
joyful, glad, or
happy
sad,
downhearted,
or unhappy
scared,
fearful, or
afraid
sympathy,
concern, or
compassion
surprised,
amazed, or
astonished
P1 2 1 1 1 1 3 2 1 1 1 1
P2 2 3 1 1 1 1 1 1 1 1 1
P3 4 1 1 1 2 3 3 1 1 1 2
P4 1 2 1 1 1 1 1 1 1 1 1
P5 4 1 1 1 3 4 4 1 1 1 1
P6 5 1 1 1 3 4 4 1 1 1 2
1.6
1.65
1.7
1.75
1.8
1.85
1.9
1.95
2
2.05
2.1
-5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32
58. Artery Video Ad Example: Traditional Measures
58
Traditional Likert-Scale Overall Rating
Question: Please indicate how much you experienced each of the following while viewing the advertisement.
Response options: Not At All | A little bit| Moderately | Quite a bit | Extremely
P
amused, fun-
loving, or silly
angry,
irritated, or
annoyed
disgust,
distaste, or
revulsion
guilty,
repentant, or
blameworthy
inspired,
uplifted, or
elevated
interested,
alert, or
curious
joyful, glad, or
happy
sad,
downhearted,
or unhappy
scared,
fearful, or
afraid
sympathy,
concern, or
compassion
surprised,
amazed, or
astonished
P1 1 1 2 1 1 1 1 1 1 1 1
P2 1 1 5 1 1 1 1 2 1 1 4
P3 3 1 3 1 1 2 1 1 1 3 3
P4 1 3 5 1 1 3 1 3 1 1 5
P5 1 1 3 1 1 3 1 2 1 1 1
P6 1 1 5 1 1 1 1 1 1 1 3
59. Artery video example
59
Traditional Likert-Scale Overall Rating
New Continuous Dial Rating
Question: Please indicate how much you experienced each of the following while viewing the advertisement.
Response options: Not At All | A little bit| Moderately | Quite a bit | Extremely
P
amused, fun-
loving, or silly
angry,
irritated, or
annoyed
disgust,
distaste, or
revulsion
guilty,
repentant, or
blameworthy
inspired,
uplifted, or
elevated
interested,
alert, or
curious
joyful, glad, or
happy
sad,
downhearted,
or unhappy
scared,
fearful, or
afraid
sympathy,
concern, or
compassion
surprised,
amazed, or
astonished
P1 1 1 2 1 1 1 1 1 1 1 1
P2 1 1 5 1 1 1 1 2 1 1 4
P3 3 1 3 1 1 2 1 1 1 3 3
P4 1 3 5 1 1 3 1 3 1 1 5
P5 1 1 3 1 1 3 1 2 1 1 1
P6 1 1 5 1 1 1 1 1 1 1 3
-‐1.2
-‐1
-‐0.8
-‐0.6
-‐0.4
-‐0.2
0
0.2
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
P2,
video
1
P3,
video
1
P4,
video
1
P5,
video
1
P6,
video
1
Mean
60. -‐1.2
-‐1
-‐0.8
-‐0.6
-‐0.4
-‐0.2
0
0.2
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
P2,
video
1
P3,
video
1
P4,
video
1
P5,
video
1
P6,
video
1
Mean
Continuous dial rating: Artery video
60
[sound
of
rushing
air]
"this
much
was
found
stuck
to
the
aorta..."
"every
cigareWe
is
doing
you
damage"
61. Electrodermal activity: Artery video
61
Traditional Likert-Scale Overall Rating
New Physiological Measure of Arousal
Question: Please indicate how much you experienced each of the following while viewing the advertisement.
Response options: Not At All | A little bit| Moderately | Quite a bit | Extremely
P
amused, fun-
loving, or silly
angry,
irritated, or
annoyed
disgust,
distaste, or
revulsion
guilty,
repentant, or
blameworthy
inspired,
uplifted, or
elevated
interested,
alert, or
curious
joyful, glad, or
happy
sad,
downhearted,
or unhappy
scared,
fearful, or
afraid
sympathy,
concern, or
compassion
surprised,
amazed, or
astonished
P1 1 1 2 1 1 1 1 1 1 1 1
P2 1 1 5 1 1 1 1 2 1 1 4
P3 3 1 3 1 1 2 1 1 1 3 3
P4 1 3 5 1 1 3 1 3 1 1 5
P5 1 1 3 1 1 3 1 2 1 1 1
P6 1 1 5 1 1 1 1 1 1 1 3
0.0
1.0
2.0
3.0
4.0
5.0
6.0
7.0
P1
P2
P3
P4
P5
P6
Mean
62. Electrodermal activity: Artery video
62
"...the
main
artery
from
the
heart"
"every
cigareWe
is
doing
you
damage"
[voice,
pace
change]
"authorized
by
the
Australian
government"
"this
much
was
found
stuck
to
the
aorta..."
[sound
of
rushing
air]
[first
faWy
deposits
emerge]
+
+
+
+
+
+
“every
cigareWe
is
doing
you
damage
"
[sound
effect;
no
text]
“age
32“
[heartbeats]
[sound
of
crackling
embers]
+
+
+
+
63. 1.6
1.65
1.7
1.75
1.8
1.85
1.9
1.95
2
2.05
2.1
-5 -4 -3 -2 -1 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32
EDA does not capture valence
63
You
c
a
n
p
u
t
n
o
t
e
s
h
e
r
e
,
b
u
t
i
f
y
o
u
d
o
n
’
t
i
t
w
o
n
’
t
a
p
p
e
a
r
w
h
e
n
y
o
u
p
r
e
s
e
n
t
P1: Artery ad (Negative emotion)
P1: Visa ad (Positive emotion)
64. Continuous Dial Rating: Artery vs. Visa
64
-1.1
0.0
1.1
P1
P2
P3
P4
P5
P6
Mean
-1.2
-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
P2, video 1
P3, video 1
P4, video 1
P5, video 1
P6, video 1
Mean
65. EDA advantages and disadvantages
65
• Advantages
– Continuous measure of
automatic physiological
response
– Sensitive to minor changes in
arousal
– Informs order of magnitude
• Disadvantages
– Does not inform valence
– Peak of physiological response
is slow
– Sometimes difficult to collect
0
0.5
1
1.5
2
2.5
Dial Eye Tracker EDA
MeanIntrusivenessRating
Debriefing question: On a scale of 1 to 5, how intrusive was ____ while you were trying to complete the tasks and watch
videos?
Dial: Two participants rated the dial as very intrusive (4): “I was having to concentrate on what my reaction was, not just
have it.”
“It’s not something I normally do, or something I do consciously.”
EDA: Three participants rated the wrist band as moderately intrusive (3): “It was itchy.” “I had to remember not to move it.”
“I didn’t know where to put it.”
67. We need to be taking a collaborative approach
67
• Disparate measures of physiological response can tell a cohesive story!
• By analyzing different streams of data we can uncover a very rich level
of analysis.
68. We need to be taking a collaborative approach
68
69. Combining implicit measures for meaningful insights
69
-1.100
0.000
1.100
• Simulated pupil diameter data
• Simulated heart rate variability data
• Simulated EDA data
70. EDA: promising future
70
• Promising results
– When data is good, EDA provides continuous, “objective” arousal
measure
– There is consistency between:
• The Likert scale and the continuous dial data
• Self-reported emotion overall and EDA data
– EDA provides additional data above and beyond self-report
measures
– Most complete story can be told with a combination of measures.
71. 71
• Data Analyses
– Compare to baseline – different baseline per person and per stimulus
– How does pupil dilation data compare with EDA?
– Reduce the intrusiveness ratings for all metrics
Lessons learned
• Dial
– If ET is not used, allow participants to look at the dial when making
responses
– Include simple practice task to increase familiarity
• Eye Tracker
– Instruct participants to visually search as if they were at home on their own
computer
• EDA
– Improve quality of EDA data; explore equipment
– Provide a cushion/pad to rest arm
– Over-recruit
72. Select your measure carefully
72
• Where are participants dwelling on instructions and tasks?
– Eye tracking
• Which specific elements on a page are particularly
stressful?
– Eye tracking, EDA
• Which content is very engaging for the user?
– Eye tracking, EDA, satisfaction questions, debriefing interview
• Which design causes more stress on the user?
– EDA, debriefing interview
78. Pushing our research further
78
• There are lessons to be learned from neuromarketing
– Neuromarketing researchers have used EDA, heart rate
variability and even fMRI and EEG in an attempt to
determine how users experience an advertisement.
• UX has a different set of requirements
– To become more usable for practitioners, we need:
• Portable technology that can be taken when traveling
• Software that has a short learning curve
• Customizations that allow for sensors to be wrist mounted and
more literature to substantiate the use of this sensor location
• Analysis protocols that can be completed in a short period of
time.
79. Issues to keep in mind
79
• We want to mimic real-world experiences during a usability
study
• Complex setup will confound our experimental design
• Participant comfort is paramount
• Concurrent think-aloud vs. Retrospective think-aloud
• A talking participant is a distracted participant
• We always need to provide support for a ROI
80. Where do we go from here?
80
• We need to:
– Collaborate to move our
field forward
– Share methods and
analysis protocols
– Empirically test our
hypotheses
– Continually provide proof
for ROI
81. Thank you!
81
Jennifer Romano Bergstrom
jbergstrom@forsmarshgroup.com | @romanocog
Dan Berlin
dberlin@madpow.net | @banderlin
Jon Strohl
jstrohl@forsmarshgroup.com | @jonstrohl
David Hawkins
dhawkins@forsmarshgroup.com | @dHawk87
UXPA2013
|
Washington,
DC