- what is UX?
- why is it important?
- a brief history and future of UX
- general ux principles
- enterprise ux
- ux project approach
- ui design principles
- ux tools
UX is omnipresent nowadays and will grow more and more the tool of innovation. Companies are becoming aware of the vitality of adopting this technology from the start. The Importance of UX is a presentation of how we as a UX Design Team implement UX in projects.
- what is UX?
- why is it important?
- a brief history and future of UX
- general ux principles
- enterprise ux
- ux project approach
- ui design principles
- ux tools
UX is omnipresent nowadays and will grow more and more the tool of innovation. Companies are becoming aware of the vitality of adopting this technology from the start. The Importance of UX is a presentation of how we as a UX Design Team implement UX in projects.
UX 101: A quick & dirty introduction to user experience strategy & designMorgan McKeagney
A quick & dirty intro to UX strategy & design. Some context, some fundamentals, some current & emerging trends, and some useful resources for the absolute beginner.
First delivered @ the NDRC Launchpad startup accelerator in Dublin, Ireland, 16/10/2014. (www.ndrc.ie)
Stop UX Research being a Blocker. How to fit UX research into agile teams.
UX research can’t be rushed but it also can’t be uncapped.
Some research activities will take longer than others, but it’s most important to differentiate between research that provides specific value in the moment vs. research that pays off strategically in the long run.
Foundational research methods will help you decide where you want to go, while directional methods will give you turn by turn directions for how to get there.
A presentation on UX Experience Design: Processes and Strategy by Dr Khong Chee Weng from Multimedia University at the UX Indonesia-Malaysia 2014 that was conducted on the 26th April 2014 in the Hotel Bidakara, Jakarta, Indonesia.
Those who don't learn from history are doomed to NOT repeat it.
We know the old adage, but the other reality is that there's nothing new under the sun. The same goes for the practice of User Experience (UX) and it goes back further than you might think.
History can be fun – especially when we see how it relates to our ever-expanding and shifting industry of today. This presentation is geared to new practitioners who want to understand the foundations of our field and veterans who would like to see a different perspective on our profession. Let's look at the practice of UX through a historical lens at some of man's most creative pursuits and demonstrate the parallels between the past and today's design trends.
An Introduction to the World of User ResearchMethods
What is user? Why do we do it? How do we do it? User Research Consultants, Dr Jennifer Klatt and Ben Smith from Methods Digital (https://methodsdigital.co.uk/) have kindly put together this slide deck to take you through the basics.
Going from Here to There: Transitioning into a UX Careerdpanarelli
A lot of people are curious about transitioning into the field of User Experience Design (UX). In this talk, I talk about a few different ways that you can transition into a UX career, be it grad school, night classes, or the ol' school of hard knocks, backed up by case studies. This talk was given at NoVA UX Meetup in the offices of AddThis, hosted by organizer Jim Lane.
UX 101: A quick & dirty introduction to user experience strategy & designMorgan McKeagney
A quick & dirty intro to UX strategy & design. Some context, some fundamentals, some current & emerging trends, and some useful resources for the absolute beginner.
First delivered @ the NDRC Launchpad startup accelerator in Dublin, Ireland, 16/10/2014. (www.ndrc.ie)
Stop UX Research being a Blocker. How to fit UX research into agile teams.
UX research can’t be rushed but it also can’t be uncapped.
Some research activities will take longer than others, but it’s most important to differentiate between research that provides specific value in the moment vs. research that pays off strategically in the long run.
Foundational research methods will help you decide where you want to go, while directional methods will give you turn by turn directions for how to get there.
A presentation on UX Experience Design: Processes and Strategy by Dr Khong Chee Weng from Multimedia University at the UX Indonesia-Malaysia 2014 that was conducted on the 26th April 2014 in the Hotel Bidakara, Jakarta, Indonesia.
Those who don't learn from history are doomed to NOT repeat it.
We know the old adage, but the other reality is that there's nothing new under the sun. The same goes for the practice of User Experience (UX) and it goes back further than you might think.
History can be fun – especially when we see how it relates to our ever-expanding and shifting industry of today. This presentation is geared to new practitioners who want to understand the foundations of our field and veterans who would like to see a different perspective on our profession. Let's look at the practice of UX through a historical lens at some of man's most creative pursuits and demonstrate the parallels between the past and today's design trends.
An Introduction to the World of User ResearchMethods
What is user? Why do we do it? How do we do it? User Research Consultants, Dr Jennifer Klatt and Ben Smith from Methods Digital (https://methodsdigital.co.uk/) have kindly put together this slide deck to take you through the basics.
Going from Here to There: Transitioning into a UX Careerdpanarelli
A lot of people are curious about transitioning into the field of User Experience Design (UX). In this talk, I talk about a few different ways that you can transition into a UX career, be it grad school, night classes, or the ol' school of hard knocks, backed up by case studies. This talk was given at NoVA UX Meetup in the offices of AddThis, hosted by organizer Jim Lane.
On June 25, TryMyUI hosted a webinar with speaker Ritvij Gautam on collaborative UX analysis. This is the slide deck from that webinar.
Full recording of the webinar:
https://www.youtube.com/watch?v=9g05rGMnmYs
As designers, we know we should be doing research, testing and iteration as part of our design process, but those things are often what gets bypassed in favour of ‘getting it done’.
This talk will discuss the different types of research we can perform on digital projects, how to conduct effective research, and how to test our designs with real users to uncover important insights into our projects.
I’ll also discuss how to sell the added time and cost to bosses and clients, and how to do it on a shoestring budget.
I’ve performed UX research and testing for a number of our clients at Headspace and I’ve also applied it to our own software startup, so I have good case studies to show a practical application of the knowledge.
The presentation will provide an overview of the key forms of research that are frequently conducted to support the UX design process – and how they are traditionally conducted. The presentation will then focus on two forms of UX research and the positive impact that moving these into the digital realm has on the quality, timing, and cost of the research – as well as highlighting the challenges it brings.
This overview details the UI/UX design process at our company, Propeller Labs. We pride ourselves on partnering with leading companies to create digital solutions. Innovative design, through effective process, has positioned us to become a leading partner in building digital products.
Better understand how to involve your target audiences during the design phase. Learn more about the research methods needed to ensure your target users will understand your product and can use it with ease before you invest time and money into the costly development phase.
Topics:
- Setting research objectives for the design phase
- Bringing your users into hands-on collaborative design activities such as paper-prototyping and card sorting
- Evaluating your design with users through usability testing, including in-person and remote testing
- Some of the tools available, including automated testing tools
In the last episode of Putting Users in UX, Steven and Terry dove into the mechanics of effective user research.
We began with tips for planning your research, including setting research objectives, choosing the right research methods, and recruiting participants.
Then we got into conducting research: the set-up, facilitating the sessions, and guiding participants appropriately to ensure you’re getting the insights you need.
Finally, we showed you how to capture and analyze your findings so that your research can be easily understood and used by the rest of the project team.
Rethinking UX Research - Design4Drupal 2014 keynote presentationPerfetti Media
How do you really know what your users want? How do you ensure your designs work for your prospects and customers? How can you be confident that your design changes improve your site?
There are those that believe that user research must be a complex and scientific process that takes a lot of time, money, and resources. However, in the real world, most designers and developers don't have the luxury to spend weeks and months on their user research.
It's possible to get useful results without the time-consuming expense of traditional user research methods. In this presentation, Christine Perfetti will share proven strategies and techniques for successfully integrating UX research into your process. You'll learn how to answer essential design questions using methods that take only a day—and sometimes only 10 minutes!
Blockchain, Predictive Analytics and HealthcareRuchi Dass
Episode-of-care payment and comprehensive care payment systems can help providers prevent health problems; prevent the occurrence of acute episodes among individuals who have health conditions; prevent poor outcomes during major acute episodes, such as infections, complications, and hospital readmissions; and reduce the costs of successful treatment.
Using cryptography to keep exchanges secure, blockchain provides a decentralized database, or “digital ledger”, of transactions that everyone on the network can see. This network is essentially a chain of computers that must all approve an exchange before it can be verified and recorded.
Learn more about Blockchain in healthcare here.
Learn how to use prototyping and usability testing as a means to validate proposed functionality and designs before you invest in development. SOMETIMES there is a huge disconnect between the people who make a product and the people who use it. Usability testing is vital to uncovering the areas where these disconnects happen. In this symposium you will learn the steps to conduct a successful usability test. This includes tips and real life examples on how to plan the tests, recruit users, facilitate the sessions, analyze the data, and communicate the results.
This presentation shares the journey I’ve been on, from trying to shape and influence a user’s path, to creating sandbox environments in which people can play and amaze us!
______
Designers are trained to guide users toward predetermined outcomes, but is there a better use of this persuasive psychology? What happens if we focus less on influencing desired behaviors and focus more on designing ‘sandboxes’: open-ended, generative systems? And how might we go about designing these spaces? It’s still “psychology applied to design”, but in a much more challenging and rewarding way!
In this talk, I’ll share the journey I’ve been on, from trying to shape and influence a user’s path, to creating these sandbox environments. You’ll learn why systems such as Twitter, Pinterest, and Minecraft are so maddeningly addictive, and what principles we can use to create similar experiences. We’ll look at education and the work of Maria Montessori, who wrote extensively about how to create learning environments that encourage exploration and discovery. And we’ll look at game design, considering all the varieties of games, especially those carefully designed to encourage play — a marked contrast with progression games designed to move you through a series of ever-increasing challenges, each converging upon the same solution. Finally, we’ll look at web applications, and I’ll share how this thinking might influence your work, from how you respond to new feature requests to how you design for behavior change in a more mature way.
Best Practice For UX Deliverables - Eventhandler, London, 05 March 2014Anna Dahlström
TAKE THIS WORKSHOP ONLINE & GET 20% OFF WITH CODE 'SLIDESHARE'
https://school.uxfika.co/p/best-practice-for-ux-deliverables/?product_id=325265&coupon_code=SLIDESHARE
---
Slides from my 'Best practice for UX deliverables' workshop that I ran for Eventhandler in London on the 05th of March 2014.
http://www.eventhandler.co.uk/events/uxnightclass-uxdeliverables3
---
Please note that for copyright reasons & client privacy the examples in this presentation are slightly different than from the workshop. The examples included are for reference only in terms of what I talked through in the 'Good examples' section.
-----
ABSTRACT
Whilst the work we do is not meant to be hanged on a wall for people to admire, nor is meant to be put in a drawer and forgotten about. Just as we make the products and services we design easy to use, the UX of UX is about communicating your thinking in a way that ensures that what you've defined is easy to understand for the reader. It's about adapting the work you do to the project in question and finding the right balance of making people want to look through your work whilst not spending unnecessary time on making it pretty.
Who is it for?
This workshop is suitable for anyone starting out in UX, or who's worked with it for a while but is looking to improve the way they present their work.
What you'll learn
In this hands on workshop we'll walk through real life examples of why the UX of UX deliverables matter. We'll cover how who the reader is effects the way we should present our work, both on paper and verbally, and how to ensure that the work you do adds value. Coming out of the workshop you'll have practical examples and hands on experience with:
// How to adapt and sell your UX deliverable to the reader (from clients, your team, in house and outsourced developers)
// Guiding principles for creating good UX deliverables (both low and high fidelity)
// Best practice for presentations, personas, user journeys, flows, sitemaps, wireframes and other documents
// Simple, low effort but big impact tools for improving the visual presentation of your UX deliverables
The goal of this presentation is to give attendees a deeper understanding of usability testing so they can leverage it in their own work. The material will shed light on what is important to the research buyer and will help the research provider to better understand how to plan, moderate, and report on a usability study. It will also provide information on where they can go to learn more about this very practical qualitative method.
Kay will cover what a usability test is and when to use it, the key planning steps, the language around it, and the unique insights this method produces. She will also discuss the various approaches a market researcher can take when running a usability study at different points in a product’s development (e.g., concept, early prototype, released product).
Usability testing (or user testing) involves measuring the ease with which users can complete common tasks on your website. The results of the analysis are a huge eye-opener and their implementation often leads to:
Increased sales and task completion and a high rate of return site visitors
A greatly improved understanding of your customers’ needs
A significant reduction in call centre enquiries
A much more user-focused in-house development team Source: http://www.wbcsoftwarelab.com/wbcblog/read-basics-of-usability-testing
Simple Ways of Planning, Designing and Testing Usability of a Software Produc...KAROLINA ZMITROWICZ
Originally presented at QS-Tag 2016
https://www.qs-tag.de/en/abstracts/tag-1/simple-ways-of-planning-designing-and-testing-usability-of-a-software-product/
This affects the quality of software and increases the production cost of ... effectiveness of every method, it is useful to select the particular elicitation
http://www.imran.xyz
Cox Automotive: Testing Across Multiple BrandsOptimizely
Cox Automotive, the world’s leader in automotive remarketing services, and parent company to such brands as Autotrader, Kelley Blue Book, Manheim, and Dealer.com, has more than 40,000 auto dealer clients across five continents.
Cox Auto focuses on continually improving its products to create faster vehicle transactions and enabling consumers to have a seamless online-to-offline experience. Testing has a natural space to play here - as Cox Automotive’s businesses have learned to scale experimentation to optimize the design of its digital experiences.
In this webinar, Frances Reyes, Seth Stuck, and Sabrina Ho will discuss how Cox Automotive is building a culture of experimentation and testing across their digital properties.
You’ll learn:
- The impetus of testing at Cox Automotive
- How they leverage and share information across their business units, creating shared goals despite different business priorities
- How they created a framework for data-driven decisions across the company
The Consumer
Research Process
The Importance of the Consumer
Research Process
Largely Influenced by Psychology, sociology, and anthropology
Developing Research Objectives
Secondary Data
Designing Primary research
Qualitative Collection Method
Depth Interview
Qualitative Research vs Quantitative Research - a QuestionPro Academic WebinarQuestionPro
Hosted on October 14, 2020, this QuestionPro Academic focused webinar delved into the differences of Qualitative and Quantitative research and how you can achieve this using the QuestionPro research platform. We spoke about Heatmap and Hotspot analysis, card sorting, online focus groups using video discussions and even a beta feature coming soon, LiveCast that uses NLP to build real-time analytics from video survey questions. Our speaker was Dan Fleetwood, the President for Research and Insights at QuestionPro.
[UPDATE] Udacity webinar on Recommendation SystemsAxel de Romblay
A 1h webinar on RecSys for the Udacity NanoDegree Program "How to become a Data Scientist" : https://in.udacity.com/course/data-scientist-nanodegree--nd025.
The link to the ipynb : https://www.kaggle.com/axelderomblay/udacity-workshop-on-recommendation-systems
It is possible for a product to pass quality assurance tests and acceptance testing without being user-friendly. It is also too easy for those of us who build digital products to make assumptions about what our users need. As a design thinker, I strive to bring the authentic voices of complex audiences into the product lifecycle through pragmatic research.
A sound design research process not only shapes digital products to be more usable, it also adds value to drive engagement.
용산FM 라디오 방송 with 최병호 교수
I. 최병호 교수 소개: 본인 소개, 주요 기여, 인터뷰
II. 4차산업혁명: 4차산업혁명의 정체는?, 사례, 사회문제해결 도전과제
III. 소셜임팩트 AI 사례: 센시, 수퍼빈, 테스트웍스
IV. 최병호 교수 미래: 위대한 인물 육성 강사, 소셜임팩트 창출 AC, 문해력 해결 전도사, 사람과 자연을 사랑하는 작가
AI 트렌드 통찰로 산업파괴적인 AI BM을 모색해보고, AI 중심의 NEW THINKING으로 인류의 삶을 변화시킬 위대한 리더십을 고찰해본다.
AI 적용 트렌드 통한 산업파괴적인 AI BM 모색
AI 사례 분석으로 새로운 패러다임 창출 전략 탐색
AI 도전
NEW THINKING
소상공인을 위한 오프라인 매장 전략
방역 시스템 전략
블루오션 전략(1): No virus & No wait
마케팅용 퍼소나(PERSONA) 전략
젠트리피케이션 예측 전략
블루오션 전략(2): Noise masking
소상공인 제품을 위한 전략
수요 예측 전략
판로 예측 전략
장인과 예비장인을 위한 제조 지원 전략
소상공인을 위한 금융 지원 전략
소상공인을 위한 지능형 신용평가 및 금융 지원 전략
인공지능(AI)과 사용자 경험(UX)
담론 I. 드라마로 본 AI & UX
담론 II. 도전과제로 본 AI & UX
담론 III. 변방성 질문으로 본 AI & UX
사례연구 #1-1. 지능형 패션 프로파일링 및 UX
사례연구 #1-2. 지능형 패션 추천 시스템 및 UX
사례연구 #2. 지능형 시니어 맞춤 UX
인공지능시대?! 지금, 무슨 일이 벌어지고 있는가? 우리는, 무엇을 질문하고 통찰해야 하는가?Billy Choi
EPISODE #1. 치매환자를 위해서 인공지능은 무엇을 할 수 있을까요?
EPISODE #2. 미래의 집을 위한 지능형 HCI/UX?
EPISODE #3. 시니어를 위해서 지능형 HCI/UX?
EPISODE #4. 패션 장인들을 위해서 인공지능은 무엇을 할 수 있을까요?
SCENARIO #1. 인공지능과 비즈니스모델링
SCENARIO #2. 인공지능과 철학
I. 사회혁신 담론과 게이미피케이션
사회적경제 아파트와 게이미피케이션?
스마트앵커와 게이미피케이션?
지역기반 노인통합돌봄서비스와 게이미피케이션?
사회적경제특구와 게이미피케이션?
사회문제 해결형 혁신형 사업과 게이미피케이션?
II. 행동변화를 유도할 수 있는 HCI/UX 이론과 게이미피케이션
(시니어에게 지속적으로 스마트밴드를 착용하도록 만드는 휴먼 인터랙션의 비밀? )
새로운 것을 시도하게 하려면
내적 동기 유발
지속가능성 시동 – 자동화 시도
지속가능성 본격 진입 – 착수: ‘지속적 강화 계획’
지속가능성 본격 진입 – 단기 가속: ‘고정비율 계획’
지속가능성 유지 – ‘변동비율 계획’
지속가능성 유지 – 중독성 있는 고리 형성
처음 시작은 아주 사소한 것부터 출발
지금까지 논의한 거의 모든 것은 습관의 힘
Story Editing
Expert Accessory Dwelling Unit (ADU) Drafting ServicesResDraft
Whether you’re looking to create a guest house, a rental unit, or a private retreat, our experienced team will design a space that complements your existing home and maximizes your investment. We provide personalized, comprehensive expert accessory dwelling unit (ADU)drafting solutions tailored to your needs, ensuring a seamless process from concept to completion.
Top 5 Indian Style Modular Kitchen DesignsFinzo Kitchens
Get the perfect modular kitchen in Gurgaon at Finzo! We offer high-quality, custom-designed kitchens at the best prices. Wardrobes and home & office furniture are also available. Free consultation! Best Quality Luxury Modular kitchen in Gurgaon available at best price. All types of Modular Kitchens are available U Shaped Modular kitchens, L Shaped Modular Kitchen, G Shaped Modular Kitchens, Inline Modular Kitchens and Italian Modular Kitchen.
Hello everyone! I am thrilled to present my latest portfolio on LinkedIn, marking the culmination of my architectural journey thus far. Over the span of five years, I've been fortunate to acquire a wealth of knowledge under the guidance of esteemed professors and industry mentors. From rigorous academic pursuits to practical engagements, each experience has contributed to my growth and refinement as an architecture student. This portfolio not only showcases my projects but also underscores my attention to detail and to innovative architecture as a profession.
Can AI do good? at 'offtheCanvas' India HCI preludeAlan Dix
Invited talk at 'offtheCanvas' IndiaHCI prelude, 29th June 2024.
https://www.alandix.com/academic/talks/offtheCanvas-IndiaHCI2024/
The world is being changed fundamentally by AI and we are constantly faced with newspaper headlines about its harmful effects. However, there is also the potential to both ameliorate theses harms and use the new abilities of AI to transform society for the good. Can you make the difference?
White wonder, Work developed by Eva TschoppMansi Shah
White Wonder by Eva Tschopp
A tale about our culture around the use of fertilizers and pesticides visiting small farms around Ahmedabad in Matar and Shilaj.
You could be a professional graphic designer and still make mistakes. There is always the possibility of human error. On the other hand if you’re not a designer, the chances of making some common graphic design mistakes are even higher. Because you don’t know what you don’t know. That’s where this blog comes in. To make your job easier and help you create better designs, we have put together a list of common graphic design mistakes that you need to avoid.
Dive into the innovative world of smart garages with our insightful presentation, "Exploring the Future of Smart Garages." This comprehensive guide covers the latest advancements in garage technology, including automated systems, smart security features, energy efficiency solutions, and seamless integration with smart home ecosystems. Learn how these technologies are transforming traditional garages into high-tech, efficient spaces that enhance convenience, safety, and sustainability.
Ideal for homeowners, tech enthusiasts, and industry professionals, this presentation provides valuable insights into the trends, benefits, and future developments in smart garage technology. Stay ahead of the curve with our expert analysis and practical tips on implementing smart garage solutions.
20. 4 Common Biases
in Customer Research
• Confirmation Bias
• Framing Effect
• Observer-expectancy Effect
• Recency Bias
21. Confirmation Bias
Your tendency to search for or interpret
information in a way that confirms your
preconceptions or hypotheses.
22. Framing Effect
When you and your team draw different
conclusions from the same data based on your
own preconceptions.
23. Observer-expectancy
When you expect a given result from your
research which makes you unconsciously
manipulate your experiments to give you that
result
24. Recency Bias
This results from disproportionate salience
attributed to recent observations (your very last
interview) – or the tendency to weigh more
recent information over earlier observations
30. You need to gather:
• Factual information
• Behavior
• Pain points
• Goals
You can document this on the persona validation board
As well as…
Photos, video, audio, journals…document everything
41. 40
Methodology: Contextual Observation/Ethnography
► Business problem
► How are people actually using products versus how they were designed?
► Description
► In-depth, in-person observation of tasks & activities at work or home. Observations
are recorded.
► Benefits
► Access to the full dimensions of the user experience (e.g. information flow,
physical environment, social interactions, interruptions, etc)
► Limitations
► Time-consuming research; travel involved, Smaller sample size does not provide
statistical significance, Data analysis can be time consuming
► Data
► Patterns of observed behavior and verbatims based on participant response,
transcripts and video recordings
► Tools
► LiveScribe (for combining audio recording with note-taking)
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
42.
43.
44.
45. 44
Methodology: Remote Ethnography
► Business problem
► How are people actually using in their environment in real-time?
► Description
► Participants self-record activities over days or weeks with pocket video cameras or
mobile devices, based on tasks provided by researcher.
► Benefits
► Allows participants to capture activities as they happen and where they happen
(away from computer), without the presence of observers. Useful for longitudinal
research & geographically spread participants.
► Limitations
► Dependence on participant ability to articulate and record activities, Relatively high
data analysis to small sample size ratio
► Data
► Patterns based on participant response, transcripts and video recordings
► Tools
► Qualvu.com
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
46. 45
Methodology: Large-Sample Online Behavior Tracking
► Business problem
► Major redesign of a large complex site that is business-critical?
► Description
► 200-10,000+ respondents do tasks using online tracking / survey tools
► Benefits:
► Large sample size, low cost per respondent, extensive data possible
► Limitations
► No direct observation of users, survey design complex…other issues
► Data
► You name it (data exports to professional analysis tools).
► Tools of Choice
► Keynote WebEffective, UserZoom,
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
47. 46
Methodology: Lab-based UX Testing
► Business problem
► Are there show-stopper (CI) usability problems with your user experience?
► Description
► 12-24 Respondents undertake structured tasks in controlled setting (Lab)
► Benefits
► Relatively fast, moderate cost, very graphic display of major issues
► Limitations
► Small sample, study design, recruiting good respondents
► Data
► Summary data in tabular and chart format PLUS video out-takes
► Tools
► Leased testing room, recruiting service and Morae (Industry Standard)
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
48.
49. 48
Methodology: Eye-Tracking
Business Problem
Do users see critical content and in what order?
Description
Respondents view content on a specialized workstation or glasses.
Benefits
Very accurate tracking of eye fixations and pathways.
Limitations
Relatively high cost, analysis is complex, data can be deceiving.
Data
Live eye fixations, heat maps…etc.
Tools of Choice
Tobii - SMI
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
50. 49
Methodology: Automated Online Card Sorting
► Business problem
► User’s cannot understand where content they want is located?
► Description
► Online card sorting based on terms you provide (or users create)
► Benefits
► Large sample size, low cost, easy to field
► Limitations
► Use of sorting tools confuse users, data hard to understand
► Data
► Standard cluster analysis charts and more
► Tools of Choice
► WebSort…and others
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
51. 50
Methodology: fMRI (Brain Imaging)
► Business Problem?
► What areas of the brain are being activated by UX design
► Description
► Respondents given visual stimulus while in FMRI scanner
► Benefits
► Maps design variables to core functions of the human brain
► Limitations
► Expensive and data can be highly misleading
► Data
► Brain scans
► Tools
► Major medical centers and research services (some consultants)
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
52. 51
Methodology: Professional Heuristics
► Business problem
► Rapid feedback on UX design based on best practices or opinions
► Definition
► “Heuristic is a simple procedure that helps find adequate, though often imperfect,
answers to difficult questions (same root as: eureka)”
► Benefits
► Fast, low cost, can be very effective in some applications
► Limitations
► No actual user data, analysis only as good as expert doing audit
► Data
► Ranging from verbal direction to highly detailed recommendations
► Tools of Choice
► Written or verbal descriptions and custom tools by each experts.
Cost / respondent: NA
Statistical validity: None – Some – Extensive
53. 52
Methodology: Focus Groups
► Business problem
► What are perceptions and ideas around products/concepts?
► Description
► Moderated discussion group to gain concept/product feedback and inputs; can
include screens, physical models and other artifacts
► Benefits
► Efficient method for understanding end-user preferences and for getting early
feedback on concepts , particularly for physical or complex products that benefit
from hands-on exposure and explanation
► Limitations
► Lacks realistic context of use; Influence of participants on each other
► Data
► Combination of qualitative observations (like ethnographic research) with
quantitative data (e.g. ratings, surveys)
► Tools
► See qualitative data analysis
Cost / respondent: Low – Moderate – High
Statistical validity: None – Some – Extensive
54. A / B Testing
What
A testing procedure in which two (or
more) different designs are evaluated in
order to see which one is the most
effective. Alternate designs are served to
different users on the live website.
Why
Can be valuable in refining elements on a
web page. Altering the size, placement, or
color of a single element, or the wording
of a single phrase can have dramatic
effects. A / B Testing measures the results
of these changes.
Resources
A/B testing is covered in depth in the book
Always Be Testing: The Complete Guide to
Google Website Optimizer by Bryan
Eisenberg and John Quarto-von Tivadar.
http://www.testingtoolbox.com/
You can also check out the free A/B
testing tool Google Optimizer.
https://www.google.com/analytics/siteopt/pr
eview
A / B Testing
http://www.flickr.com/photos/danielwaisberg/
55. Kano Analysis
What
Survey method that determines
how people value features and
attributes in a known product
domain. Shows what features are
basic must-haves, which features
create user satisfaction, and which
features delight.
Why
Allows quantitative analysis of
feature priority to guide
development efforts and
specifications. Ensures that
organization understands what is
valued by users. Less effective for
new product categories
Kano Analysis
56. Six Thinking Hats
What
A tactic that helps you look at decisions
from a number of different perspectives.
The white hat focuses on data; the red on
emotion; the black on caution; the yellow
on optimism; the green on creativity; and
the blue on process.
Why
Can enable better decisions by
encouraging individuals or teams to
abandon old habits and think in new or
unfamiliar ways. Can provide insight into
the full complexity of a decision, and
highlight issues or opportunities which
might otherwise go unnoticed.
Resources
Lateral thinking pioneer Edward de Bono
created the Six Thinking Hats method.
http://www.edwdebono.com/
An explination from Mind Tools.
http://www.mindtools.com/pages/article/
newTED_07.htm
Six Thinking Hats
http://www.flickr.com/photos/daijihirata/
60. What is Ethnography?
• Defined as:
– a method of observing human interactions in social
settings and activities (Burke & Kirk, 2001)
– as the observation of people in their ‘cultural context’
– the study and systematic recording of human cultures;
also : a descriptive work produced from such research
(Merriam-Webster Online)
• Rather than studying people from the outside, you
learn from people from the inside
61. (Anderson, 1997; Malinowski, 1
967; 1987; Kuper 1983)
Who Invented Ethnography?
• Invented by Bronislaw Malinowski in 1915
– Spent three years on the Trobriand Islands (New
Guinea)
– Invented the modern form of fieldwork and
ethnography as its analytic component
62. (Salvador & Mateas, 1997)
Traditional VS Design Ethnography
Traditional
• Describes cultures
• Uses local language
• Objective
• Compare general
principles of society
• Non-interference
• Duration: Several Years
Design
• Describes domains
• Uses local language
• Subjective
• Compare general
principles of design
• Intervention
• Duration: Several
Weeks/Months
63. Contextual inquiry is a field data-gathering technique
that studies a few carefully selected individuals in
depth to arrive at a fuller understanding of the work
practice across all customers.
Through inquiry and interpretation, it reveals
commonalities across a product’s customer base.
What is contextual inquiry?
~ Beyer & Holtzblatt
64. Contextual Inquiry:
When to do it
Every ideation and design cycle should start with a
contextual inquiry into the full experience of a customer and
his/her.
Contextual inquiry clarifies and focuses the problems a
customer is experiencing by discovering the
• Precise situation in which the problems occur.
• What the problem entails.
• How customers go about solving them.
65. What is your focus?
Who is your audience?
Recruit & schedule participants
Learn what your users do
Develop scenarios
Conduct the inquiry
Interpret the results
Evangelize the findings
Rinse, repeat (at least monthly)
Contextual Inquiry:
How to do it
66. (Nielsen, 2002)
Dos & Don’ts
Don’t
• Ask simple Yes/No
questions
• Ask leading questions
• Use unfamiliar jargon
• Lead/guide the ‘user’
Do
• Ask open-ended questions
• Phrase questions properly
to avoid bias
• Speak their language
• Let user notice things on
his/her own
67. Analyzing
the results
“The output from customer research is not a
neat hierarchy; rather, it is narratives of
successes and breakdowns, examples of use
that entail context, and messy use artifacts”
Dave Hendry
66
68. Research Analysis
What are people’s values?
People are driven by their social and cultural contexts as much as their rational
decision making processes.
What are the mental models people build?
When the operation of a process isn’t apparent, people create their own models of
it
What are the tools people use?
It is important to know what tools people use since you are building new tools to
replace the current ones.
What terminology do people use to describe what they do?
Words reveal aspects of people’s mental models and thought processes
What methods do people use?
Flow is work is crucial to understanding what people’s needs are and where
existing tools are failing them.
What are people’s goals?
Understanding why people perform certain actions reveals an underlying
structure of their work that they may not be aware of themselves.
69. Affinity
Diagrams
“People from different teams engaged in affinity
diagramming is as valuable as tequila shots and
karaoke. Everyone develops a shared
understanding of customer needs, without the
hangover or walk of shame”
68
70. Research Analysis: Affinity Diagrams
Creates a hierarchy of all observations, clustering them
into themes.
From the video observations, 50-100 singular
observations are written on post-its
(observations ranging from tools, sequences, interactions,
work-arounds, mental models, etc)
With entire team, notes are categorized by relations into
themes and trends.
75. Users record thoughts,
comments, etc. over time
http://www.flickr.com/photos/vanessabertozzi/877910821
http://www.flickr.com/photos/yourdon/3599753183/
http://www.flickr.com/photos/stevendepolo/3020452399/
http://www.flickr.com/photos/jevnin/390234217/
Interview users
Gather feedback, data
Organise and analyse
(affinity maps, analytics)
76. Participants keep a record of
“When” data
Date & time
Duration
Activity / task
“What" data
Activity / task
Feelings / mood
Environment / setting
77. No one right way to collect data
Structured
Yes/no
Select a category
Date & time
Multiple choice
Unstructured
Open-ended
Opinions / thoughts / feelings
Notes / comments
Combine / mix & match
http://www.flickr.com/photos/roboppy/9625780/
http://www.flickr.com/photos/vanessabertozzi/877910821
78.
79. “Hygiene” aspects
At the beginning
•Introduction / get-to-know-you
•Demographics & psychographics, profiling
•Instructions / Setting expectations
At the end
•Follow-up
•Thanks / token gift
•Reflection
92. Usability Tests
Start new test
Identify 3-5
tasks to test
Observe test
participants
performing
tasks
Identify the 2-3
easiest things to
fix
Make changes
to site
93. Identify Tasks for the Test
Known
problem
areas
Most
common
activities
Popular
pages
New
pages or
services
96. Staff of One
Recruits test
participants Runs the test
Records the
test
(screen recording
software & mic)
Preps test
environment
before & after
test
97. Staff of Two
Recruits test
participants
Runs the test
#1
Observes the
test
Preps test
environment
before & after
each test
#2
100. New technologies and
techniques allow for
Remote:
– Moderated testing
– Unmoderated testing
– Observation
Irrelevance of
Place
101. Remote Moderated Testing
Products like GotoMeeting allow connections
to the test (or observation) computer to the
Internet. VoIP can carry voice cheaply.
LiveMeeting
WebEx
GoToMeeting
For screen VoIP Audio
Skype
GoogleTalk
Translator
Moderator
Participant
Observers
102. Remote Unmoderated Testing
‘Task-based’ Surveys
> Online/remote Usability Studies
(unmoderated)
> Benchmarking (competitive /comparison)
> UX Dash`boards (measure ROI)
Online Card Sorting
> Open or closed
> Stand alone or
> Integrated with task-based
studies & surveys
Online Surveys
> Ad hoc research
> Voice of Customer studies
> Integrated with Web Analytics data
User Recruiting Tool
> Intercept real visitors (tab or layer)
> Create your own private panel
> Use a panel provider*
Robust Set of Services
103. 102
• Saves time
o Lab study takes 2-4 weeks from start to finish, unmoderated typically takes hours to
a few days*
• Saves money
o Participants compensation typically a lot less ($10 vs. $100)
o Tools are becoming very inexpensive
• Reliable metrics
o Only (reasonable) way to collect UX data from large sample sizes
• Geography is not a limitation
o Collect feedback from customers all over the world
• Greater Customer insight
o Richest dataset about the customer experience
Why Should You Care?
104. 103
Common Research Questions:
• What are the usability issues, and how big?
• Which design is better, and by how much?
• How do customer segments differ?
• What are user design preferences?
• Is the new design better than the old design?
• Where are users most likely to abandon a transaction?
Types of Studies:
• Comprehensive evaluation
• UX benchmark
• Competitive evaluation
• Live site vs. prototype comparison
• Feature/function test
• Discovery
Overview
Typical Metrics:
• Task success
• Task time
• Self-report ratings such as ease of use,
confidence, satisfaction
• Click paths
• Abandonment
113. STEPS IN A CARD SORT
1. Decide what you want to learn
2. Select the type of Card Sort (open vs closed)
3. Choose Suitable Content
4. Choose and invite participants
5. Conduct the sort (online or in-person)
6. Analyze Results
7. Integrate results
114. WHAT ARE YOU WANTING TO LEARN?
• New Intranet vs Existing?
• Section of Intranet?
• Whole organization vs single department?
• For a project? For a team?
115. Product
Targets
CRM
Project
Review
CRM
Organizatio
n Chart
Christmas
Party
Walkathon
Results
Year in
Review
Meeting
Vacation
Policy
Pay Days
Vacation
request
form
Year in
Review
Meeting
Product
TargetsCRM
Project
Review
CRM
Organizatio
n Chart
Christmas
Party
Walkathon
Results
Vacation
Policy
Pay Days
Vacation
request
form
OPEN VS CLOSED
Vacation
Policy
Christmas
Party
CRM
Project
Review
CRM
Organization
Chart
Product
Targets
Year in
Review
Meeting
Pay Days
Walkathon
Results
Vacation
request
form
Vacation
Policy
Christmas
Party
CRM
Project
Review
CRM
Organizatio
n Chart
Product
TargetsYear in
Review
Meeting
Pay Days
Walkathon
Results
Vacation
request
form
Company
News
Departments
Human
Resources
Projects
Company
News
Events
Human
Resources
Projects
Company
News
Departments
Human
Resources
Projects
OPEN
SORT
CLOSED
SORT
116. SELECTING CONTENT
Do’s
•30 – 100 Cards
•Select content that can be
grouped
•Select terms and concepts
that mean something to
users
Don’ts
• More than 100 cards
• Mix functionality and
content
• Include both detailed and
broad content
118. LOOK AT
• What groups were created
• Where the cards were placed
• What terms were used for labels
• Organization scheme used
• Whether people created accurate or inaccurate groups
119. INTEGRATE RESULTS: CREATE YOUR IA
Our Company
Executive Blog
New York
Vancouver
Mission and Values
Projects
Project Name 1
Project Name 2
Project Name 3
Project Name 4
Departments
Executive
Operations
Operations Support
Vessel Planning
Yard Planning
Rail Planning
Finance &
Administration
Human Resources
Corporate
Communications
IT
Community &
Groups
Events
Charitable Campaigns
Vancouver Carpool
Employee
Resources
Vacation & Holidays
Expenses
Travel
Health & Safety
Wellness
Benefits
Facilities
Payroll
Communication Tools
Centers of
Excellence
Project Management
Professionals
Engineering
Terminal Technologies
NAVIS
Lawson
IT
Yard Planning
122. Card Sorting is as common as Lab based Usability
Testing
Source: 2011 UxPA Salary Survey
123. Terms & Concepts
• Open Sort: Users sort items into groups and give the
groups a name.
Closed Sort: Users sort items into previously defined
category names.
• Reverse Card Sort (Tree Test) : Users are asked to locate
items in a hierarchy (no design)
• Most Users Start Browsing vs Searching: Across 9
websites and 25 tasks we found on average 86% start
browsing
http://www.measuringusability.com/blog/card-sorting.php
http://www.measuringusability.com/blog/search-browse.php
125. Set-up of an eye tracking test
User tests are often run in 45 to 60
minute sessions with 6 to 15
participants:
1. Participants are give a number of
typical task to complete, using the
website, design or product you want
to test.
2. The user’s intuitive interaction is
observed, comments and reactions
are recorded.
3. The participant‟s impressions are
captured in an interview at the end
of the test.
124
126. Eye tracking results: Heatmaps
Heatmaps show what participants
focus on.
In this example, „hot spots‟ are the
picture of the shoes, the central entry
field and the two right-hand tiles
underneath.
The data of all participants is
averaged in this map.
125
127. Eye tracking results: Gazeplot
Gaze plots show the „visual path‟ of
individual participants. Each bubble
represents a fixation.
The bubble size denotes the length
or intensity of the fixation.
Additional results are available in
table format for more detailed
analysis.
126
128. The key visual and a box at the bottom
Note: Telstra Clear have since re-designed their homepage.
The key
visual got
lots of
attention.
Surprising: This box got
heaps of attention. It
reads:
“If you are having trouble
getting through to us on
the phone, please click
here to email us, we‟ll get
back to you within 2
business days”.
Participants got the
impression that Telstra Clear
has trouble with their
customer service.
The main
navigation and
its options got
almost no
attention.
127
129. The Face effect – an example
bunnyfoot
Yep, there’s
attention on
certain… areas, … the face,
however, is the
strongest point
of focus!
128
130. Using the Face effect
humanfactors.com
Eye tracking results for ad Version
A:
We see a face effect: The model‟s face
draws a lot of attention.
The slogan is the other hot spot of the
design. Participants will likely have read
it.
The product and its name get some,
but not a lot of attention.
129
131. Using the Face effect
Eye tracking results for ad Version
B:
Again, we see a strong face effect. BUT:
In this version, the models gaze is in line
with the product and its name.
The product image and name get
considerably more attention!
Additionally, even the product name at
the bottom is noticed by a number of
participants.
humanfactors.com 130
132. Ways to focus attention
usableworld.com.au
Same effect: If the baby faces you, you‟ll look at the baby. But if the baby faces the ad
message, you pay attention to the message. You basically follow the baby‟s gaze.
131
133. Banner blindness
… or are they?
In this test, participants were
given a task: Find the nearest
ATM.
Participants focused on the
main navigation and the
footer navigation– this is
where they found the „ATM
locator‟.
So, when visiting a site with a
task in mind – as you
normally do – the central
banner can be ignored!
132
134. Compare the visual paths: Task versus browse
When browsing, the central banner gets lots of attention. But how often do you visit a bank
website just to browse?
Participant was asked just to look at the homepage Participant was given a task („Find the nearest ATM‟)
133
135. Main focus: Navigation options
Eye tracking results show:
When looking for
something on a
website, the main
focus of attention are
the navigation options.
Maybe users have learned
that they‟re unlikely to
find what they‟re looking
for in a central banner
image.
Task: „What concerts are happen in Auckland this month?‟ Task: „You want to send an email to customer service‟
134
136. Task: „You want to get in touch with customer service‟
When do users look at banners?
In this example, participants looked at the banner even though they were looking for
something specific. What‟s different?
Participant was asked just to look at the homepage
135
139. 1. Visibility of system status
2. Match between system and real world
3. User control and freedom
4. Consistency and standards
5. Error prevention
6. Recognition rather than recall
7. Flexibility and efficiency of use
8. Aesthetic and minimalist design
9. Help users recognize, diagnose, and recover from errors
10. Help and documentation
J. Nielsen and R. Mack, eds. Usability Inspection Methods, 1994
Nielsen’s 10 heuristics
Slide 138
141. HE output
Slide 140
• A list of usability problems
• Tied to a heuristic or rule of practice
• A ranking of findings by severity
• Recommendations for fixing problems
• Oh, and the positive findings, too
143. 142
Objectives/goals for
the modules
Reason content is being
presented
Conciseness of presentation
Definitions required to work
with the module/content
Evaluation criteria and
methods
Direct tie between content
and assessment measure
Sequence of presentation
follows logically from
introduction
Quizzes challenge users
Develop a consistent structure that
defines what’s noted in the
bulleted points, above.
Avoid generic statements that
don’t focus users on what they will
be accomplishing.
Advise that there is an assessment
used for evaluation and indicate if
it’s at the end or interspersed in
the module
Connect ideas in the goals and
objectives with outcomes in the
assessment
Follow the order of presentation
defined at the beginning
Develop interesting and
challenging questions
Re-frame goals/objectives at the
end of the module
3
Finding Description Recommendation H C S Severity Rating
Objectives/goals for the
modules
Reason content is being
presented
Conciseness of presentation
Definitions required to work
with the module/content
Evaluation criteria and
methods
Direct tie between content and
assessment measure
Sequence of presentation
follows logically from
introduction
Quizzes challenge users
Develop a consistent structure that
defines what’s noted in the bulleted
points, above.
Avoid generic statements that don’t
focus users on what they will be
accomplishing.
Advise that there is an assessment
used for evaluation and indicate if it’s
at the end or interspersed in the
module
Connect ideas in the goals and
objectives with outcomes in the
assessment
Follow the order of presentation
defined at the beginning
Develop interesting and challenging
questions
Re-frame goals/objectives at the end
of the module
3
Hyperspace, Shock, and Cardiac Arrest all require more clearly defined goals and objectives.
H = Hyperspace; C = Cardiac Arrest; S = Shock
Business question: Is anyone working on a major (large-scale) site launch or redesign that your company depends on for survival?
Audience question: “How many of you have a project at the point where it is ready for major commitment? (round A, new release, major new upgrade) I have a web site, software or product and I am about to commit major funding or resources to next phase of developmentDo I have usability problems with the user experience that are basically show stoppers Users cannot download the application Users cannot log in Users cannot set up a profile pageUsers cannot navigate to critical content 1-3 critical tasks in 60 min.
Business question: How are users actually viewing your content (in what order, for how long and in what specific pattern or pathways)?Audience question: Have you wondered if critical links, buttons or content messaging is being viewed on a critical page?Description: this methodology is very useful when trying to determine why certain homepage metrics from analytics programs are of concern (not clicking on value proposition element…etc.)The respondent sits at a specialized computer screen and undergoes a simple calibration sequence. Respondent is given a stimulus question or task (active or passive) (show homepage for set period of time 15 seconds)System tracks eye pathways and fixations and produces a data file from that task.Important things to know about eye-tracking Tobii not designed for web sites or changing visual stimulus)This makes actual testing of web navigation (changing from page to page) very complex to actually analyze and not accurateVery effective for single stimulus presentations of fixed durationsExcellent for detailed analysis of home pages or critical landing pages and forms Very insightful for assessing impact of advertizing on homepage visual scanning.
Business problem: How do I organize information like content, navigation, overall IA so that users understand it?Description: this is an automated version of the classic card sorting studies where you give users a pile of index cards with your content descriptions on them and ask them to sort the cards into groups according to how they relate to the content. Example: If I have a bunch of content categories how do I determine what the groupings are and the high level navigation labels? Lets say you have a site selling women’s underwear and you what to create a navigation structure that matches the users mental model. So do you organize the site by type of underwear on top level and styles, colors, or do you organize the site navigation by life style like (athletic, everyday, intimate) and they by type of article color, and price. Respondents are invited to an online study via email.When they agree they encounter a screen with a list of labels or terms in one column and are asked to sort the terms into groups they find organizationally relevant. When they are finished you can give them another card sort of just finish the study. When the required number of respondents are finished with the card sort you can view the dataCard sorting data is analyzed through the application of cluster analysis (not that easy to understand but very useful)
Business question: Do any of you have new development team that has minimal UX / Usability experience? Is your team employing best practices and are they aware of the key UX and Usability performance issues that an effective solution must meet.Description: A highly experienced usability / UI design expert conducts a structured audit of your system or product and rates the system on best practices and estimated performanceInterview and select an expert who has direct experience in your product category and sectorExpert gathers information from your development team and conducts structured audit based on predetermined best practices. Expert presents findings to your team (sometimes not a happy experience for UX design teams without knowledge of formal UCD methods.Very effective early in development and can be repeated with updates at less cost.
Pattern Name : A/B TestingClassification: Continuous ImprovementIntent: Can be valuable in refining elements on a web page. Altering the size, placement, or color of a single element, or the wording of a single phrase can have dramatic effects. A / B Testing measures the results of these changes. Also Known As:Other names for the pattern.Motivation (Forces):A scenario consisting of a problem and a context in which this pattern can be used.Applicability:Situations in which this pattern is usable; the context for the pattern.Structure:A graphical representation of the pattern. Class diagrams and Interaction diagrams may be used for this purpose.Participants:A listing of the classes and objects used in the pattern and their roles in the design.Collaboration:A description of how classes and objects used in the pattern interact with each other.Consequences:A description of the results, side effects, and trade offs caused by using the pattern.Implementation:A description of an implementation of the pattern; the solution part of the pattern.Sample Code:An illustration of how the pattern can be used in a programming languageKnown Uses:Examples of real usages of the pattern.Related Patterns:Other patterns that have some relationship with the pattern; discussion of the differences between the pattern and similar patterns.
Pattern Name : Kano AnalysisClassification: Business Requirements ManagementIntent: Allows quantitative analysis of feature priority to guide development efforts and specifications. Ensures that organization understands what is valued by users. Less effective for new product categories.Also Known As: Kano ModelMotivation (Forces): You have a need to categorize features by basic must-haves, which features create user satisfaction, and which features delight. Applicability: You have a list of business requirements, however you know that in the current phase of the project, you will not be able to get everything done. You are going to use a Cycle methodology, and you need to know which features the users will want as basic must have’s, which features will excite them, and which are low impact features. In any given release, you will want to include at least one delightful / exciting features. Additionally on your first release you will probably want to include as many basic / must have features. Use Kano Analysis to identify which features are which.Structure:A graphical representation of the pattern. Class diagrams and Interaction diagrams may be used for this purpose.Participants: Potential Users, SurveyorCollaboration:A description of how classes and objects used in the pattern interact with each other.Consequences: This tool tells you about user perceptions. Remember this limitation, you might want to measure something else. Implementation: Survey method that determines how people value features and attributes in a known product domain. Shows what features are basic must-haves, which features create user satisfaction, and which features delight.Sample Code:An illustration of how the pattern can be used in a programming languageKnown Uses:Examples of real usages of the pattern.Related Patterns:Other patterns that have some relationship with the pattern; discussion of the differences between the pattern and similar patterns.
Pattern Name : Six Thinking HatsClassification: Business Requirements ManagementIntent: Can enable better decisions by encouraging individuals or teams to abandon old habits and think in new or unfamiliar ways. Can provide insight into the full complexity of a decision, and highlight issues or opportunities which might otherwise go unnoticed.Also Known As:Other names for the pattern.Motivation (Forces):A scenario consisting of a problem and a context in which this pattern can be used.Applicability:Situations in which this pattern is usable; the context for the pattern.Structure:A graphical representation of the pattern. Class diagrams and Interaction diagrams may be used for this purpose.Participants:A listing of the classes and objects used in the pattern and their roles in the design.Collaboration:A description of how classes and objects used in the pattern interact with each other.Consequences:A description of the results, side effects, and trade offs caused by using the pattern.Implementation:A description of an implementation of the pattern; the solution part of the pattern.Sample Code:An illustration of how the pattern can be used in a programming languageKnown Uses:Examples of real usages of the pattern.Related Patterns:Other patterns that have some relationship with the pattern; discussion of the differences between the pattern and similar patterns.
Note: give an example here
Usability tests are really not such a big deal. Here’s a quick overview of the steps:Come up with a set of 3-5 different tasks that you’ll ask users to perform.Round up some 5-10 volunteers who will act as test participants and then bring them one at a time into a testing area where you’ll observe them as they perform the predetermined tasks.After you’ve observed all the test partipants, you’ll have a pretty good idea of some things that need to be fixed and what things seems to be working OK. After you make the easiest 2-3 fixes, go back and do another round of testing and tweaking, etc.
OK, so you now have an idea about what service or resource you’re going to test, next you’ll want to think about what actual tasks you want your test participants to do. You’lll want to pick pick tasks that are going to reveal some useful information to you.One obvious place to go looking for tasks are those pages or services that you and your colleagues already know need work, such as your interlibrary loan form or the way that library hours are displayed.Another strategy is to think about what are the most common activities among patrons in your library. Take a look at your site statistics to see what are the most popular pages. Maybe that’s where you want to do your testing.Or maybe you’re about to launch a new page or service. Those are great opportunities for testing.
OK. So the gear you need is not too complicated. You’ll need a computer….a desktop or a laptop will do. Last year, I had test participants use my smarthphone wen I was testing a mobile web site. If you really want to get serious about user-centered design, you may want to do usability testing on paper sketches that precede any actual website coding. This is perfect acceptable and commonly done. It’s a great way to run tests that will help you get a basic page layout and site architecture problems.You’ll also want to install some screen recording software on the computer that your test participants use. That way, you can capture as a movie all the mouse movements, page clicks, and characters typed; this is really rich data to return to when the tests are done and you are trying to write up your report. I’ll talk in a minute about software options.Another option that has worked for me is to simply have a second person on hand helping you with the test. That person’s sole responsibility is to closely observe the test participant and take detailed notes.Finally, if you have screen recording software, you might as well get a USB microphone that can capture the conversation between the test participant and the test facilitator. You’ll want to encourage the participant to think aloud as much as possible as they perform tasks.
Here are five options for screen recording software. I’ve used CamStudio a lot mostly because it is free and can be installed on any machine. With the others, you’ll get a much richer feature set but will limited by the number of machines you can install it on.
OK, so if you are doing the tests all by your lonesome (not the best situation but certainly still doable), you’ll be in charge of recruiting test participants, running the test, recording the test (you’ll definitely need screen recording software and a mic), and for prepping the test environment.
If you can get another person to help you out with the testing, you can break up the tasks in rational ways.
It’s essential that you ask the participant to speak aloud so you can hear them express any frustrations or surprises they’ve had.
Saves time – Very fast, thousands on panels, Money – essence of quick and dirty. Techniques for dealing with noise, unrealistic to be in the lab that long. Combines both qual/quant and attitudes and behavior.
All the flexibility you need to set up a study and analyzing the dataSignificant support in designing study and analysis. Pricing is all project based – typically very expensive – good choice for a large benchmark study
- Sort into groups
OPEN SORT: good for getting ideas on groups of contentCLOSED: Useful to see where people would put the content.
Card sorting as a method in HCI largely took off during the internet boom of the late 1990’s with the proliferation of website navigation.
Today it’s one of the most popular methods UX professionals use. In fact, practitioners report using Card Sorting as frequently as task oriented lab-based usability testing.
This effect can be used to direct attention, for example on an ad. Here two different versions of an ad were eye-tracked. In this case, the model is looking directly at the viewer.
And in this version, the model looks at the product, forming a straight line between her eye and the product name on the package.
Using the cards post-task or post-test.Participant walks table, chooses. Returns to discuss meaning. Log comments for later analysis.