The Completed Quests page clearly shows the total points earned for each completed assignment as well as the total points earned for the course so far. Participants found this breakdown of points intuitive and helpful for understanding their progress in the class.
What is Heuristic evaluation
Background
Benefits
Main advantages and drawbacks of the method
Scenario and methods of evaluation
10 usability Heuristics in usability engineering
How to conduct heuristic Evaluation
Phases of the Evaluation Method
Problems and Evaluators
Seamlessness thought the whole user experience
What is Heuristic evaluation
Background
Benefits
Main advantages and drawbacks of the method
Scenario and methods of evaluation
10 usability Heuristics in usability engineering
How to conduct heuristic Evaluation
Phases of the Evaluation Method
Problems and Evaluators
Seamlessness thought the whole user experience
Assessing the Influence on User Experience of Web Interface Interactions Acro...Erkki Saarniit
Designers would like to create web-based solutions that would work on different devices, but the user experience may differ across the devices, which may complicate the work of designers.
The research sought answers to how users perceive different web interactions and whether there are any differences in the perceptions of interactions among laptops and smartphones.
BYOD (Bring Your Own Device) in the Classroom: Two Tech Tools for Fostering ...maritezita
Maritez Apigo's presentation at the Strengthening Student Success Conference on October 5, 2016 at Hyatt Regency Orange County
Tired of telling your students to put away their phones? Instead, instruct your students to take them out! The presenter will showcase two user-friendly educational technology tools for engaging students in interactive polls and administering formative assessments of student learning outcomes on their mobile devices: Poll Everywhere and Socrative. These student response systems allow instructors to identify students in need of targeted intervention strategies and to apply data-driven instruction. Through the techniques modeled and examples shared, the presenter will demonstrate the benefits of integrating technology in the classroom to advance equity and student success. This session will also provide tips and helpful resources for getting started and troubleshooting with these technologies. Bring your smartphone, iPad, tablet, or laptop to fully participate in this interactive session.
http://maritez.populr.me/sssc16
Grad1-YuanjingSun-CS5760Evaluation-UtestReport-Apr27Yuanjing Sun
This usability test report documents includes test of the Field Form web App http://www.csl.mtu.edu/classes/cs4760/www/projects/s15/group6/www/hci/. The test was carried out in Apr.14 to 16th 2015 by Team Justice League. It adapted paper-based USDA agriculture field condition criteria and evaluation method by using Field Form website. Local farmers can upload weekly report of weather observation, assessment of crop condition as well as GPS location. Such field data collection across 3000 counties in U.S. will have incredible values for stakeholders.
Assessing the Influence on User Experience of Web Interface Interactions Acro...Erkki Saarniit
Designers would like to create web-based solutions that would work on different devices, but the user experience may differ across the devices, which may complicate the work of designers.
The research sought answers to how users perceive different web interactions and whether there are any differences in the perceptions of interactions among laptops and smartphones.
BYOD (Bring Your Own Device) in the Classroom: Two Tech Tools for Fostering ...maritezita
Maritez Apigo's presentation at the Strengthening Student Success Conference on October 5, 2016 at Hyatt Regency Orange County
Tired of telling your students to put away their phones? Instead, instruct your students to take them out! The presenter will showcase two user-friendly educational technology tools for engaging students in interactive polls and administering formative assessments of student learning outcomes on their mobile devices: Poll Everywhere and Socrative. These student response systems allow instructors to identify students in need of targeted intervention strategies and to apply data-driven instruction. Through the techniques modeled and examples shared, the presenter will demonstrate the benefits of integrating technology in the classroom to advance equity and student success. This session will also provide tips and helpful resources for getting started and troubleshooting with these technologies. Bring your smartphone, iPad, tablet, or laptop to fully participate in this interactive session.
http://maritez.populr.me/sssc16
Grad1-YuanjingSun-CS5760Evaluation-UtestReport-Apr27Yuanjing Sun
This usability test report documents includes test of the Field Form web App http://www.csl.mtu.edu/classes/cs4760/www/projects/s15/group6/www/hci/. The test was carried out in Apr.14 to 16th 2015 by Team Justice League. It adapted paper-based USDA agriculture field condition criteria and evaluation method by using Field Form website. Local farmers can upload weekly report of weather observation, assessment of crop condition as well as GPS location. Such field data collection across 3000 counties in U.S. will have incredible values for stakeholders.
We test the site www.whirlpool.net.au and did a detail analysis on that website and tried to find the issues. This is our analysis and finding about the website and some recommendation to improve the design of the website.
Google Chromecast Usability Report by Team User FriendlyReed Snider
This usability report for the Google Chromecast® was carried out by team User Friendly of the Bentley University Testing & Assessments course in the Human Factors in Information Design graduate program.
The Google chromecast product was given to a set of 45-80 year old participants who were instructed to simply "set up the device". This project was not organized by Google and all rights of the terms used are attributed to Google® under Alphabet®.
Examining the effect of a real time student dashboard on student behavior and...Bob Bodily
In this presentation we present a randomized control trial research study conducted to determine the effect of a real-time student dashboard on student behavior and student achievement. We also present on some of our design changes to increase student use of our dashboards.
Visit BobBodily.com for more information about my research.
Formative EvaluationFormative evaluation gives real results as t.docxhanneloremccaffery
Formative Evaluation
Formative evaluation gives real results as the extent of the validity of the project submitted. Where it is in the development stage as it gives specific guidance on the kinks at each stage of the design stages. This gives us a good opportunity for the development and improvement before starting the execution, which helps to reach the desired goal.
"Formative evaluation is conducted during the process of designing and development the materials while there's still time to make changes. Summative evaluation measures the effectiveness of instruction after it has been finalized.” (Lecture Note).
During the first phase of the project design (analysis phase) and found several recipes
characteristics of participants
During the first phase of the project design (analysis phase) and found several recipes they are involved, including:
1- All of the targeted are teachers of elementary schools.
2- Thy did not use iMovie before.
. 3- They have the basics of computer use
. 4- Thy have a desire to use technology in education
. 5- All of them have experience of not less than five years in education
. 6- all-male sex
This helps us a lot in choosing the right tools to deliver the project in proportion to their
characteristics. Also it helps in building the content according to the quality of our targets and what should we say and what we must not say.
The materials and instruments used in the evaluation
Due to the value of our project is submitted, the first steps to success is the involvement of experts and some of the target in the evaluation process. This helps a lot in the detection of defects that may be unclear to us. Can not build any project according to a personal opinion because it certainly would lose a lot of elements that make it an integrated and comprehensive.
"A fourth class of strategies is termed participant-oriented models. As the term suggests, they emphasize the central importance of the evaluation participants, especially clients and users of the program or technology. Client-centered and stakeholder approaches are examples of participant-oriented models, as are consumer-oriented evaluation systems. “(William M.K 2006).
The preparation of questions of the most important matters to reveal the strengths and weaknesses of the project, and I have been chosen two people from the experts. The first one in curriculum design and the other in training program design ,and questions were as follows:
1. Are the objectives clear and been achieved?
2. Is the content sequential and interdependent makes the learning process more effective?
3. What is your assessment of the stimuli provided in the project?
4-Any other observations you see that we need them to raise the level of the project .
There will come a comment on the results of the questionnaire later. After this phase of the evaluation moved into a one to one, where it reflects the value that is gained from the educational situation and skills gained from the impact o ...
If you’ve never requested a usability study bid before or you want to see how our process differs from others you have worked with in the past — this deck is for you. Here is the 5-step process June UX uses to plan and conduct moderated usability studies.
Usability is how successfully and satisfactorily a person uses a product, document, website, or app to achieve goals effectively & efficiently. Good usability is measured by these factors: memorability, efficiency, errors, learnability, and satisfaction.
OER Differentiation project
9 countries in South America, Africa and Asia
Survey Instrument Validity
Validation of a survey instrument to understand the current users and potential user of Open Education Resources (OER) in Global South
ROER4D & GO-GN Research - Research Track (Aspen) – 10:30 - Thursday, 23 April 2015
OE Global, Banff, 2015
Dutra - Brazil
Project based leaning a toolkit for teachersSTEPHEN ANDREW
Project Based Learning is a teaching approach that applies Project management skills and knowledge in a classroom ( teaching –learning environment)
A powerful approach to learning that provides;
- opportunity to build essential 21stC skills
- deeper knowledge and expertise needed for life and carrier
A proven methodology for achieving students higher success
The CS-QuizLand project is an innovative and engaging Android-based quiz application that challenges users with a variety of questions across different categories in the field of computer science. The application is designed with a user-friendly interface, making it easy for users to navigate and participate in quizzes. At the end of each quiz, users receive a comprehensive result report that provides detailed feedback on their performance, helping them to understand their strengths and areas for improvement.
One of the unique features of CS-QuizLand is the ability to earn coins for each correct answer. These coins can be redeemed for real money (1000 coins = ₹10), providing an added incentive for users to engage with the app and improve their knowledge. Additionally, the application includes a spin wheel feature, offering users the chance to earn extra coins and enhance their overall experience.
CS-QuizLand emphasizes security and ease of use by allowing users to register with a simple username, email ID, and password of their choice. This registration process ensures that users can securely access their accounts and maintain their progress over time. The app's design also supports easy login and logout, allowing users to seamlessly continue their quiz journey at their convenience.
The application serves as an educational tool, helping users expand their knowledge of computer science concepts. It is particularly beneficial for students and individuals looking to strengthen their understanding of the subject. With the widespread use of smartphones, CS-QuizLand can be easily accessed on users' mobile devices, making it a convenient and accessible resource for learning.
CS-QuizLand also integrates features to encourage user engagement and retention. The app supports leaderboards, allowing users to compete with friends and other players globally. This adds a social and competitive element to the app, making the learning experience more interactive and motivating.
Furthermore, CS-QuizLand's sleek and modern design ensures an enjoyable experience for users. The app's layout and interface are carefully crafted to provide a visually appealing and intuitive environment. The use of vibrant colors and animations enhances the overall aesthetic, while clear and concise instructions guide users through the app's features.
In conclusion, CS-QuizLand is a comprehensive and engaging application that combines education, entertainment, and gamification. Its focus on providing an interactive and rewarding experience for users makes it a valuable tool for anyone interested in improving their computer science knowledge. The app's advanced features, ease of use, and secure registration process position it as a leading choice for quiz enthusiasts and learners alike. Through continuous updates and improvements, CS-QuizLand aims to remain at the forefront of educational technology, offering an enjoyable and enriching experience for its users.
This presentation raises some challenges for the OSGeo community addressing some aspects of the foundation pillars; in particular the incubation process, the Open Geoscience and the ethics are discussed to raise awareness and discussion.
Student peer assessment( BC Campus Symposium 2018)Isabeau Iqbal
Jason Myers, Bosung Kim and I presented on Student Peer Assessment in higher education. This is our slide deck which we openly share and invite you to use and modify.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
2. Executive Summary
Queso is a learning management system. Queso provides students and instructors with a platform to share content, class
materials, instructor feedback and point-of-semester grade level.
The usability test type is problem identification usability test. We conducted this test with the objective of identifying usability
problems on the desktop version of Queso from the student profile. The 6 participants(4 expert users and 2 novices) completed
5 tasks.
Moderators found and categorized 22 problems after testing primary functionality and 5 positive attributes of Queso to keep.
The usability metrics measured were satisfaction scores, efficiency and effectiveness rates. The mean value of satisfaction score
is 47.5 and the mean value of completion rate across tasks is 67.8%, mean value of task time is 126.2 seconds. As a group, the
team aggregated the findings and made redesign suggestions, improving the usability of the site. The main issues Queso faces
are as follows:
● The language used in Queso system was not understandable to users
● Poor information architecture impacts navigation and content organization
● Missing affordances to accomplish goals
● Visual design is unclear and does not show placement of user is in the system
The next steps are to give results to the developer and improve usability through further research and redesign the site
2
3. Introduction
Full Product Description
Queso is a learning management system (LMS). Queso’s main competitor in the market
space is Blackboard, one of the most used LMS systems available.
Queso is used in Universities and college in both undergraduate and graduate level
classrooms . Intended users of this learning management system are both instructors and
students.
Queso provides students and instructors with a platform to share content, class materials,
instructor feedback and point-of-semester grade level, for students. Queso’s design and
user interface’s goal was to have a game-like, playful, experience for its intended
audience.
3
4. Introduction
Terminology and Language used in Queso
Quest: Assignments with guidelines and deadlines created by instructors for students to submit.
Projected grade: The visualization of the current grade in course from total points available, the grade the student will likely
get if percentage of points earned continues
Current Level: Grade earned in class during the semester from total assignments available
Feedback: Comments from the instructor about assignments submitted.
Course content: This function allows instructors to post syllabus, slides, videos and articles on a feed for all students to view
and access.
Available assignment(Available Quest): This feature allows assignments to be posted by instructors for students to submit
online. Students can see the due date and points worth for each assignment.
Completed assignment(Completed Quest): This function shows feedback from instructors on graded assignments. Students
see their assignment grade, optional feedback and are given the option to revise and re-submit.
Current Grade (Progress): This function allows students to see their current grade earned from all assignments of the course.
They can look up for current grade and total grade for individual assignments.
5. Introduction
Test Objectives
Gather data about the overall usability of Queso. The goals of this study are:
● Assess the overall usability of Queso for target-users performing most frequently
used and most troublesome tasks. Identify obstacles and usability problems with:
○ Assignment submission
○ Revising a submitted assignment
○ Reviewing assignment grade
○ Viewing course progress
○ Reviewing instructor's’ feedback on assignments.
● Provide recommendations to redesign the site
5
7. Methods
Participants
7
Characteristic Number of participants
Total 6
Product experience
Novice (have used Queso for more than 6 months)
Expert (never used Queso)
2
4
Key characteristics
UM student 6
Representative of groups with special needs
Young, old or physically disabled 0
Paid/compensated No
8. Methods
Participants Demographic Data
8
Age Range:
Employment status: 6 (100%) participants are Full time students and employed
Language: 4 (66.7%) Native English Speaker
2 (33.3%) Non-Native English Speaker
Education status: 3 (50%) participants completed Undergraduate degree
3 (50%) participants completed Graduate degree
Gender: Female: 4 (66.7%) Male: 2
(33.3%)
9. Methods
9
6 (100%) participants access internet
Multiple times a day
Participants Demographic Data
Current computer skills:
Scale from Poorly Excellent
1 7
6 participants (100%) all have experience of
using Learning management system.
Types of Learning management system:
3 (50%) participants chose scale 5
1 (16.7%) participants chose scale 6
2 (33.3%) participants chose scale 7
10. Methods
10
Participants Demographic Data
How familiar the participants are with the "Queso"
Scale from Unfamiliar Very Familiar
1 7
Devices access to Queso:
4 (66.7%) participants use Desktop/ Laptop to access
Queso
2 (33.3%) participants haven’t used Queso
2 (33.3%) participants chose scale 1
2 (33.3%) participants chose scale 5
1 (16.7%) participants chose scale 6
1 (16.7%) participants chose scale 7
11. Methods
11
Participants Demographic Questionnaire Data
Participants’ Comments about Queso before usability test (4 participants used Queso before)
+ “It was a very useful tool with some opportunities for improvement.
I already used blackboard and Queso is more friendly user.”
+/- “I found it good but slightly confusing sometimes. I never liked that the comments
and reviews disappeared once you watched them. It needs more interaction options
like Blackboard. I think it's too simple yet confusing sometimes. As far as I know
Queso aims to emulate the flexibility and fun of video games and bring that to the
classroom, but it needs to be more fun, exciting and visually appealing to reach that.”
+/- “Good platform, needs to be reorganized.”
- “It's confusing and not sure how to edit an assignment.The navigation is in several
different places which is confusing.”
“+” : Positive Comment “+/-” Both positive and negative comment “-”: Negative Comment
12. 12
Objective To find usability problems
Reason chosen Most frequently used and most troublesome task to be performed
Source of task Team explored website and identified primary and secondary tasks to be
tested
● Task 1 Submit an Assignment(Quest)
● Task 2 Look up class grade (progress)
● Task 3 Look up Points from an assignment(Quest)
● Task 4 Revise a Submitted assignment(Quest)
● Task 5 Access material posted from professor
*All task scenarios and success criteria can be found in the appendix
Tasks
Methods
13. Methods
Packets before usability test
● Moderator Guide
● Participant Guide
● Observer Guide
● Test Plan
Facility
Classroom setting
13
Test Tools & Equipment
Participant’s Computing Environment
● Display Devices
○ Wireless connection to projector with Crestron AirMedia and
have HDMI cable as backup
○ Dedicated test desk with 13-inch Retina Display Macintosh
Powerbook (2013) computer with OS X El Capitan installed
open on Google Chrome and signed into Queso for each
participant
● No Audio Devices were used
● Manual Input Devices
○ Keyboard on Macintosh Powerbook
○ Printed guides on standard 8.5 x 11 inch paper + Pens
○ Apple Magic Bluetooth Wireless Mouse connected to the
computer
Test Administrator Tools
● Devices: Two iPhones for time on task
- Timing for test duration (overall)
- Timing for task completion duration
14. Methods
Before Test
1. Created usability test plan, moderator, participant and observer guides
2. Created individual accounts for each participant to perform usability test
3. Ran 2 pilot studies
4. Revised scenarios based on pilot studies
5. Setup testing environment
14
Moderator Procedure
15. Methods
During Test
1. Greeted participant, read introductions and informed consent with participant
2. Instructed participant to think aloud and say start and end time for each task
3. Test session should not last longer than 25 minutes.
4. Recorded start time
5. Moderated task: take notes of observations and record problem identification
6. Recorded total task time for each task
7. Determined completion of task (pass/gray pass/fail)
8. Asked participant to take the Single Ease Question(SEQ)
9. Asked follow-up questions
After Test
1. Asked participants to take the System Usability Scale(SUS) questionnaire
2. Asked follow up questions after each task
15
Moderator Procedure
16. Methods
Participant General Instructions
1. Arrive to the location at confirmed time
2. Listened to Introductions
3. Read and signed informed consent
4. Filled out demographic survey
5. Stated start time
6. Performed 5 tasks and followed think aloud protocol during each task
7. Stated when finished task
8. Answered the Single Ease Question (SEQ)
9. Answered the System Usability Scale (SUS) questions
10. Answered follow up questions after each task
11. Debrief
16
*One moderator interacted with one participant during the test and 5 observers observed the process.
*Participant Task Instructions are in appendix session.
17. Methods
Usability Metrics
● Effectiveness Metrics
○ Completion Rate- the completion rate is calculated by assigning a ‘1’ if the test participant
manages to complete a task and ‘0’ if he/she does not. A 0.5 is assignment to a task
completed with an assist, or a “gray pass”.
○ Errors- Errors can be unintended actions, slips, mistakes or omissions that a user makes
while attempting a task.
○ Assists - Moderator steps in to help users avoid and correct mistakes;
● Efficiency Metrics
○ Task Time - Total time user/participant spent attempting to complete a task.
17
18. Methods
Usability Metrics
● Satisfaction Metrics
○ SEQ- The Single Ease Question (SEQ) with a 1-7 rating scale to assess how difficult users find a
task. It's administered immediately after a user attempts a task.
○ SUS- The System Usability Scale (SUS) with a 1-5 rating scale measures the usability. It consists
of a 10 item questionnaire with five response options for respondents from Strongly agree to
Strongly disagree.
● Usability Keepers: Aspects, functions participants identified as positive attributes.
● Improvement Opportunities: Targets for redesign
● Problem Severity Metrics: Severity Score = Impact + Frequency
○ Impact is the effect on the user experience, measured on a 1~4 scale
○ Frequency of problem occurrence is converted into a 1~4 ranking
○ So Severity Score is on a 2~8 scale. Problems with the greatest severity score are of greatest
priority
18*Measuring standards for Impact and Frequency ranking are in Appendix
21. Results
Performance and Satisfaction Results
Efficiency Metrics(Task Time: seconds )
21
The chart shows the Geometric
Mean of task time for each task,
including the Confidence
Intervals(95%).
Participants spent more time on
Task 1 and 2 than the other three.
Task 1 is “Submit an Assignment”
and Task 2 is “Look up class grade”.
27. Results
Performance and Satisfaction Results
Satisfaction Metrics(Single Ease Question with a 1 to 7 scale, larger number means
easier)
27
Task 1
Submit an Assignment(Quest)
Task 2
Look up class grade (progress)
Task 3
Look up Points from an
assignment(Quest)
Task 4
Revise a Submitted assignment(Quest)
Task 5
Access material posted from professor
28. Results
Performance and Satisfaction Results
Satisfaction Metrics(System Usability Scale with a 1 to 5 scale)
28
Participant # SUS Score
1 - Expert 37.5
2 - Expert 52.5
3 - Expert 72.5
4 - Novice 45
5 - Expert 55
6 - Novice 20
Average 47.9
*Based on research, a SUS score above a 68 would be considered above average and anything below 68 is below average
29. Results
Issue-Based Results
Issues By Task
29
Task 1
Submit an Assignment(Quest)
Task 2
Look up class grade (progress)
Task 3
Look up Points from an
assignment(Quest)
Task 4
Revise a Submitted assignment(Quest)
Task 5
Access material posted from professor
33. Task 1 Submit a Quest
Keeper Proper use of language and location of important information
Location Sidebar in Available Quests page, Completed Quests page and Class
homepage
Description
“Available” and “Completed” labels are self-explanatory for
assignments, and the location in side navigation is constant.
33
Voices
“Was helped by the “available” in available quests; Knew to
assume that meant something still pending.”
34. Task 1 Submit a Quest
Keeper Intuitive blue affordance for link
Location Revise page, Attempt page
Description
When creating a hyperlink for submission, the url turns blue
once it is successfully made a hyperlink. During usability
testing this was noted as a positive affordance that informed of
an updated system status.
34
35. Task 2 Look Up Your Class Progress
Keeper Finding total points earned in the Completed page was intuitive
Location Completed Quests page
Description
Having total points for all assignments on the completed page
of a course was an intuitive flow for users to find their graded
information. Keeping similar content together made finding and
accomplishing task during usability testing efficient.
35
36. Task 4 Revise a Submitted Quest
Keeper Naming of ‘Revise’ button
Location Completed Quests Page
Description
Naming of button matched the function it performed, so user
knew that by clicking the ‘revise’ button they would navigate to
the revise page.
36
Voices
“Naming of ‘Revise’ button makes sense to me”.
37. Task 4 Revise a Submitted Quest
Keeper Flow to revise an assignment matched the mental model of users
Location Revise page
Description
During usability testing, a user noted finding previous
assignments were easy and the properly labeled revise button
made completing the task "easy"
37
40. Task 1 Submit an Assignment(Quest)
Problem Making link clickable is inefficient and not intuitive
Location Attempt page
Description
To add a clickable link, users need to type some text first, then
highlight the text, click the "Link" icon, and enter the URL
before hyperlink will work. These required steps proved difficult
for users to complete during testing
Problem Type Interaction
Element Persistence Local
Recommendation
Automatically convert URLs to be clickable by auto detecting
web address format or at least once hyperlink button is
pressed; remove the need to retype link after pressing hyperlink
button
40
Severity 2 3 4 5 6
7 8
41. Task 1 Submit an Assignment(Quest)
Problem Due date disappears after submission of assignment
Location View page
Description
Deadline date for assignments disappears from all Queso
assignment descriptions after a user submits an assignment. All
other content concerning assignment remains. This inconsistency
did not match user expectations
Problem Type Content
Element Persistence Local
Recommendation
Create conventions and standards for Queso platform. No
information concerning an assignment should be invisible at any
point in the grading process
41
Severity 2 3 4 5 6
7 8
42. Task 1 Submit an Assignment(Quest)
Problem Inefficient flow for uploading file
Location Attempt page
Description
Multiple steps to upload a file make process inefficient: Opens My
Computer, but then have to press Choose a File button to access
any content.
Problem Type Interaction
Element Persistence Local
Recommendation
When clicking on "My Computer", show up the window for
choosing files. Remove "Choose File" button.
42
Severity 2 3 4 5 6
7 8
Voice
“I don’t see grades. It would be easier if it was all in one
place.”
“I thought when I clicked “My computer” it would give me a
list of my local files. I assumed it would be clickable.”
43. Task 1 Submit an Assignment(Quest)
Problem Unnatural language and terminology
Location Class Homepage
Description
For novices, there is no explanation that "Quest" means
"assignment." "Attempt" and "Expires" were not seen as a
match between user mental model during testing.
Problem Type Naming
conventions
Element Persistence Global
Recommendation
Change "Quest" to "assignment", "Attempt" to "Submit",
"Expires" to "Deadline" to match expectations of users during
testing and create a dedicated about page that introduces
course and philosophy behind gameful learning.
43
Severity 2 3 4 5 6
7 8
44. Task 1 Submit an Assignment(Quest)
Problem Lack of confirmation of completing file upload
Location Class Homepage
Description
There is no clear notification when files are already uploaded. Lack
of system status proved confusing to users unsure if they had
actually triggered uploading of a file to turn in. The lack of system
status proved inefficient in accomplishing tasks as users sat waiting
for a file to upload that was already complete
Problem Type Interaction
Element Persistence Local
Recommendation
Queso should keep users informed about what is going on in the
system with consistent update of placement in the site and
feedback from actions. Create an alert once file is uploaded so
user can submit assignment.
44
Severity 2 3 4 5 6
7 8
46. Task 2 Look Up Class Progress
Problem Not intuitive naming for “grade” and “points”
Location Progress page
Description
In Queso, "grade" is called "level" and "points" are called "skills",
which is not intuitive. Users didn't understand how level corresponds
to points range in the usability test. The way of showing points,
"number + skill name", such as "5 Physical Computing" leads to
difficulty accomplishing task goals
Problem Type Naming
conventions
Recommendation
Change "current level" to "grade", "skills" to "points". Get rid of
skills, and for showing points, use the format of "points earned /
total points". For example, "8/10". Naming should match user
expectations and standard conventions should be created for all
courses in Queso.
46
Voice
“I don’t see grades. It would be easier if it was all in one
place.”
Severity 2 3 4 5 6
7 8
47. Task 2 Look Up Class Progress
Problem Uninformative “Totals” in Completed Assignments
page
Location Completed Quest page
Description
In the page of Completed Quests, the card of total points is called
"Totals", which is not self-explanatory enough. Users can see
nothing other than points earned, making it inefficient to find
information about grade/points.
Problem Type Visual design and naming conventions
Element Persistence Local
Recommendation
Rename "Totals" based on an open card sort done with additional
users and provide information like current grade. Display all
information concerning grades in one location that is always
visible and accessible to users.
47
Voice
“There is no bar so I don’t know if it’s 40 out of 40”
Severity 2 3 4 5 6
7 8
48. Task 2 Look Up Class Progress
Problem “Totals” in Completed assignments page is not clickable
Location Completed Quest page
Description
The assignment container that provides total information is not
clickable, while users expected it to be in the usability test, to show
information about their grade since none is visible from the
completed assignments page
Problem Type Interaction and visual design
Element Persistence Local
Recommendation
While "Totals" is not clickable, we will redesign the completed
grade display, removing the "Totals" section, so all information
concerning a graded assignment is visible
48
Severity 2 3 4 5 6
7 8
49. Task 2 Look Up Class Progress
Problem Progress bars don’t match grade earned
Location Completed Quest page
Description
Lack of title for progress bar in side navigation that depicts current
grade in a course caused confusion for users attempting to find their
progress. This increases the memory load of users without
explanatory text of what progress they're viewing.
Problem Type Visual design
Element Persistence Local
Recommendation
Make clear to users what information they are seeing by labeling
features and creating standards in content shown. All progress
bars should have points earned out of total points and a title
referencing what points the progress bar is referring to.
49
Severity 2 3 4 5 6
7 8
50. Task 2 Look Up Class Progress
Problem Progress bar in left-hand navigation
Location Class homepage
Description
Lack of title for progress bar in side navigation that depicts current
grade in a course caused confusion for users attempting to find their
progress. This increases the memory load of users without
explanatory text of what progress they're viewing.
Problem Type Visual design and naming conventions
Element Persistence Local
Recommendation
Make clear to users what information they are seeing by labeling
features and creating standards in content shown. All progress
bars should have points earned out of total points and a title
referencing what points the progress bar is referring to.
50
Severity 2 3 4 5 6
7 8
51. Task 2 Look Up Class Progress
Problem Progress bars are not clickable for more information
Location Class homepage
Description
When trying to view grade information, two participants could not find
what they were looking for and started clicking the progress bar. As
information wasn't visible to user, they thought the progress bar
would display more content.
Problem Type Visual design and naming
conventions
Element Persistence Local
Recommendation
Move primary content concerning grading to one place and
display it without the need of clickable content. Make progress
bars clickable and show additional information about grading.
51
Severity 2 3 4 5 6
7 8
53. Task 3 Look up grade on an assignment (Quest)
Problem Notifications of feedback disappear after viewing once
Location Completed Quests Page
Description
The feedback from the professor appears in the main page but
disappears right after leaving the page. After that it is only visible
in the revise page. There is no way to look back at previous
notifications. There is lack of consistency and standards in the
presentation of all material concerning grading.
Problem Type Content
Element Persistence Local
Recommendation
Create a new completed assignment page that displays all
information concerning graded work without removing any
content. Standards in documentation available to user should
mark feedback next to grade and always visible.
53
Severity 2 3 4 5 6
7 8
54. Task 3 Look up grade on an assignment (Quest)
Problem Professor feedback is presented under the Revise page
Location Revise Quest Page
Description
When looking for feedback from professor on a graded assignment
during the usability test, participants clicked around all possible
options, finally found information in the "Revise" page as its
organization does not match mental models and expectations
Problem Type Content
Element Persistence Local
Recommendation
Put feedback with graded information so that similar content is in
organized by one point of use.
54
Voice
“I was able to click around and find the information, but it
was not clear where professor notes were”
Severity 2 3 4 5 6
7 8
55. Task 3 Look up grade on an assignment (Quest)
Problem Non-Intuitive visualization of grade earned on a completed
Location Completed Quests Page
Description
Visualization of grade (points) earned on a completed
assignment is not intuitive and lacks title of what
content is displaying.
Problem Type Visual Design
Element Persistence Local
Recommendation
Users could not see how many points they earned out of total
points. The bar which displays the earned points is blank and
lacks title to explain what it refers to which caused users to
misread points earned vs. total points.
55
Voice
“I see “5” but the progress bar is empty. It seems I haven’t
submitted this assignment.”
Severity 2 3 4 5 6
7 8
56. Task 3 Look up grade on an assignment (Quest)
Problem “View” button mismatches users‘ expectation for feedback
Location Completed Quests Page
Description
Users do not understand the meaning of "View" button. They
feel the functionality of "View" button is not related to getting
feedback; feedback from instructors appears in both "Revise"
page and "View" page, which is overlapped.
Problem Type Icon
Intuitiveness
Recommendation
Change the naming from "View" to "Feedback". Make the
feedback only appear in the "Feedback" page.
56
Severity 2 3 4 5 6
7 8
57. Task 3 Look up grade on an assignment (Quest)
Problem Dropdown menu hides grade information
Location Completed Quests Page
Description
Current page for graded assignments does not show any points
users earned unless they click on a dropdown button in the top
left corner for each completed assignment.
Problem Type Layout, hidden
functionality 57
Recommendation
Remove dropdown button from all assignment information. This
content is important and should be immediately visible on the
page. Redesign the visual layout to use the available space to
provide value to user by consistently showing grade information
and feedback.
Voice
“Points should appear without need to click.”
Severity 2 3 4 5 6
7 8
59. Task 4 Revise a submitted assignment (Quest)
Problem Revise page has redundant content
Location Revise Page
Description
In ‘Revise’ page, there is redundant content that is also include in
the original submission page like an assignment to-do-list,
information and feedback from instructors. This made users
confused about the main functionality of this page.
Problem Type Interaction
Element Persistence Local
Recommendation
Create an alert with the message confirming for the user that
the file has been uploaded.
59
Severity 2 3 4 5 6
7 8
60. Task 4 Revise a submitted assignment (Quest)
Problem Lack of feedback that a file was successfully uploaded
Location Revise Page
Description
When a user completed uploading a file, there is no feedback
provided to user that signals that assignment was successfully
submitted. This led to users feeling unsure of whether they had
successfully submitted their file.
Problem Type Interaction
Element Persistence Local
Recommendation
Create an alert with the message confirming for the user that
the file has been uploaded.
60
Severity 2 3 4 5 6
7 8
61. Task 4 Revise a submitted assignment (Quest)
Problem Previous submission information maintained in
working area of "Revise" page
Location Revise Page
Description
No separation of previous content submitted and work area for
submitting revision. This led to confusion among users over what
is being submitted as part of revision and what was part of
previous submissions.
Problem Type Visual Design
Element Persistence Local
Recommendation
Move content that is part of previous submission to 'original
submission page', which can be accessed through the
dropdown menu on the revise page; Work area to submit
revision should be blank.
61
Severity 2 3 4 5 6
7 8
63. Task 5 Access material posted from professor
Problem Posts on class homepage are not well organized
Location Class Homepage
Description
Posts are displayed chronologically, so users became
frustrated that important information is not organized by
content category.
Problem Type Navigation
Element Persistence Local
Recommendation
Organize content on the class homepage and create tabs for
posts of different categories or importance. This will foster
recognition over recollection by displaying sections of posts
like readings, schedules and more.
63
Severity 2 3 4 5 6
7 8
64. Task 5 Access material posted from professor
Problem No static location for contact information
Location Class Homepage
Description
Location of professor contact information was difficult to find
for users because it is classified as a post and not as static
content. The location isn't consistent. Users have to scroll
down and find new placement once new content is posted.
Problem Type Layout
Element Persistence Local
Recommendation
Set a dedicated, consistent and static area to provide contact
information, and organize posts for consistent placement that
users can recognize easily
64
Severity 2 3 4 5 6
7 8
65. Task 5 Access material posted from professor
Problem Lack of clarity in title of professor contact information
Location Class Homepage
Description
Users had difficulty finding contact information for the
professor due to unclear naming standards in Queso system
Problem Type Content
Element Persistence Local
Recommendation
Create naming standards and documentation for general
course posts and organize the posts by categories matching
user mental model
65
Voice
“It needs to be more obvious.”
Severity 2 3 4 5 6
7 8
67. 67
Redesign Recommendations Main Points
● Improve website visual style with affordances and visual hierarchy
○ Design Focus needs to be changed from a “game-like” style to a more focused and easy-of-use style. Similar
functions grouped together at point of use.
● Clarify Language used by creating consistent and understandable naming conventions
○ Change site language to something more familiar to users. For example “Quests” should be assignments, “levels”
should be grades.”Expires on date” should be deadline.
● Change navigation by adding secondary navigation focused on serving primary
functionality
○ Navigation throughout site needs to be reconcepted. Build in a way to move forwards and backwards. This
traditional navigation is not available in the current site.
● Improve information structure
○ Better placement (sidebar) for contact info, syllabus, class, professor feedback alerts and assignments due.
69. Redesign Recommendation - User’s Home page
Description of problems
1. Current homepage is unorganized
and doesn’t clearly label different
segments and added functionality
of Queso
Recommendations from test
results
1. Have all students enrolled classes
together on the home page.
Labeled “current.”
2. Access to “Past Classes”
3. Other home page content such as
“settings” and “about” should be
located here.
70. Redesign Recommendation - Available Assignments Page
Description of problems
1. Current course homepage houses
unorganized posts without categorization or
clear navigation to interact with information
2. Sidebar navigation wastes space with
unexplained progress bar and only two main
functions of the classpage
3. Primary uses of the site are submitting an
assignment, checking current grade and
viewing assignments that are still due. The
current information architecture is inefficient
4. Uploading file button is too small and doesn’t
provide system status feedback once a file
has been uploaded
5. Lack of back button limits user freedom
Recommendations from test
results
1. Have everything at point-of-use for available
and completed assignments
2. Submit drag-and-drop feature. A one stop
directive to upload assignments rather than
the current user interface that provides three
options and causes confusion. Red direction
type would change from red to blue so user
71. Redesign Recommendation - Completed Assignments
PageDescription of problems
1. Assignment information is segmented across
the current website causing users to recall
placement instead of read organized
navigation
2. When viewing a graded assignment, the grade
earned and possible grade for an assignment
are unclear due to lack of descriptive text and
poor naming conventions
3. Graded assignments have a progress bar that
doesn’t show any information and show points
without saying whether they are grade earned
from a graded assignment or the possible
grade
Recommendations from test results
1. Color coding visually shows user “assignment
01” has been completed and uploaded.
2. Revise assignment upload is found here.
3. Assignment grade, shows progress and
success rate. Deletes current progress bar
function.
4. Line graph shows progress and week through
72. Redesign Recommendation - Reading Page
Description of problems
1. ‘Posts’ on current homepage of
individual courses house important
information, but during usability testing
lack of clear naming conventions and
organization lead to confusion and
uncertainty
2. Contact information for professors and
other important information like the
syllabus were requested on separate
pages and organized during usability
testing
Recommendations from test results
1. Readings provides up-to-date materials
such as articles for the user. Student
additions would also be available here.
For example, class projects or
inspiration, to share with fellow
classmates.
73. Redesign Recommendation - Notifications Page
Description of problems
1. Feedback from professor on a graded
assignment is found in the "Revise"
page. This does not match mental
models and expectations
2. The feedback from the professor
appears in the main page but
disappears. There is no way to look
back work or notification to help guide
users.
3. There is lack of consistency and
standards in the presentation of all
material concerning grading.
Recommendations from test results
1. Notifications were not visible within the
current queso system. Notifications are
now seen as soon as user is logged
onto the Homepage.
74. 74
Conclusion
Through the usability test run, 6 moderators identified 5 positive attributes of the current site to leave
unchanged. The overview of parts to keep are the naming of certain functionality like the ‘Revise’
button on completed assignments page, the correct blue, underlined affordance for hyperlinks and the
flow to access both the total points for a course and to revise an assignment.
However, there were 22 usability problems identified and 8 with a severity score over 6. Overall,
opportunities for improvement include restructuring the information architecture of Queso’s
environment, clarifying the visual design to help users know their place in the system in order to find
the content they’re looking for. Also, improving naming conventions and standards would significantly
reduce confusion heard from participants as they used the think aloud method during testing.
The next steps include running an open card sort to clarify language and terminology used in Queso.
The redesign implemented new language but remains untested. An open card sort would identify any
further problems or better naming opportunities. Furthermore, Queso could continue adding visual
design suggestions for improvement of site functionality based on user testing of the redesign.
75. APPENDIX
75
1. Demographic Questionnaire
2. Participant Task Instructions
3. Post-task and Post-study Questionnaire
4. Rubin’s problem impact ranking
5. Rubin’s frequency ranking
6. Raw Data
7. Task Flows Outliers By Participants
76. Demographic Questionnaire
How old are you?
◯ Under 18 ◯ 19 - 24 ◯ 25-30 ◯ 31 - 36 ◯ 37 - 42 ◯ Over 42
What gender do you identify with ?
Male Female
Employment and enrollment status?
▢ Part-time student and unemployed ▢ Part-time student and employed
▢ Full-time student and employed ▢ Full-time student and unemployed
▢ Non-student and employed ▢ Non-student and unemployed
76
77. Demographic Questionnaire Continued
Are you a native English speaker?
◯ Native English speaker ◯ Non-native English speaker
Other: ________________________
What is your highest level of education earned?
◯ High school diploma ◯ Undergraduate degree
◯ Graduate degree ◯Post-graduate degree
◯ Other: ________________________
Are you a student from the University of Miami?
◯ Yes ◯ No 77
78. Demographic Questionnaire Continued
How often do you access the internet?
◯ Multiple times a day ◯ Several Times a Week
◯ Several Times a Month ◯ About once a month ◯ Less than once a month
How would you rate your current computer skills?
Have you ever used a learning management system for classrooms ?
◯ Yes ◯ No
78
79. Demographic Questionnaire Continued
What type of management system for classrooms have you used? Please write their names:
▢ Blackboard ▢ Schoology ▢ Queso
▢ Axis ▢ Moodle ▢ Absorb
How familiar are you with the Management system for classrooms "Queso"?
If you've used Queso, what device do you use to access it?
◯ Mobile device ◯ Desktop or laptop ◯ Tablet ◯ Haven't used Queso Other:
__________
79
80. Task 1: Submit an Assignment
Scenario
You are signed onto Queso in a class called “Test Physical Computing.” You’ve completed the
assignment and it’s saved on your desktop
1. Find the deadline for the assignment 2.1 More in Serial Processing.
2. Turn in the assignment with a file "sketch.ino"
3. Add a link in a clickable format with this url: "http://bit.ly"
Success Criteria
Find the assignment 2.1 More in Serial Processing and write down the deadline: April 12th. Put
the video link in the text editor, make clickable and click “Submit”.
Optimal Task flow
80
81. Task 2: Look up your class progress
Scenario
You are signed onto Queso in a class called “Test Physical Computing.”
1. Find the current grade you’ve earned in the class thus far.
2. Find the current points you’ve earned in the class thus far.
Success Criteria
Find the current grade is Level 1. The total points is 40.
Optimal Task Flow
81
82. Task 3: Look up Points from a Quest
Scenario
You are signed onto Queso in a class called “Test Physical Computing.” You receive an email from
your professor that an assignment, 1.1 Blink LED, has been graded.
1. Find the grade you earned on the assignment
2. See if the professor gave feedback
Success Criteria
Find the grade is 5 points and the note is “Very Good.”
Optimal Task Flow
82
83. Task 4: Revise a Submitted Quest
Scenario
You are signed onto Queso in a class called “Test Physical Computing.” 1.1 Blink LED is an
assignment you’ve already submitted.
1. Find the “newSketch.ino” file that is on the desktop of this computer
2. Resubmit your assignment with the ”newSketch.ino” file
Success Criteria
Find assignment 1.1 Blink LED and upload the revised file “newSketch.ino” on the desktop and
submit.
Optimal Task Flow
83
84. Task 5: Access material posted from professor
Scenario
You are signed onto Queso in a class called “Test Physical Computing.”
1. Find the contact information for this course’s instructor
2. Check the “Inspiration Presentation Schedule” to see when Lauren Kett is presenting.
Objectives
To see whether the participant could easily find information from instructor’s posts.
Success Criteria
Find the instructor’s email address is dpdickinson@yahoo.com, mobile phone number is 757-810-
1913. Lauren Kett’s presentation date is Mar 31.
Optimal Task Flow
84
85. Overall, how difficult or easy did you complete this task?
System Usability Scale
1. I think that I would like to use this system frequently
2. I found the system unnecessarily complex
3. I thought the system was easy to use
4. I think that I would need the support of a technical
person to be able to use this system
5. I found the various functions in this system were well integrated
After Task and Post Study Questionnaires
85
6. I thought there was too much inconsistency in this system
7. I would imagine that most people would learn to use this system very quickly
8. I found the system very cumbersome to use
9. I felt very confident using the system
10. I needed to learn a lot of things before I could get going with this system
86. Severity Score Metrics
Impact
Rankin
g
Impact
Description
Impact Definition
4 Unusable The user is not able to or will not want to use a particular part of the product because of the way that the
product has been designed and implemented.
3 Severe The user will probably use or attempt to use the product here, but will be severely limited in his or her
ability to do so. The user will have great difficulty in circumventing the problem.
2 Moderate The user will be able to use the product in most cases, but will have to undertake some moderate effort in
getting around the problem.
1 Irritant The problem occurs only intermittently, can be circumvented easily, or is dependent on a standard that is
outside the product's boundaries. Could also be a cosmetic problem.
Rubin’s Problem Impact Ranking
92. Raw Data
Task Flows Outliers By Participant 2
Task 1
Took a detour, didn't understand the how to create a link instruction.
Task 2
She got to the progress page, but she was very confused with the labeling and start looking
somewhere else.
Task 3
She didn’t notice the points in the quest drop down area.
Task 4
Optimal task flow.
Task 5
Participant didn´t scroll down right away in order to find the information needed.
93. Raw Data
Task Flows Outliers By Participant 3
Task 1
Optimal for submitting file; failed to also add clickable link.
Task 2
Alternate flow (only see total points).
Task 3
Optimal task flow.
Task 4
Optimal task flow.
Task 5
Optimal task flow, but initially attempted task by going off site.
94. Raw Data
Task Flows By Participant 4
Task 1
Start - Queso Homepage - Test Physical Computing Page - Go to "Available" - Find "2.1
More in Serial Processing" - Click "Attempt" - Upload the file(Add file) - Add video link in the
text editor - Click"Submit" - Stop
Task 2
Start - Queso Homepage - Test Physical Computing Page - Click username menu in Navi
Bar - Go to "Completed" Quests - Check "Total Points" box - Click down arrow inside each
assignment - Go back to Test Physical Computing Page - Click username menu in Navi Bar
- Click "Progress" - Scroll down - Check total points and current grade - Stop
Task 3
Start - Queso Homepage - Test Physical Computing Page - Go to "Completed" Quests -
Find "1.1Blink Led" - Click down arrow - Click "View" - Go back to "Completed" Quests -
Click "Revise" - Check "Notes on Quest" - "Stop"
Task 4
Start - Queso Homepage - Test Physical Computing Page - Go to "Completed" Quests -
Find "1.1Blink Led" - Click "Revise" - Upload files - Click "Submit" - Stop
95. Raw Data
Task Flows Outliers By Participant 5
Task 1
Optimal task flow.
Task 2
Alternate to see total points, viewed alternate to see level but was unsure of answer and
chose wrong content.
Task 3
Optimal task flow.
Task 4
Optimal task flow.
Task 5
Optimal task flow.
96. Raw Data
Task Flows Outliers By Participant 6
Task 1
Optimal task flow - But
1. Clicked dropdown icon before "Attempt"
2. Clicked the link icon twice.
3. Uploaded file first.
4. Clicked "My Computer" several times.
5. Went back to "Available" to look for deadline.
Task 2
Optimal task flow - But went back to "Completed" after "Progress", to look for Current
Grade.
Task 3
Optimal task flow - But continued to Class Homepage - Queso - Progress - View.