The document describes a study that investigated the effectiveness of tactile feedback for finger-based text entry on mobile touchscreen devices. The study compared performance on a device with a physical keyboard, a touchscreen without tactile feedback, and a touchscreen with tactile feedback added. Tactile feedback was provided through vibration and improved text entry speed and accuracy compared to the standard touchscreen. The results suggest that tactile feedback can help regain some of the feedback lost without physical buttons and improve touchscreen text entry experience.
Mobile devices are increasingly becoming part of everyday
life for many different uses. These devices are mainly based
on using touch-screens, which is challenging for people
with disabilities. For visually-impaired people interacting
with touch-screens can be very complex because of the lack
of hardware keys or tactile references. Thus it is necessary
to investigate how to design applications, accessibility
supports (e.g. screen readers) and operating systems for
mobile accessibility. Our aim is to investigate interaction
modality so that even those who have sight problems can
successfully interact with touch-screens. A crucial issue
concerns the lack of HW buttons on the numpad. Herein
we propose a possible solution to overcome this factor. In
this work we present the results of evaluating a prototype
developed for the Android platform used on mobile
devices. 20 blind users were involved in the study. The
results have shown a positive response especially with
regard to users who had never interacted with touchscreens
SMARCOS Abstract Paper submitted to ICCHP 2012Smarcos Eu
This study is part of the European project "Smarcos" (http://www.smarcos-project.eu/) that includes among its goals the development of services which are specifically designed and accessible for blind users.
In this paper we present the prototype application designed to make the main phone features available in a way which is accessible for a blind user. The prototype has been developed to firstly evaluate the interaction modalities based on gestures, audio and vibro-tactile feedback.
“Spindex” (Speech Index) Enhances Menus on Touch Screen Devices with Tapping,...abhishek07887
The document describes a study that investigated how "spindex" (speech index) auditory cues can enhance auditory menus on touchscreen smartphones. The study evaluated navigation of a 150-song menu using tapping, wheeling, or flicking gestures with or without visual display and with or without spindex cues. Results showed lower target search times and subjective workload with spindex compared to speech alone, for all input gestures and visual conditions. Spindex was also rated subjectively higher than plain speech.
This document describes a proposed "Low vision Mobile App Portal" that would provide mobile apps specifically designed for visually impaired users. It discusses how most existing mobile apps are designed for sighted users and do not provide an adequate user experience for visually impaired users, even with accessibility features. The document advocates designing apps from the ground up with accessibility and usability for visually impaired users in mind, rather than just adding accessibility features to apps designed for sighted users. It provides some examples of how traditional apps like calendars and contacts would need to be redesigned for visually impaired users.
We have entered a multi-device world. Now, in our increasingly connected world, people own multiple devices-PCs, smartphones, tablets, TVs, and more-and are already using them together, switching between them, in order to accomplish a single goal.
This document provides information about surface computing. It discusses Microsoft Surface, a large multi-touch tabletop computer that allows multiple users to interact directly on its screen surface using hands, brushes or other objects. Key features of surface computing include multi-touch interaction, tangible user interfaces using physical objects, support for multiple simultaneous users, and object recognition capabilities. The document also outlines the hardware components of Microsoft Surface and provides examples of its applications.
This document summarizes research conducted by Microsoft on developing haptic feedback for touchscreen devices. The research aimed to make touchscreens feel more like physical keyboards by using piezoelectric actuators under screen keys to simulate the tactile feedback of pressing physical buttons. The study found that piezoelectric actuators could successfully generate key-click sensations for individual keys without the whole device vibrating. The research identified voltage thresholds needed to generate distinguishable vibrations and tested subjects' ability to identify which key was pressed based on haptic feedback alone. The findings suggest that haptic feedback technology has potential to enhance the touchscreen experience and make it feel more like using physical buttons.
Mobile speech and advanced natural language solutionsSpringer
This document discusses two frameworks for semantic interpretation in natural language technology for mobile devices: a rule-based framework and a statistical framework. The rule-based framework draws from expert systems and uses production rules and ontologies. The statistical framework uses data-driven methods. Both frameworks have advantages and drawbacks, and the document speculates that future systems may combine aspects of both frameworks to better understand user intent and resolve ambiguities.
Mobile devices are increasingly becoming part of everyday
life for many different uses. These devices are mainly based
on using touch-screens, which is challenging for people
with disabilities. For visually-impaired people interacting
with touch-screens can be very complex because of the lack
of hardware keys or tactile references. Thus it is necessary
to investigate how to design applications, accessibility
supports (e.g. screen readers) and operating systems for
mobile accessibility. Our aim is to investigate interaction
modality so that even those who have sight problems can
successfully interact with touch-screens. A crucial issue
concerns the lack of HW buttons on the numpad. Herein
we propose a possible solution to overcome this factor. In
this work we present the results of evaluating a prototype
developed for the Android platform used on mobile
devices. 20 blind users were involved in the study. The
results have shown a positive response especially with
regard to users who had never interacted with touchscreens
SMARCOS Abstract Paper submitted to ICCHP 2012Smarcos Eu
This study is part of the European project "Smarcos" (http://www.smarcos-project.eu/) that includes among its goals the development of services which are specifically designed and accessible for blind users.
In this paper we present the prototype application designed to make the main phone features available in a way which is accessible for a blind user. The prototype has been developed to firstly evaluate the interaction modalities based on gestures, audio and vibro-tactile feedback.
“Spindex” (Speech Index) Enhances Menus on Touch Screen Devices with Tapping,...abhishek07887
The document describes a study that investigated how "spindex" (speech index) auditory cues can enhance auditory menus on touchscreen smartphones. The study evaluated navigation of a 150-song menu using tapping, wheeling, or flicking gestures with or without visual display and with or without spindex cues. Results showed lower target search times and subjective workload with spindex compared to speech alone, for all input gestures and visual conditions. Spindex was also rated subjectively higher than plain speech.
This document describes a proposed "Low vision Mobile App Portal" that would provide mobile apps specifically designed for visually impaired users. It discusses how most existing mobile apps are designed for sighted users and do not provide an adequate user experience for visually impaired users, even with accessibility features. The document advocates designing apps from the ground up with accessibility and usability for visually impaired users in mind, rather than just adding accessibility features to apps designed for sighted users. It provides some examples of how traditional apps like calendars and contacts would need to be redesigned for visually impaired users.
We have entered a multi-device world. Now, in our increasingly connected world, people own multiple devices-PCs, smartphones, tablets, TVs, and more-and are already using them together, switching between them, in order to accomplish a single goal.
This document provides information about surface computing. It discusses Microsoft Surface, a large multi-touch tabletop computer that allows multiple users to interact directly on its screen surface using hands, brushes or other objects. Key features of surface computing include multi-touch interaction, tangible user interfaces using physical objects, support for multiple simultaneous users, and object recognition capabilities. The document also outlines the hardware components of Microsoft Surface and provides examples of its applications.
This document summarizes research conducted by Microsoft on developing haptic feedback for touchscreen devices. The research aimed to make touchscreens feel more like physical keyboards by using piezoelectric actuators under screen keys to simulate the tactile feedback of pressing physical buttons. The study found that piezoelectric actuators could successfully generate key-click sensations for individual keys without the whole device vibrating. The research identified voltage thresholds needed to generate distinguishable vibrations and tested subjects' ability to identify which key was pressed based on haptic feedback alone. The findings suggest that haptic feedback technology has potential to enhance the touchscreen experience and make it feel more like using physical buttons.
Mobile speech and advanced natural language solutionsSpringer
This document discusses two frameworks for semantic interpretation in natural language technology for mobile devices: a rule-based framework and a statistical framework. The rule-based framework draws from expert systems and uses production rules and ontologies. The statistical framework uses data-driven methods. Both frameworks have advantages and drawbacks, and the document speculates that future systems may combine aspects of both frameworks to better understand user intent and resolve ambiguities.
An Overview of Interactive Surfaces: Applications, Sensors, and Dimensionsgaup_geo
This document provides an overview of interactive surface technology by exploring applications, sensors, and dimensions. It surveys application areas such as entertainment, collaboration, communication, computer interfaces, and customer-vendor interfaces. It also breaks down sensing technologies like capacitive touch screens, optical imaging, and frustrated total internal reflection. Finally, it discusses the size and complexity of different interactive surfaces. The goal is to give a broad overview of current interactive surface technology applications and sensing methods.
1 PROGRAM ISEM RESEARCH PAPER FOR APPLIED.docxhoney725342
1
PROGRAM: ISEM
RESEARCH PAPER FOR APPLIED PROJECT
GESTURES & DESIGN PATTERNS IN TOUCHSCREEN SOFTWARE
DEVELOPMENT: A USABILITY STUDY
KATIE TOBIN
Date: December, 18th 2011
5
TABLE OF CONTENTS
ABSTRACT....................................................................................................................3
KEYWORDS ..................................................................................................................3
1. INTRODUCTION ......................................................................................................4
2. PROBLEM STATEMENT AND JUSTIFICATION ..............................................................7
3. LITERATURE REVIEW - ANALYSIS OF RELATED WORK ................................................9
4. SOLUTION APPROACH ...........................................................................................15
5. WORK PLAN ..........................................................................................................20
6. PROTOTYPE IMPLEMENTATION ..............................................................................21
7. USABILITY STUDY RESULTS ....................................................................................50
8. CONCLUSIONS REACHED .......................................................................................52
9. REFERENCES ..........................................................................................................53
APPENDICES ..............................................................................................................54
1
ABSTRACT
This goal of this research is to examine several common usability design flaws and to present a
plan to create a set of best practices that will contain both user-tested design patterns and proper
gesture use that bring about better user task outcomes. It also describes how a usability study
would be carried out, including the process of analyzing the study results and finding the best
design patterns for those tasks. This research is needed, because there are a limited amount of
usability-tested software design patterns for use on mobile touchscreen devices and this re-
searcher believes that usability has understandably suffered in this medium. As any mobile
touchscreen device user can attest, it can be surprisingly frustrating to perform a simple task -
such as copying and pasting text from one location to another, or filling out a simple form. The-
se tasks and more were tested on a mobile touchscreen device in order to identify and attempt to
solve these common usability problems.
KEYWORDS:
Touchscreen, Gestures, Software, Usability, Affordances, Usability Study, Software Design Pat-
terns
5
1. INTRODUCTION
Touchscreen devices can be a nightmare to operate. Nearly everyone has experienced the “Fat
Finger Problem” when trying to type some text on a ...
AGE BASED USER INTERFACE IN MOBILE OPERATING SYSTEMIJCSEA Journal
The mobile phones are becoming now an irreplaceable utility of every household. It serves as wall clock, alarm clock, calculator, calendar, timer and many more, but have this multi-functionality overloaded the interface of the new generation of the mobile phones. The youth have adapted well to these multiple functionalities graphical user interface, but the interface has now haunting effects for usage by the two age groups i.e. elderly and kids. The interface may end up leading the new generation mobiles in the market useless or of very little use to the elderly and kids. This leads towards the need of age based user interface in the mobile operating system which will consist of interface selection home screen which further directs to age oriented interfaces.
AGE BASED USER INTERFACE IN MOBILE OPERATING SYSTEMIJCSEA Journal
The mobile phones are becoming now an irreplaceable utility of every household. It serves as wall clock, alarm clock, calculator, calendar, timer and many more, but have this multi-functionality overloaded the interface of the new generation of the mobile phones. The youth have adapted well to these multiple functionalities graphical user interface, but the interface has now haunting effects for usage by the two age groups i.e. elderly and kids. The interface may end up leading the new generation mobiles in the market useless or of very little use to the elderly and kids. This leads towards the need of age based user interface in the mobile operating system which will consist of interface selection home screen which further directs to age oriented interfaces.
This document discusses various aspects of human-computer interaction (HCI) and user interface design. It begins by defining HCI and its goals of making systems useful, usable and satisfying to users. It then discusses why good UI design is important, covering both explicit and implicit forms of interaction. The document outlines challenges in areas like ubiquitous access and personalized spaces. It analyzes interfaces for different devices like PCs, mobile phones, games consoles and remote controls. It also covers multimodal interaction, gestures, wearable and implanted devices. Finally, it briefly introduces the human-centered design process.
Efficiency or Quality of Experience: A Laboratory Study of Three Eyes- Free T...Vladimir Kulyukin
A growing number of individuals who are blind or visually impaired is using smartphones in their daily
activities. The touchscreen is a standard component of smartphones. While benefitting people with low vision by enhancing control of the text style and color and the size of images and text, the touchscreen has the downside for visually
impaired users in that physical buttons for input of command selection and text entry are replaced with the touchscreen’s soft buttons. To overcome this limitation, we are investigating eyes-free approaches to using the smartphone’s touchscreen for information browsing. In this article, we present a laboratory study of three eyes-free touchscreen user
interfaces for browsing menu hierarchies. Our findings indicate that quality of experience and familiarity may be as important as the time efficiency of completing tasks.
This document presents information on touch technology and multi-touch interfaces. It discusses the history of touchscreens and how multi-touch improved on traditional keypads. Major companies that adopted multi-touch include Apple, Microsoft, Dell, Google and HP. Multi-touch offers advantages like more intuitive gestures over menus and is now common in smartphones, tablets and other devices. While powerful, multi-touch interfaces also face limitations such as difficulty with small selections and lower battery life. Overall, the document examines the rise of multi-touch technology and its impact on business applications.
AlterEgo: A Personalized Wearable Silent Speech Interface. MIT
Arnav Kapur
MIT Media Lab
Cambridge, USA
arnavk@media.mit.edu
Shreyas Kapur
MIT Media Lab
Cambridge, USA
shreyask@mit.edu
Pattie Maes
MIT Media Lab
Cambridge, USA
pattie@media.mit.edu
The touch screen technology is widely used in PDA, smart phone, PMP, ATM, information kiosk and many other types of equipment in industrial, medical and commercial environment. Actually the technology enabling these devices is not new, since it was invented by Dr. Samuel C. Hurst in 1971. But it becomes hotter after the release of popular iPhone and iPod touch. With new patents filed for the touch screen technology, Apple brings a new wave to this mature segment and more companies are involved in this revolution with improved interactive UI, ICs, assembly modules and software components.
The document discusses various aspects of user interface evaluation including different evaluation techniques. It describes cognitive walkthrough, heuristic evaluation, and review-based evaluation as three common evaluation methods. Cognitive walkthrough involves experts walking through a design to identify potential usability issues based on psychological principles. Heuristic evaluation involves experts examining a design for violations of identified usability heuristics. Review-based evaluation uses results from literature to support or refute parts of a design. Conducting evaluations in a usability lab provides advantages like specialized equipment availability but lacks the full user context.
Brand New AI “Swiping Technology” Taps Into 6.8 Billion Mobile Users Goldmine TO Get UNLIMITED Traffic, Leads & Sales Making Us $426 Every Time Someone Swipe or Tap!
Multimodal and Affective Human Computer Interaction - Abhinav SharmaAbhinav Sharma
This document discusses human-computer interaction and related topics such as multimodal and affective HCI. It begins with an introduction to the history and development of HCI. It then discusses more recent developments like touch interfaces, voice assistants, and consistent cross-device experiences. Two areas of interest in HCI are identified as multimodal interaction using multiple modes like voice and touch simultaneously, and affective HCI which aims to understand human emotions during interaction. Several research papers are summarized that explore topics like the definitions of multimodal HCI, challenges and opportunities it presents, as well as efforts in the field of automatic emotion recognition in HCI. Overall issues discussed include how to design more natural and seamless multimodal experiences, and how
This document provides an introduction and overview of a project on vision-based hand gesture recognition. It discusses the motivation for the project and how hand gestures can provide a more natural human-computer interaction compared to traditional input devices like keyboards and mice. The document outlines the objectives of the project, which are to develop a system that can identify specific hand gestures using a webcam and interpret them to control mouse operations on a computer. It also provides an overview of the organization of the project report and the topics that will be discussed in subsequent chapters, such as the literature review, proposed methodology, results, and conclusions.
An effective approach to develop location-based augmented reality information...IJECEIAES
Using location-based augmented reality (AR) for pedestrian navigation can greatly improve user action to reduce the travel time. Pedestrian navigation differs in many ways from the conventional navigation system used in a car or other vehicles. A major issue with using location-based AR for navigation to a specific landmark is their quality of usability, especially if the active screen is overcrowded with the augmented POI markers which were overlap each other at the same time. This paper describes the user journey map approach that led to new insights about how users were using location-based AR for navigation. These insights led to a deep understanding of challenges that user must face when using location-based AR application for pedestrian navigation purpose, and more generally, they helped the development team to appreciate the variety of user experience in software requirement specification phase. To prove our concept, a prototype of intuitive location-based AR was built to be compared with existing standard-location based AR. The user evaluation results reveal that the overall functional requirements which are gathered from user journey have same level of success rate criteria when compared with standard location-based AR. Nevertheless, the field study participants highlighted the extended features in our prototype could significantly enhance the user action on locating the right object in particular place when compared with standard location-based AR application (proved with the required time).
BFA Digital Design Thesis Proposal Presentation DRAFTSkye Sant
Skye Sant presented her initial thesis on researching the history of interface design and observing how users interact with the Wii remote. She aims to make conclusions about the future of full-arm gesture interfaces. Her research outcomes include publishing articles, hosting a gallery show to present her findings on what works and doesn't work for full-arm gestures, and creating a website and demo interface for users to experience. She welcomes further discussion via her website or Twitter account.
A SURVEY ON NUMEROUS DEVELOPMENTS IN MULTI-TOUCH TECHNOLOGYpharmaindexing
This document summarizes numerous developments in multi-touch technology. It discusses various multi-touch technologies categorized as either sensor-based or computer vision-based. Sensor-based technologies like FMTSID, DiamondTouch and SmartSkin are able to simultaneously detect multiple touch points but are costly to build. Computer vision-based technologies like FTIR, DI and Microsoft Surface use optical techniques and cameras to detect touches and are more scalable and affordable. The document also outlines key technologies for multi-touch like touch detection accuracy, user identification, and bimanual interaction support.
This document provides an overview of natural user interface (NUI) research as it relates to human-computer interaction (HCI). It defines HCI and notes that NUI is an emerging paradigm that aims to make interfaces invisible through natural motions, gestures, and interactions. The document outlines relevant areas of HCI including computer science, behavioral sciences, and design. It discusses NUI focus areas like touch, vision, speech and technologies like touchscreens, voice input, and augmented reality. Examples of NUI applications are also presented. Challenges with NUI include the currently high costs of developing the specialized technology.
LEVERAGING MOBILE DEVICES TO ENHANCE THE PERFORMANCE AND EASE OF PROGRAMMING ...IJITE
Programming simple robots allows teachers to reinforce unified science, technology, engineering, and
math (STEM) concepts. However, for many educators, the cost and computer requirements for robotics kits
are prohibitive. As mobile devices have become increasingly ubiquitous, low cost, and powerful, they may
prove to be an attractive means of coding for, controlling, and enhancing the capabilities of low-cost
mobile robots. This study looks into the viability of using LEGO Mindstorms NXT and Google Android
devices by using Bluetooth to establish a link between the two. This allows for the exchange of live data
remotely for use in various applications with the hope of creating a low-cost mobile programming
environment. The mobile applications developed were able to successfully exchange data with NXT
hardware via Bluetooth and show evidence that mobile devices can be used as a tool to assist in robotic
programming in education.
LEVERAGING MOBILE DEVICES TO ENHANCE THE PERFORMANCE AND EASE OF PROGRAMMING ...IJITE
Programming simple robots allows teachers to reinforce unified science, technology, engineering, and
math (STEM) concepts. However, for many educators, the cost and computer requirements for robotics kits
are prohibitive. As mobile devices have become increasingly ubiquitous, low cost, and powerful, they may
prove to be an attractive means of coding for, controlling, and enhancing the capabilities of low-cost
mobile robots. This study looks into the viability of using LEGO Mindstorms NXT and Google Android
devices by using Bluetooth to establish a link between the two. This allows for the exchange of live data
remotely for use in various applications with the hope of creating a low-cost mobile programming
environment. The mobile applications developed were able to successfully exchange data with NXT
hardware via Bluetooth and show evidence that mobile devices can be used as a tool to assist in robotic
programming in education.
LEVERAGING MOBILE DEVICES TO ENHANCE THE PERFORMANCE AND EASE OF PROGRAMMING ...IJITE
This document discusses using mobile devices to enhance programming and capabilities for low-cost mobile robots. It describes developing applications on Android devices that allow communication with LEGO Mindstorms NXT robots via Bluetooth to exchange sensor and motor data in real-time. This provides a low-cost mobile programming environment. The applications allowed monitoring sensor values, setting motor positions, and testing shows mobile devices can assist with robotic programming education by providing an intuitive interface and freeing up funds for other educational needs. However, limitations of Bluetooth for streaming multiple sensors were discovered, requiring prioritization of sensor data or alternative connections.
An Overview of Interactive Surfaces: Applications, Sensors, and Dimensionsgaup_geo
This document provides an overview of interactive surface technology by exploring applications, sensors, and dimensions. It surveys application areas such as entertainment, collaboration, communication, computer interfaces, and customer-vendor interfaces. It also breaks down sensing technologies like capacitive touch screens, optical imaging, and frustrated total internal reflection. Finally, it discusses the size and complexity of different interactive surfaces. The goal is to give a broad overview of current interactive surface technology applications and sensing methods.
1 PROGRAM ISEM RESEARCH PAPER FOR APPLIED.docxhoney725342
1
PROGRAM: ISEM
RESEARCH PAPER FOR APPLIED PROJECT
GESTURES & DESIGN PATTERNS IN TOUCHSCREEN SOFTWARE
DEVELOPMENT: A USABILITY STUDY
KATIE TOBIN
Date: December, 18th 2011
5
TABLE OF CONTENTS
ABSTRACT....................................................................................................................3
KEYWORDS ..................................................................................................................3
1. INTRODUCTION ......................................................................................................4
2. PROBLEM STATEMENT AND JUSTIFICATION ..............................................................7
3. LITERATURE REVIEW - ANALYSIS OF RELATED WORK ................................................9
4. SOLUTION APPROACH ...........................................................................................15
5. WORK PLAN ..........................................................................................................20
6. PROTOTYPE IMPLEMENTATION ..............................................................................21
7. USABILITY STUDY RESULTS ....................................................................................50
8. CONCLUSIONS REACHED .......................................................................................52
9. REFERENCES ..........................................................................................................53
APPENDICES ..............................................................................................................54
1
ABSTRACT
This goal of this research is to examine several common usability design flaws and to present a
plan to create a set of best practices that will contain both user-tested design patterns and proper
gesture use that bring about better user task outcomes. It also describes how a usability study
would be carried out, including the process of analyzing the study results and finding the best
design patterns for those tasks. This research is needed, because there are a limited amount of
usability-tested software design patterns for use on mobile touchscreen devices and this re-
searcher believes that usability has understandably suffered in this medium. As any mobile
touchscreen device user can attest, it can be surprisingly frustrating to perform a simple task -
such as copying and pasting text from one location to another, or filling out a simple form. The-
se tasks and more were tested on a mobile touchscreen device in order to identify and attempt to
solve these common usability problems.
KEYWORDS:
Touchscreen, Gestures, Software, Usability, Affordances, Usability Study, Software Design Pat-
terns
5
1. INTRODUCTION
Touchscreen devices can be a nightmare to operate. Nearly everyone has experienced the “Fat
Finger Problem” when trying to type some text on a ...
AGE BASED USER INTERFACE IN MOBILE OPERATING SYSTEMIJCSEA Journal
The mobile phones are becoming now an irreplaceable utility of every household. It serves as wall clock, alarm clock, calculator, calendar, timer and many more, but have this multi-functionality overloaded the interface of the new generation of the mobile phones. The youth have adapted well to these multiple functionalities graphical user interface, but the interface has now haunting effects for usage by the two age groups i.e. elderly and kids. The interface may end up leading the new generation mobiles in the market useless or of very little use to the elderly and kids. This leads towards the need of age based user interface in the mobile operating system which will consist of interface selection home screen which further directs to age oriented interfaces.
AGE BASED USER INTERFACE IN MOBILE OPERATING SYSTEMIJCSEA Journal
The mobile phones are becoming now an irreplaceable utility of every household. It serves as wall clock, alarm clock, calculator, calendar, timer and many more, but have this multi-functionality overloaded the interface of the new generation of the mobile phones. The youth have adapted well to these multiple functionalities graphical user interface, but the interface has now haunting effects for usage by the two age groups i.e. elderly and kids. The interface may end up leading the new generation mobiles in the market useless or of very little use to the elderly and kids. This leads towards the need of age based user interface in the mobile operating system which will consist of interface selection home screen which further directs to age oriented interfaces.
This document discusses various aspects of human-computer interaction (HCI) and user interface design. It begins by defining HCI and its goals of making systems useful, usable and satisfying to users. It then discusses why good UI design is important, covering both explicit and implicit forms of interaction. The document outlines challenges in areas like ubiquitous access and personalized spaces. It analyzes interfaces for different devices like PCs, mobile phones, games consoles and remote controls. It also covers multimodal interaction, gestures, wearable and implanted devices. Finally, it briefly introduces the human-centered design process.
Efficiency or Quality of Experience: A Laboratory Study of Three Eyes- Free T...Vladimir Kulyukin
A growing number of individuals who are blind or visually impaired is using smartphones in their daily
activities. The touchscreen is a standard component of smartphones. While benefitting people with low vision by enhancing control of the text style and color and the size of images and text, the touchscreen has the downside for visually
impaired users in that physical buttons for input of command selection and text entry are replaced with the touchscreen’s soft buttons. To overcome this limitation, we are investigating eyes-free approaches to using the smartphone’s touchscreen for information browsing. In this article, we present a laboratory study of three eyes-free touchscreen user
interfaces for browsing menu hierarchies. Our findings indicate that quality of experience and familiarity may be as important as the time efficiency of completing tasks.
This document presents information on touch technology and multi-touch interfaces. It discusses the history of touchscreens and how multi-touch improved on traditional keypads. Major companies that adopted multi-touch include Apple, Microsoft, Dell, Google and HP. Multi-touch offers advantages like more intuitive gestures over menus and is now common in smartphones, tablets and other devices. While powerful, multi-touch interfaces also face limitations such as difficulty with small selections and lower battery life. Overall, the document examines the rise of multi-touch technology and its impact on business applications.
AlterEgo: A Personalized Wearable Silent Speech Interface. MIT
Arnav Kapur
MIT Media Lab
Cambridge, USA
arnavk@media.mit.edu
Shreyas Kapur
MIT Media Lab
Cambridge, USA
shreyask@mit.edu
Pattie Maes
MIT Media Lab
Cambridge, USA
pattie@media.mit.edu
The touch screen technology is widely used in PDA, smart phone, PMP, ATM, information kiosk and many other types of equipment in industrial, medical and commercial environment. Actually the technology enabling these devices is not new, since it was invented by Dr. Samuel C. Hurst in 1971. But it becomes hotter after the release of popular iPhone and iPod touch. With new patents filed for the touch screen technology, Apple brings a new wave to this mature segment and more companies are involved in this revolution with improved interactive UI, ICs, assembly modules and software components.
The document discusses various aspects of user interface evaluation including different evaluation techniques. It describes cognitive walkthrough, heuristic evaluation, and review-based evaluation as three common evaluation methods. Cognitive walkthrough involves experts walking through a design to identify potential usability issues based on psychological principles. Heuristic evaluation involves experts examining a design for violations of identified usability heuristics. Review-based evaluation uses results from literature to support or refute parts of a design. Conducting evaluations in a usability lab provides advantages like specialized equipment availability but lacks the full user context.
Brand New AI “Swiping Technology” Taps Into 6.8 Billion Mobile Users Goldmine TO Get UNLIMITED Traffic, Leads & Sales Making Us $426 Every Time Someone Swipe or Tap!
Multimodal and Affective Human Computer Interaction - Abhinav SharmaAbhinav Sharma
This document discusses human-computer interaction and related topics such as multimodal and affective HCI. It begins with an introduction to the history and development of HCI. It then discusses more recent developments like touch interfaces, voice assistants, and consistent cross-device experiences. Two areas of interest in HCI are identified as multimodal interaction using multiple modes like voice and touch simultaneously, and affective HCI which aims to understand human emotions during interaction. Several research papers are summarized that explore topics like the definitions of multimodal HCI, challenges and opportunities it presents, as well as efforts in the field of automatic emotion recognition in HCI. Overall issues discussed include how to design more natural and seamless multimodal experiences, and how
This document provides an introduction and overview of a project on vision-based hand gesture recognition. It discusses the motivation for the project and how hand gestures can provide a more natural human-computer interaction compared to traditional input devices like keyboards and mice. The document outlines the objectives of the project, which are to develop a system that can identify specific hand gestures using a webcam and interpret them to control mouse operations on a computer. It also provides an overview of the organization of the project report and the topics that will be discussed in subsequent chapters, such as the literature review, proposed methodology, results, and conclusions.
An effective approach to develop location-based augmented reality information...IJECEIAES
Using location-based augmented reality (AR) for pedestrian navigation can greatly improve user action to reduce the travel time. Pedestrian navigation differs in many ways from the conventional navigation system used in a car or other vehicles. A major issue with using location-based AR for navigation to a specific landmark is their quality of usability, especially if the active screen is overcrowded with the augmented POI markers which were overlap each other at the same time. This paper describes the user journey map approach that led to new insights about how users were using location-based AR for navigation. These insights led to a deep understanding of challenges that user must face when using location-based AR application for pedestrian navigation purpose, and more generally, they helped the development team to appreciate the variety of user experience in software requirement specification phase. To prove our concept, a prototype of intuitive location-based AR was built to be compared with existing standard-location based AR. The user evaluation results reveal that the overall functional requirements which are gathered from user journey have same level of success rate criteria when compared with standard location-based AR. Nevertheless, the field study participants highlighted the extended features in our prototype could significantly enhance the user action on locating the right object in particular place when compared with standard location-based AR application (proved with the required time).
BFA Digital Design Thesis Proposal Presentation DRAFTSkye Sant
Skye Sant presented her initial thesis on researching the history of interface design and observing how users interact with the Wii remote. She aims to make conclusions about the future of full-arm gesture interfaces. Her research outcomes include publishing articles, hosting a gallery show to present her findings on what works and doesn't work for full-arm gestures, and creating a website and demo interface for users to experience. She welcomes further discussion via her website or Twitter account.
A SURVEY ON NUMEROUS DEVELOPMENTS IN MULTI-TOUCH TECHNOLOGYpharmaindexing
This document summarizes numerous developments in multi-touch technology. It discusses various multi-touch technologies categorized as either sensor-based or computer vision-based. Sensor-based technologies like FMTSID, DiamondTouch and SmartSkin are able to simultaneously detect multiple touch points but are costly to build. Computer vision-based technologies like FTIR, DI and Microsoft Surface use optical techniques and cameras to detect touches and are more scalable and affordable. The document also outlines key technologies for multi-touch like touch detection accuracy, user identification, and bimanual interaction support.
This document provides an overview of natural user interface (NUI) research as it relates to human-computer interaction (HCI). It defines HCI and notes that NUI is an emerging paradigm that aims to make interfaces invisible through natural motions, gestures, and interactions. The document outlines relevant areas of HCI including computer science, behavioral sciences, and design. It discusses NUI focus areas like touch, vision, speech and technologies like touchscreens, voice input, and augmented reality. Examples of NUI applications are also presented. Challenges with NUI include the currently high costs of developing the specialized technology.
LEVERAGING MOBILE DEVICES TO ENHANCE THE PERFORMANCE AND EASE OF PROGRAMMING ...IJITE
Programming simple robots allows teachers to reinforce unified science, technology, engineering, and
math (STEM) concepts. However, for many educators, the cost and computer requirements for robotics kits
are prohibitive. As mobile devices have become increasingly ubiquitous, low cost, and powerful, they may
prove to be an attractive means of coding for, controlling, and enhancing the capabilities of low-cost
mobile robots. This study looks into the viability of using LEGO Mindstorms NXT and Google Android
devices by using Bluetooth to establish a link between the two. This allows for the exchange of live data
remotely for use in various applications with the hope of creating a low-cost mobile programming
environment. The mobile applications developed were able to successfully exchange data with NXT
hardware via Bluetooth and show evidence that mobile devices can be used as a tool to assist in robotic
programming in education.
LEVERAGING MOBILE DEVICES TO ENHANCE THE PERFORMANCE AND EASE OF PROGRAMMING ...IJITE
Programming simple robots allows teachers to reinforce unified science, technology, engineering, and
math (STEM) concepts. However, for many educators, the cost and computer requirements for robotics kits
are prohibitive. As mobile devices have become increasingly ubiquitous, low cost, and powerful, they may
prove to be an attractive means of coding for, controlling, and enhancing the capabilities of low-cost
mobile robots. This study looks into the viability of using LEGO Mindstorms NXT and Google Android
devices by using Bluetooth to establish a link between the two. This allows for the exchange of live data
remotely for use in various applications with the hope of creating a low-cost mobile programming
environment. The mobile applications developed were able to successfully exchange data with NXT
hardware via Bluetooth and show evidence that mobile devices can be used as a tool to assist in robotic
programming in education.
LEVERAGING MOBILE DEVICES TO ENHANCE THE PERFORMANCE AND EASE OF PROGRAMMING ...IJITE
This document discusses using mobile devices to enhance programming and capabilities for low-cost mobile robots. It describes developing applications on Android devices that allow communication with LEGO Mindstorms NXT robots via Bluetooth to exchange sensor and motor data in real-time. This provides a low-cost mobile programming environment. The applications allowed monitoring sensor values, setting motor positions, and testing shows mobile devices can assist with robotic programming education by providing an intuitive interface and freeing up funds for other educational needs. However, limitations of Bluetooth for streaming multiple sensors were discovered, requiring prioritization of sensor data or alternative connections.
Similar to Investigating_the_effectiveness_of_tactile_feedbac.pdf (20)
This document summarizes research on tactile displays and sensory substitution for visually impaired individuals. It discusses how tactile displays can present information through the sense of touch as an alternative to vision. It covers key aspects of cutaneous perception like amplitude, frequency, location on the body, and how these are affected by aging. It also reviews common technologies for tactile displays like Braille, pin arrays, and vibrotactile displays and how they are used to present information through sensory substitution.
Temple of Asclepius in Thrace. Excavation resultsKrassimira Luka
The temple and the sanctuary around were dedicated to Asklepios Zmidrenus. This name has been known since 1875 when an inscription dedicated to him was discovered in Rome. The inscription is dated in 227 AD and was left by soldiers originating from the city of Philippopolis (modern Plovdiv).
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
LAND USE LAND COVER AND NDVI OF MIRZAPUR DISTRICT, UPRAHUL
This Dissertation explores the particular circumstances of Mirzapur, a region located in the
core of India. Mirzapur, with its varied terrains and abundant biodiversity, offers an optimal
environment for investigating the changes in vegetation cover dynamics. Our study utilizes
advanced technologies such as GIS (Geographic Information Systems) and Remote sensing to
analyze the transformations that have taken place over the course of a decade.
The complex relationship between human activities and the environment has been the focus
of extensive research and worry. As the global community grapples with swift urbanization,
population expansion, and economic progress, the effects on natural ecosystems are becoming
more evident. A crucial element of this impact is the alteration of vegetation cover, which plays a
significant role in maintaining the ecological equilibrium of our planet.Land serves as the foundation for all human activities and provides the necessary materials for
these activities. As the most crucial natural resource, its utilization by humans results in different
'Land uses,' which are determined by both human activities and the physical characteristics of the
land.
The utilization of land is impacted by human needs and environmental factors. In countries
like India, rapid population growth and the emphasis on extensive resource exploitation can lead
to significant land degradation, adversely affecting the region's land cover.
Therefore, human intervention has significantly influenced land use patterns over many
centuries, evolving its structure over time and space. In the present era, these changes have
accelerated due to factors such as agriculture and urbanization. Information regarding land use and
cover is essential for various planning and management tasks related to the Earth's surface,
providing crucial environmental data for scientific, resource management, policy purposes, and
diverse human activities.
Accurate understanding of land use and cover is imperative for the development planning
of any area. Consequently, a wide range of professionals, including earth system scientists, land
and water managers, and urban planners, are interested in obtaining data on land use and cover
changes, conversion trends, and other related patterns. The spatial dimensions of land use and
cover support policymakers and scientists in making well-informed decisions, as alterations in
these patterns indicate shifts in economic and social conditions. Monitoring such changes with the
help of Advanced technologies like Remote Sensing and Geographic Information Systems is
crucial for coordinated efforts across different administrative levels. Advanced technologies like
Remote Sensing and Geographic Information Systems
9
Changes in vegetation cover refer to variations in the distribution, composition, and overall
structure of plant communities across different temporal and spatial scales. These changes can
occur natural.
This document provides an overview of wound healing, its functions, stages, mechanisms, factors affecting it, and complications.
A wound is a break in the integrity of the skin or tissues, which may be associated with disruption of the structure and function.
Healing is the body’s response to injury in an attempt to restore normal structure and functions.
Healing can occur in two ways: Regeneration and Repair
There are 4 phases of wound healing: hemostasis, inflammation, proliferation, and remodeling. This document also describes the mechanism of wound healing. Factors that affect healing include infection, uncontrolled diabetes, poor nutrition, age, anemia, the presence of foreign bodies, etc.
Complications of wound healing like infection, hyperpigmentation of scar, contractures, and keloid formation.
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptxEduSkills OECD
Iván Bornacelly, Policy Analyst at the OECD Centre for Skills, OECD, presents at the webinar 'Tackling job market gaps with a skills-first approach' on 12 June 2024
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...PECB
Denis is a dynamic and results-driven Chief Information Officer (CIO) with a distinguished career spanning information systems analysis and technical project management. With a proven track record of spearheading the design and delivery of cutting-edge Information Management solutions, he has consistently elevated business operations, streamlined reporting functions, and maximized process efficiency.
Certified as an ISO/IEC 27001: Information Security Management Systems (ISMS) Lead Implementer, Data Protection Officer, and Cyber Risks Analyst, Denis brings a heightened focus on data security, privacy, and cyber resilience to every endeavor.
His expertise extends across a diverse spectrum of reporting, database, and web development applications, underpinned by an exceptional grasp of data storage and virtualization technologies. His proficiency in application testing, database administration, and data cleansing ensures seamless execution of complex projects.
What sets Denis apart is his comprehensive understanding of Business and Systems Analysis technologies, honed through involvement in all phases of the Software Development Lifecycle (SDLC). From meticulous requirements gathering to precise analysis, innovative design, rigorous development, thorough testing, and successful implementation, he has consistently delivered exceptional results.
Throughout his career, he has taken on multifaceted roles, from leading technical project management teams to owning solutions that drive operational excellence. His conscientious and proactive approach is unwavering, whether he is working independently or collaboratively within a team. His ability to connect with colleagues on a personal level underscores his commitment to fostering a harmonious and productive workplace environment.
Date: May 29, 2024
Tags: Information Security, ISO/IEC 27001, ISO/IEC 42001, Artificial Intelligence, GDPR
-------------------------------------------------------------------------------
Find out more about ISO training and certification services
Training: ISO/IEC 27001 Information Security Management System - EN | PECB
ISO/IEC 42001 Artificial Intelligence Management System - EN | PECB
General Data Protection Regulation (GDPR) - Training Courses - EN | PECB
Webinars: https://pecb.com/webinars
Article: https://pecb.com/article
-------------------------------------------------------------------------------
For more information about PECB:
Website: https://pecb.com/
LinkedIn: https://www.linkedin.com/company/pecb/
Facebook: https://www.facebook.com/PECBInternational/
Slideshare: http://www.slideshare.net/PECBCERTIFICATION
1. See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/221517614
Investigating the effectiveness of tactile feedback for mobile touchscreens
Conference Paper · April 2008
DOI: 10.1145/1357054.1357300 · Source: DBLP
CITATIONS
445
READS
1,872
3 authors, including:
Eve Hoggan
Helsinki Institute for Information Technology HIIT
33 PUBLICATIONS 1,801 CITATIONS
SEE PROFILE
Stephen Anthony Brewster
University of Glasgow
516 PUBLICATIONS 17,312 CITATIONS
SEE PROFILE
All content following this page was uploaded by Stephen Anthony Brewster on 21 May 2014.
The user has requested enhancement of the downloaded file.
2. Investigating the Effectiveness of Tactile Feedback for
Mobile Touchscreens
Eve Hoggan, Stephen A. Brewster and Jody Johnston
Glasgow Interactive Systems Group, Department of Computing Science
University of Glasgow, Glasgow, G12 8QQ, UK
{eve, stephen}@dcs.gla.ac.uk www.tactons.org
ABSTRACT
This paper presents a study of finger-based text entry for
mobile devices with touchscreens. Many devices are now
coming to market that have no physical keyboards (the
Apple iPhone being a very popular example). Touchscreen
keyboards lack any tactile feedback and this may cause
problems for entering text and phone numbers. We ran an
experiment to compare devices with a physical keyboard, a
standard touchscreen and a touchscreen with tactile feed-
back added. We tested this in both static and mobile envi-
ronments. The results showed that the addition of tactile
feedback to the touchscreen significantly improved finger-
based text entry, bringing it close to the performance of a
real physical keyboard. A second experiment showed that
higher specification tactile actuators could improve per-
formance even further. The results suggest that manufactur-
ers should use tactile feedback in their touchscreen devices
to regain some of the feeling lost when interacting on a
touchscreen with a finger.
Author Keywords
Tactile feedback, multimodal interaction, touchscreens,
mobile interaction, fingertip interaction.
ACM Classification Keywords
H.5.2. [User Interfaces]: Haptic I/O.
INTRODUCTION
Touchscreen mobile devices are becoming ever more popu-
lar with both manufacturers and users. As there is no need
for a physical keyboard to take up space on the device, they
can have larger screens which can be used more flexibly,
meaning a better display of videos, web pages or games, or
reconfiguring the display as required, for example rotating
from portrait to landscape. A soft keyboard can be dis-
played when text must be entered. The most popular such
device at the present time is the Apple iPhone (Figure 1),
but many other manufacturers have also removed the phys-
ical keyboards from devices such as PDAs, digital cameras
and music players. The use of a touchscreen also allows
novel forms of interaction, for example using gestures [14]
on the screen to control a device, or more flexible forms of
text entry and navigation.
Although the keyboards used on touchscreen devices are
based on the original physical mobile keyboards, one im-
portant feature is lost: the buttons cannot provide the tactile
response that physical buttons do when touched or clicked.
Without the tactile feedback, users can only rely on audio
and visual cues which can be ineffective in mobile applica-
tions due to small screen size, outside noise, social restric-
tions and the demands of other real world tasks [9]. Studies
of touchscreen text entry have shown the difficulties of typ-
ing with a stylus on the touchscreen using the standard
QWERTY keyboard and have proposed various different
keyboard layouts to combat this [11, 16]. Despite these re-
sults, the primary method for data entry is still the standard
QWERTY keyboard. In an initial, small study [1] we
showed that entering text on a touchscreen when on the
move can be problematic and that adding artificial tactile
feedback can reduce error rates.
Figure 1: The Apple iPhone, a finger-operated touchscreen
phone.
A mobile device user’s context can be extremely varied, for
example, travelling on the train, at a party or in the gym.
The user expects to be able to interact effectively with the
device in all of these situations. For example, sending
emails and browsing the Web on a touchscreen mobile de-
Permission to make digital or hard copies of all or part of this work for
personal or classroom use is granted without fee provided that copies are
not made or distributed for profit or commercial advantage and that copies
bear this notice and the full citation on the first page. To copy otherwise,
or republish, to post on servers or to redistribute to lists, requires prior
specific permission and/or a fee.
CHI 2008, April 5–10, 2008, Florence, Italy.
Copyright 2008 ACM 978-1-60558-011-1/08/04…$5.00.
Copyright 2008 ACM 978-1-60558-011-1/08/04Ö$5.00.
3. vice whilst on the train to work is a common activity. A
solution is needed that makes the entry of text efficient and
simple in these situations.
One other important issue is that devices like the iPhone
have done away with the stylus and just use the finger for
input (and in fact multiple fingers for certain interactions).
The previous generation of touchscreen phones used a small
stylus for interaction. This is advantageous from the de-
vice’s point of view as the interaction point is very clear
and easy for the device to recognise. They are, however,
less convenient for users who have to get the stylus out
each time they want to interact with the device and the sty-
lus is easy to lose. There has been little study of the effects
of using a finger on the screen for interaction. Given the
necessary limited size of these devices, widgets are often
too small [13] or positioned in an awkward way preventing
them from being selected easily with a finger. This makes it
even more difficult for users to interact with their devices
when, for example, travelling to work on a bumpy train.
In an effort to address these issues, this paper presents a
study into the use of tactile feedback for a touchscreen mo-
bile phone QWERTY keyboard where a fingertip is used to
press the keys. Our aim was to quantify the effects of tactile
feedback in mobile and static settings by comparing a de-
vice with a physical keyboard to a touchscreen phone and
then to the touchscreen phone with added tactile feedback.
BACKGROUND
The amount of research in mobile tactile feedback for text
entry is limited as, up until this point, there have always
been physical buttons on the devices providing their own
natural tactile feedback. The research has focused on more
complex feedback instead such as calendar alerts and navi-
gation cues [2, 4].
Previous research into tactile interaction on mobile devices
has often focused on new hardware, for example, a tactile
stylus [8] for use on touchscreens and also a piezo-electric
display [7, 15] for use with a stylus. These projects have all
provided tactile feedback by attaching a physical actuator to
the back of the device, or to the stylus, or by building cus-
tom device displays. The experiments discussed here in-
volve the use of the standard, built-in mobile phone vibra-
tion actuator which can be found in almost all commercially
available devices and compare them to specialised external
vibrotactile actuators. By using the actuator already in the
device, the tactile feedback is not restricted by expensive or
rare technology and does not require any hardware to be
added to the device which could increase its size or weight
which may be inappropriate for mobile devices.
Brewster et al. [1] reported a small study investigating the
effects of adding tactile feedback to keyboard interactions
on a PDA operated with a stylus. They compared a standard
PDA soft keyboard to one with artificial tactile feedback
presented when the keys were pressed. The artificial tactile
feedback in this case was created using a small vibrotactile
actuator attached to the back of the device (the same as
Figure 10). Results showed that, in a static lab setting, with
tactile feedback users entered significantly more text, made
fewer errors and corrected more of the errors they did
make. In mobile use (on a subway train) they found many
of the tactile benefits reduced, but users still corrected sig-
nificantly more errors. This study was small, but provided
the motivation for our work. To fully understand the effects
of tactile feedback we need to evaluate both a real physical
keyboard and one with artificial tactile feedback to see how
they compare. If we can achieve the same level of perform-
ance as a real keyboard then we can combine the benefits of
touchscreen displays with physical buttons. Our previous
study only used six participants in the mobile condition so it
is hard to draw any strong conclusions. Also, fingertip use
was not tested, only stylus-based interactions. The work
reported here builds on this previous study to validate it and
to extend its range.
There have been several studies into mobile interaction with
touchscreens using styli that do not involve tactile feed-
back. MacKenzie [11] has conducted various experiments
investigating the use of different keyboard layouts to assist
in text entry using a touchscreen. However, these different
layouts have not become commercially available and are
unlikely to, just as the Dvorak [3] keyboard layout has been
proven to be superior in numerous cases but has never been
widely adopted. For this reason, we looked at augmenting
the current QWERTY keyboard layout with tactile feed-
back.
This paper presents experiments conducted in both a lab
setting and mobile environment investigating text entry on a
touchscreen device with and without tactile feedback. The
aim is to explore the effects of tactile feedback from key-
board events (confirming that the fingertip is touching a
button, confirming that the button has been pressed and
highlighting whether the fingertip has slipped off the button
or not) to see if performance can be improved.
EXPERIMENTAL DESIGN
After initial investigations using currently available mobile
touchscreen devices, the Palm Treo 750 and a Samsung
i718 (Figure 2) were chosen for the experiment. The Palm
Treo was chosen for the control condition as it is a popular
device and has a physical keyboard, allowing us to compare
typing performance between a touchscreen and real, phys-
ical buttons. The Samsung i718 was chosen as it is a new
device and has a large touchscreen display, ideal for pre-
senting a full QWERTY virtual keyboard. The i718 phone
contains a Samsung Electro-Mechanics Linear Resonant
Actuator and Immersion VibeTonz technology to control
the actuator and produce sophisticated tactile effects
(www.immersion.com). This actuator consists of a moving
magnetic mass, an electromagnet and a spring. The system
is underdamped with a very high Q (i.e. sharply peaked
resonance). The resonant frequency is ~175Hz. The small
size of the mass and the strength of the resonance makes
4. this actuator ideal for short, sharp effects (such as are found
in mechanical buttons) because it reaches maximum accel-
eration in 2 to 3 wavelengths (10 to 20ms). This is one of
the best vibration actuators available in a standard phone
and is much more controllable than a standard motor.
Figure 2: A Palm Treo 750 and a Samsung i718.
Feedback Design
A standard QWERTY touchscreen keyboard was created
for the i718 that matched the one on the Treo in terms of
button size and keyboard layout. We could then add tactile
effects as required using the built in actuator and Immer-
sion’s Vibetonz studio (their tactile authoring tool). The
exact size and spacing of the physical keys on the Treo
were copied when designing the touchscreen buttons on the
i718. The Treo keys were 50x35mm with a gap of 3mm
between each. The i718 has a 2.8inch touchscreen with
240x320 pixels. The touchscreen buttons we designed for
the i718 were slightly larger than its standard soft keyboard,
because they were based on the physical Treo keys and
were designed for use with the fingertip not a stylus, but in
all other respects were exactly the same as standard Win-
dows Mobile buttons: they highlight when pressed.
In this study, a set of simple Tactons [2] was created to rep-
resent the different keyboard events and keys that exist on a
touchscreen keyboard. A fingertip-over event (when the
fingertip has touched a button) used a 1-beat smooth Tac-
ton, a fingertip-click event used a 1-beat sharp Tacton,
while a fingertip-slip event (when the fingertip moved over
the edge of a button) used a 3-beat rough Tacton. All of the
tactile feedback was created using the standard internal
vibration actuator in the i718 device.
Fingertip-Over Event
One of the key features lost in a touchscreen keyboard is
feeling the edges of the keys. We created a tactile equiva-
lent to this so that users could feel around the display and
know when they were on a key or moving between one key
and the next. In other work, such as Brewster et al. [1] this
was not included, but is a key feature of a real keyboard and
is important for working out when the fingers are touching
the keys and orienting in the display.
In a standard interface, a mouse over event is fired when the
mouse pointer is moved over a GUI element such as a but-
ton. This was adapted to create a fingertip-over event which
is fired when the finger moves over any button in the inter-
face. When the fingertip-over event is triggered, a 1-beat
smooth 300ms Tacton is presented. The cue uses a 250Hz
sine wave with increasing intensity during the ramp up time
and decreasing intensity during the ramp down time to cre-
ate a smooth rounded feeling button (Figure 3).
Figure 3: Fingertip-Over Tacton waveform with ramp up and
down to make it feel smooth and rounded.
Home Keys
On traditional physical keyboards it is common to find
raised ridges on the ‘F’ and ‘J’ keys used for orientation. To
recreate this on the touchscreen keyboard, when the ‘F or J’
key triggers the fingertip-over event a different textured
Tacton is presented. The Tacton is 1-beat 300ms amplitude
modulated 250Hz sine wave, which feels rough.
Fingertip-click Event
Tactile feedback was used to confirm that a button had been
pressed. When the fingertip-click event was triggered, a 1-
beat sharp 30ms Tacton was presented. The cue used a
175Hz square wave and no ramp up or ramp down time to
create a very short and quick ‘click’ resembling the ‘click’
felt when depressing a physical button.
Fingertip-Slip Event
An event was triggered whenever the fingertip moved over
the edge of any button on the screen, indicating a transition
or slip from one to the next (fingertip slips can be trouble-
some for users and can cause errors that are often unde-
tected). This allowed users to run their fingertips over the
buttons feeling all of the edges. When the fingertip-slip
event is triggered, a 3-beat rough 500ms Tacton is pre-
sented. The rough texture is created using an amplitude
modulated 175Hz sine wave. This Tacton was designed to
be attention grabbing and to feel very different to the other
cues allowing easy identification of a slip by the user.
Methodology
We designed an experiment to investigate the effects of
incorporating tactile feedback into mobile touchscreen but-
tons. The experiment compared user performance on a typi-
cal mobile device featuring a real, physical keyboard (Palm
Treo), to a touchscreen keyboard with added tactile feed-
back (Samsung i718) and to the same device with no tactile
feedback. The experiment hypotheses were as follows:
1. Participants will be able to enter text with the least er-
rors and greatest speed on the physical keyboard;
2. Tactile feedback will improve speed and accuracy of
text entry on touchscreen keyboards;
5. 3. Touchscreens with tactile feedback will achieve com-
parable accuracy and speed levels to physical key-
boards;
4. Tactile feedback will improve the speed and accuracy
of text entry on touchscreens when mobile.
We needed participants who had some expertise in text en-
try on mobile devices, so before beginning the experiment
participants were required to complete a questionnaire on
their text entry habits. We chose participants who send, on
average, 1 - 10 text messages on a QWERTY mobile device
per day as they can be considered moderate users who have
experience of using physical keyboards. It was not possible
to get enough participants who had experience of
touchscreen keyboards as they are not yet common. Users
were given training with the keyboards as discussed later.
We used a within-subjects design where the conditions
were:
1. Standard mobile device with physical keyboard (the
control Physical condition);
2. Touchscreen mobile device with tactile feedback added
to soft keyboard (the Tactile condition);
3. Touchscreen mobile device with soft keyboard (the
Standard condition).
The phrase set used for the text in the experiment was from
MacKenzie [10] and has been used successfully in several
studies [11, 17]. It is a 500-phrase set with no punctuation
symbols and no upper case letters. Due to time constraints
from our experimental design, the full set of 500 phrases
could not be used so a random set of 30 phrases was selec-
ted for each run of the experiment. This resulted in each
condition (Standard, Tactile and Physical) lasting approxi-
mately 20 minutes. All conditions were tested in a static lab
environment and also on the move on a subway train.
EXPERIMENT 1 – LAB AND MOBILE STUDY
Twelve participants took part in this experiment. All par-
ticipants were students or staff at the University with an age
range of 18 to 38 years. There were 3 female participants
and 9 male participants. Two participants were left-handed.
All participants were seated during the experiment and
asked to hold the device in their hands at all times.
Participants were shown a phrase and asked to memorise it,
then type it in using the keyboard for each condition. They
were asked to enter it as quickly and as accurately as pos-
sible. Each phrase was shown at the top of the screen until
the participant began typing at which point the phrase dis-
appeared. The interface used on the i718 is shown in Figure
4. The Treo had the same display, except the onscreen key-
board was not shown; participants hit the physical ‘Enter’
key to submit a phrase. This method sits in between the text
creation method and the text copy method. Text creation
(where users come up with their own messages), although
most realistic, is difficult to use as errors cannot easily be
detected. Text copy (users copy messages on the screen) is
not very realistic as most users do not copy their text mes-
sages or emails, for example, from a piece of paper onto
their device. The method used in this experiment was not
text creation, but the participants were not copying text
directly onto the device either making it a slightly more
realistic scenario (Brewster et al. used the copying method
in their earlier study). Timing began when the participants
hit the first key and stopped when they hit ‘Submit’ (or En-
ter on the Treo). They moved on to the next task whether or
not the phrase was correct.
Figure 4: Screenshot of the experiment interface.
The mobile part of the study could have been tested in
many different ways, for example, with users walking [14].
This has been shown to be an effective way to generate
some of the workload of using a device whilst on the move
[1]. We chose to investigate the interaction on a subway
train (Figure 5) on the Glasgow Underground. People use
PDAs and phones on trains and buses every day whilst
commuting. The underground is a good platform for testing
as noise levels are very dynamic, being quiet when stopped
at a station, but very noisy when the train is in motion.
Light levels again vary dramatically. Vibration and move-
ment are also very changeable. When the train is stopped
there is little vibration. However, when it accelerates and
decelerates people are subjected to lots of forces and vibra-
tion from the engine and general movement. Another im-
portant factor for this experiment is that the within-subjects
design used meant that participants had to use three differ-
ent keyboards which took around one hour. This would be
too far for some of the participants to walk. The subway
allowed us to test in a realistic usage situation but without
fatiguing the users too much.
Conditions in this experiment were fully counterbalanced.
Half of the participants completed the lab-based experiment
first while the other half took part in the mobile subway
train session first. For both the lab and mobile parts of the
experiment, the keyboard conditions were also counterbal-
6. anced. The first set of conditions was completed on one day
and the second set was completed at least one day later, to
avoid participant fatigue. A training period was given be-
fore each trial (with five phrases for each keyboard type) to
familiarise each user with the interface to be used. Tactile
feedback was described and users were given the chance to
physically feel the feedback with their fingertips. The de-
pendent variables measured in the experiment were speed,
accuracy, keystrokes per character and subjective workload
(using the NASA TLX workload assessment [5]). We add-
ed an extra factor, annoyance, to the workload analysis to
specifically focus on any issues of irritation that the tactile
feedback might cause the participants.
Figure 5: The mobile condition of the experiment on the sub-
way train. The experimenter (on the left) takes notes whilst the
participant (on the right) enters text.
Results
Accuracy
The Physical keyboard condition had accuracy levels of
88.25% in the lab and 89.6% in the mobile setting. The
Tactile condition achieved scores of 82.7% in the lab and
80% on the train, while the Standard touchscreen keyboard
produced scores of 69.6% in the lab and 65.8% when mo-
bile (see Figure 6).
Figure 6: Average percentage of phrases entered correctly
(with standard deviations).
A two-way ANOVA was performed on the mean number of
correct phrases entered, comparing the effects of mobility
(static and mobile) and the three keyboard types (Physical,
Tactile and Standard). A correct response in this case was
when the entered phrase (when the ‘submit’/Enter button
was selected) matched the given phrase completely (regard-
less of whether corrections were made along the way).
The ANOVA showed a significant main effect for keyboard
type (F(2,33) = 96.9, p < 0.001). Using post hoc Tukey's
Pairwise Comparisons, it can be seen that a significantly
higher number of phrases was entered correctly on both the
physical keyboard and the tactile touchscreen than on the
standard touchscreen (p=0.05). There were no significant
differences in the number of correct phrases entered on the
physical keyboard and on the tactile touchscreen. The
scores are, on average, 5.5% lower on the tactile
touchscreen than the physical keyboard in the lab and 9.6%
lower when mobile. Only between 1.6 and 2.8 more phrases
were entered incorrectly in the tactile condition, suggesting
that the performance is comparable with the real keyboard.
There was no main effect for mobility (F (1,33)=1.79,
p=0.18) and no interaction between keyboard type and mo-
bility (F(2,33)=1.58, p=0.21. This suggests that users did
not type less accurately when on the move.
These results show that participants were still able to enter
text accurately when on the move, even with the distur-
bances caused by the train. It also shows the addition of
tactile feedback can overcome some of the problems caused
by the Standard touchscreen keyboard and enable people to
notice and recover from errors they make (which confirms
the result of Brewster et al. [1]).
Keystrokes Per Character (KSPC)
The number of keystrokes per character was recorded for
each keyboard type. KSPC is the number of keystrokes
required, on average, to generate a character of text for a
given text entry technique in a given language with the
ideal being one per character [17]. Given that accuracy
scores were based on whether or not the submitted phrase
matched the given phrase exactly and did not include cor-
rections as errors, KSPC was recorded in order to examine
how many corrections users had to make before submitting
a correct phrase. The average number of KSPC for each
condition is shown in Figure 7.
Figure 7: Average KSPC for each setting and keyboard type
(with standard deviations).
7. A two-way ANOVA was performed on the KSPC data
comparing the effects of mobility and the keyboard type. A
significant main effect for KSPC for keyboard type was
found (F(2,33) = 6.58, p<0.0001). Tukey tests showed that
there were significantly more KSPC when typing on the
tactile touchscreen than the physical or standard keyboards
(p=0.05). There was no main effect for mobility
(F(1,33)=2.53, p=0.11) and no interaction between key-
board type and mobility (F(2,33)=0.04, p=0.95).
The standard touchscreen keyboard had a lower KSPC than
the tactile one. The reason for this is that participants cor-
rected fewer of the errors they made (as can be seen in Fig-
ure 6). In an experiment like this one this is a reasonable
tradeoff for participants as they are not penalised for errors
(they can continue to the next phrase even if the current one
is incorrect). In a real life setting, this would result in many
mis-typed email addresses or URLs. The physical keyboard
was still the best, with the lowest KSPC value. This sug-
gests that the tactile feedback added helps some aspects of
typing but it is not quite at the level of a real, physical key-
board. Further research into the tactile cues would be
needed to see if we could improve the results here.
Time to Enter Phrases
Figure 8 shows the average time taken to enter a phrase for
each keyboard condition in the lab and mobile settings.
Participants using the physical keyboard entered the phrases
with means of between 13 and 17 seconds (lab and mobile).
The tactile touchscreen allowed participants to enter a
phrase of text in 20 seconds (lab) and 22 seconds (mobile)
while text entry on the standard touchscreen took longer
with rates of between 25 and 27 seconds.
Figure 8: Average time per phrase entered in seconds (with
standard deviations).
A two-way ANOVA on time to enter phrases for they key-
board types and mobility showed a significant main effect
for keyboard type (F(2,33) = 69.78, p< 0.001). Tukey HSD
tests showed that that the time taken to enter phrases on the
physical keyboard and tactile touchscreen were signifi-
cantly lower than on the standard touchscreen keyboard
(p=0.001). The physical keyboard was significantly faster
than the tactile one (p=0.05), indicating that the physical
keyboard allows faster typing speeds than the touchscreen
even with tactile feedback. This result means that tactile
additions to the standard keyboard again had a significant
positive effect on the usability of the device. Combining
this with the accuracy results suggests that tactile feedback
can offer some significant advantages for touchscreen de-
vices.
This time there was a significant main effect for mobility
(F(1,33)=9.48, p=0.003), with the mobile condition increas-
ing the time taken to enter phrases over the lab (there was
no interaction, F(2,33)=2.65, p=0.077). This shows that
being mobile does slow down text entry rates due to the
movements in the environment even though it did not affect
accuracy. This may be because participants chose to main-
tain accuracy at the expense of input speed.
Subjective Workload
The results of the NASA TLX [5] questionnaires are shown
in Figure 9. A two-way ANOVA on overall workload
showed a significant main effect (for the standard six fac-
tors) for keyboard type (F(2,498) = 111.35, p<0.001). There
was no significant main effect for mobility (F(2,498) =
0.19, p = 0.66) and there was no interaction (F(2,498) = 0.7,
p = 0.49). Tukey HSD tests showed that overall workload
when using the standard touchscreen keyboard is signifi-
cantly higher than when using the physical keyboard or the
tactile touchscreen keyboard (p=0.05). There was no sig-
nificant difference between the Physical and Tactile condi-
tions.
Figure 9: Average scores from NASA TLX questionnaires.
Further analysis using a single factor ANOVA on each of
the workload factors showed a significant difference in all
seven factors of the workload analysis with p<0.001. A
Tukey HSD test confirmed that the mental demand, phys-
ical demand, frustration and annoyance levels when using
the standard touchscreen are significantly higher than when
using the other keyboard types. It also showed that per-
ceived performance levels are significantly lower on the
standard touchscreen.
The analysis of each workload factor also showed that tem-
poral demand and effort is significantly higher when using
8. the standard or tactile touchscreen than when using the
physical keyboard.
Some participants commented that the standard touchscreen
was frustrating as they did not receive feedback when their
fingertip had moved off the edge of the button. Any visual
feedback was masked by the fingertip over the button. A
few also commented that they resorted to using their
fingernails to tap the button in order to be more accurate but
that this was uncomfortable.
Discussion
Hypothesis 1 can be accepted as it has been shown that
using the physical keyboard produces significantly fewer
errors and the greatest input speed with phrases being en-
tered up to 10 seconds faster than the on the standard
touchscreen. The greatest input speed results apply in both
the lab and mobile settings.
Hypotheses 2 and 4 can also be accepted as the results show
that touchscreen keyboards with tactile feedback produce
fewer errors and greater speeds of text entry compared to
standard touchscreen keyboards without tactile feedback
both in the lab setting and in the mobile setting. This can be
seen clearly in the mobile setting where phrases were en-
tered up to 6 seconds faster with tactile feedback and accu-
racy scores were as high as 74% compared to the poor ac-
curacy scores on the standard touchscreen keyboard. These
results indicate that when in a mobile situation on a bumpy
noisy train, it becomes even more difficult to use a standard
touchscreen keyboard but tactile touchscreens still perform
significantly better despite the dynamic environment.
Given that text entry on the tactile touchscreen only took 4
seconds longer on average than the physical keyboard and
the accuracy results between both keyboards are compa-
rable, hypothesis 3 can be partially accepted. These results
suggest that tactile feedback should be added to touchscreen
phones as it can significantly improve performance over a
phone without such feedback.
EXPERIMENT 2 - TACTILE TOUCHSCREEN VERSION 2
Given the promising results obtained in the first experiment
suggesting that tactile feedback could significantly improve
the usability of fingertip touchscreen text entry, it was de-
cided that this should be further investigated using an alter-
native, higher specification actuator incorporating spatial
location to see if performance could be improved even fur-
ther. Therefore, the same experiment was run again in the
lab and mobile environments, but this time using a Dell
Axim PDA. The Dell PDA was used as the new vibrotactile
actuators could not be connected to the proprietary audio
connection in the i718. The aim of this experiment was to
investigate whether an alternative, more expensive, spe-
cialised actuator providing localisation could increase per-
formance and get closer to that of the real physical key-
board.
Figure 10: A C2 Tactor from Engineering Acoustics Inc.
Hardware
The C2 Tactor from EAI (Figure 10) was used for this
study. It is a small wearable linear vibrotactile actuator,
which was designed specifically to provide a lightweight
equivalent to large laboratory-based linear actuators [12].
The contactor in the C2 is the moving mass itself, which is
mounted above the housing and pre-loaded against the skin.
This provides localised feedback as only the contact point
(the sliver dot in the centre of (Figure 10) vibrates instead
of the whole surrounding (which occurs in a typical mobile
phone like the i718 where the whole device shakes). The
C2 is resonant at 250Hz but is also designed to be able to
produce a wide range of frequencies unlike many basic mo-
bile phone actuators which have a limited frequency range.
In the experiment, a standard Dell Axim PDA was aug-
mented with two C2 actuators attached to the back (Figure
11) so that they rested under the user’s hand when it was
held. One pressed against the palm of the hand, the other
the middle of the fingers.
Figure 11: The Dell Axim PDA with 2 C2 actuators on the
back.
Feedback Design
The tactile feedback used in this experiment was identical
to that provided in Experiment 1 in that the manipulated
tactile parameters are the same. However, the feedback felt
very different due to the higher quality of the actuators:
they ramp up faster and modulation is clearer. The only
other difference was that the actuators provide localised
feedback to the hand holding the device as opposed to shak-
ing the whole device. By placing the two actuators on the
left and right sides of the device, spatial location could be
incorporated into the feedback to give some indication of
which button was giving the feedback. Using multiple ac-
tuators to provide spatial cues has been used successfully in
a previous study investigating vibrotactile progress bars on
a PDA [6]. Three actuators were pulsed in turn to give the
feeling of movement around a circle to indicate the progress
9. of a download. The design we used here is similar but uses
only two actuators and applies it to text entry.
Whenever a fingertip-over, fingertip-click, or fingertip-slip
event was triggered, the actuator placed nearest the button
would be used to present the feedback. For instance, if the
button ‘A’ was pressed, the actuator on the left was acti-
vated, if the button ‘G’ was pressed, both actuators were
activated and if the button ‘L’ was pressed, the actuator on
the right was activated.
Methodology
The aims and methodology of this experiment were the
same as the previous experiment with one additional hy-
pothesis:
5. Vibrotactile feedback from the C2 actuators will pro-
vide better results than the built-in actuator in the mo-
bile device.
A new set of 12 participants were recruited from the Uni-
versity. They were presented with 30 random phrases from
the MacKenzie phrase set and asked to enter them as
quickly and as accurately as possible on the PDA (with and
without tactile feedback). Participants used the device in the
lab and then on the subway. They only used this device for
the tactile touchscreen and standard touchscreen conditions
as we could compare back to Experiment 1 for performance
on the physical keyboard condition. Once again, speed,
accuracy and KSPC were measured.
Results From Experiment 2
Accuracy
The average number of phrases entered correctly is shown
in Figure 12 alongside the results from Experiment 1 for
comparison. In the lab condition the PDA with actuators
scored 23.8 correct answers and mobile 24.5.
Figure 12: Average percentage of phrases entered correctly
(with standard deviations).
A two-way ANOVA with replication was performed on the
mean number of correct phrases entered, comparing the
effects of mobility (static and mobile) and the four key-
board types (Physical, Tactile and Standard from Experi-
ment 1 and the PDA with 2 C2 actuators). Although par-
ticipants in experiment 2 had less practice (they only did
PDA with no tactile and PDA with tactile in lab and mo-
bile), the first study was fully counterbalanced so that a
valid comparison can be made between both sets of data
from each experiment. The ANOVA showed there was a
significant main effect for keyboard type (F(3,88) = 84.6,
p<0.0001). Post hoc Tukey tests showed that significantly
more phrases were entered correctly on the physical key-
board, PDA with 2 actuators and on the tactile touchscreen
compared to the standard touchscreen (p=0.05). There were
no significant differences between the tactile touchscreen,
PDA and the physical keyboard. The results show that the
average number of correct phrases on the physical keyboard
(26.4 in the lab, 26.9 when mobile) and the PDA (25.3 in
the lab, 24.4 when mobile) were very similar. There was
again no main effect for mobility (F(1,88)=2.1, p=0.144)
and no interaction (F(3,88)=1.39, p=0.24).
Keystrokes per Character (KSPC)
The KSPC data for Experiment 1 and the PDA with the C2
actuators is shown in Figure 13. The PDA scored a mean of
1.20 in the lab and 1.24 when mobile.
Figure 13: Average keystrokes per character (with standard
deviations).
A two-way ANOVA showed a significant main effect in
KSPC for keyboard type (F(3,88)=4.82, p=0.003). Using
post hoc Tukey's Pairwise Comparison, it can be seen that a
significantly higher number of KSPC occurred when using
the tactile touchscreen compared to all three other types of
keyboard including the PDA with C2 actuators (p=0.05).
There were no significant differences in the KSPC on the
physical keyboard and on the PDA with C2 actuators or the
standard touchscreen. This suggests that the PDA has an
advantage over the tactile touchscreen in that KSPC is re-
duced, meaning that it is closer to the performance of the
real keyboard. On the other hand, the KSPC results could
be seen as an indication that participants corrected more of
their errors on the tactile touchscreen therefore increasing
KSPC. Therefore suggesting that the tactile touchscreen
helps users to identify errors more easily by providing
fingertip-slip feedback while errors could go unnoticed on
the standard touchscreen keyboard with no tactile feedback.
10. There was no main effect for mobility (F(1,88)=3.42,
p=0.07) and no interaction between mobility and keyboard
type (F(3,88)=0.03, p=0.98).
Time to Enter Phrases
The average time to enter a phrase is shown in Figure 14. A
two-way ANOVA on mobility and keyboard type showed a
significant main effect for keyboard type (F(3,88)=70.41,
p<0.001), for mobility (F(1,88) = 10.24, p=0.001), with a
significant interaction between the two (F(3, 88)=2.92,
p=0.03).
As before, the average time to enter a phrase was signifi-
cantly affected by keyboard type. Tukey tests showed that
the time per phrase on the PDA was significantly lower
than on the tactile touchscreen and the standard touchscreen
(p=0.05). There was no significant difference between the
physical keyboard and the PDA. These results suggest that
the more specialised C2 actuator can improve results over
the actuator in the Samsung i718 and get closer to the real
keyboard. The C2's are too large to be incorporated into a
current mobile device, but it does suggest that manufactur-
ers could improve the capabilities of some of the more basic
actuators used and gain further performance benefits.
As in Experiment 1 we found that mobility significantly
increased the time taken to enter phrases, but this time there
was a significant interaction between keyboard type and
mobility. The interaction occurred as there was no change
in performance in the PDA condition when static and mo-
bile. All other keyboard types performed worse when mo-
bile, but performance with the PDA went from 17.5 to 17.9
seconds per phrase. It is not clear why this occurred and
further investigation is needed to see if there is a real effect.
It does suggest, however, that the performance with that
virtual tactile keyboard is robust, with performance in the
mobile condition very close to the real physical keyboard.
Figure 14: Average time per phrase in seconds (with standard
deviations).
Subjective Workload
The results of the NASA TLX questionnaires are shown in
Figure 15. A two-way ANOVA on overall workload
showed a significant main effect for keyboard type
(F(3,498) = 88.62, p<0.001) but no effect for mobility (F =
0.12, p = 0.72) and no interaction between them. A one-way
ANOVA performed on each of the seven workload factors
followed by Tukey tests showed that mental demand, phys-
ical demand, temporal demand, effort, frustration were sig-
nificantly increased and perceived performance was signifi-
cantly decreased when using the standard touchscreen.
Unlike Experiment 1, the ANOVA and Tukeys showed a
significantly higher level of annoyance for the PDA with
C2 actuators than with the physical keyboard or with the
original tactile touchscreen (F(2,33) = 35.4, p < .0001). It is
not clear why there should be more annoyance in this case,
particularly as performance overall was improved. It may
be due to the stronger forces that the C2 actuators can ap-
ply. We did not allow the users to change the force of the
actuators, but that could easily be done in the same way as
the volume of the audio can be changed.
Figure 15: NASA TLX Scores Compared with Experiment 1.
Observed Methods of Finger Input
One interesting observation from these studies was the dif-
ferent ways in which participants chose to use their fingers
to press the buttons. All but one of the participants used
both thumbs when entering text on the physical keyboard.
On the touchscreen, both with and without tactile feedback,
participants used several different techniques. Some would
go from using their forefinger to the middle finger to the
fingernail and back to the forefinger for instance. Two of
the participants even attempted using their little fingers to
enter the text. At no time did the participants use their
thumbs or more than one finger at once on the
touchscreens. It appeared as though they were uncomfort-
able and could not find an easy way to use their fingers with
the displays. One participant commented that he felt like he
was ‘learning to type all over again’. This could simply be
due to the relative novelty of touchscreen mobile devices
and participants’ lack of experience with them or, perhaps,
could indicate that a different keyboard layout is required.
DISCUSSION AND CONCLUSIONS
The two studies reported here have shown that tactile feed-
back can significantly improve fingertip interaction and
performance with soft keyboards on touchscreen mobile
11. devices. With the addition of this extra tactile feedback the
performance of touchscreen keyboards can be brought close
to the level of real, physical keyboards. This means that the
benefits of touchscreen displays (such as the flexibility to
use the whole screen for games, web pages or photos) do
not come at the cost of poorer text or number entry. It has
been demonstrated that tactile feedback can benefit
touchscreen interaction in both stationary situations and
more varying, realistic mobile situations.
Furthermore, a comparison of two different types of tactile
actuator showed that text entry on can be further improved
by using multiple, specialised actuators which can provide
localised feedback as opposed to a single standard actuator
which vibrates the whole device. However, the results for
both types of tactile touchscreen show that user perform-
ance is significantly better than when using a touchscreen
with no tactile feedback. Therefore, given that the C2 actua-
tors used are expensive and not currently found in standard
devices, it would appear to still be beneficial and easier to
augment touchscreens with tactile feedback provided by the
actuator already present in the phone.
Given these promising results, it is probable that tactile
feedback will aid in all interactions with touchscreen but-
tons, not just text entry, and future studies will examine
fingertip interaction with other types of touchscreen widg-
ets such as progress bars, icons, menus and sliders.
The results of our studies suggest that manufacturers should
include tactile feedback in new touchscreen devices. There
were no drawbacks from including it, only benefits. Fur-
thermore, such feedback is likely to be of benefit to stylus
interaction as well as fingertip interaction (shown also in
[1]). Our results strongly suggest that using either the built-
in vibrotactile actuator already present in most mobile de-
vices or more specialised actuators to produce tactile feed-
back can improve the usability of touchscreen keyboards.
ACKNOWLEDGEMENTS
This work was supported by Immersion Corp, California
and EPSRC Advanced Research Fellowship GR/S53244.
Hoggan is joint funded by Nokia and EPSRC. Many thanks
to Chris Ullrich, Director of Applied Research, Immersion
Corp and Strathclyde Passenger Transport for allowing us
to test our software on their trains.
REFERENCES
1. Brewster, S., Chohan, F. and Brown, L. Tactile Feed-
back for Mobile Interactions. In Proc CHI '07, ACM
Press (2007), 159 - 162.
2. Brown, L. M. and Brewster, S. A. Multidimensional
Tactons for Non-Visual Information Display in Mobile
Devices. In Proc MobileHCI '06, ACM Press (2006),
231 - 238.
3. Dvorak, A. There Is a Better Typewriter Keyboard.
National Business Education Quarterly 11, (1943), 51 -
58.
4. Erp, J. B. Tactile Navigation Display. In Proc First
international Workshop on Haptic Human-Computer
Interaction, LNCS (2001), 165 - 173.
5. Hart, S. G. and Staveland, L. E. Development of Nasa-
Tlx (Task Load Index): Results of Empirical and Theo-
retical Research. Human Mental Workload (1988), 139 -
183.
6. Hoggan, E., Brewster, S. and Anwar, S. Mobile Multi-
Actuator Tactile Displays. In Proc 2nd International
Workshop on Haptic and Audio Interaction Design,
Springer LNCS (2007), 22 - 33.
7. Kaaresoja, T., Brown, L. M. and Linjama, J. Snap-
Crackle-Pop: Tactile Feedback for Mobile Touch
Screens. In Proc Eurohaptics '06, (2006), 565 - 566.
8. Lee, J. C., Dietz, P. H., Leigh, D., Yerazunis, W. S. and
Hudson, S. E. Haptic Pen: A Tactile Feedback Stylus for
Touch Screens In Proc 17th annual ACM symposium on
User interface software and technology ACM Press
(2004), 291-294.
9. Luk, J., Pasquero, J., Little, S., MacLean, K., Levesque,
V. and Hayward, V. A Role for Haptics in Mobile Inter-
action: Initial Design Using a Handheld Tactile Display
Prototype In Proc CHI '06, ACM Press (2006), 171-180
10.MacKenzie, I. S. and Soukoreff, R. W. Phrase Sets for
Evaluating Text Entry Techniques. In Proc CHI '03,
ACM (2003), 754 - 755.
11.Mackenzie, I. S., Zhang, S. and Soukoreff, R. W. Text
Entry Using Soft Keyboards. Behaviour and Informa-
tion Technology 18, 4 (1999), 235 - 244.
12.Mortimer, B., Zets, G. and Cholewiak, R. W. Vibrotac-
tile Transduction and Transducers. The Journal of the
Acoustic Society of America 121, (2007), 2970 - 2977.
13.Parhi, P., Karlson, A. K. and Bederson, B. B. Target
Size Study for One-Handed Thumb Use on Small
Touchscreen Devices. In Proc Mobile HCI '06, ACM
Press (2006), 203 - 210.
14.Pirhonen, A., Brewster, S. and Holguin, C. Gestural and
Audio Metaphors as a Means of Control for Mobile De-
vices. In Proc CHI '02, ACM Press (2002), 291 - 298.
15.Poupyrev, I. and Maruyama, S. Tactile Interfaces for
Small Touch Screens. In Proc 16th Annual ACM Sym-
posium on User Interface Software and Technology,
ACM Press (2003), 217 - 220.
16.Sears, A., Revis, D., Swatski, J., Crittenden, R. and
Shneiderman, B. Investigating Touchscreen Typing: The
Effect of Keyboard Size on Typing Speed. Behaviour
and Information Technology 12, 1 (1993), 17 - 22.
17.Soukoreff, R. W. and MacKenzie, I. S. Metrics for Text
Entry Research: An Evaluation of MSD and KSPC, and
a New Unified Error Metric. In Proc CHI '03, ACM
(2003), 113 - 120.
View publication stats