1. Dasher: Efficient Text Entry By Navigation
Sponsor Alan Lawrence, Patrick Welche
Logo Department of Physics
University of Cambridge
New Method Of Text Entry Control by Continuous Gestures
Unlike keyboards or handwriting techniques, Dasher Mouse, touchscreen/pad, joystick: 1 or 2 dimensions
lists all possible words and sentences in alphabetical (39wpm desktop+mouse / 20wpm on mobile device)
order, dividing up space between all letters A-Z. Text Tilting by accelerometer (iPhone, iPad & Android)
entry is then a matter of navigating or steering towards Head mouse, gaze & head tracking – opengazer [1]
the desired sentence; this can be controlled by a wide (29wpm with gaze only!)
variety of gestures. Breath mouse – or any continuous gesture (muscle!)
(15wpm by breathing!)
Figure 1. Steer towards any Figure 2. Click or tap Figure 3. A 1-dimensional input Figure 4. Performing
point by mouse or touch to zoom in on an area is mapped onto a 2d curve actions within Dasher
Text Prediction & Arithmetic Encoding Control by Switches
A language model predicts what you may write next: Timing-insensitive: divide into boxes
●
Probable things are big so quick and easy to write; Direct mode: one button per box
●
Improbable things are smaller and so harder Menu mode: buttons for scan-to-next & select
(but always possible: Dasher is mode-free) Timing-sensitive: one- or two-button dynamic modes
●
The model is trained on a sample of text Greater accuracy → fewer presses
●
any alphabet, language, or writing style One press can enter many letters
●
Learns as the user writes.
Nomon: Switch Interface
Figure 5. Scanning: switch Figures 6 & 7. Two/one buttons, with timing:
zooms into highlighted box Press when sentence passes a line
●
Select arbitrary locations on screen
●
Uses timing of presses of single switch
●
Faster than scanning
●
Clockface for each item → click at noon
●
Learns how accurately you click
●
Uses information theory to guess target
●
Combines guesses to high accuracy
Acknowledgments References (Arial, 32 points, bold)
We would like to thank David Mackay for his guidance, the [1] http://www.inference.phy.cam.ac.uk/opengazer/
many previous authors of Dasher, and the Gatsby Foundation. [2] http://www.inference.phy.cam.ac.uk/nomon/
www
www.aegis-project.eu