SlideShare a Scribd company logo
1 of 1
Download to read offline
MSc thesis
“Augmented freehand pointing on Wall Size Displays”
Abstract
Nowadays, there is a clear trend in the industry towards larger displays. Wall Size Displays (WSDs)
can be used in multiple occasions, from conference meeting rooms and large public displays to
private and public entertainment systems. For enhancing the user experience and performance
when interacting with large displays, the newest technological trends like Augmented Reality (AR)
can be embodied in the interaction. The present project is an attempt to answer whether a better
level in both the user experience and performance can be achieved by using Augmented Reality in
interaction with Wall Size Displays.
A WSD consisting of 12 HD projectors was used with a total of 7680 x 3240 pixels and resolution of
around 68pixels per inch. Distant freehand gestures interaction with marker-based tracking from the
OptiTrack motion capture system was employed, simulating the basic scenario of point & click. The
Optical-see-through head mounted display Epson Moverio BT-200 smart-glasses was used as AR
device, and the augmentation feedback was a laser beam. Three User Interfaces were implemented
and compared, the Simple, Combined and Augmented Interfaces. The Simple User-Interface acted as
the control interface, where the users were able to point & click only with the assistance of the
cursor. The Combined User-Interface differed from the Simple by incorporating AR, so the users
were able to interact with the assistance of both the cursor and the laser beam. The third interface
was the “Augmented User-Interface”, where the interaction was performed with an invisible cursor,
only through the AR glasses.For evaluating and comparing the performance of the interfaces, 12
participants followed a multidirectional pointing task as described in the ISO9241-9 standard. The
task for each interface was divided into 9 conditions with different Indices of Difficulty.
Several limitations were set on the experiment because of the glasses’ features and performance.
Nonetheless, a positive outcome for the augmentation’s effect was observed through the Combined
Interface’s performance and subjective assessment. The Mean Movement Times and mean error
rates in the different conditions tended to be overall better compared to the Simple, and especially
in conditions with big and medium targets. Regarding the Augmented Interface, its performance was
not considerably poorer than of the rest. Exempting the small targets which affected greatly the
interaction in this interface, its performance in conditions with big and medium targets was
comparable to the one of the other interfaces. The Augmented Interface behaved satisfactorily for
pointing and clicking in big and medium targets eliminating adequately the cursor, showing a
potential for solving multi-user interaction problems.

More Related Content

Similar to Abstract MSc thesis (linedin)

Bachelor's Thesis Sander Ginn
Bachelor's Thesis Sander GinnBachelor's Thesis Sander Ginn
Bachelor's Thesis Sander GinnSander Ginn
 
Mardanbegi.2011.mobile gaze based screen interaction in 3 d environments
Mardanbegi.2011.mobile gaze based screen interaction in 3 d environmentsMardanbegi.2011.mobile gaze based screen interaction in 3 d environments
Mardanbegi.2011.mobile gaze based screen interaction in 3 d environmentsmrgazer
 
Eyeglasses-free Display: Towards Correcting Visual Aberrations with Computati...
Eyeglasses-free Display: Towards Correcting Visual Aberrations with Computati...Eyeglasses-free Display: Towards Correcting Visual Aberrations with Computati...
Eyeglasses-free Display: Towards Correcting Visual Aberrations with Computati...Dario Caliendo
 
Sig2014 vision correcting display
Sig2014 vision correcting displaySig2014 vision correcting display
Sig2014 vision correcting displayWael Sharba
 
Technical Report.docx
Technical Report.docxTechnical Report.docx
Technical Report.docxAbiola57
 
CHAINED DISPLAYS: CONFIGURATION OF MULTIPLE CO-LOCATED PUBLIC DISPLAYS
CHAINED DISPLAYS: CONFIGURATION OF MULTIPLE CO-LOCATED PUBLIC DISPLAYSCHAINED DISPLAYS: CONFIGURATION OF MULTIPLE CO-LOCATED PUBLIC DISPLAYS
CHAINED DISPLAYS: CONFIGURATION OF MULTIPLE CO-LOCATED PUBLIC DISPLAYSIJCNCJournal
 
SMARCOS_Paper_Mobile hci12 246
SMARCOS_Paper_Mobile hci12 246SMARCOS_Paper_Mobile hci12 246
SMARCOS_Paper_Mobile hci12 246Smarcos Eu
 
Iirdem screen less displays – the imminent vanguard
Iirdem screen less displays – the imminent vanguardIirdem screen less displays – the imminent vanguard
Iirdem screen less displays – the imminent vanguardIaetsd Iaetsd
 
IRJET- Sixth Sense Technology in Image Processing
IRJET-  	  Sixth Sense Technology in Image ProcessingIRJET-  	  Sixth Sense Technology in Image Processing
IRJET- Sixth Sense Technology in Image ProcessingIRJET Journal
 
screen less display documentation
screen less display documentationscreen less display documentation
screen less display documentationmani akuthota
 
Hand Gesture Identification
Hand Gesture IdentificationHand Gesture Identification
Hand Gesture IdentificationIRJET Journal
 
Lift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentationLift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentationShang Ma
 
ecasesid2001paper
ecasesid2001paperecasesid2001paper
ecasesid2001paperRichard Woo
 
THE IMPACT OF VR GRAPHICAL USER INTERFACE ON OCULUS TOUCH CONTROLLER AND OCUL...
THE IMPACT OF VR GRAPHICAL USER INTERFACE ON OCULUS TOUCH CONTROLLER AND OCUL...THE IMPACT OF VR GRAPHICAL USER INTERFACE ON OCULUS TOUCH CONTROLLER AND OCUL...
THE IMPACT OF VR GRAPHICAL USER INTERFACE ON OCULUS TOUCH CONTROLLER AND OCUL...ijma
 

Similar to Abstract MSc thesis (linedin) (20)

Bachelor's Thesis Sander Ginn
Bachelor's Thesis Sander GinnBachelor's Thesis Sander Ginn
Bachelor's Thesis Sander Ginn
 
Mardanbegi.2011.mobile gaze based screen interaction in 3 d environments
Mardanbegi.2011.mobile gaze based screen interaction in 3 d environmentsMardanbegi.2011.mobile gaze based screen interaction in 3 d environments
Mardanbegi.2011.mobile gaze based screen interaction in 3 d environments
 
Eyeglasses-free Display: Towards Correcting Visual Aberrations with Computati...
Eyeglasses-free Display: Towards Correcting Visual Aberrations with Computati...Eyeglasses-free Display: Towards Correcting Visual Aberrations with Computati...
Eyeglasses-free Display: Towards Correcting Visual Aberrations with Computati...
 
Sig2014 vision correcting display
Sig2014 vision correcting displaySig2014 vision correcting display
Sig2014 vision correcting display
 
Technical Report.docx
Technical Report.docxTechnical Report.docx
Technical Report.docx
 
CHAINED DISPLAYS: CONFIGURATION OF MULTIPLE CO-LOCATED PUBLIC DISPLAYS
CHAINED DISPLAYS: CONFIGURATION OF MULTIPLE CO-LOCATED PUBLIC DISPLAYSCHAINED DISPLAYS: CONFIGURATION OF MULTIPLE CO-LOCATED PUBLIC DISPLAYS
CHAINED DISPLAYS: CONFIGURATION OF MULTIPLE CO-LOCATED PUBLIC DISPLAYS
 
SMARCOS_Paper_Mobile hci12 246
SMARCOS_Paper_Mobile hci12 246SMARCOS_Paper_Mobile hci12 246
SMARCOS_Paper_Mobile hci12 246
 
Iirdem screen less displays – the imminent vanguard
Iirdem screen less displays – the imminent vanguardIirdem screen less displays – the imminent vanguard
Iirdem screen less displays – the imminent vanguard
 
2010TDC_light
2010TDC_light2010TDC_light
2010TDC_light
 
Print screen
Print screenPrint screen
Print screen
 
Hmd ar presentation
Hmd ar presentationHmd ar presentation
Hmd ar presentation
 
IRJET- Sixth Sense Technology in Image Processing
IRJET-  	  Sixth Sense Technology in Image ProcessingIRJET-  	  Sixth Sense Technology in Image Processing
IRJET- Sixth Sense Technology in Image Processing
 
Worldkit System
Worldkit SystemWorldkit System
Worldkit System
 
screen less display documentation
screen less display documentationscreen less display documentation
screen less display documentation
 
Hand Gesture Identification
Hand Gesture IdentificationHand Gesture Identification
Hand Gesture Identification
 
Multitouch Interaction
Multitouch   InteractionMultitouch   Interaction
Multitouch Interaction
 
Lift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentationLift using projected coded light for finger tracking and device augmentation
Lift using projected coded light for finger tracking and device augmentation
 
Consistency in UX Design
Consistency in UX DesignConsistency in UX Design
Consistency in UX Design
 
ecasesid2001paper
ecasesid2001paperecasesid2001paper
ecasesid2001paper
 
THE IMPACT OF VR GRAPHICAL USER INTERFACE ON OCULUS TOUCH CONTROLLER AND OCUL...
THE IMPACT OF VR GRAPHICAL USER INTERFACE ON OCULUS TOUCH CONTROLLER AND OCUL...THE IMPACT OF VR GRAPHICAL USER INTERFACE ON OCULUS TOUCH CONTROLLER AND OCUL...
THE IMPACT OF VR GRAPHICAL USER INTERFACE ON OCULUS TOUCH CONTROLLER AND OCUL...
 

Abstract MSc thesis (linedin)

  • 1. MSc thesis “Augmented freehand pointing on Wall Size Displays” Abstract Nowadays, there is a clear trend in the industry towards larger displays. Wall Size Displays (WSDs) can be used in multiple occasions, from conference meeting rooms and large public displays to private and public entertainment systems. For enhancing the user experience and performance when interacting with large displays, the newest technological trends like Augmented Reality (AR) can be embodied in the interaction. The present project is an attempt to answer whether a better level in both the user experience and performance can be achieved by using Augmented Reality in interaction with Wall Size Displays. A WSD consisting of 12 HD projectors was used with a total of 7680 x 3240 pixels and resolution of around 68pixels per inch. Distant freehand gestures interaction with marker-based tracking from the OptiTrack motion capture system was employed, simulating the basic scenario of point & click. The Optical-see-through head mounted display Epson Moverio BT-200 smart-glasses was used as AR device, and the augmentation feedback was a laser beam. Three User Interfaces were implemented and compared, the Simple, Combined and Augmented Interfaces. The Simple User-Interface acted as the control interface, where the users were able to point & click only with the assistance of the cursor. The Combined User-Interface differed from the Simple by incorporating AR, so the users were able to interact with the assistance of both the cursor and the laser beam. The third interface was the “Augmented User-Interface”, where the interaction was performed with an invisible cursor, only through the AR glasses.For evaluating and comparing the performance of the interfaces, 12 participants followed a multidirectional pointing task as described in the ISO9241-9 standard. The task for each interface was divided into 9 conditions with different Indices of Difficulty. Several limitations were set on the experiment because of the glasses’ features and performance. Nonetheless, a positive outcome for the augmentation’s effect was observed through the Combined Interface’s performance and subjective assessment. The Mean Movement Times and mean error rates in the different conditions tended to be overall better compared to the Simple, and especially in conditions with big and medium targets. Regarding the Augmented Interface, its performance was not considerably poorer than of the rest. Exempting the small targets which affected greatly the interaction in this interface, its performance in conditions with big and medium targets was comparable to the one of the other interfaces. The Augmented Interface behaved satisfactorily for pointing and clicking in big and medium targets eliminating adequately the cursor, showing a potential for solving multi-user interaction problems.