[1] Fit Your Hand: Personalized User Interface
Considering Physical Attributes of Mobile Device
Users
 Authors: Hosub Lee and Young Sang Choi
Samsung Electronics Co., Ltd.,
Source: ACM. UIST’11, (October 16–19, 2011), Santa
Barbara, California, USA.

[2] BLUI: Low-cost Localized Blowable User
Interfaces
 Authors: Shwetak N. Patel and Gregory D. Abowd
Georgia Institute of Technology,
Source: ACM. UIST’07, (October 7–10, 2007),
Newport, Rhode Island, USA.
http://www.youtube.com/watch?v=34SEmMxbNkQ
[3] WUW - Wear Ur World - A Wearable Gestural
Interface
 Authors: Pranav Mistry and Pattie Maes and Liyan
Chang MIT Media Lab ,
Source: ACM. CHI 2009,( April 4 -9, 2009), Boston,
MA, USA
http://www.pranavmistry.com/projects/sixthsense/

[4] Light-tech Interaction Used in Ubiquitous Interface
Design
 Authors: Feng Xian Tsinghua University ,
 Source: ACM. RDURP’11, (September 18, 2011),
Beijing, China.

  Adviser: Geeng Neng You Speaker: 張靜蓉 Date: 2012/01/
User
interface   WUW         BLUI            FUH             Light-tech
Name of     Wearable Ur Localized     Fit Your Hand     Light-tech
interface   World       Blowable User                   Interaction
                        Interfaces

Scenario    Augmented   Entertainment Easy-to-touch     Ubiquitous
            Reality     Assistive     positions
                        technology

Usage       Freehand    Hands-free      Handedness      Face /visual
            Gestures    Interaction                     recognition
Operation 1. Gestures   Blowable        Handedness      Emotional
          supported     Selection       Left-hander     lamp
          by multi-     Scrolling       Right-hander    of Chameleon
          touch         Dragging        Finger length
          systems       Physical Blow   Usage habits
          2. Freehand   Metaphor
          gestures
          3. Iconic
          gestures
Expression    Intuitive       Intuitive              Intuitive user   Intuitive
              experience of   directly control       interaction      Emotional
              full-sized      certain interactive    Dynamically      recognition
              gestural        Localized              reformulates
              systems         Blowable               the layout


Equipment     Laptop          Laptop                 Smartphone       Laptop
              Projector       Computer screen        Tablet           Computer
              Camera          Microphone                              screen
              Color
              markers

Application   Microsoft       C++ application        Android          MCUs
              Windows         Java application       Machine          (Microprocessor
              platform        connects via a         learning         Control Unit)
              using C#,       TCP connection         algorithm        used in
              WPF and         FFT application                         Arduino, which
              openCV.         Machine learning       Divided          is a very
                              KNN                    interface into   popular
                                                     4x5=20 cells     open-source
                              Divided interface                       interactive
                              into 3x3 , 4x4, 5x5,                    hardware
                              6x6 regions                             platform
[1] Fit Your Hand: Personalized User
    Interface Considering Physical
    Attributes of Mobile Device Users
     A mobile user interface.
     Dynamically reformulates the layout.
     Integrate machine learning algorithm.
     It infers users’ physical
      characteristics.
      handedness, finger length, or usage
      habits.
     Calculates the optimal touch area for
      the user.
A03=11×1.3   A12=13×1.3   A13=16×1.3   A21=22×
         =14.3        =16.9        =20.8        1.3=28.6
         A22=18×1.3   A30=20×1.3   A40=13×1.3   A41=15×
         =23.4        =26          =16.9        1.3=19.5




Workflow of the Proposed System
The User Interface Architecture
Implementation
[2] BLUI: Low-cost Localized Blowable
   User Interfaces
       A unique form of hands-free interaction.
       Blowing at a laptop or computer screen to
        directly control .
       Localization where on the screen the
        person is blowing.
       Rely solely on a single microphone.
       Java application performs the machine
        learning.
           Use k-Nearest Neighbor (KNN)
          classification for inferring which
          of those regions is targeted.
 Selection
   Hands-free way to directly select
  objects.
 Scrolling
   Top or the bottom of the scroll bar
  (vertical) to cause the page to move.
 Dragging
   Panning a document or cursor
  movements.
 Physical Blow Metaphor
   Gaming or entertainment applications.
   Use blowing to richly convey a physical
  phenomenon. Blowing out birthday
  candles, dust & whack-a-mole.
 BLUI uses real-time audio analysis and
  fingerprinting on the incoming
  microphone data.
 Wind noise produces a broadband
  frequency response through the
  microphone , which is very high in
  amplitude for low frequency components.
 Event-based system.
 The BLUI engine recognizes blowing
  activity.
 Application programmers can register
  callbacks for these blowing events.
 Create special purpose widgets that
  respond appropriately.
25-
50
blow
 Performance of the BLUI localizer for various resolutions (%
s
              of correctly identified regions).
[3] WUW - Wear Ur World - A
    Wearable Gestural Interface
       Bring information out into the
        tangible world.
       Projector and the camera are
        mounted on a hat.
       Coupled in a pendant like
        wearable device.
       Connected to laptop.
intuitive
   experience
   of full-sized
   gestural
   systems.
Stylus-based
interface
[4] Light-tech Interaction Used in
    Ubiquitous Interface Design
    Light-tech interaction as a
     methodology used for realize
     ubiquitous interface.
    Light-tech interaction modules should
     be low power consumption and
     embedded.
    Microprocessor Control Unit (MCU) is
     usually the core of a product.
    The light-tech interaction means that
     using light-tech approaches to help
     users interact with products more
 Emotional lamp uses visual
  recognition technology to
  detect people and transmit
  signals via embedded MCU
  modules or network directly
  to local devices or the far-
  end.
 The receiver controlled by
  an Arduino based module
  in local or in remote will
  turn on lamp when signals
  received.
 The lamp can recognize
  human’s expressions and
  emotions.
 When person is happy, the
  lamp will turn to the warm
  tone, whereas it will turn to
  the cold tone to calm
  person down whenever a
Toy design prototype. The main structure of it
is made by handiwork in where a micro RFID
module is embedded as the sensor and
 When   children put different cards or
  models on the game board, the
  computer, which connected to the
  game zone, will receive the id of them.
 And then the computer will play
  different interactive content to users
  according to the id. This toy is benefit
  for kids’ social emotional development
  when kids play with each other or with
  parents.

Papaer4 ea

  • 1.
    [1] Fit YourHand: Personalized User Interface Considering Physical Attributes of Mobile Device Users Authors: Hosub Lee and Young Sang Choi Samsung Electronics Co., Ltd., Source: ACM. UIST’11, (October 16–19, 2011), Santa Barbara, California, USA. [2] BLUI: Low-cost Localized Blowable User Interfaces Authors: Shwetak N. Patel and Gregory D. Abowd Georgia Institute of Technology, Source: ACM. UIST’07, (October 7–10, 2007), Newport, Rhode Island, USA. http://www.youtube.com/watch?v=34SEmMxbNkQ
  • 2.
    [3] WUW -Wear Ur World - A Wearable Gestural Interface Authors: Pranav Mistry and Pattie Maes and Liyan Chang MIT Media Lab , Source: ACM. CHI 2009,( April 4 -9, 2009), Boston, MA, USA http://www.pranavmistry.com/projects/sixthsense/ [4] Light-tech Interaction Used in Ubiquitous Interface Design Authors: Feng Xian Tsinghua University , Source: ACM. RDURP’11, (September 18, 2011), Beijing, China. Adviser: Geeng Neng You Speaker: 張靜蓉 Date: 2012/01/
  • 3.
    User interface WUW BLUI FUH Light-tech Name of Wearable Ur Localized Fit Your Hand Light-tech interface World Blowable User Interaction Interfaces Scenario Augmented Entertainment Easy-to-touch Ubiquitous Reality Assistive positions technology Usage Freehand Hands-free Handedness Face /visual Gestures Interaction recognition Operation 1. Gestures Blowable Handedness Emotional supported Selection Left-hander lamp by multi- Scrolling Right-hander of Chameleon touch Dragging Finger length systems Physical Blow Usage habits 2. Freehand Metaphor gestures 3. Iconic gestures
  • 4.
    Expression Intuitive Intuitive Intuitive user Intuitive experience of directly control interaction Emotional full-sized certain interactive Dynamically recognition gestural Localized reformulates systems Blowable the layout Equipment Laptop Laptop Smartphone Laptop Projector Computer screen Tablet Computer Camera Microphone screen Color markers Application Microsoft C++ application Android MCUs Windows Java application Machine (Microprocessor platform connects via a learning Control Unit) using C#, TCP connection algorithm used in WPF and FFT application Arduino, which openCV. Machine learning Divided is a very KNN interface into popular 4x5=20 cells open-source Divided interface interactive into 3x3 , 4x4, 5x5, hardware 6x6 regions platform
  • 5.
    [1] Fit YourHand: Personalized User Interface Considering Physical Attributes of Mobile Device Users  A mobile user interface.  Dynamically reformulates the layout.  Integrate machine learning algorithm.  It infers users’ physical characteristics. handedness, finger length, or usage habits.  Calculates the optimal touch area for the user.
  • 6.
    A03=11×1.3 A12=13×1.3 A13=16×1.3 A21=22× =14.3 =16.9 =20.8 1.3=28.6 A22=18×1.3 A30=20×1.3 A40=13×1.3 A41=15× =23.4 =26 =16.9 1.3=19.5 Workflow of the Proposed System
  • 7.
    The User InterfaceArchitecture
  • 8.
  • 9.
    [2] BLUI: Low-costLocalized Blowable User Interfaces  A unique form of hands-free interaction.  Blowing at a laptop or computer screen to directly control .  Localization where on the screen the person is blowing.  Rely solely on a single microphone.  Java application performs the machine learning. Use k-Nearest Neighbor (KNN) classification for inferring which of those regions is targeted.
  • 10.
     Selection Hands-free way to directly select objects.  Scrolling Top or the bottom of the scroll bar (vertical) to cause the page to move.  Dragging Panning a document or cursor movements.  Physical Blow Metaphor Gaming or entertainment applications. Use blowing to richly convey a physical phenomenon. Blowing out birthday candles, dust & whack-a-mole.
  • 11.
     BLUI usesreal-time audio analysis and fingerprinting on the incoming microphone data.  Wind noise produces a broadband frequency response through the microphone , which is very high in amplitude for low frequency components.  Event-based system.  The BLUI engine recognizes blowing activity.  Application programmers can register callbacks for these blowing events.  Create special purpose widgets that respond appropriately.
  • 12.
    25- 50 blow Performance ofthe BLUI localizer for various resolutions (% s of correctly identified regions).
  • 13.
    [3] WUW -Wear Ur World - A Wearable Gestural Interface  Bring information out into the tangible world.  Projector and the camera are mounted on a hat.  Coupled in a pendant like wearable device.  Connected to laptop.
  • 15.
    intuitive experience of full-sized gestural systems. Stylus-based interface
  • 16.
    [4] Light-tech InteractionUsed in Ubiquitous Interface Design  Light-tech interaction as a methodology used for realize ubiquitous interface.  Light-tech interaction modules should be low power consumption and embedded.  Microprocessor Control Unit (MCU) is usually the core of a product.  The light-tech interaction means that using light-tech approaches to help users interact with products more
  • 17.
     Emotional lampuses visual recognition technology to detect people and transmit signals via embedded MCU modules or network directly to local devices or the far- end.  The receiver controlled by an Arduino based module in local or in remote will turn on lamp when signals received.  The lamp can recognize human’s expressions and emotions.  When person is happy, the lamp will turn to the warm tone, whereas it will turn to the cold tone to calm person down whenever a
  • 18.
    Toy design prototype.The main structure of it is made by handiwork in where a micro RFID module is embedded as the sensor and
  • 19.
     When children put different cards or models on the game board, the computer, which connected to the game zone, will receive the id of them.  And then the computer will play different interactive content to users according to the id. This toy is benefit for kids’ social emotional development when kids play with each other or with parents.