Your SlideShare is downloading. ×
Kinetic Mashups: augmenting physical places with motion-aware services
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Saving this for later?

Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime - even offline.

Text the download link to your phone

Standard text messaging rates apply

Kinetic Mashups: augmenting physical places with motion-aware services

544
views

Published on

Paper presented at DART2008 in Cagliari

Paper presented at DART2008 in Cagliari

Published in: Technology

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
544
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. uMoveKinetic Mashups: augmenting physical places with motion-aware services Vincenzo Pallotta Pervasive and Artificial Intelligence Research Group Department of Computer Science University of Fribourg Switzerland
  • 2. uMove Outline•  Motivation•  Kinetic User Interfaces•  uMove framework•  Mobile Collaborative Workflow•  Conclusions•  Future work 2 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 3. uMove Key success factors in Location-based Systems•  Key success factors for new-generation LBSs: –  [Ricci et al. 2008] indentified 4 key success factors•  Pro-activeness –  Old LBSs used location to adapt services, but services needed to be explicitly invoked •  New-generation LBSs can take decisions based on (several) location context- (changes)•  Cross-referencing –  Old LBSs were basically single-referencing: •  User and Service spatio-temporally co-located•  Multiple-targets –  Old LBSs only allowed a single-target•  Interaction-oriented –  Traditional LBSs were typically content-oriented •  E.g. context-aware IR 3 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 4. uMove Kinetic User Interfaces•  Motion is a human natural behaviour! –  Intentions can be recognized from motion patterns: •  Self-motion, gestures, moving objects, coordinated motion•  Beyond classical location-awareness in UbiComp: –  Motion is viewed as a dimension of the user context: •  Adapt application’s behaviour –  E.g. enable TTS while driving –  Motion is an input modality and triggers contextualized events: •  Interact with application –  E.g. Drive-through in Electronic Toll Payment Systems •  Applications “Contextually” react to “context-change”! 4 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 5. uMove (Some) Goals of KUI•  Extend classical GUI/TUI interaction patterns to physical spaces: –  Hovering/Pointing, Focus, Drag&Drop, PopUps & PullDowns, Pick&Drop, (mouse) gestures, …•  Enable coordination and collaboration between users: –  Asynchronous: by means of geo-located and mobile artifacts. •  E.g. leaving “traces” of passage… –  Situated Action: •  E.g. Just in time/place workflow.•  Incidental Interaction: –  Geo-located services activation through motion •  E.g. ActiveBadge’s “Follow me” applications•  Pro-active personal assistance: –  Activity monitoring and problem detection Moving •  E.g. Unobtrusive monitoring of Alzheimer patients 5 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 6. uMove The KUI Ontology•  GeoTop: –  Physical Space viewed as a “desktop” –  populated by moving entities and structured into “zones”•  Kuidgets: –  software representations of geo-localised (moving) entities•  Widgets: –  “providers” of motion properties for Kuidgets (i.e. sensors wrappers)•  Spatio-temporal Relations: –  Dynamically created between Kuidgets –  E.g. enter, exit, joint move, approaching, …•  Activities: –  Motion patterns aggregated into higher-level semantic events•  Situations: –  Contexts of use triggered by motion patterns and activities 6 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 7. uMove KUI Interaction Patterns•  Incidental Interaction: –  users perform actions by moving themselves or objects in the physical space: •  unobtrusiveness is achieved by hiding the effects of the actions until something relevant happens in the system according to the current context; •  only a minimal amount of feedback is provided just to let users know that the input has been captured.•  Continuous Interaction: –  Users perform an activity that is monitored by the system: •  the system silently observes users activity •  triggers a more attention-demanding interaction (e.g. a GUI- based dialog on a handheld device) only when: –  an abnormal behaviour is detected, –  or when contextually relevant information is available. 7 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 8. uMove KUI and GUI•  Manipulation: –  GUI: •  direct manipulation of domain object •  need of (visual) feedback on the instruments •  undoable actions –  KUI: •  Indirect manipulation of domain object •  Reduced/different feedback on the instrument •  Some actions may be only (contextually) “reversible” •  E.g. entering, exiting.•  Interaction patterns: –  KUI’s Drag & Drop, Pop-ups 8 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 9. uMove KUI-based Scenarios•  UbiDrive –  Driving behaviour trigger system’s reactions •  E.g. exceeded speed limits, deceleration in proximity of a gas station, fleet integrity.•  UbiGlide (showcased at UbiComp’07) –  Motion-aware Flight Assistant •  E.g. no-fly zones, collision and storms avoidance, dangerous manœuvres.•  Ubi@Work –  Motion-aware work assistance •  Risky situations (nuclear/chemical plants) •  Sudden reaction to abnormal behaviours (e.g. escaping)•  UbiShop (showcased at NGMAST’07) –  Motion-aware mobile collaboration •  Shared Tasks performed just-in-time/place •  Motion-based interaction (e.g. accept, refuse, confirm) 9 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 10. uMove KUI Middleware App1 App2 App3 App4 App5Activity Layer A1 A2 A3 A4 A1.1 A1.2KUI Space Layer r3 r1 KUI Manager r2 k2 z2 k3 GeoDB k1 z1 z3Observation Layer W1 W2 W3 W4 W5 W6 W7 S1 S2 S3 S4 S5 S6 S=sensor W=widget Z=zone K=Kuidget A=activity 10 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 11. uMove Mobile Collaborative Workflow•  Extends classic Master-Worker coordination model•  Opportunistic Task Assignment: –  when worker’s context is “right”!•  Dynamic Team Formation: –  If people are close to each other in a zone where the work is needed, they will be asked to collaborate together.•  From simple scenario (e.g. shopping list) to very complex workflows –  E.g. logistics, military operations, emergency operations 11 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 12. uMove12 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 13. uMove UbiShop ArchitectureRFID phidgets13 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 14. uMove UbiWeb14 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 15. uMove UbiBuilder15 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 16. uMove UbiMonitor16 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 17. uMove Conclusion•  uMove enables new-generation Location-based systems: –  By means of pro-active, multi-target, cross-referenced and interactive interfaces.•  Kinetic User Interfaces: –  Enable unobtrusive interaction design –  Motion-aware computing•  New mobile collaboration model: –  Based on just-in-time/place task assignment or team formation•  uMove framework for rapid prototyping of LBSs 17 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 18. uMove Future Work•  uMove framework: – made publicly available as open-source project•  Library of common KUI patterns – Test new KUI interaction patterns•  KUI scenarios with new prototypes: – SmartHeating (energy saving in houses) – ActiMeet (tangible interaction in meetings) – NAMASTE (multimodal interactive storytelling) 18 Vincenzo.Pallotta@unifr.ch 4/19/12
  • 19. uMove19 Vincenzo.Pallotta@unifr.ch 4/19/12