Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Upcoming SlideShare
Learning from meaningful, purposive interaction
Next
Download to read offline and view in fullscreen.

0

Share

Download to read offline

Reality As A Knowledge Medium

Download to read offline

Introduction to the WEKIT project on wearable experience for knowledge intensive training.

Related Books

Free with a 30 day trial from Scribd

See all
  • Be the first to like this

Reality As A Knowledge Medium

  1. 1. Reality as a Knowledge Medium Fridolin Wild1,2) 1)Oxford Brookes University 2)Open University, UK
  2. 2. A note on methodical order Conversation is the root of all information exchange. Narratives convert tacit knowledge to explicit knowledge. # 2 No, all embodied experience! Just language?
  3. 3. Narrate| Experience # 3 information sharing sense making
  4. 4. The 4 Realms of Experience (Pine & Gilmore, 1998) # 4
  5. 5. Performance Augmentation with wearables sensor-sensory loop: super-real experience 010000100001000010001010001111110 The guest enacts experiences The stager produces narratives
  6. 6. Affordances: which cultures of use to support? Viewpoints: from what perspective? Abstraction: what matters? what to pay attention to? Editing: how to enrich or reduce? Social Scope: for individuals, teams, or more? Sensing: which senses and what sensors? Grand Challenges? (Fominykh, Wild, Smith, Alvarez, & Morozov, 2014) Capture | Re-Enact
  7. 7. Research questions (p.14) RQ01. How to enrich the capture of activities and experiences by means of wearable sensors? RQ02. How to experience more of the captured activities and experiences via AR and WT or a remote simulation? RQ03. To what extent can the knowledge of an expert be captured as wearable experience? RQ04. To what extent can a trainee experience the phenomenology of the expert through applying wearable experience? # 7
  8. 8. Wearable experience Examples of narratives with the new new interface. # 8
  9. 9. Scenario  Phase 1: capturing expert experience  Phase 2: wearing expert experience by trainees  Phase 3: analysis and post processing
  10. 10. Example 1: Live Guidance
  11. 11. Example 2 + 3: Re-enactment
  12. 12. Sensors & Senses Glasses, wristbands, BCIs, smart objects, … # 12
  13. 13. Capturing (p.10, p18)  Wearable ambient and biofeedback sensors  Tracking human position and orientation of the person in space/environment  Gaze direction  Capture the narration and ambient sounds  videos of the vision area  360 video  Audio  Gestures  along with affect data  and physiological data # 13
  14. 14. Re-enactment  All gathered sensor data will be stored and synchronized as a single experience recording. This tangible artifact will be available for trainees.  Second, the re-enactment of the captured experience will be achieved by augmenting trainee experience with contextualized expert data in real time.  For example, by 3D scanning the environment the system will know the position of both the expert and the trainee in space. Therefore, the relative position of the expert can be displayed for the trainee as an AR element. Data from other sensors will make the trainee aware of where the expert is looking, what the expert is seeing, what the expert is saying, how the expert is handling the tools with hands, and more. In such a manner, the trainee will be able to experience the presence of the expert while working on a task. # 14
  15. 15. Hardware + SDKs (p.19f) # 15 Type of Product/service Examples Use in WEKIT Eye tracking sensors Head-mounted eye trackers from vendors such as: SMI, Arrington Research, Tobii, ASL, MindMetriks Measuring where the user is looking in a scene and at what depth to correlate with other metrics 3D scanners Occipital’s structure.io, Microsoft Kinect, Intel Real Sense Quick object scanning; body posture / limb tracking in realtime; depth sensing for awareness of workplace Hand/Finger posture and gesture tracking wearable sensors MYO, LEAP Motion, Vuzix LEAP Motion will be used for hands-on teletutoring (aka ‘GhostHands’ instructions) or can be integrated with Vuzix Smart glasses Cameras Machine vision cameras from vendors such as: Basler, ImagingSource, PointGray, PixeLink. Crowdemotion For capturing the face region (facial expressions) of the user. 360 cameras ( optional) V.360, Theta360, PanoPro, GeoNaute, Bubble Capturing 360 degree view. EEG sensors MyndPlay EEG headbands, Interaxon Muse, Shimmer EEG Raw and Bioexplorer Worker well-being, stress, mental effort, complacency, agitation Heart rate wearable sensors e-Health Sensor Platform, ShimmerSense, Heartmath Worker well-being, stress to correlate with GSR and EEG Skin conductivity wearable sensors e-Health Sensor Platform, Shimmer sense Worker well-being (stress level) Smart Glasses Google Glass, Epson Moverio BT200, Vuzix M100, and Meta- 1 More examples can be found in Smart Glasses Market Report For capturing the user’s point of view, gesture, voice recording and prompts. AR platforms Unity, Vuforia, Metaio, Occulus, Samsung Gear, HTC Vvo To recreate scenarios and simulated training environment
  16. 16. Dream, record, narrate, repeat. WEKIT. # 16
  17. 17. WEKIT project is funded by the European Commission = EU disclaimer http://wekit.eu/ Ces’t tout. fridolin.wild@gmail.com skype: fridolin.wild whatsapp: +447751239881 # 17

Introduction to the WEKIT project on wearable experience for knowledge intensive training.

Views

Total views

1,549

On Slideshare

0

From embeds

0

Number of embeds

27

Actions

Downloads

18

Shares

0

Comments

0

Likes

0

×