Successfully reported this slideshow.
Your SlideShare is downloading. ×

Immersive Community Analytics for Wearable Enhanced Learning (HCI International 2019)

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Upcoming SlideShare
Impact2
Impact2
Loading in …3
×

Check these out next

1 of 10 Ad

Immersive Community Analytics for Wearable Enhanced Learning (HCI International 2019)

Download to read offline

Nowadays, we can use immersive interaction and display technologies in collaborative analytical reasoning and decision making scenarios. In order to support heterogeneous professional communities of practice in their digital transformation, it is necessary not only to provide the technologies but to understand the work practices under transformations as well as the security, privacy and other concerns of the communities. Our approach is a comprehensive and evolutionary socio-technological learning analytics and design process leading to a flexible infrastructure where professional communities can co-create their wearable enhanced learning solution. In the core, we present a multi-sensory fusion recorder and player that allows the recordings of multi-actor activity sequences by human activity recognition and the computational support of immersive learning analytics to support training scenarios. Our approach enables cross-domain collaboration by fusing, aggregating and visualizing sensor data coming from wearables and modern production systems. The software is open source and based on the outcomes of several national and international funded projects.

Nowadays, we can use immersive interaction and display technologies in collaborative analytical reasoning and decision making scenarios. In order to support heterogeneous professional communities of practice in their digital transformation, it is necessary not only to provide the technologies but to understand the work practices under transformations as well as the security, privacy and other concerns of the communities. Our approach is a comprehensive and evolutionary socio-technological learning analytics and design process leading to a flexible infrastructure where professional communities can co-create their wearable enhanced learning solution. In the core, we present a multi-sensory fusion recorder and player that allows the recordings of multi-actor activity sequences by human activity recognition and the computational support of immersive learning analytics to support training scenarios. Our approach enables cross-domain collaboration by fusing, aggregating and visualizing sensor data coming from wearables and modern production systems. The software is open source and based on the outcomes of several national and international funded projects.

Advertisement
Advertisement

More Related Content

Similar to Immersive Community Analytics for Wearable Enhanced Learning (HCI International 2019) (20)

Advertisement

More from IstvanKoren (14)

Recently uploaded (20)

Advertisement

Immersive Community Analytics for Wearable Enhanced Learning (HCI International 2019)

  1. 1. Immersive Community Analytics for Wearable Enhanced Learning Ralf Klamma, Rizwan Ali, and István Koren Advanced Community Information Systems (ACIS) RWTH Aachen University, Germany koren@dbis.rwth-aachen.de HCI International 2019 July 30, 2019 Orlando, Florida, USA
  2. 2. 2 Agenda • Motivation • Immersive Analytics • Research Context • Demo Video • Conclusion
  3. 3. 3 From Learning Analytics to Immersive Community Analytics • Foci of Learning Analytics (LA) are learning processes on the cognitive level • Informal Learning with manual activities poses new challenges • Turn from the mind to the body: integration of declarative and procedural knowledge [Ullman, 2004]. Example: bakers kneading bread [Nonaka et al., 1995] • Need to transform research practices as well • Concerns of privacy and data security in domains like learning are high  Immersive wearable enhanced learning enables in-place community learning analytics
  4. 4. 4 Wearable Enhanced Community Analytics Life Cycle
  5. 5. 5 Background • Immersive Analytics as a subset of Visual Analytics: defined as “the use of engaging analysis tools to support data understanding and decision making” [Marriott et al., 2018] • Human Activity Recognition to recognize activities with the help of many sensors and machine learning • Experience API, a specification for exchanging learning data with other researchers and practitioners
  6. 6. 6 Research ContextWEKIT • “Wearable Experience for Knowledge Intensive Training” funded by EU • body-worn vest with sensors • Augmented Reality head-up display ARLEM • IEEE standard supported by AR- FOR-EU project • activityML • activities of agents (human/non-human) • workplaceML • Physical environment and learning context
  7. 7. 7 Video of Sensor Fusion Framework
  8. 8. 8 Conclusion • LA at the workplace is conceptually different from traditional LA • Feedback is best delivered in an immersive manner while doing training • Necessary collaborative processes are best situated in a Community of Practice • Need to transform research practices as well • Human-robot collaboration poses new challenges, but first of all threats • SWEVA: Social Web Environment for Visual Analytics to create and share processing and visualization pipelines
  9. 9. 9 fin • Thank you for your attention! • Do you have any questions? https://github.com/rwth-acis klamma@dbis.rwth-aachen.de @klamma koren@dbis.rwth-aachen.de @istinhere
  10. 10. 10 Human Activity Recognition Workplace Artificial Intelligence Data Integration and Aggregation Training

×