Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

IoT in E-Learning - Hardware sensor integration for mobile learning

393 views

Published on

mLearnCon 2015 Speaker

Published in: Technology
  • Be the first to comment

  • Be the first to like this

IoT in E-Learning - Hardware sensor integration for mobile learning

  1. 1. Hardware Sensor Integration for mobile Learning Prasoon Nigam CTO / Vice President – Technology Stratbeans Consulting www.StratBeans.com prasoon@stratbeans.com @prasoonnigam
  2. 2. Coverage • What are the sensors and other sensory elements. • Why sensor integration has revolutionary potential in learning context – ILT vs CBT. • When is the right time for sensor integration – are we late ? • How to integrate basic to complex sensors to enhance learning content – demo, if time permits.
  3. 3. WHAT What are the sensors and other sensory elements
  4. 4. SENSED : THE IMAGE PROCESSED : IDENTIFIED IT BE RELATED WITH XMEN INTEGRATED: WHY IS HE SHOWING ME THIS, HOW IS IT CONNECTED TO HIS TALK ?
  5. 5. What is a Sensor Sensor “receives”, “transmitted” information which
  6. 6. LIGHT SEE What is a Sensor Sensor “receives”, “transmitted” information which Eye Webcam
  7. 7. Sensory Universe : Sensors, Transmitter, Processor Processor Integrator SensorTransmitter
  8. 8. Sensory Universe : Sensors, Transmitter, Processor Processor Integrator SensorTransmitter
  9. 9. WHY Why sensor integration has revolutionary potential in learning context – ILT vs CBT.
  10. 10. Lets spot sensory universe elements involved in ILT situation ILT : Instructor Learner MORE SENSORY ELEMENTS => MORE INTELL
  11. 11. Do you know the name of the XMEN Instructor with… ILT : Instructor Learner x x x x x NO EAR, NO EYES MUCH SMALLER BRAIN BRANCHING SIMULTION, SCORES ASSESSMENT MUTANT POWER PRESENT EVERYWHERE AT SAME TIME, Answer is : Madame CBT MAGNETO STOLE HER SENSORS
  12. 12. Moment of Truth … ILT : Instructor x x x x CBT : Self Paced NO EAR, NO EYES MUCH SMALLER BRAIN BRANCHING SIMULTION, SCORES ASSESSMENT MUTANT POWER PRESENT EVERYWHERE AT SAME TIME, Answer is : Madame CBT MAGNETO STOLE HER SENSORS
  13. 13. From “OR” to“AND” O R ILT : Instructor CBT : Self Paced Learner • Personalized • Intelligent • Adapts with Learner’s need • Scalable • Self Paced • Geography Independent Think, if we are able to give Madame CBT her se “intelligent AND scala
  14. 14. Best of Both Worlds : The Smart CBT • Adapting Content to match learner’s state of mind • Adapt interactivity • Adapt Engagement • Advices learner to take a break • Interprets Brain Waves to determine cognitive state of mind • Hands Free Navigation • Speech Detection • Movement Detection • Customized Gestures • Detect Confidence Level through speech analysis of learner • Location based learning Smart CBT/mLearning
  15. 15. WHEN When is the right time for sensor integration – are we late ?
  16. 16. Time Number of Sensors Internet Of Things TIME IS “NOW” FOR mLEARNING TO LEVERAGE THE SENSORS
  17. 17. The Irony ! LEARNING CONTENT in IRON WALLS The IRON WALL b • mLearning • Sensors on dev
  18. 18. HOW Low hanging fruits + Advance Ideas
  19. 19. Simple to Complex Sensor Integration Ideas 1.Brain wave detection 2.Hands free navigation of content using Hand Gestures 3.Augmentation of content using AR apps 4.Voice based response 5.Confidence level detection
  20. 20. 1: Detect Brain Attention Level Application Area: • Depending on brain attention level, user is suggested to • BREAK • ENGAGE • INTERACT SENSOR :EEG DEMO AVAILABLE Brain Wave Type Implication Gamma Problem solving, concentration Beta Active Mind, Busy Alpha Restful, Relaxed Theta Drowsiness
  21. 21. 1: Detect Brain Attention LevelSENSOR :EEG DEMO AVAILABLE - Operating condition for learnin - Adaptive content - Grade a content, before relea
  22. 22. 1 : Hands-Free Navigation of Content using Hand GesturesApplication Area: • Performance support system in Restaurant. Staff can refer to recipe steps, while preparing the food SENSOR : CAMERA DEMO AVAILABLE
  23. 23. 2: Augmentation of Content using Augmented Reality AppsApplication Area: • A complex process flow, comes alive with videos that explain that step in depth • A user is attempting a check your knowledge, clue appears for the knowledge options • School children is reading about planetary system, Screen comes live with video, explaining about the planet he clicked SENSOR : CAMERA DEMO AVAILABLE
  24. 24. 3: Voice Based Response Application Area: • Response to an assessment can be spoken out instead of typed SENSOR : MIC DEMO AVAILABLE DO NOT WASH DO NOT DRY WASH IN COLD WATER GIVE IT YOUR MOM DO NOT WASH What does this mean? What is the Color of an apple? What is Gravity? How do you determine time? What does this mean?
  25. 25. 4: Confidence Level Detection Application Area: • A Customer support executive (voice process), speaks out an answer and using speech analytics, he is tested on various performance parameters like “speed of response”, “tone”, “confidence” SENSOR : MIC
  26. 26. 5: Distraction Detection Application Area: • A learner is showing patterns like “shaking of legs”, “no significant movement”, “body temperature” , “heart rate” and the content pops-up • BREAK : Suggest the user to take a break • ENGAGE : Put more engaging level of the same content - “Watch a video to understand better” • INTERACT : Put a CYU question related with the topic SENSOR : BIO-WEARABLE
  27. 27. Final Note • Sensor integration is NOW a reality – wearable, camera & brain wave are top picks • Demand sensor smart content from your content development/technology integration teams • Take baby steps • Yes SCORM / xAPI complaint sensor integrated content are easily doable • Discuss what are the other possible application of smart content Lets pledge to give Madame CBT her
  28. 28. Future Possibility Questions ? prasoon@StratBeans.com @prasoonnigam
  29. 29. END

×