Your SlideShare is downloading. ×
  • Like
Virtual World simulations  to support  Robot-Mediated Interaction
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Virtual World simulations to support Robot-Mediated Interaction

  • 658 views
Published

Invited keynote presentation given at Virtual Worlds Best Practices in Education conference in July 2013. …

Invited keynote presentation given at Virtual Worlds Best Practices in Education conference in July 2013.
Website http://www.vwbpe.org/ai1ec_event/keynote-speaker-michael-vallance-sl-dafydd-beresford?instance_id=413

Published in Education , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
658
On SlideShare
0
From Embeds
0
Number of Embeds
2

Actions

Shares
Downloads
7
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Virtual World simulations to support Robot-Mediated Interaction. Dr. Michael Vallance Future University Hakodate, Japan http://www.mvallance.net
  • 2. Research aim (long term): To design an evidence-­‐based   framework   of   learning when undertaking tasks   of   measurable   complexity in a 3D  virtual  world. How? (i) procedural processes, (ii) learning reflections, (iii) collate data of students collaborating in-world when programming a robot # A successful task consists of a robot and program solution to solve specified circuit challenges. In   this   presentation, the focus is upon the   development   of   measuring   complexity   of   tasks   involving   robot-­‐mediated   interactions  (RMI).
  • 3. March 11, 2011 Fukushima Japan nuclear plant disaster. Earthquake and tsunami damaged cooling systems to reactors. Four reactors exploded and radioactivity was released to the atmosphere. Currently: evacuees cannot return home and depression is becoming prevalent among the strained residents [1]; the Japanese government has changed its criteria for dangerous levels of radioactivity so leaving residents confused [2]; workers are struggling to maintain the safety of the plant [3]; deformities have been discovered in local wildlife [4]. Why?  Our motivation for context
  • 4. Lack of robots in Japan to assist with the recovery operations!!! Less than a week iRobot USA donated two PackBot 510 robots and Warrior 710 robots, and iRobot engineers trained Japanese operators. 3 weeks for TEPCO to authorize their use [5].
  • 5. 1. People need to be better informed and equipped to make sense of information. Give students learning opportunities: reflecting, organizing, negotiating and creating. A challenging project like programming robots also provides opportunities for learning content in the Science, Technology, Engineering and Maths (STEM) subjects. 2. International collaboration is essential communication for now and the future. A virtual world as a future 3D space. A safe medium for communication and experiential learning. The tasks in this research aim to support (1) and (2). As educators, what can we learn from this disaster?
  • 6. The students’ aim is to communicate solutions to problems which involve the programming of a LEGO robot to follow specific circuits. This is undertaken by 1. designing circuits - with robot maneuvers and sensors 2. experiencing collaboration - students in Japan and UK within 3D space. Experiences lead to personal strategies for teamwork, planning, organizing, applying, analyzing, creating and reflection. # Measured as Essential Skills for Wales Baccalaureate Qualification, UK. Evidence required by Education Authority for post-16 qualification. About the research ...
  • 7. Literature review of task complexity involving robots
  • 8. CTC  =  Σ  (d  +  m  +  s+  o) for  example CTC = Σ (4 + 3 + 2 + 2) = 11 There is no consensus in the discipline of Robotics or Human-Robot Interaction for accurately measuring task complexity [6]. Given the specific purposes of the robot in our research, task complexity was calculated according to the number of sections that make up a given maze [7] [8]. Circuit Task Complexity (CTC) = number of directions + number of maneuvers + number of sensors + number of obstacles. Circuit  Task  Complexity
  • 9. We found that the logic of assigning task complexity to circuits was inadequate. For instance, complexity values were assigned to distinct maneuvers such as forward – turn – back. Over the course of our previous research, as circuits became more challenging, the NXT programming became more complex. Especially adding sensors to maneuver around and over obstacles. Simply counting the number of obstacles in the circuit task complexity was flawed because the programming required to maneuver over a bridge using touch sensors, for instance, was far more complex than maneuvering around a box using touch sensors. CTC  =  Σ  (d  +  m  +  s+  o) Circuit  Task  Complexity
  • 10. In the NXT Mindstorms software, the Move block controls the LEGO robot direction and turns. Move block contains 6 variables: NXT ‘brick’ port link - direction - steering - power - duration - next action. In other words, the students have to make 6 specific decisions about the values which make up the programmable block. Therefore, we assign v1 a value of 6. This was repeated for sensor, switch and loop. Robot Task Complexity RTC  =  Σ  Mv1  +  Σ  Sv2  +  Σ  SW  +  Σ  Lv3    
  • 11. RTC  =  Σ  Mv1  +  Σ  Sv2  +  Σ  SW  +  Σ  Lv3       where, M  =  number  of  moves  (direcHon  and  turn)   S    =  number  of  sensors SW  =  number  of  switches L  =  number  of  loops for  example   RTC  =  Σ  Mv1  +  Σ  Sv2  +  Σ  SW  +  Σ  Lv3    RTC = (8 x 6) + (3 x 5) + 0 + 3 RTC = 66 v  =  number  of  decisions  required  by  user   for  each  programmable  block v1  =  6   v2  =  5 v3  =  2 Robot  Task  Complexity We acknowledge that, at present, our modified Robot Task Complexity metric applies only to the Mindstorms NXT software and LEGO robot, but it does provide a useful indicator in our attempts to analyze the experiential learning during the collaborative tasks. The CTC problem can now be evaluated against the RTC solution.
  • 12. Students in one country 1. provided with task specification 2. work on a solution to the task 3. construct their circuit in the virtual world + in their real-world lab 4. develop a NXT program to maneuver the physical LEGO robot appropriately. The problem and the proposed solution are then communicated in real-time to   students   in   the   other   country via the 3D virtual world. Task implementation
  • 13. Task specification examples Task Task: robot actions CTC/ target CTC only / objective is to iteratively increase CTC/ Collabo ration STEM/ anticipated Essential Skills (Wales Baccalaureat e)/ anticipated RTC/ post task calculation based upon students’ solution. T1 Movement: follow the line. Sensors: light and touch CTC = Σ (d + m + s+ o) CTC= 1+2+2+1 = 7 Japan teach UK S: Recognition of light sensor values. What happens when trigger point increased/ decreased? T: Learn how to organise NXT program blocks logically. E: Construct a robot. Connect software to hardware. M: Recognise spatial movements and the problem of friction. Change surface to see if robot works the same. Calculate coefficient of friction. Identify Plan/ manage Explore/ Analyse (organize) Evaluate (checking) Reflect T2 Movement: follow the line. Sensors: colour and action. CTC= 1+2+2+2 = 8 UK teach Japan S: Recognition of light sensor values. What happens when trigger point increased/ decreased? How does the NXT sensor recognise colour R, G or B? Try different colour variations and observe subsequent robot actions. T: Learn how to organise NXT program blocks logically. E: Construct a robot. Connect software to hardware. M: Identify Plan/ manage Explore/ Analyse (organize) Evaluate (checking) Reflect T3 Movement: square. Sensors: touch and sound. CTC = 4+3+1+1 = 9 Japan teach UK S: T: Learn how to organise NXT program blocks logically. E: Construct a robot. Connect software to hardware. M: Calculate distance, speed and force (touch). Identify Plan/ manage Explore/ Analyse (organize) Evaluate (checking) Reflect
  • 14. Resources. • LEGO Mindstorms NXT software version 2.1 • LabView 2010 with NXT module. • LEGO robot 8527 kit • LEGO blocks and similar workspaces/lab in Japan university + 2 UK schools • All have same Apple technologies (MacBook Pro + OSX 10.7)
  • 15. reactor off switchradioactive bins control station
  • 16. www.firesabre.com Resources. • Aurora-Sim hosted by Firesabre.
  • 17. Virtual Fukushima in JIBE hosted by Reaction Grid reactiongrid.net
  • 18. Virtual Fukushima in JIBE hosted by Reaction Grid reactiongrid.net
  • 19. telerobotics “for the rest of us” inspired by
  • 20. Task flow chart for simulation
  • 21. Task Task description T1 Assemble LEGO robots. JPN + UK students introductions T2 NXT program + circuit. JPN teaching UK T3 NXT program + circuit (90 degree turns + measured length). UK teaching JPN T4 Circuit + NXT program. Move. Touch sensor. Turn 90 degrees. JPN teaching JPN. T5 Circuit + NXT program. Around obstacles. JPN teaching JPN. T6 Circuit + NXT program. Around obstacles. JPN teaching JPN. T7 NXT program + touch sensors + circuit. Locate and press switch off. JPN teaching JPN. T8 Over an obstacle. NXT program + sensors + bridge building (cardboard). JPN teaching JPN. T9 Over an obstacle. NXT program + sensors + bridge building (wood). JPN teaching JPN. T10 Robot arm + scoop. UK teaching JPN T11 Robot arm + NXT program. JPN preparation T12 Robot arm + scoop + NXT program. Streaming video. JPN teaching UK. T13 Programming LabView for remote control. T14 Programming LabView for remote control. T15 Programming LabView for remote control. T16 UK teaching Japan. Robot construction + NXT program + stop and swing arm to hit ball. T17 Suika robot. Rotate + follow line+ sensor + chop down. Japan preparation 1. T18 Suika robot. Rotate + follow line+ sensor + chop down. Japan preparation 2. T19 Suika robot. Rotate + follow line+ sensor + chop down. Japan preparation 3. T20 Robot construction + NXT program + + obstacles + sensors. T21 Suika robot. Rotate + follow line+ sensor + chop down. Japan teach UK. T22 Programming LabView for remote control. T23 Programming LabView for remote control. T24 Remote control for search & rescue circuit A. T25 Remote control for search & rescue circuit B. T26 Remote control for search & rescue circuit C. T27 Remote control for search & rescue circuit D. T28 Move to black line, stop and throw ball to hit over obstacle. UK teaching Japan. Tasks
  • 22. CTC  =  Σ  (d  +  m  +  s+  o) RTC  =  Σ  Mv1  +  Σ  Sv2  +  Σ  SW  +  Σ  Lv3    Task CTC RTC T2 0.56 0.22 T3 0.50 0.42 T4 0.81 0.22 T5 0.81 0.57 T6 1.00 0.85 T7 0.69 1.00 T8 0.25 0.39 T9 0.31 0.33 T10 0.19 0.20 T11 0.63 0.76 T12 0.63 0.84 T16 0.56 0.83 T17 0.25 0.22 T18 0.31 0.65 T19 0.31 0.65 T20 0.69 0.48 T21 0.31 0.65 T28 0.25 0.17 Circuit  Task  Complexity Robot  Task  Complexity Task TaskComplexity
  • 23. Graph  of   Task  ~  Task  Fidelity Task TF T2 0.34 T3 0.08 T4 0.59 T5 0.24 T6 0.15 T7 -0.31 T8 -0.14 T9 -0.02 T10 -0.01 T11 -0.13 T12 -0.21 T16 -0.27 T17 0.03 T18 -0.34 T19 -0.34 T20 0.21 T21 -0.34 T28 0.08 Task TaskFidelity
  • 24. Immersion ( flow ) - how immersed students become within the process of each task. To record immersion (or flow), a virtual FlowPad appears in front of the virtual world avatars. At regular intervals during the task procedures each avatar has to answer two questions, with four options: Q1. How challenging is the activity? • Difficult (score 4) • Demanding (score 3) • Manageable (score 2) • Easy (score 1). Q2. How skilled are you at the activity? • Hopeless (score 1) • Reasonable (score 2) • Competent (score 3) • Masterful (Score 4). These questions were chosen based upon research in flow by Pearce et al. [9].
  • 25. Graph of task immersivity (flow) Task Challenge Skill T2 0.5 1 T3 0.75 0.5 T4 0.5 0.75 T5 1 0.67 T6 0.8 0.67 T7 0.67 0.8 T8 0.67 0.5 T9 0.42 0.92 T10 0.42 0.5 T11 0.8 0.5 T12 0.58 0.58 T16 0.8 0.45 T17 0.25 1 T18 0.7 0.7 T19 0.25 1 T20 0.94 0.5 T21 0.75 0.75 T28 0.75 0.58
  • 26. If we look at the data of Task Fidelity and immersivity, we suggest that T10 and T28 would be considered most successful tasks when students are engaged in robot mediated interactions. TF value for T28 was only + 0.08; slightly above the optimal Task Fidelity line. T28 was slightly below the optimal path of immersivity. Similarly for T10 with immersivity slightly above optimal path of immersivity and Task Fidelity at +0.01. The challenge for instructors is to seek tasks similar to T28 and T10 where immersivity is close to or on the optimal path of immersivity, and task complexity is close to or on the optimal line of Task Fidelity. The challenge for researchers is to seek ways to transfer these observations to further tasks with different participants in order to develop more reliable optimal learning tasks when engaged in robot mediated interactions in a virtual space [10].
  • 27. This applied research is developing metrics for learning when conducting virtual world tasks.The motivation to implement this research was the nuclear disaster of 3-11.A virtual Fukushima nuclear plant and an OpenSim training space have been iteratively designed and built. International collaboration by students as non-experts has highlighted the benefits and challenges posed when engaged in constructing robot-mediated interactions (RMI) within the context of distance-based communication in 3D spaces. Students’ immersion (or flow), Circuit Task Complexity, and Robot Task Complexity have been calculated. Optimal learning tasks have been highlighted.A new metric is suggested for measuring tasks involving robots, which we term Task Fidelity [10]. Many thanks to UK collaborators and students at University of South Wales and CynonValley schools, my students at Future University, Japan, and metaverse designers at Firesabre and Reaction Grid. Conclusion Next question How can a better taxonomy be designed to identify specific learning when students are engaged in mixed reality (real and 3D virtual world) Robot Mediated Interactions? Acknowledgements
  • 28. References (1) T. Morris-Suzuki, D. Boilley, D. McNeill and A. Gundersen. Lessons from Fukushima. Netherlands: Greenpeace International, February 2012. (2) J. Watts. “Fukushima parents dish the dirt in protest over radiation levels.” The Guardian, May 2, 2011. [Online]. Available: http://www.guardian.co.uk/world/2011/may/02/parents-revolt-radiation-levels [Accessed August 20, 2012]. (3) L. W. Hixson. “Japan’s nuclear safety agency fights to stay relevant.” Japan Today. [Online]. Available: http:// www.japantoday.com/category/opinions/view/japans-nuclear-safety-agency-Fig.hts-to-stay-relevant [Accessed August 20, 2012]. (4) N. Crumpton. “Severe abnormalities found in Fukushima butterflies.” BBC Science & Environment. [Online]. Available: http://www.bbc.co.uk/news/science-environment-19245818 [Accessed August 20, 2012]. (5) E. Guizzo. “Fukushima Robot Operator Writes Tell-All Blog.” IEEE Spectrum, August 23, 2011. [Online]. Available: http://spectrum.ieee.org/automaton/robotics/industrial-robots/fukushima-robot-operator-diaries [Accessed August 20, 2012]. (6) M. Vallance and S. Martin. “Assessment and Learning in the Virtual World: Tasks, Taxonomies and Teaching For Real.” Journal of Virtual Worlds Research Vol. 5, No. 2, 2012. (7) S. B. Barker and J. Ansorge. “Robotics as means to increase achievement scores in an informal learning environment.” Journal of Research in Technology and Education, Vol. 39, No. 3, pp. 229-243, 2007. (8) D.R. Olsen and M.A. Goodrich, “Metrics for evaluating human-robot interactions.” [Online]. Available: http:// icie.cs.byu.edu/Papers/RAD.pdf [Accessed March 14, 2009]. (9) M. Pearce, M. Ainley and S. Howard. “The ebb and flow of online learning.” Computers in Human Behavior, Vol. 21, pp. 745–771, 2005. (10) M. Vallance, C. Naamani, M. Thomas and J. Thomas. “Applied Information Science Research in a Virtual World Simulation to Support Robot Mediated Interaction Following the Fukushima Nuclear Disaster.” Communications in Information Science and Management Engineering (CISME). Vol. 3 Issue 5, pp. 222-232.