Advertisement
Advertisement

More Related Content

Similar to Engineering active learning: LEGO robots & 3D virtual worlds(20)

Advertisement

Engineering active learning: LEGO robots & 3D virtual worlds

  1. Engineering active learning: LEGO robots and 3D virtual worlds Dr. Michael Vallance Future University Hakodate, Japan
  2. about FUN
  3. research
  4. Robot -mediated Interaction (RMI) Research: Design an evidence'based+ framework+ of+ learning when undertaking tasks+of+measurable+complexity in a 3D+virtual+world. The students’ aim is to communicate solutions to problems which involve the programming of a robot to navigate specific circuits. # Experiences lead to personal strategies for teamwork, planning, organizing, applying, analyzing, creating and reflection. # Measured as Essential Skills for Wales Baccalaureate Qualification, UK. 
 Evidence required by UK Education Authority for post-16 qualification.
  5. “The acquisition of knowledge and skills does not necessarily constitute learning.The latter occurs when the learner connects the knowledge or skill to previous experience, integrates it fully in terms of value, and is able to actively use it in meaningful and even novel ways” (Hase, 2011). * self-determined learning * student-centred learning
  6. Learner involvement in the environment of learning. Learner generates contextually relevant content. Spontaneous and organic (structured, organized, coherent, integrated) learning experiences. True collaboration between teacher and learner, and learner and learner. Flexible curricula. Flexible assessment. Heutagogical characteristics for active learning Hase, S. (2011), Learner defined curriculum: heutagogy and action learning in vocational training. Southern Institute of Technology Journal of Applied Research, Special Edition: Action research and action learning in vocational education and training. Heutagogy (hjuːtəgɒdjiː)
  7. context
  8. http://spectrum.ieee.org/energy/nuclear/24-hours-at-fukushima context motivated by 3/11
  9. People … especially university-age … need to be better informed and equipped to make sense of information and make subsequent independent decisions. International collaboration and communication are essential now and in the future. Simulations can be used to prepare for disaster and recovery. As educators, what can we learn from this disaster?
  10. Robot Task Complexity RTC$=$Σ$Mv1$+$Σ$Sv2$+$Σ$SW$+$Σ$Lv3$$ Why robots?
 • Provides closed, highly defined tasks.
 • Task complexity can be quantified.
 • Tasks can be replicated (same level of complexity but different maneuvers).
 • Provoke behaviors and communicative exchanges which can be located on a framework for analysis. Circuit$Task$Complexity$ CTC$=$Σ$(d$+$m$+$s+$o)
  11. Why virtual spaces?
 • Active 3D communication space. • Simulated context (cf. NASA, Los Alamos, USA DoD). • Immersion .. flow .. impact on learning. • Future of online communication (cf. Rift, Glasses, avatar, AR). • Remote control of virtual & real robots. • Determine the pedagogy & learning (or the heutagogy): ‘how’ & ‘why’ & ‘what’ • Students can design & manipulate the learning environment.
  12. implementation (virtual spaces)
  13. implementation (robots)
  14. implementation (real world circuits)
  15. reactor off switchradioactive bins control station
  16. implementation (students)
  17. movie demo time http://tinyurl.com/m34wpr9
  18. data
  19. Task flow chart for simulation
  20. Lesson outline for UK & USA teachers
  21. Learning objectives Task Task: robot actions CTC/ target CTC only / objective is to iteratively increase CTC/ Collabo ration STEM/ anticipated Essential Skills (Wales Baccalaureat e)/ anticipated RTC/ post task calculation based upon students’ solution. T1 Movement: follow the line. Sensors: light and touch CTC = Σ (d + m + s+ o) CTC= 1+2+2+1 = 7 Japan teach UK S: Recognition of light sensor values. What happens when trigger point increased/ decreased? T: Learn how to organise NXT program blocks logically. E: Construct a robot. Connect software to hardware. M: Recognise spatial movements and the problem of friction. Change surface to see if robot works the same. Calculate coefficient of friction. Identify Plan/ manage Explore/ Analyse (organize) Evaluate (checking) Reflect T2 Movement: follow the line. Sensors: colour and action. CTC= 1+2+2+2 = 8 UK teach Japan S: Recognition of light sensor values. What happens when trigger point increased/ decreased? How does the NXT sensor recognise colour R, G or B? Try different colour variations and observe subsequent robot actions. T: Learn how to organise NXT program blocks logically. E: Construct a robot. Connect software to hardware. M: Identify Plan/ manage Explore/ Analyse (organize) Evaluate (checking) Reflect T3 Movement: square. Sensors: touch and sound. CTC = 4+3+1+1 = 9 Japan teach UK S: T: Learn how to organise NXT program blocks logically. E: Construct a robot. Connect software to hardware. M: Calculate distance, speed and force (touch). Identify Plan/ manage Explore/ Analyse (organize) Evaluate (checking) Reflect
  22. Cognitive)process Knowledge$ dimension factual conceptual procedural metaG$ cogniHve remember understand apply analyze evaluate create Bloom’s revised taxonomy of learning objectives
  23. Data is captured and coded using neo- Bloomian descriptors: virtual screen capture + real world video capture. Transana s/w Google Drive 16 tasks 60 hours of data
  24. CTC$=$Σ$(d$+$m$+$s+$o) RTC$=$Σ$Mv1$+$Σ$Sv2$+$Σ$SW$+$Σ$Lv3$  Task CTC RTC T2 0.56 0.22 T3 0.5 0.42 T4 0.81 0.22 T5 0.81 0.57 T6 1 0.85 T7 0.69 1 T8 0.25 0.39 T9 0.31 0.33 T10 0.19 0.2 T11 0.63 0.76 T12 0.63 0.84 T16 0.56 0.83 T17 0.25 0.22 T18 0.31 0.65 T19 0.31 0.65 T20 0.69 0.48 T21 0.31 0.65 T28 0.25 0.17 Circuit$Task$Complexity$ Robot$Task$Complexity Task TaskComplexity
  25. based on Pearce et. al., 2005 Immersivity ( or Flow)
  26. The ‘optimal zone’ of immersivity
  27. observations
  28. Procedural  knowledge required little remembering but 
 more applying and evaluating. Active learning. With increased task complexity, the amount of analyzing, 
 evaluating and creating also increased. Active learning. BUT NOT ALWAYS!! Later tasks revealed that making tasks more complex does not necessarily engage in more occurrences of same components of the cognitive process. Why the difference? Immersion. Increased analyzing, 
 evaluating and creating when students engaged in tasks in zone of optimal immersivity. Learning is not linear (we already know that, don’t we!!) as might be assumed by university metrics for under-graduate and post-graduate education. Our RMI data has revealed
  29. Robot tasks involving sensors lead to a more immersed experience. Let students iteratively design, build and utilize modes of communication in 3D virtual spaces. They will use them. UK students used mostly procedural language (general) with confirmation questions. Japanese students offered mostly instructional language (specific) but with few instances checking for understanding. Active learning can be implemented through student- determined design of learning environments and tasks leading to particular types of thinking that are sensitive to heutagogy. # Forthcoming paper with Dr. P.A. Towndrow. Diana Laurillard (2012) calls this the design of learning as practice. Laurillard, D. (2012). Teaching as a design science. New York: Routledge.
  30. robots or not … what canYOU take away from this talk?
  31. Learner involvement in the environment of learning. Learner generates contextually relevant content. Spontaneous and organic (structured, organized, coherent, integrated) learning experiences. True collaboration between teacher and learner, and learner and learner. Flexible curricula. Flexible assessment. student-directed active learning Hase, S. (2011)
  32. Anderson, L.W., Krathwohl, D.R., Airasian, P.W., Cruicshank, K.A., Mayer, R.E., Pintrich, P.R., Raths, J. & Wittrock, M.C. (2001). A taxonomy for learning, teaching and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman. Battro, A. M., Fischer, K. W. & Lena, P. J. (2011). The educated brain: essays in neuroscience. UK: Cambridge University Press. Bloom, B.S. (Ed.) (1956). Taxonomy of Educational Objectives, the classification of educational goals. Handbook 1: Cognitive Domain. New York: McKay. Dewey, J. (1938). Experience & education. New York: Touchstone. Hase, S. (2011), Learner defined curriculum: heutagogy and action learning in vocational training. Southern Institute of Technology Journal of Applied Research, Special Edition: Action research and action learning in vocational education and training. Available from http://sitjar.sit.ac.nz/SITJAR/ Special Accessed August 16, 2014. Laurillard, D. (2012). Teaching as a design science. New York: Routledge. Tarricone, P. (2011). The taxonomy of metacognition. New York: Psychology Press. References
  33. (1) T. Morris-Suzuki, D. Boilley, D. McNeill and A. Gundersen. Lessons from Fukushima. Netherlands: Greenpeace International, February 2012. (2) J. Watts. “Fukushima parents dish the dirt in protest over radiation levels.” The Guardian, May 2, 2011. [Online]. Available: http://www.guardian.co.uk/world/2011/may/02/parents-revolt-radiation-levels [Accessed August 20, 2012]. (3) L. W. Hixson. “Japan’s nuclear safety agency fights to stay relevant.” Japan Today. [Online]. Available: http:// www.japantoday.com/category/opinions/view/japans-nuclear-safety-agency-Fig.hts-to-stay-relevant [Accessed August 20, 2012]. (4) N. Crumpton. “Severe abnormalities found in Fukushima butterflies.” BBC Science & Environment. [Online]. Available: http://www.bbc.co.uk/news/science-environment-19245818 [Accessed August 20, 2012]. (5) E. Guizzo. “Fukushima Robot Operator Writes Tell-All Blog.” IEEE Spectrum, August 23, 2011. [Online]. Available: http://spectrum.ieee.org/automaton/robotics/industrial-robots/fukushima-robot-operator-diaries [Accessed August 20, 2012]. (6) M. Vallance and S. Martin. “Assessment and Learning in the Virtual World: Tasks, Taxonomies and Teaching For Real.” Journal of Virtual Worlds Research Vol. 5, No. 2, 2012. (7) S. B. Barker and J. Ansorge. “Robotics as means to increase achievement scores in an informal learning environment.” Journal of Research in Technology and Education, Vol. 39, No. 3, pp. 229-243, 2007. (8) D.R. Olsen and M.A. Goodrich, “Metrics for evaluating human-robot interactions.” [Online]. Available: http:// icie.cs.byu.edu/Papers/RAD.pdf [Accessed March 14, 2009]. (9) M. Pearce, M. Ainley and S. Howard. “The ebb and flow of online learning.” Computers in Human Behavior, Vol. 21, pp. 745–771, 2005. (10) M. Vallance, C. Naamani, M. Thomas and J. Thomas. “Applied Information Science Research in a Virtual World Simulation to Support Robot Mediated Interaction Following the Fukushima Nuclear Disaster.” Communications in Information Science and Management Engineering (CISME). Vol. 3 Issue 5, 2013, pp. 222-232. Additional resources
  34. Engineering active learning: LEGO robots and 3D virtual worlds Dr. Michael Vallance Future University Hakodate, Japan http://www.mvallance.net
 
 This PDF is at http://tinyurl.com/mnmx3kx
Advertisement