Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Web-Based Self- and Peer-Assessment of Teachers’ Educational Technology Competencies


Published on

Presentation at the ICWL 2011 conference, 9 December 2011, Hong Kong.

Published in: Education, Technology

Web-Based Self- and Peer-Assessment of Teachers’ Educational Technology Competencies

  1. 1. Web-based Self- and Peer-assessment of Teachers’ Educational Technology Competencies Hans Põldoja, Terje Väljataga, Kairit Tammets, Mart Laanpere Tallinn University, Estonia
  2. 2. cbaThis work is licensed under the Creative CommonsAttribution-ShareAlike 3.0 Unported License. To view a copyof this license, visit or send a letter to Creative Commons, 444 CastroStreet, Suite 900, Mountain View, California, 94041, USA.
  3. 3. Outline• Problem statement• Related works: existing competency frameworks• Design challenges• Design methodology• Current prototypes• Conclusions and future work
  4. 4. Research context• Importance of educational technology competencies• Generic ICT competency frameworks (e.g. ICDL) do not cover all the competencies needed for educational use of ICT• Educational Technology Competency Model (ETCM) for Estonian teachers• DigiMina project for assessing teachers’ educational technology competencies
  5. 5. Research problemTo what extent and how could beteachers’ educational technologycompetencies assessed using a Web-based tool?
  6. 6. Existing competency frameworks• International Computer Driving License• UNESCO ICT Competency Framework for Teachers• ISTE National Educational Technology Standards for Teachers (NETS-T)
  7. 7. • ECDL / ICDL• Used in 148 countries• Focused on ICT usage• Standardized testing
  8. 8. • Launched 2008, revised 2011• Guidelines for creating national competency models• 6 subdomains
  9. 9. • ISTE NETS-T 2008• 20 competencies in 5 competency groups• Used in Norway, Netherlands, Finland, etc.
  10. 10. Design challenges
  11. 11. Design challenges• How to define performance indicators of all competencies in ETCM?• How to select appropriate methods and instruments for assessing competencies?• How to implement selected assessment methods in a Web-based tool?
  12. 12. Educational Technology Competency Model for Estonian Teachers• Based on ISTE NETS-T 2008• 5-level assessment rubric developed by local expert group
  13. 13. Assessment rubrics example3.1. Demonstrate fluency in technology systems and the transfer of current knowledge to newtechnologies and situations 1 2 3 4 5Creates a user Manages access Solves Transfers working Choosesaccount in a web- rights to the independently the methods from (compares,based system and resources problems that known web evaluates) the mostcreates/uploads published in the occur during the environment/ suitable tool for aresources; uses web. use of ICT tools software to an given task.common software/ (using help, manual, unknownweb environments/ FAQ or forums environment.hardware with the when needed);help of a user combines differentmanual; uses tools; changes thepresentation tools settings of a web-and a printer; based system.saves/copies files toexternal drive.
  14. 14. Measuring Educational Technology Competencies• Assessment methodology and instruments must be reliable, valid, flexible, but also affordable with respect to time and costs.• Four levels of measuring competencies (Miller, 1990): • knows • knows how • shows how • does
  15. 15. Web-based Assessment of Competencies• Five-dimensional framework for authentic assessment (Gulikers et al, 2004): • tasks: meaningful, relevant, typical, complex, ownership of problem and solution space; • physical context: similar to professional work space and time frame, professional tools; • social context: similar to social context of professional practice (incl. decision making); • form: demonstration and presentation of professionally relevant results, multiple indicators; • criteria: used in professional practice, related to realistic process/ product, explicit
  16. 16. Design methodology
  17. 17. Research-based design methodologyContextual Inquiry Participatory Design Product Design Software Prototype As Hypothesis Concept mapping User stories Information architecture Participatory design sessions Paper prototyping Scenario-based High-fidelity Personas Agile sprints design prototyping Adapted from (Leinonen et al, 2008)
  18. 18. Personas• Teacher training master student• Novice teacher• Experienced teacher• Educational technologist of a school• Trainings manager (in a national organization) (Cooper et al, 2007)
  19. 19. Scenarios• Master student is evaluating her educational technology competencies• Peer assessment of problem solving tasks• Educational technologist of a school is getting an overview of teachers’ educational technology competencies• Training manager is compiling a training group with sufficient level of competencies (Carroll, 2000)
  20. 20. Participatory design sessions • 2 sessions • Discussing the scenarios • Drawing the sketches
  21. 21. Main concepts
  22. 22. Competency Test• Competency test can be taken several times to measure the advancement• Usability issue: large number of tasks (20 competencies, 5 levels)• Solutions: • Can be saved and continued later • Setting the starting level with self-evaluation
  23. 23. Tasks• Three types of tasks: • automatically assessed self-test items (29) • peer assessment task (23) • self reflection task (41)• Need to increase the number of competencies that can be assessed with a self-test• Peer assessment requires blind review from 2 users in a same or higher competency level
  24. 24. Tasks• Tasks are authored using IMS QTI compatible test authoring tool TATS (Tomberg & Laanpere, 2011)• 5 IMS QTI question types are used: • choiceInteraction (multi-choice) • choiceInteraction (multi-response) • orderInteraction • associateInteraction • extendedTextInteraction
  25. 25. Competency Profile• Level of competencies is displayed as a diagram• User can compare her competency level with the average level of various groups• Privacy settings (private, group members, public)• Can be linked or embedded to external web pages
  26. 26. Group• Typically created for a school or a group of teacher training students• Group owner can see competency profiles of all members• Anybody can create a group• Groups can be set up as private or public
  27. 27. Competency Requirements• Large number of competency profiles would make DigiMina a valuable planning tool• Competency requirements can be created by the training manager, teacher trainer and group owner• Will be implemented in a later phase
  28. 28. Current prototype
  29. 29. Conclusions and future work• DigiMina as a component of a larger digital ecosystem• Involving expert teachers in creating and evaluating assessment tasks• Integrating DigiMina with the national teachers’ portal• DigiMina as a framework for assessing various competency models
  30. 30. References• Gulikers, J. T. M., Bastiaens, T. J., & Kirschner, P. A. (2004). A Five-Dimensional Framework for Authentic Assessment. Educational Technology Research & Development, 52(3), 67–86.• Miller, G. E. (1990). The assessment of clinical skills/competence/performance. Academic Medicine, 65(9), S63–S67.• Leinonen, T., Toikkanen, T., & Silvfast, K. (2008). Software as Hypothesis: Research-Based Design Methodology. In Proceedings of the Tenth Anniversary Conference on Participatory Design 2008 (pp. 61–70). Indianapolis, IN: Indiana University.• Cooper, A., Reimann, R., & Cronin, D. (2007). About Face 3: The Essentials of Interaction Design. Indianapolis, IN: Wiley Publishing, Inc.• Carroll, J. M. (2000). Making Use: Scenario-Based Design of Human-Computer Interactions. Cambridge, MA: The MIT Press.• Tomberg,V., & Laanpere, M. (2011). Implementing Distributed Architecture of Online Assessment Tools Based on IMS QTI ver.2. In F. Lazarinis, S. Green, & E. Pearson (Eds.), Handbook of Research on E-Learning Standards and Interoperability: Frameworks and Issues (pp. 41–58). IGI Global.
  31. 31. AcknowledgementsThis research was supported by• EDUKO program of Archimedes Foundation• Estonian Ministry of Education and Research targeted research grant No. 0130159s08• Tiger University Program of the Estonian Information Technology Foundation
  32. 32. Thank You!Hans Põldoja AssociateTallinn University, Estoniahans.poldoja@tlu.eehttp://www.hanspoldoja.net