Web-Based Self- and Peer-Assessment of Teachers’ Educational Technology Competencies
Web-based Self- and Peer-assessment of Teachers’ Educational Technology Competencies Hans Põldoja, Terje Väljataga, Kairit Tammets, Mart Laanpere Tallinn University, Estonia
cbaThis work is licensed under the Creative CommonsAttribution-ShareAlike 3.0 Unported License. To view a copyof this license, visit http://creativecommons.org/licenses/by-sa/3.0/ or send a letter to Creative Commons, 444 CastroStreet, Suite 900, Mountain View, California, 94041, USA.
Outline• Problem statement• Related works: existing competency frameworks• Design challenges• Design methodology• Current prototypes• Conclusions and future work
Research context• Importance of educational technology competencies• Generic ICT competency frameworks (e.g. ICDL) do not cover all the competencies needed for educational use of ICT• Educational Technology Competency Model (ETCM) for Estonian teachers• DigiMina project for assessing teachers’ educational technology competencies
Research problemTo what extent and how could beteachers’ educational technologycompetencies assessed using a Web-based tool?
Existing competency frameworks• International Computer Driving License• UNESCO ICT Competency Framework for Teachers• ISTE National Educational Technology Standards for Teachers (NETS-T)
• ECDL / ICDL• Used in 148 countries• Focused on ICT usage• Standardized testing
• Launched 2008, revised 2011• Guidelines for creating national competency models• 6 subdomains
• ISTE NETS-T 2008• 20 competencies in 5 competency groups• Used in Norway, Netherlands, Finland, etc.
Design challenges• How to deﬁne performance indicators of all competencies in ETCM?• How to select appropriate methods and instruments for assessing competencies?• How to implement selected assessment methods in a Web-based tool?
Educational Technology Competency Model for Estonian Teachers• Based on ISTE NETS-T 2008• 5-level assessment rubric developed by local expert group
Assessment rubrics example3.1. Demonstrate ﬂuency in technology systems and the transfer of current knowledge to newtechnologies and situations 1 2 3 4 5Creates a user Manages access Solves Transfers working Choosesaccount in a web- rights to the independently the methods from (compares,based system and resources problems that known web evaluates) the mostcreates/uploads published in the occur during the environment/ suitable tool for aresources; uses web. use of ICT tools software to an given task.common software/ (using help, manual, unknownweb environments/ FAQ or forums environment.hardware with the when needed);help of a user combines differentmanual; uses tools; changes thepresentation tools settings of a web-and a printer; based system.saves/copies ﬁles toexternal drive.
Measuring Educational Technology Competencies• Assessment methodology and instruments must be reliable, valid, ﬂexible, but also affordable with respect to time and costs.• Four levels of measuring competencies (Miller, 1990): • knows • knows how • shows how • does
Web-based Assessment of Competencies• Five-dimensional framework for authentic assessment (Gulikers et al, 2004): • tasks: meaningful, relevant, typical, complex, ownership of problem and solution space; • physical context: similar to professional work space and time frame, professional tools; • social context: similar to social context of professional practice (incl. decision making); • form: demonstration and presentation of professionally relevant results, multiple indicators; • criteria: used in professional practice, related to realistic process/ product, explicit
Research-based design methodologyContextual Inquiry Participatory Design Product Design Software Prototype As Hypothesis Concept mapping User stories Information architecture Participatory design sessions Paper prototyping Scenario-based High-ﬁdelity Personas Agile sprints design prototyping Adapted from (Leinonen et al, 2008)
Personas• Teacher training master student• Novice teacher• Experienced teacher• Educational technologist of a school• Trainings manager (in a national organization) (Cooper et al, 2007)
Scenarios• Master student is evaluating her educational technology competencies• Peer assessment of problem solving tasks• Educational technologist of a school is getting an overview of teachers’ educational technology competencies• Training manager is compiling a training group with sufﬁcient level of competencies (Carroll, 2000)
Participatory design sessions • 2 sessions • Discussing the scenarios • Drawing the sketches
Competency Test• Competency test can be taken several times to measure the advancement• Usability issue: large number of tasks (20 competencies, 5 levels)• Solutions: • Can be saved and continued later • Setting the starting level with self-evaluation
Tasks• Three types of tasks: • automatically assessed self-test items (29) • peer assessment task (23) • self reﬂection task (41)• Need to increase the number of competencies that can be assessed with a self-test• Peer assessment requires blind review from 2 users in a same or higher competency level
Tasks• Tasks are authored using IMS QTI compatible test authoring tool TATS (Tomberg & Laanpere, 2011)• 5 IMS QTI question types are used: • choiceInteraction (multi-choice) • choiceInteraction (multi-response) • orderInteraction • associateInteraction • extendedTextInteraction
Competency Proﬁle• Level of competencies is displayed as a diagram• User can compare her competency level with the average level of various groups• Privacy settings (private, group members, public)• Can be linked or embedded to external web pages
Group• Typically created for a school or a group of teacher training students• Group owner can see competency proﬁles of all members• Anybody can create a group• Groups can be set up as private or public
Competency Requirements• Large number of competency proﬁles would make DigiMina a valuable planning tool• Competency requirements can be created by the training manager, teacher trainer and group owner• Will be implemented in a later phase
Conclusions and future work• DigiMina as a component of a larger digital ecosystem• Involving expert teachers in creating and evaluating assessment tasks• Integrating DigiMina with the national teachers’ portal• DigiMina as a framework for assessing various competency models
References• Gulikers, J. T. M., Bastiaens, T. J., & Kirschner, P. A. (2004). A Five-Dimensional Framework for Authentic Assessment. Educational Technology Research & Development, 52(3), 67–86.• Miller, G. E. (1990). The assessment of clinical skills/competence/performance. Academic Medicine, 65(9), S63–S67.• Leinonen, T., Toikkanen, T., & Silvfast, K. (2008). Software as Hypothesis: Research-Based Design Methodology. In Proceedings of the Tenth Anniversary Conference on Participatory Design 2008 (pp. 61–70). Indianapolis, IN: Indiana University.• Cooper, A., Reimann, R., & Cronin, D. (2007). About Face 3: The Essentials of Interaction Design. Indianapolis, IN: Wiley Publishing, Inc.• Carroll, J. M. (2000). Making Use: Scenario-Based Design of Human-Computer Interactions. Cambridge, MA: The MIT Press.• Tomberg,V., & Laanpere, M. (2011). Implementing Distributed Architecture of Online Assessment Tools Based on IMS QTI ver.2. In F. Lazarinis, S. Green, & E. Pearson (Eds.), Handbook of Research on E-Learning Standards and Interoperability: Frameworks and Issues (pp. 41–58). IGI Global.
AcknowledgementsThis research was supported by• EDUKO program of Archimedes Foundation• Estonian Ministry of Education and Research targeted research grant No. 0130159s08• Tiger University Program of the Estonian Information Technology Foundation