2013-03-27 SITE TPACK symposium

  • 1,744 views
Uploaded on

Symposium at the 24th Annual International Conference of the Society for Information Technology & Teacher Education, 27 March 2013. …

Symposium at the 24th Annual International Conference of the Society for Information Technology & Teacher Education, 27 March 2013.
This symposium discusses several ways in which (pre-service) teachers‟ TPACK can be measured. The first two studies unravel the TPACK survey (Schmidt et al., 2009), a self-report instrument to determine TPACK, and try to revalidate the survey in two different pre-service teacher education contexts: The US and the Netherlands. The third study triangulates findings from the TPACK survey with other instruments to better understand teachers‟ development of TPACK that resulted from teachers‟ collaborative design of technology integrated lessons. The last contribution focuses on measuring transfer of TPACK, as it studies how beginning teachers, who had TPACK training during their pre-service education, demonstrated TPACK in their practice. Similarities and differences in the ways TPACK were measured and its implications will be discussed.

More in: Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
1,744
On Slideshare
0
From Embeds
0
Number of Embeds
3

Actions

Shares
Downloads
17
Comments
0
Likes
3

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. MEASURING TPACKInternational symposium on TPACK Joke Voogt, Petra Fisser, Ayoub Kafyulilo, Douglas Agyei (University of Twente) Johan van Braak, Jo Tondeur, Natalie Pareja Roblin (Ghent University) Denise Schmidt-Crawford, Dale Niederhauser, Wei Wang (Iowa State University) SITE 2013, 27 March 2013, New Orleans
  • 2. Invited international symposium on TPACK 2010 Strategies for teacher professional development on TPACK 2011 Teachers‟ assessment of TPACK: Where are we and what is needed? 2012 Developing TPACK around the world: Probing the framework even as we apply it 2013: Measuring TPACK
  • 3. Conceptualizing TPACK Strategies toacquire TPACK Measuring effects
  • 4. The Netherlands BelgiumIowa State Ghana Tanzania
  • 5. Part 1 Introduction to the symposium – Joke Voogt Measuring TPACK: Further Validation of the Technological Pedagogical Content Knowledge Survey for Preservice (TPACK) Teachers - Denise Schmidt-Crawford, Wei Wang, Dale Niederhauser, Iowa State University Unraveling the TPACK model: finding TPACK-Core – Petra Fisser & Joke Voogt, University of Twente, The Netherlands, Johan van Braak & Jo Tondeur, Ghent University, Belgium Discussion with the Audience
  • 6. MEASURING TPACK: Further Validation of the TPACK Survey for Preservice Teachers Denise A. Schmidt-Crawford Dale Niederhauser Wei WangCenter for Technology in Learning and Teaching School of Education Iowa State University
  • 7. Validation of TPACK SurveySchmidt, D. A., Baran, E., Thompson, A. D., Mishra, P.,Koehler, M. J., & Shin, T. S. (2009-10). TechnologicalPedagogical Content Knowledge (TPACK): TheDevelopment and Validation of an Assessment Instrument forPreservice Teachers. Journal of Research on Technology inEducation, 42, 123-149. Characteristics: • 47 likert-item survey • Seven constructs • Preservice teachers (elementary & early childhood education)
  • 8. Sampling of Requests…
  • 9. Sampling of Requests…
  • 10. The Problem• Other researchers using the survey were finding different patterns of results: • Factors were not stable • Items were loading on different factors • Factors were not aligning with the conceptual model
  • 11. Further Analysis• Research Context: • 3-credit, introduction to technology course (15 weeks) • Required for elementary education and early childhood education majors • Attend two, 1-hour lectures and one, 2-hour laboratory session every week• Participants: • 534 preservice teachers • 82% elementary education majors, 16% early childhood education majors, 2% other • 88% female; 12% male • 23% freshmen, 40% sophomores, 30% juniors, 7% seniors • 72% had not yet completed a practicum experience• Research Procedures: • Online survey administered at the end of the course (15-25 minutes to complete)
  • 12. Data Analysis • Principle components factor analysis (Varimax with Kaiser Normalization) • Internal consistency (Cronbach‟s alpha)
  • 13. TPACK as an Exploded Abstraction T P C
  • 14. Results1. TK, PK, TPK, TCK factors remained the same. TPACK Total Eigen Construct Items Values TK 6 .877 PK 7 .921 TPK 9 .902 TCK 5 .879
  • 15. Results 2. CK is messy! TPACK Total Combined Eigen Construct Items Items Value CK 12 3 .854Comment:I can use a ____________ way of thinking.I have various ways and strategies for developing my understanding of __________.I have sufficient knowledge about _____________.
  • 16. Results 3. PCK – Math item dropped out. TPACK Total Eigen Construct Items Value PCK 3 .865Comment:Indicated the participants were not answering “math” question in ways thataligned with the other content areas.
  • 17. Results4. TPACK – Two factors emerged (content, general). TPACK Total Eigen Construct Items Values Content 4 .885 General 4 .917
  • 18. Measuring TPACK• Collecting information about preservice teachers‟ perception of what they know • Direct measure of self perception • Indirect measure of knowledge• Start using direct measures for some TPACK constructs • e.g., CK – Content specific measures, PK – Praxis test• Program evaluation – Provides metrics of key places in teacher education program • What is working? What is not? (Interventions)• Start looking at TPACK as a dynamic model – What kinds of experiences can we provide to build “overlap?”
  • 19. Returning to the Problem• Using survey with „other‟ populations (i.e., inservice teachers)• Using survey with a focus on a specific content area (i.e., math, science)• Using survey in different countries• Validity & reliability are effected by population and content area
  • 20. QUESTIONS? Denise A. Schmidt-Crawford dschmidt@iastate.edu Dale Niederhauser dsn@iastate.edu Wei Wang weiyui72@iastate.eduCenter for Technology in Learning and Teaching School of Education Iowa State University
  • 21. Unraveling the TPACK model: finding TPACK-Core Joke Voogt, Petra Fisser University of Twente Johan van Braak, Jo Tondeur Ghent University, Belgium SITE, New Orleans, 27 March 2013
  • 22. Aim of the study: Empirical exploration of the TPACK model  Can we reproduce the distinguished constructs of the TPACK conceptual framework as represented in the Venn diagram with our data?  If not:  can we unravel the model?  can we find new constructs?  can we develop a new instrument that measures the self-perception of (pre-service) teachers?
  • 23. Why this study? We became fascinated by  the attractiveness of the model  the acceptance of the model by teachers  but also by the complexity of the model (and what‟s behind it) We worked on  Survey for pre-service teachers  Professional development for in-service teachers  Literature review (JCAL, 2012)  Discussions/debates/presentations
  • 24. We all know the TPACK model: “At the heart of good teaching with technology are three core components: content, pedagogy, and technology, plus the relationships among and between them.” (Koehler & Mishra, 2006)
  • 25. The context of the study The Netherlands Flanders, Belgium Pre-service teachers Teacher educators Use of technology in the science Use of technology in different domain domains Sample: Sample: - 287 pre-service teachers - 146 teacher educators - age 16-24 - age 26-61 - 24% male, 76% female - 29% male, 71% female - distributed over 4 years of - 1-38 years experience as study teacher educator Instrument: TPACK Survey Instrument: TPACK Survey (Schmidt et al., 2009), all items (Schmidt et al., 2009), T-related items
  • 26. Results (NL), reliability Reliability all TPACK-items together: Cronbach‟s α = 0.92 Reliability for all categories within Domain Cronbach’s α the TPACK Survey: TK .90 PK .75 CK .74 PCK .63 TCK .85 TPK .72 TPCK .83 Models .73
  • 27. Results (NL), factor analysis Factor analysis  Can we measure TPACK by asking questions for each of the 7 TPACK domains?  Are we measuring the 7 TPACK domains? Exploratory factor analysis (PC, Varimax) revealed 11 factors, 68% of variance explained Further analysis of the factors lead to forcing to 7 factors, 58% of variance explained. But… are these 7 factors the same as the 7 TPACK-domains??
  • 28. Results (NL), factor analysisfactor Items in factor Name factor Reliability Cronbach’s α1 TK1 TK2 TK3 TK4 TK5 TK6 TK7 Technological .90 Knowledge2 PK1 PK2 PK3 PK4 PK5 PK6 PK7 Pedagogical Knowledge .753 CK1 CK2 CK3 PCK1 PCK2 Pedagogical Science .80 Content Knowledge4 TCK1 TCK2 TCK3 TCK4 TCK5 TCK6 Technological & .88 TPK1 TPK2 Pedagogical enhanced TPCK2 TPCK3 TPCK4 TPCK6 Science Content Knowledge5 TPK3 TPK4 TPK5 Critically applying .73 TPCK1 TPCK5 learned TPACK6 Model1 Model2 Model3 Model4 Role models of TPACK .737 TPCK7 TPCK8 TPCK9 TPCK10 TPACK Leadership .89
  • 29. Results (NL), first findings Yes: TK and PK (and “role models”) No: CK, PCK, TCK, TPK and TPCK  CK and PCK are combined  TCK is combined with some of the TPK and some of the TPCK items and form a “Core TPACK” dimension  The other TPK and TPCK items are combined and form a scale “critically thinking about what you learned in your study before applying it”  Except for 4 TPCK items that form a “TPACK Leadership” scale
  • 30. Results (NL), focusing on the T-related itemsfactor Items in factor Name factor Reliability Cronbach’s α1 TK1 TK2 TK3 TK4 TK5 TK6 TK7 Technological .90 Knowledge2 PK1 PK2 PK3 PK4 PK5 PK6 PK7 Pedagogical Knowledge .753 CK1 CK2 CK3 PCK1 PCK2 Pedagogical Science .80 Content Knowledge4 TCK1 TCK2 TCK3 TCK4 TCK5 TCK6 Technological & .88 TPK1 TPK2 Pedagogical enhanced TPCK2 TPCK3 TPCK4 TPCK6 Science Content Knowledge5 TPK3 TPK4 TPK5 Critically applying .73 TPCK1 TPCK5 learned TPACK6 Model1 Model2 Model3 Model4 Role models of TPACK .737 TPCK7 TPCK8 TPCK9 TPCK10 TPACK Leadership .89
  • 31. Using the NL-results in the Flanders study Survey for teacher educators Only the T-related items from the TPACK Survey Specific science-related items were left out, all items were transformed to “your content area” Reliability all TPACK-items together: Cronbach‟s α = 0.97 Reliability for all categories within Domain Cronbach’s α the TPACK Survey: TK .95 TCK .92 TPK .83 TPCK .96
  • 32. Results (FL) Goal: Confirmatory Factor Analysis on the NL-data First: doing the Factor analysis again on the NL-data with only the TPACK Survey items that were included in the FL-survey: factor Items in factor Name factor Reliability Cronbach’s α 1 TK1 TK2 TK3 TK4 TK5 TK6 TK7 TK .90 2 TCK1 TCK2 TCK3 TCK4 TCK & TPK .85 TPK1 TPK2 (TPCK1) 3 TPCK1 TPCK2 TPCK3 TPCK4 TPCK .77 TPCK5 TPCK6 (TPCK1)
  • 33. Results (FL) Next, the Confirmatory Factor Analysis: Yes, there is a good fit: But:  the correlations between the factors are also very high,  a 1- or 2-factor solution might be possible*
  • 34. Unraveling the TPACK model When it comes to technology integration… Factors:  TK, TPK/TCK, & TPCK  or… TK & TPK/TCK/TPCK?  or… TK/TPK/TCK/TPCK? The integration of the domains as described by Koehler & Mishra go beyond the 3 circles and the overlapping areas! But what does that mean?
  • 35. TK, TPK/TCK, & TPCK TK items are very general: “I know how to solve my own technical problems”, “I can learn technology easily”, “I keep up with important new technologies” TPK and TCK items are much more related to (the preparation of) lessons: “I can choose technologies that enhance the teaching approaches for a lesson” and “I can choose technologies that enhance the content for a lesson” TPCK items are related to lessons and activities in the classroom: “I can teach lessons that appropriately combine science, technologies, and teaching approaches”, “I can select technologies to use in my classroom that enhance what I teach, how I teach, and what students learn”
  • 36. Getting closer to TPACK Core Propositions: 1. TK is conditional for TCK, TPK and TPCK (Voogt, Fisser, Gibson, Knezek & Tondeur, 2012) (& recent regression analysis) 2. The combination of TPK, TCK and TPCK is the heart (or the core) of the model (TPACK Core) And if you take a close look..It has been there the whole time!  “At the heart of good teaching with technology are three core components: content, pedagogy, and technology, plus the relationships among and between them.” (Koehler & Mishra, 2006)
  • 37. What does this mean for measuring TPACK? Can we keep the survey items for TK, TCK, TPK and TPCK? Or do we need to develop a new set of items that measures TPACK Core? We don‟t have the answer (yet)..
  • 38. What does this mean for measuring TPACK? What we do know:  Developing an instrument that is suitable for every situation is impossible  It is the specific context that matters most, and T, P and C are always context-dependent!  Measuring TPACK by a self-reporting survey is not enough  More measurement moments are needed  More instruments (observation, lesson plan rubric, etc) are needed
  • 39. More information? Ideas about (measuring) TPACK Core? Please contact us!  Petra Fisser: p.h.g.fisser@utwente.nl And for the Dutch & Flemish people  htpp://www.tpack.nl 
  • 40. Part 2 Welcome back! TPACK development in teacher design teams: assessing teachers’ perceived and observed knowledge - Ayoub Kafyulilo, Dar es salaam University College of Education, Tanzania; Petra Fisser & Joke Voogt, University of Twente, The Netherlands Long term impact of TPACK: From pre-service teacher learning to professional and teaching practices - Douglas Agyei, University of Cape Coast, Ghana; Joke Voogt, University of Twente, The Netherlands Discussant: Natalie Pareja Roblin – University of Ghent, Belgium Discussion with Audience
  • 41. TPACK development in teacher design teams:Assessing the teachers’ perceived and observed knowledge Ayoub Kafyulilo, Dar es salaam University College of Education Petra Fisser and Joke Voogt, University of Twente
  • 42. Introduction This study was conducted with the in-service science teachers in Tanzania. It adopted design teams as professional development arrangement to develop teachers‟ technology integration knowledge and skills. TPACK was used as a framework for describing the teachers‟ knowledge requirements for integrating technology in science teaching
  • 43. The Intervention The study comprised of four intervention activities  The workshop  Lesson design in design teams  Lesson implementation in the classroom  Mostly a projector and a laptop were used in teaching  Reflection with peers (peer appraisal)
  • 44. Lesson design in design teams
  • 45. An example of a classroom set up with a projector, laptopand a projection screen
  • 46. Research questions  What is the in-service science teachers‟ perceived TPACK before and after intervention?  What are the observed in-service science teachers‟ TPACK before and after intervention?
  • 47. Participants The study adopted a case study design  Design teams were study cases  Individual teachers were the units of analysis. 12 in-service science teachers participated in the study. The 12 teachers formed three design teams (each with 4 teachers)
  • 48. Instrument Six data collection instrument were used in this study to collect self- reported and observed data. Self reported data were collected through;  TPACK survey,  Reflection survey,  Focus group discussion and  Interview Observation data were collected through;  Classroom observation checklist,  Researcher‟s logbook
  • 49. TPACK Survey (pre and post-intervention) The TPACK survey was used before and after the intervention The instrument was adopted from Schmidt et al (2009) and Graham et al (2009) and used a 5 point Likert scale The reliability was 0.93 Cronbach‟s alpha
  • 50. Observation checklist The observation checklist was administered before and during the intervention The items had a 3 point Likert scale: “No” = absence, “No/Yes” = partial existence, and “Yes” = presence of the behavior Two people rated the observation checklist and the inter-rater reliability was 0.87 Cohen Kappa.
  • 51. The reflection survey The reflection survey was administered at the end of the intervention to assess the teachers‟ opinions about learning technology in design teams The overall reliability for items related to TPACK was 0.68 Cronbach‟s alpha.
  • 52. Researcher’s logbook The researchers‟ logbook was used to maintain a record of activities and events occurring during the intervention process. The researcher‟s logbook was used during peer appraisal, TPACK training and lesson design. Data collected through the researchers logbook were important in describing the interventions processes.
  • 53. Teachers’ interview The interview was administered at the end of the intervention to asses the effectiveness of design teams in teachers‟ development of TPACK An example of the interview question was:  What technology integration knowledge and skills did you develop from design teams? Four randomly selected interviews out of12 interviewees were coded by a second person. The inter-coder reliability was 0.83 Cohen Kappa.
  • 54. Focus group discussion A focus group discussion was administered at the end of the intervention An example of the question asked in FGD was:  How do you evaluate the results of your discussion in design teams; in terms of the products you made, decisions in the team, new ideas and innovations Two randomly selected FGD were coded by a second person. The inter-coder reliability was 0.92 Cohen Kappa.
  • 55. Results: Teachers’ perceived TPACK before and after theintervention Before intervention, teachers perceived their CK, PK and PCK as high, and TK, TCK, TPK and TPCK were low. After intervention, all TPACK components were perceived high. A Wilcoxon signed ranks test for two related samples showed that TK, PK, TCK, TPK and TPACK were significant at p ≤ 0.01 whereas CK and PCK were significant at p ≤ 0.05 Results from the reflection survey showed that teachers‟ developed TPACK through their participation in design teams.
  • 56. Results (Teachers’ observed TPACK) Findings from teachers observation showed a significant difference between pre- and post-intervention results. Pre-intervention results showed a low teachers‟ TK, TCK, TPK, and TPACK (M < 1.5, SD ≤ 0.17) in a three points Likert scale However, in the post-intervention results, all TPACK components were high (P ≤ 0.05).
  • 57. Conclusions The triangulation of the findings from self-reported and observed data showed;  A limited teachers‟ TK, TPK, TCK and TPACK before intervention,  After intervention all the TPACK components were high In this study, self-reported data comply with the observed data This differs from the findings of Alayyar (2011) and Kafyulilo et al (2011) which showed a difference between the observed and perceived TPACK
  • 58. Conclusions Probably this has something to do with  The instrument,  The culture and  The level of the teachers. Findings from both observed and self-reported data indicate that teachers‟ PK, CK and PCK were high before and after intervention. This may suggest that in the context of Tanzania, technology integration efforts need to focus more on technology related components of TPACK rather than the whole TPACK.
  • 59. Thanks for your attention kafyulilo@duce.ac.tz
  • 60. Long term impact of TPACK: From pre-service teacher learning to professional and teaching practices Douglas Agyei & Joke Voogt61
  • 61. Motivation Poor student achievements (in mathematics) High failure rate (More than 86% of failures to enter Tertiary levels) TIMSS 2003 & 2007 (43rd out of 44 & 46th out of 48) Poor attitudes Mathematics Teaching Teacher-centred approach (Hardly any hands-on activities,Whole class teaching Lots of notes being copied ) Low cognitive learning (Concept formation at a more abstract level, Heavy emphasis on assessment)
  • 62. Intervention studies in the 2009 – 2011 A Longitudinal study to integrate technology in teaching mathematics (Ghana) Two case studies of Professional Development (PD) in 2009 and 2010 Integration of the PD arrangement into a regular mathematics–specific IT course (2011) TPACK Framework ICT (spreadsheet) to promote in-depth maths concept formation Activity-Based Learning (ABL) to make lesson less teacher-centred
  • 63. TPACK Conceptualization (Intervention Studies 2)1. Make use of existing ICT tools TPACK Frame work - Interconnection(Spreadsheet-specific) of content pedagogy & technology (Mishra & Koehler,2006)2. Active involvement of learners(Activity Based Learning-ABL)3. Explore connection betweenspreadsheet, ABL pedagogy andmathematical concept
  • 64. Outcome of the Intervention Studies Developed TPACK of Participants  Self- assessment TPACK  Lesson artefacts  Lesson Observations Three years into project :  Mathematics teachers pursuing carriers in different institutions  Various Senior High Schools/Junior High Schools in Ghana
  • 65. Challenge and Data CollectionMeasure the impact of the Intervention Studies Explore whether and how the beginning teachers integrate ICT (demonstrate TPACK) in their teaching practices Gain insight into factors promoting (or hindering) the teachers’ ICT integration (TPACK demonstration) − Questionnaire (100) − Interview ( 20) − Observation (8) − Researchers’ logbook
  • 66. Results (1)- Self Report Table 1: Mean score of factors that influence teachers TPACK use (N=100). Conditions Mean Std Dev Skills and knowledge 4.57 .355 Dissatisfaction with status quo 4.48 .283 Commitment 4.21 .287 Availability of Time 3.75 .562 Rewards and Incentives 3.17 .237 Participation (Decision making 3.02 .503 involvement) School Culture 2.05 .292 Resources ( ICT facilities) 1.71 .311
  • 67. Results 2 : Lesson ObservationTable 2: Teacher lesson implementation (n=8) Teachers Subject ICT Availability Strategy Taught Two (2) Mathematics Personal Laptop and Spreadsheet techniques projector (interactive demonstration) Two (2) ICT Personal Laptop and Resources from Internet projector (interactive demonstration) Two (2) Mathematics Personal laptop (Rotating Spreadsheet techniques groups of students ) /Resources from Internet Two (2) Mathematics No ICT Facility Worksheet to support teamwork
  • 68. Snapshot of a lesson on Linear EquationsLinear functions in the slope intercept form TPCKmaths TKss
  • 69. Snapshot of a lesson on Enlargement Consider the diagrams below. eye water Image of coin coin
  • 70. Snapshot of a lesson on Enlargement (2)Image of the boy A boy Pin-hole camera
  • 71. Snapshot of a lesson on Introduction to computernetworks (1)
  • 72. Snapshot of a lesson on Introduction to computernetworks (2)
  • 73. Summary of Results & Conclusions Developed and strong positive views about TPACK in the long term (result of the pre- service preparation intervention studies) Specific focus on ABL “P” and spreadsheets “T” in Mathematics “C” helped to develop deep connections between their subject matter, instructional strategy and the ICT application, fostering TPACK in the long term (closer to the original conception of Schulman’s (1986) ideas of Pedagogical Content Knowledge) Develop TPACK in similar initiatives using other ICT applications and/or different subject matter. Develop and extend pedagogical reasoning to support students learning Using multiple data sources is a good way to assess TPACK in the long run Teachers’ “knowledge and skill” acquired and “dissatisfaction with the status quo” are key in promoting long term TPACK Lack of access to ICT infrastructure and unenthusiastic school cultures hinder TPACK in the long run
  • 74. Thank you  Douglas D. AgyeiEmail: ddagyei@yahoo.com  Joke M. VoogtEmail: j.m.voogt@utwente.nl
  • 75. Symposium: Measuring TPACK SITE Conference, New Orleans, 2013 Natalie Pareja Roblin discussant
  • 76. TPACK: A growing concept
  • 77. Main themes in these studies Review of studies about TPACK published between 2005-2011 (n=55) (Voogt et al., 2012)• Development of the TPACK concept (14 studies)• Measuring (student-)teachers’ TPACK (14 studies)• Development of TPACK in specific subject domains (7 studies)• Strategies to support the development of (student-) teachers’ TPACK (36 studies)• TPACK and teacher beliefs (6 studies)
  • 78. This symposium: Measuring TPACK Integrative views C P Transformative views T Pre-service teachers In-service teachers Student teachers Teacher trainers
  • 79. Towards a comprehensive approach formeasuring TPACK
  • 80. Integrating multiple instruments tomeasure TPACK1. Perceived TPACK • Self-assessment survey (from integrative to transformative views on TPACK) • Interviews • Teacher reflections • ....2. Enacted TPACK • Observation checklist • Lesson plans • Researcher logbook • .....
  • 81. TPACK as a complex and “fuzzy”concept• How can TPACK (and its constituting knowledge domains) be operationalized? Is it possible (and desirable) to pull apart the knowledge domains that constitute TPACK?• If TPACK is considered as a “sliding framework”, as suggested by Cox and Graham (2009), is it possible to develop standardized instruments to measure it?• How does qualitative data contribute to the understanding of (pre-service) teachers’ TPACK development? What does it add to survey data?• How to best combine self-reported and observed TPACK measurements?
  • 82. Examining the development of TPACKacross time In-service teachers Beginning teachersPre-service teachers
  • 83. TPACK development as a dynamic andcontext-bound process• How does TPACK develop as student teachers step into the teaching profession and become experienced teachers?• What factors (personal, institutional, systemic) facilitate and/or hinder TPACK development?• How does the context (school characteristics, learner characteristics, access to technology, ICT policies, etc.) influence the ways in which teachers integrate technology (i.e., how TPACK is put into action)?
  • 84. Towards a comprehensive approach formeasuring TPACK: Moving forward...
  • 85. Integrating multiple instruments: Recent initiatives Assessing teachers’ pedagogical ICT competences (The Netherlands)
  • 86. Assessing teachers’ pedagogical ICTcompetences Self-perceived + Observed
  • 87. Format of the video vignette - Subject - Goal Introduction - Nature of ICT use (+/- 2 minutes) - Perceived advantages/contributions of ICT - ICT applications - Goals of ICT use Practice - Attractive/efficient/effective uses - Pedagogical use of ICT (TPACK) (+/- 4 to 8 minutes) - Teacher role - Student role - Why this lesson? - Why this combination of T, P and C? Reflection - Would this lesson be different without ICT? (+/- 2 minutes) - How do you know your (ICT) goals have been accomplished?
  • 88. Examining TPACK development across time and contexts: Recent initiativesFrom pre-service to practice: Understanding beginning teachers’ uses of technology (Belgium, Flanders)
  • 89. Understanding beginning teachers’ uses of technology Longitudinal qualitative study in FlandersFocus on (institutional) factors supporting TPACK development
  • 90. Study 1: Approaches to support TPACK development Moving from stand-alone technology courses to integrated approaches that aim to support TPACK developmentTE1: From TK to... TE2: From TK to TPK TE3: From TK to TCK Tondeur, J., Pareja Roblin, N., van Braak, J., Fisser, P., Voogt, J. (2012). Technological pedagogical content knowledge in teacher education: in search of a new curriculum. Educational Studies, DOI:10.1080/03055698.2012.713548
  • 91. Study 2: Technology integration by BTPre-service education influences how BT integrqte technology intheir teaching practice. It contributes to:- Developing TK “The basic skills we did learn them”- Getting to know various technology tools that could be used with educational purposes “[To learn about things] such as Klascement or Hot Potatoes was useful” “If I had not learned it in my pre-service education, I think I would have never used it here”- Learning how to teach with technology (!) little opportunities “[We should learn] not only the application itself, but [also] how to use it and how to integrate it [in your teaching]”
  • 92. Study 2: Technology integration by BTHowever, (the extent of) this influence depends on schoolcharacteristics:- Access to technology “It is not possible to sit behind 1 computer with 19 children”- Clear ICT policies “Everybody has one hour in the computer room. It is not compulsory, but the school principal has strongly recommended it to us”- Workload “Making and trying out new things is difficult, especially at the start [of your career] because you are busy with preparing your lessons”- Support and mentoring systems
  • 93. Measuring TPACK: Mission impossible? How can TPACK (and its constituting knowledge domains)Integrative views be operationalized? Transformative on TPACK views on TPACK Is it possible to develop standardized instruments to Generic measure TPACK? Context & content instruments specific instruments How does the context influence the ways in which teachers integrate technology? How to best combine self- reported and observed TPACK measurements? Self-perceived Observed TPACKTPACK measures measures
  • 94. Thank you!natalie.parejaroblin@ugent.be