SlideShare a Scribd company logo
1 of 12
Download to read offline
Contents lists available at ScienceDirect
Computers in Human Behavior
journal homepage: www.elsevier.com/locate/comphumbeh
Full length article
Effects of digital video-based feedback environments on pre-service
teachers’ feedback competence
Christopher Neil Prilop
⁎
, Kira Elena Weber, Marc Kleinknecht
Leuphana University Lüneburg, Institute of Educational Science, Universitätsallee 1, 21335, Lüneburg, Germany
A R T I C L E I N F O
Keywords:
Pre-service teacher education
Digital learning environments
Video
Intervention
Feedback
Practicum
A B S T R A C T
The present study investigates the added value of blended digital video-based feedback environments in fos-
tering pre-service teachers' feedback competence on teaching during a practicum. Pre-service teachers practised
providing their peers with feedback on their classroom management in traditional face-to-face feedback sessions
(control group, n = 65) or blended digital video-based environments with an expert present (V-Feedback+,
n = 22) or without (V-Feedback, n = 16). Before and after the practicum, a video-based tool was applied that
required pre-service teachers to provide written feedback to a teacher concerning fictitious classroom events.
Written feedbacks were analysed by applying quantitative content analysis. Feedback competence was assessed
with six categories: assessment criteria; specificity; suggestions; questions; first person; and positive/negative
emphasis. This study demonstrated that digital video-based environments can elicit stronger effects than tra-
ditional face-to-face settings, especially when combined with expert feedback. Results from the pre- and post-
tests revealed that V-Feedback and V-Feedback + participants provided more specific feedback than the control
group. V-Feedback + participants provided feedback containing more high quality suggestions than the V-
Feedback group. This study illustrates how pre-service teachers’ feedback competence can be fostered in teaching
practicums using digital video-based feedback environments.
1. Introduction
Actively seeking peer feedback on one's own teaching is considered
essential to acquiring teaching expertise (Hammerness et al., 2005).
Feedback can be obtained by inviting colleagues into one's classroom to
observe and reflect on one's teaching practice. These feedback and re-
flection sessions help teachers “learn, grow and change” (Joyce &
Showers, 1996, p. 12) and, thus, become expert teachers. Hammerness
et al. (2005) emphasise that feedback needs to become a continuous
activity in the teaching profession. However, research in other domains
has shown that feedback also can elicit detrimental effects on perfor-
mance (Kluger & DeNisi, 1996). Therefore, the ability to provide
feedback on classroom practices needs to be taught during teacher
education. Pre-service teachers need to become competent providers of
high-quality feedback that fosters reflection on classroom practice
(Tripp & Rich, 2012a).
In pre-service teacher education, feedback and reflection sessions
during practical school experiences offer an ecologically valid learning
setting in which to acquire feedback competence. And yet, face-to-face
feedback and reflection sessions rarely are realised (Valencia, Martin,
Place, & Grossman, 2009), especially when experts are supposed to be
present (Lee & Wu, 2006). However, digital video-based feedback and
reflection environments can offer a solution. They increase opportunities
for sessions because they make feedback and reflection sessions time-
and location-independent (So, Pow, & Hung, 2009; Wu & Kao, 2008).
The educational research community has yet to focus on fostering
the competence of providing feedback on classroom practice, so we
investigated whether pre-service teachers' feedback competence in-
creased more by participating in one of three different feedback and
reflection environments in our practicum: a traditional face-to-face
feedback and reflection format; a blended-learning setting that com-
prised face-to-face feedback and a digital video-based environment
with peer and expert feedback; and a blended-learning setting with
face-to-face feedback and a digital video-based environment with only
peer feedback. With this approach, the present study broadens the
perspective on the use of digital video-based environments in ecologi-
cally valid settings and provides a foundation for future research on
fostering pre-service teachers’ feedback competence on teaching prac-
tice.
https://doi.org/10.1016/j.chb.2019.08.011
Received 1 October 2018; Received in revised form 12 August 2019; Accepted 13 August 2019
⁎
Corresponding author. Institute of Educational Science, Universitätsallee 1, C1.207, 21335, Lüneburg, Germany.
E-mail addresses: prilop@leuphana.de (C.N. Prilop), kweber@leuphana.de (K.E. Weber), marc.kleinknecht@leuphana.de (M. Kleinknecht).
Computers in Human Behavior 102 (2020) 120–131
Available online 17 August 2019
0747-5632/ © 2019 Elsevier Ltd. All rights reserved.
T
1.1. Feedback and feedback sessions
Feedback is considered one of the most powerful factors in pro-
moting achievement in a variety of contexts (Hattie & Timperley, 2007;
Narciss, 2013). It provides individuals with information about their
current performance to help them improve and reach desired standards
(Narciss, 2013). Studies concerning expertise show that feedback is
essential to improve performance. Ericsson, Krampe, and Tesch-Römer
(1993, p. 367) assert that in the “absence of adequate feedback efficient
learning is impossible and improvement only minimal even for highly
motivated subjects”. Regarding pre- and in-service teachers, this means
that merely practicing teaching is not sufficient to become an expertly
skilled educator. They need to evaluate their own teaching in co-
ordination with colleagues to learn from each other. Consequently,
Hammerness et al. (2005) assert that teachers actively need to seek
feedback to develop teaching expertise.
Such feedback occasions increasingly are being incorporated into
pre- and in-service teacher education (Kleinknecht & Gröschner, 2016;
Joyce & Showers, 2002; Kraft, Blazar, & Hogan, 2018). Feedback ses-
sions take place after observing a teacher's lesson or specific skills
training. They can involve either experts who possess more advanced
knowledge than the teacher or peers who share a similar level of
teaching expertise (Lu, 2010). Feedback sessions “stimulate reflection”
(Hammerness et al., 2005, p. 380) and, thus, cause “a self-critical, in-
vestigative process wherein teachers consider the effect of their peda-
gogical decisions on their situated practice with the aim of improving
those practices” (Tripp & Rich, 2012a p. 678). A growing body of re-
search (e.g., Allen, Hafen, Gregory, Mikami, & Pianta, 2015; Weber,
Gold, Prilop, & Kleinknecht, 2018; Fisher, Frey, & Lapp, 2011;
Matsumura, Garnier, & Spybrook, 2013; Sailors & Price, 2015;
Tschannen-Moran & McMaster, 2009; Vogt & Rogalla, 2009) has con-
firmed the substantial effects from feedback sessions on teacher
knowledge, practice, beliefs and, consequently, student achievement.
However, in different domains, Kluger and DeNisi (1996) established
that receiving feedback does not necessarily lead to improved perfor-
mance, i.e., fostering expertise requires high-quality feedback (Ericsson,
2004). Thus, to develop the ability to provide high-quality feedback
concerning teaching situations productively, pre-service teachers need
to acquire this competence.
Feedback competence in teacher assessment can be defined as the
skill to convey critical assessments of a teacher's classroom practice to
initiate reflection (Hammerness et al., 2005). After providing a criteria-
based evaluation of a teaching performance and identifying possible
strengths and weaknesses, the feedback provider needs to communicate
this information to her or his fellow teacher constructively (Sluijsmans,
Brand-Gruwel, Van Merriënboer, & Bastiaens, 2003). Prins, Sluijsmans,
and Kirschner (2006) analysed what distinguishes expert feedback from
novice feedback. They established that experts make more use of cri-
teria, provide more situation-specific comments, and more frequently
use a first-person perspective style. Additionally, they found that no-
vices prefer being provided with feedback that contains many reflective
questions, examples and suggestions for improvement.
1.2. Feedback quality
Competence in providing feedback commonly is assessed by ana-
lysing feedback quality (e.g., Prins et al., 2006). However, extant re-
search on fostering peer feedback quality has focussed on school stu-
dents (e.g., Gan & Hattie, 2014; Rotsaert, Panadero, Schellens, & Raes,
2018) or content-related tasks in higher education (e.g., M. Gielen & De
Wever, 2015; Peters, Körndle, & Narciss, 2018). Only a few have dealt
with the effects of pre-service teachers' peer feedback on mathematical
or writing tasks (e.g., Alqassab, Strijbos, & Ufer, 2018; Sluijsmans,
Brand-Gruwel, Van Merriënboer, & Martens, 2004; Sluijsmans et al.,
2003). Furthermore, to our knowledge, no extant studies have in-
vestigated how to promote pre-service teachers’ feedback quality
concerning teaching practice.
Generally, feedback's efficacy is determined through three facets:
content, function and presentation (Narciss, 2013). Following Hattie
and Timperley, feedback needs to answer three questions: Where am I
going? (Feed Up); How am I going? (Feed Back); and Where to next?
(Feed Forward). No clear consensus exists as to how feedback quality
can be measured accurately (S. Gielen, Peeters, Dochy, Onghena, &
Struyven, 2010). While some studies have examined feedback ac-
cording to validity and reliability (Hafner & Hafner, 2003; Van
Steendam, Rijlaarsdam, Sercu, & Van den Bergh, 2010), most applied
abstract classifications to measure feedback quality. Abstract classifi-
cations offer the advantage that generic knowledge can be measured.
Such measures then can be applied in a multitude of feedback situations
without being limited by domain- or task-specific aspects (S. Gielen
et al., 2010). As teaching situations do not entail clear-cut solutions, an
assessment in terms of validity and reliability would seem nearly im-
possible. Therefore, pre-service teachers' feedback quality concerning
teaching practice was evaluated in terms of content and/or style criteria
in this study.
Extant studies that have measured feedback quality (Prins et al.,
2006; S. Gielen et al., 2010; M. Gielen & De Wever, 2015) largely have
been based on a set of criteria originally suggested by Sluijsmans,
Brand-Gruwel, and Van Merriënboer (2002). First, feedback comments
need to be appropriate for the specific context, i.e., the assessor needs to
be able to evaluate a performance based on defined criteria (Feed Up).
Second, the assessor must be able to explain his or her judgements and
highlight specific examples (S. Gielen et al., 2010). Third, feedback
must contain constructive suggestions, which are part of feedback's
tutoring component. They provide learners with additional information
besides evaluative aspects. This can include knowledge about task
constraints, concepts, mistakes, how to proceed or teaching strategies
(Narciss, 2013). Explaining one's judgements can be viewed as Hattie
and Timperley’s (2007) concept of Feed Back, whereas suggestions can
be compared with Feed Forward. Fourth, feedback messages should
contain ‘thought-provoking questions’ (S. Gielen et al., 2010, p. 307)
that aim to enhance individuals' active engagement (Nicol &
Macfarlane-Dick, 2006). Fifth, M. Gielen and De Wever (2015) have
determined that feedback messages should contain positive and nega-
tive comments. Both can enhance future performances (Bandura &
Cervone, 1986; Kluger & DeNisi, 1996). Finally, high-quality feedback
should be written in the first person, with a clear structure and for-
mulations (Prins et al., 2006).
1.3. Feedback environments
A growing demand exists for ecologically valid practice environ-
ments in the field of teacher education, with research showing that pre-
service teachers need to develop situation-specific skills through which
to apply their professional knowledge effectively (Blömeke, Gustafsson,
& Shavelson, 2015; Grossman, Hammerness, & McDonald, 2009).
Teaching practicums offer such an authentic setting in which to practise
feedback on teaching. Pre-service teachers need to be able to practise
skills repeatedly to develop fluidity (Grossman & McDonald, 2008).
This can be achieved during practicums through pre-service teachers
observing their peers’ classroom interactions, then providing them with
feedback. Feedback sessions during the practicum, as well as the
practicum itself, can be viewed as “approximations of practice”
(Grossman & McDonald, 2008, p. 190). However, although they are
highly authentic approximations, feedback sessions still “rely on de-
composition” (Grossman et al., 2009, p. 2091), as mentors or university
supervisors set a specific focus on feedback. Simultaneously, these ex-
perts take up the role of “modelers of practice” (Clarke, Triggs, &
Nielsen, 2014, p. 177). As pre-service teachers tend to imitate their
practice in the classroom (Clarke et al., 2014, p. 177), this also can be
expected from feedback sessions. Although feedback can be viewed as
essential in developing expertise, feedback sessions and, thus, the
C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131
121
option to practise providing feedback are limited in teaching practi-
cums (Lee & Wu, 2006; Valencia et al., 2009).
This is largely due to time and location constraints, especially when
experts are supposed to be part of the process (Lee & Wu, 2006). Pre-
service teachers usually are assigned to different schools or in different
classes, making classroom observations and feedback sessions difficult
to implement. Furthermore, feedback sessions with pre-service teachers
at remote schools might not be provided due to a lack of resources
(Hixon & So, 2009). Digital video-based feedback environments can
resolve such logistical issues because student teachers can participate
online, regardless of time or location (Wu & Kao, 2008, p. 45). Con-
sequently, recent studies have applied digital video-based environments
to feedback sessions in teacher education (e.g., Kleinknecht &
Gröschner, 2016; Weber et al., 2018; Gregory et al., 2017; So et al.,
2009; Wu & Kao, 2008). Consequently, the advantages of these en-
vironments outweigh the disadvantages (Hixon & So, 2009). Being able
to interact remotely, and at different times, leads to higher interaction
frequency between participants and experts, allowing for “opportu-
nities to learn and to emulate each other” (Wu & Kao, 2008, p. 54).
Moreover, digital video-based environments enable pre-service teachers
to observe a variety of effective and ineffective teaching practices,
eliciting a more comprehensive understanding of classroom reality (So
et al., 2009, p. 783). Wu and Kao (2008) also showed that by being able
to select particular teaching situations, discussions and reflections were
more focussed. However, selecting specific teaching situations also has
been criticised. One study's (Sharpe et al., 2003) participants found
video clips to be artificial because pre-service teachers seemed to share
only positive events. Furthermore, when all pre-service teachers and
experts interacted solely through a digital video-based environment,
feedback sessions can become impersonal. For this reason, Malewski,
Phillion, and Lehman (2005) added traditional classroom visits along
with digital environments.
Ultimately, digital video-based feedback environments can be
viewed as approximations of practice that carry a slightly lower degree
of authenticity compared with face-to-face settings. They are not con-
ducted in real time and do not contain a complete representation of the
lesson. Consequently, the use of video provides pre-service teachers
with additional time for reflection and the opportunity to prepare their
feedback (Grossman et al., 2009, pp. 2079–2083).
1.4. Video as a tool in (pre-service) teacher education
In their recent review of 250 studies, Gaudin and Chaliès (2015)
determined that classroom videos increasingly are being used to sup-
port teacher education and professional development worldwide.
Video-based feedback and reflection have become a standard compo-
nent of digital practicum environments (e.g., Kleinknecht & Gröschner,
2016; Hixon & So, 2009; Lee & Wu, 2006; So et al., 2009), as well as
settings that entail face-to-face in- and pre-service teacher education
(e.g., Dobie & Anderson, 2015; Fukkink & Tavecchio, 2010;
Hollingsworth & Clarke, 2017; Rich & Hannafin, 2008; Tripp & Rich,
2012a 2012b).
Incorporating classroom videos offers many advantages, as well as
potential disadvantages. Classroom videos are authentic representa-
tions of teaching events that capture the complexity of teaching pro-
cesses to a high degree (Borko, Whitcomb, & Liston, 2009). Sequences
can be watched repeatedly, making it possible to revisit and examine
specific situations with different foci (Sherin, 2007). Videos of class-
room practice act as situated stimuli for eliciting knowledge about
teaching and learning (Kersting, 2008; Seidel & Stürmer, 2014). Fur-
thermore, analysis of classroom videos has been shown to lead to high
activation, immersion, resonance and motivation (Kleinknecht &
Schneider, 2013; Seidel, Stürmer, Blomberg, Kobarg, & Schwindt,
2011). However, classroom videos also contain potential constraints.
Although classroom videos can be considered rich representations of
teaching interactions, they offer less contextual information than live
observations (Sherin, 2007). Körkkö, Morales Rios, and Kyrö-Ämmäla
(2019) emphasise that video sequences require contextualisation to
convey the classroom's culture, atmosphere and environment. Fur-
thermore, videos can lead to “attentional biases”, i.e., only noticing
limited aspects of classroom reality, or “cognitive overload”, i.e., being
overwhelmed by the density of information (Derry, Sherin, & Sherin,
2014, p. 787). To counteract these limitations, various researchers have
formulated design principles for learning environments using video in
teacher education (for a discussion of existing frameworks, see Kang &
Van Es, 2018). To develop classroom videos' full potential, they need to
be embedded in contextual information, such as a description of the
class or the lesson's learning goals (Blomberg, Renkl, Sherin, Borko, &
Seidel, 2013). This provides the first guiding scaffold for learners and
adds information that is not transferred through the video. Further-
more, (pre-service) teachers have the opportunity to watch classroom
videos repeatedly, and instructors simultaneously can direct (pre-ser-
vice) teachers' attention to important aspects of the videotaped se-
quence, e.g., by setting a specific observation target (Derry et al., 2014,
pp. 785–812). Moreover, setting such a target also can be part of a
highly scaffolded learning environment that reduces the risk of cogni-
tive overload (Kang & Van Es, 2018).
When designing video-based learning environments, the video ma-
terial's origins also need to be considered (Blomberg et al., 2013). Ex-
tant studies either have applied classroom videos of (pre-service) tea-
chers' own practice (own videos) or classroom videos of peers or
unknown teachers (other videos) (Major & Watson, 2018). Studies (e.g.,
Kleinknecht & Schneider, 2013; Seidel et al., 2011) have analysed
teachers' motivational and cognitive processes when working with own
videos or other videos in depth, showing that videos of own teaching
led to a higher degree of activation than classroom videos of other
teaching. Higher activation is characterised by a deeper engagement
and involvement (immersion), being able to place oneself in the si-
tuation more easily (resonance) and continuous motivation. However,
these studies also established that teachers using other materials im-
proved their knowledge-based reasoning skills more. They analysed
more critically and deduced more consequences and alternatives than
the own video group. Concerning pre-service teacher education,
Santagata and Guarino (2011) emphasise that analysing videos of peers
can increase motivation because pre-service teachers identify with their
peers and find their classroom control more achievable. Furthermore,
Krammer et al., (2016) compared the use of own vs. other video on pre-
service teachers' professional vision. They were able to show that both
groups increased their professional vision significantly. In a slightly
different study, Hellermann, Gold, and Holodynski (2015) analysed the
effect of training professional vision skills of pre-service teachers with
own videos or with own and other videos. Though both groups resulted
in learning effects, the group with own and other videos improved their
professional vision more. They attribute the larger improvements to
more in depth learning by being faced with an inner (own video) and
outer perspective (other video).
1.5. Video-based feedback
Tripp and Rich (2012a) specifically investigated the impact of
video-based reflection and feedback on in-service teacher change. Their
study indicates that feedback becomes more focussed when using video.
Participants reported that the feedback they received was more specific,
and suggestions were more relevant. They assessed that the feedback
they previously had been provided with in learning environments
without video support was too general. Furthermore, feedback sessions
tended to be more dialogic. Video sequences elicited questions from the
teachers providing feedback, as they wanted to understand the entire
context. Hollingsworth and Clarke (2017) also found this effect to be
present in video-based reflection and feedback for in-service teachers.
Their study indicated that teachers perceived feedback sessions “as an
opportunity for the teachers and researchers to discuss observations,
C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131
122
analyses and reflections” (p. 471). This led to conversations that em-
phasised support, rather than a one-way transmission of information.
The findings by Tripp and Rich (2012a) and Hollingsworth and Clarke
(2017) are confirmed in a review of 63 video-based studies (Tripp &
Rich, 2012b). Additionally, the authors found that mentors or super-
visors play a significant role with pre-service teachers in reflecting on
classroom videos. Participants said they trusted their supervisors’ opi-
nions more than their own. However, video support also seems to in-
fluence the feedback that inexperienced mentors provided. In a study
by Rich and Hannafin (2008), some pre-service teachers ascertained
that the face-to-face feedback that their mentors provided lacked
structure, but when provided with video support, it was more specific
and in-depth.
Digital environments that apply video for feedback and reflection
yielded similar findings. Lee and Wu (2006) and Wu and Kao (2008)
created a digital environment in which pre-service teachers could
watch videos of their own and peers’ teaching. They were able to dis-
cuss their teaching practice with their peers and experienced teachers.
Wu and Kao (2008) found that the digital video-based environment
allowed “for more accurate and more probing reviews of teaching in-
stances” (p. 54). Lee and Wu (2006) also emphasised that pre-service
teachers received “more concrete feedback” (p. 379) in the digital
video-based environment. In both studies, the authors ascertained that
feedback concreteness was enhanced because pre-service teachers were
able to mark specific teaching situations on the videos that related to
their feedback. This led So et al. (2009) to evaluate digital video-based
environments as an ideal feedback and mentoring resource during
teaching practicums.
Although these modern, high-tech environments have produced
encouraging findings, it is unclear whether digital video-based feedback
environments can foster pre-service teachers’ feedback competence on
teaching practice more effectively than face-to-face sessions.
1.6. Research questions
As high-quality feedback significantly can foster (pre-service) tea-
chers' professional knowledge, teacher education needs to develop
teachers’ feedback competence early in their careers. Currently, prac-
ticing feedback provision in authentic situations mostly is limited to
face-to-face sessions, but face-to-face feedback sessions often are not a
standard component or are limited in number during teaching practi-
cums due to location and time constraints. Blended digital video-based
feedback environments can be a solution to this. Therefore, we in-
vestigated whether pre-service teachers enhanced their feedback com-
petence more by practicing feedback in a traditional face-to-face setting
(control group, CG) or in two blended settings that were combined with
face-to-face feedback: a digital video-based environment with expert
feedback (V-Feedback+) and a digital video-based environment without
expert feedback (V-Feedback) during a teaching practicum.
The following research questions were investigated:
1) To what extent does feedback practice during a practicum improve
pre-service teachers' feedback competence (CG, V-Feedback, V-
Feedback+)?
2) What impact do blended digital video-based environments (V-
Feedback, V-Feedback+) have on pre-service teachers' feedback
competence compared with the face-to-face condition (CG)?
3) How does expert feedback (V-Feedback+) influence pre-service
teachers' feedback competence compared with the condition
without expert feedback (V-Feedback)?
We assumed that pre-service teachers in all conditions (CG, V-
Feedback, V-Feedback+) would show positive development in their
feedback competence during the practicum, as they are provided with
multiple opportunities to practise in an authentic environment.
Moreover, we expected participants of the blended digital video-based
feedback environments (V-Feedback, V-Feedback+) to provide more
specific feedback containing more suggestions than the traditional face-
to-face condition (CG). It can be presumed that participants would
profit from practicing in the sheltered online environment without real-
time pressures. Finally, we hypothesised that pre-service teachers in the
blended V-Feedback + condition (expert feedback included) would in-
crease their feedback competence more than members of the V-
Feedback condition (without expert feedback) because the latter lacked
a modeler of practice.
2. Methods
2.1. Design
Pre-service teachers participated in a quasi-experimental pre-post-
design (see Fig. 1). The intervention was conducted during a four-week
practicum. Pre-service teachers participated in either traditional face-
to-face (CG), digital video-based feedback sessions (V-Feedback) or di-
gital video-based feedback sessions with expert input (V-Feedback+).
The pre- and post-tests comprised a video-based measure of pre-service
teachers’ feedback competence.
2.2. Participants
The study1
was conducted with fourth-semester bachelor's students
at a regional German university. Student teachers participated in a four-
week teaching practicum at local schools, with 120 student teachers in
practical placement. Only participants who completed the pre- and
post-tests were included in the sample. Consequently, a limited number
had to be excluded, resulting in 103 participants in the final sample. In
all, 65 student teachers were assigned to the traditional face-to-face
condition (CG; 92.3% female; Mage = 23.35, SDage = 4.61), 16 to the V-
Feedback condition (87.5% female; Mage = 24.29, SDage = 6.82) and 22
to the V-Feedback+ condition (95.5% female; Mage = 22.86,
Fig. 1. Timetable of Quasi-Experimental Study (CG = control group, VF = V-Feedback condition, VF+ = V-Feedback+ condition).
1
The sample used in this study also was subject to analyses concerning other
dependent variables (Weber et al., 2018).
C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131
123
SDage = 4.25). Originally, we planned for equally sized digital video-
based feedback groups (V-Feedback, V-Feedback+) and a larger tradi-
tional face-to-face group (CG). Video recordings in German schools
must abide by strict data-privacy policies. This meant that student
teachers had to volunteer to be part of our video-based intervention;
therefore, they could not be assigned randomly to the control group or
video-based groups. To film in classrooms, written consent had to be
acquired from schools and students' parents. Based on an advance in-
quiry with schools, we assigned student teachers to schools. Un-
fortunately, several parents did not sign their consent forms, so four
students who we had placed in the V-Feedback condition had to join the
CG. However, the remaining student teachers who volunteered for the
video groups were assigned randomly to the V-Feedback and V-Feed-
back + conditions.
Applying one-way analyses of variance (ANOVAs), no statistically
significant difference was found for participants in the three conditions
concerning age, F(2,100) = 0.55, p = .58 (CG: MCG = 23.06,
SDCG = 4.70 vs. V-Feedback: MVF = 24.29, SDVF = 6.82 vs. V-Feedback
+: MVF+ = 22.86, SDVF+ = 4.25), previous teaching experience, F
(2,100) = 1.39, p = .26 (MCG = 0.27, SDCG = 0.76 vs. MVF = 0.15,
SDVF = 0.38 vs. MVF+ = 0.30, SDVF+ = 0.81), self-estimated prior
knowledge of the dimensions of classroom management, F
(2,100) = 2.34, p = .10 (MCG = 2.89, SDCG = 0.71 vs. MVF = 3.14,
SDVF = 5.34 vs. MVF+ = 2.59, SDVF+ = 0.87), or self-efficacy of class-
room management, F(2,100) = 0.22, p = .80 (MCG = 35.29,
SDCG = 3.62 vs. MVF = 35.17, SDVF = 3.86 vs. MVF+ = 34.38,
SDVF+ = 7.80). After the practicum, we asked student teachers how
many informal feedback sessions they participated in with mentors or
peers, and sought their assessment of peer and mentor feedback quality
because this could affect their feedback competence. However, the
participants in the conditions did not show any statistical differences, F
(2,100) = 0.02, p = .99 (number of feedback occasions: MCG = 2.78,
SDCG = 1.18 vs. MVF = 2.71, SDVF = 1.07 vs. MVF+ = 2.77,
SDVF+ = 1.10, F(2,100) = 1.18, p = .31 (quality of mentor feedback:
(MCG = 3.55, SDCG = 1.10 vs. MVF = 3.29, SDVF = 1.54 vs.
MVF+ = 3.86, SDVF+ = 0.89, F(2,100) = 0.21, p = .82 (quality of peer
feedback: (MCG = 3.58, SDCG = 0.75 vs. MVF = 3.71, SDVF = 0.73 vs.
MVF+ = 3.64, SDVF+ = 0.58.
2.3. Teacher education in Germany
In Germany, teacher education is divided into two phases: a five-
year university phase and a one-and-a-half-year induction phase in
schools. Only after completing the induction phase can student teachers
become fully qualified teachers. During the university phase, student
teachers must complete bachelor's and master's degrees. They usually
study two teaching subjects and must enrol in courses on psychology,
pedagogy and sociology. Additionally, they must participate in several
practicums.2
Our study was conducted during the student teachers'
second practicum during the bachelor's phase. Student teachers already
had completed their first observational practicum, lasting three weeks,
during the second semester. The second practicum during the fourth
semester lasted four weeks and required that students teach by them-
selves for the first time.
Educational research indicates that when practical experiences are
aligned to coursework, students can connect theory to practice more
readily (Hammerness et al., 2005). Consequently, student teachers had
to complete a lecture on didactics and methods, as well as a seminar, as
preparation for the practicum. While the lecture provided an overview
of theoretical concepts and teaching methods, the seminar focussed on
ensuring that student teachers can plan a lesson. Theory and methods
from the lecture were elaborated on, then used by student teachers to
plan a fictitious lesson in detail. They had to hand in the lesson as
coursework by the end of the semester. On two occasions, the lecture
and seminar focussed on classroom management (Kounin, 1970) for the
entire length of sessions. As part of the seminar sessions, student tea-
chers had to act out parts of their fictitious lessons, with the rest of the
group acting as school students. They subsequently received feedback
on their classroom management skills from the group.
2.4. Intervention procedure
Teaching practicums can lead to limited interaction between stu-
dent teachers when they are all placed at separate schools, and feed-
back and reflection sessions in digital environments can become par-
ticularly impersonal (Malewski et al., 2005). To foster interaction and
allow for a learning community to develop, we placed students in teams
at the schools. Team partners were supposed to observe each other
teaching and provide feedback. Furthermore, each student teacher had
a tandem partner at a different school. When visiting tandem partners,
student teachers also had to participate in feedback and reflection
sessions with the university supervisor. University supervisors visited
student teachers in the V-Feedback and V-Feedback+ conditions once
and in the face-to-face condition (CG) twice. Instead of a second face-to-
face session, students in the V-Feedback and V-Feedback+ groups par-
ticipated in two feedback and reflection sessions in the digital video-
based environment (see Fig. 1). University supervisors provided expert
feedback in the V-Feedback+ digital video-based environment.
2.5. Video-based feedback environments
As feedback and reflection sessions often lack a “substantive focus”
(Valencia et al., 2009, p. 314), pre-service teachers followed a highly
structured reflection and feedback cycle in the digital-video-based en-
vironments. This cycle is based on a previous study by Kleinknecht &
Gröschner (2016).
Classroom management was set as the feedback and reflection
focus. On one hand, classroom management is considered a necessary
prerequisite for successful teaching and has proven to be hard for pre-
service teachers to accomplish (Wolff, Van den Bogert, Jarodzka, &
Boshuizen, 2015). On the other hand, a specific focus limits complexity,
making it possible for students to work on one set of skills in depth
(Tschannen-Moran, Woolfolk Hoy, & Hoy, 1998). Consequently, pre-
service teachers had to choose instances of classroom management
based on the dimensions monitoring, managing momentum and es-
tablishing rules and routines (Gold & Holodynski, 2017). Individual
lecture and course sessions on didactics and methods before the prac-
ticum were aligned to this focus. Additionally, specific criteria were
presented and clarified during the practicum's introductory event.
Before reflecting and receiving feedback in the digital video-based
environment, pre-service teachers filmed themselves. They used cam-
eras mounted on tripods provided by the university. Team partners
were responsible for handling the cameras. Subsequently, pre-service
teachers chose a five-to 10-min video sequence of their teaching ac-
cording to the classroom management focus. Sequences were supposed
to show an instructional phase and the following transitional phase to
individual, peer, or group work. Each sequence should contain at least
one critical classroom management event so that participants did not
solely choose positive events (Sharpe et al., 2003). The sequences were
uploaded to a Moodle forum with vShare software enhancement
(Huppertz, Massler, & Plötzner, 2005). Students composed a self-re-
flection of the video sequence following a three-step approach
(Kleinknecht & Gröschner, 2016; Seidel et al., 2011). They were asked
first to describe the classroom-management situation, evaluate and
explain evaluations, and finally consider possible alternative teaching
strategies. Additionally, they were able to annotate specific situations in
their video using the vShare tool. This course of reflection was presented
and explained in the introductory lecture and seminar one week before
2
The length and timing of practicums can vary depending on individual
German states.
C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131
124
the practicum started. On one hand, the three-step approach was sup-
posed to foster a more in-depth and structured reflection of the lesson.
On the other hand, the structured reflection embedded the video with
additional contextual information and facilitated comprehensive un-
derstanding (Blomberg et al., 2013; Körkkö et al., 2019). After posting
their self-reflections online, they received feedback from two peers re-
garding their classroom video and self-reflection. Methodological sup-
port also was provided for this step through a feedback example and
rules at the introductory event (i.e., Base your feedback on specific
classroom situations. Suggest alternatives. Ask thought-provoking
questions.). Furthermore, a fictitious example of the feedback and re-
flection cycle was presented during the practicum's introductory event
and on the online platform. After having received feedback from their
peers, pre-service teachers in the V-Feedback condition completed the
reflection and feedback cycle by composing a feedback balance. They
were supposed to reflect on the feedback that their peers provided and
decide which classroom management alternatives they would try to use
in future situations. In the V-Feedback+ condition, this step was pre-
ceded by expert feedback from the university supervisors. Fig. 2 shows
the interface of the V-Feedback+ digital video-based environment and a
fictitious example of self-reflection, peer feedbacks, expert feedback
and feedback balance. Face-to-face sessions (CG) followed the same
reflection and feedback cycle, but without video. Feedback and re-
flection sessions were based on a lesson observed directly before, in the
CG.
At the end of the practicum, V-Feedback and V-Feedback+ partici-
pants self-reflected and received feedback in one face-to-face and two
digital video-based feedback and reflection sessions on their own
classroom practice. Student teachers in the CG participated in two face-
to-face feedback and reflection sessions. As face-to-face feedback and
reflection sessions dealt with an entire lesson, two feedback and re-
flection occasions in the digital video-based environments equalled one
face-to-face session. Furthermore, every student provided feedback on
their peers’ teaching the same number of times.
2.6. Instruments
A quasi-experimental, repeated-measures, pre-test-treatment-post-
test design was adopted for this study. To assess pre-service teachers’
feedback competence, a video- and text-based measure was developed.
Instruments using video are used to enhance authenticity and
complexity (Borko et al., 2009; Sherin, 2007) and have become a
“prominent tool for studying teacher learning and the activating of
teacher knowledge” (Seidel & Stürmer, 2014, p. 740). Recent studies
indicate that they can measure situation-specific skills more efficiently
than text-based cases (Barth, Thiel, & Ophardt, 2018; Gold &
Holodynski, 2017). Our tool comprises a feedback situation that focuses
on classroom management. Student teachers were presented with a
classroom and feedback situation and had to provide the depicted
teacher with feedback. Feedback comments were limited to 200 words
or fewer, and test time was approximately 30 min. Student teachers
were presented with the feedback situation through the online platform
Unipark (Questback, 2017). Six weeks after the pre-test, an identical
post-test was administered (see Fig. 1).
The teaching and feedback scenario was constructed around a 1-min
video sequence (see Fig. 3). During the video clip, a second-grade pri-
mary-school class is shown working at different learning stations. The
students are trying to determine what happens with water when an
object is immersed in it. The teacher, for example, checks results, asks
students to return to their work stations, or scolds them for disrupting
the lesson. The video sequence shows both successful and unsuccessful
classroom-management scenarios. Apart from the video vignette, the
feedback situation contained information on the teacher and a self-re-
flection and utterance from the teacher. These were presented in text
form. The content of the self-reflection and utterance focussed on the
facet of withitness in classroom management. Withitness can be defined
as a teacher's competence in monitoring classroom events continuously
and reacting appropriately when needed. It was employed as it is
considered a focal constituent of successful classroom management
Fig. 2. Interface of the V-Feedback+ digital video-based environment (V-Feedback = without expert feedback).
Fig. 3. Elements of the peer feedback measure.
C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131
125
(Kounin, 1970; Wolff, 2015).
To determine whether the scenario is authentic, the content was
validated. Twelve feedback experts (feedback occasions per year:
M = 77.67, SD = 45.61) were invited to assess authenticity. The ex-
perts comprised teacher educators who are responsible for training
teachers during the German induction phase (1.5 years). One of their
principal tasks is to observe trainee teachers’ teaching and provide them
with feedback. We asked the experts to assess the authenticity, their
interest, and typicality of the feedback situation on a four-point Likert
scale (from 1 = does not apply to 4 = applies). The experts perceived the
feedback situation as authentic (M = 3.25, SD = 1.04, interesting,
M = 2.90, SD = 1.10, and typical, M = 3.00, SD = 0.85). As a result,
an acceptable degree of content validity can be assumed.
2.7. Coding and analysis
The written feedback that pre-service teachers provided for the pre-
and post-tests was coded by applying a coding scheme that Prins et al.
(2006) developed. It is based on a prior study in which Sluijsmans et al.
(2002) extracted characteristics of constructive feedback from expert
feedback. Although Prins et al. (2006) coding scheme was developed to
assess general practitioners' feedback-report quality, versions of it have
been applied in a variety of domains (M. Gielen & De Wever, 2015; S.
Gielen et al., 2010). We adopted the coding scheme to feedback con-
cerning classroom management.
We analysed the feedback, following guidelines for quantifying
qualitative data from Chi (1997). Instead of “counting the number of
criteria used and the number of comments or certain words present
[Prins et al.‘s coding scheme], evaluates the presence of a set of ne-
cessary ingredients” (S. Gielen et al., 2010, p. 307). Consequently, we
decided to treat each feedback as a unit of analysis because it captures
the “semantics of the inference at a more appropriate level” (Chi, 1997,
p. 10).
Our coding scheme comprises six categories (see Table 1). Sub-op-
timal feedback was coded as ‘0’, average feedback as ‘1’ and good
feedback as ‘2’. An example for code 2 of the specificity category is ‘I
had the impression that you did not notice Max and Anna playing with
the water when you were explaining the task to Charlotte’. In the
suggestions category, feedback such as ‘Maybe it would be an idea to
stand in front of the class during such station learning so that you have
everything in view’ would have been coded ‘2’. Consequently, partici-
pants could achieve a maximum score of two points in each category.
As an estimation of overall feedback competence, we combined in-
dividual categories into an aggregated score, resulting in a possible
maximum of 12 points.
Three coders carried out the coding of feedback comments. Coders
were student workers trained by a member of the research team. Before
coding the entire sample, 10 randomly chosen feedback comments were
coded for practice. Differences between coders were discussed
(Zottmann et al., 2013), then coders randomly were assigned feedback
comments so that each feedback comment was coded by two in-
dependent coders to establish reliability. We calculated Cohen's kappa
(κ) (Fleiss & Cohen, 1973). Coding yielded substantial kappa values (see
Table 1). Consequently, sufficient content reliability was established.
2.8. Methods of analysis
The data was analysed applying one-way analyses of variance
(ANOVA), paired samples t-tests, one-way repeated measures multi-
variate analyses of variance (MANOVA), a one-way ANOVA with dif-
ference scores and planned contrasts. All analyses were computed using
SPSS25 software. Furthermore, the alpha value was set at p < .05 for
all statistical analyses. Normal distribution of data was slightly violated
in the CG. As parametric analysis tools such as ANOVA and MANOVA
are robust against this kind of violation (O'Brien & Kaiser, 1985;
Schmider, Ziegler, Danay, Beyer, & Bühner, 2010), we decided to apply
Table1
Contentanalysisofpre-serviceteachers'peerfeedback:Category,feedbackquality,scorespercentageofcoderagreementandinter-coderreliability(Cohen'skappa).
CategoryGoodfeedbackAveragefeedbackSub-optimalfeedbackPercentageofcoderagreementκ
AssessmentcriteriaReferencetoclassroommanagement,including
terminology
2Referencetoclassroommanagementwithout
terminology
1Noreferencetoclassroommanagement087.2%.715
SpecificitySpecificsituationiselaboratedon2Specificteachingphaseiselaboratedon1Nospecificsituationorphaseelaboratedon081.8%.662
SuggestionsAlternativespresentedwithexplanation2Alternativespresentedwithoutexplanation1Noalternativespresented084.9%.707
QuestionsActivatingquestionposed2Clarifyingquestionposed1Noquestionsposed091.6%.700
FirstpersonWritteninfirstpersonthroughoutfeedback2Occasionallywritteninfirstperson1Notwritteninfirstperson084.9%.752
Positive/negativeEquilibriumofpositiveandnegativefeedback2Mainlypositivefeedback1Mainlynegativefeedback081.8%.697
C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131
126
it. Sphericity was provided in the ANOVAs and MANOVAs. Sufficient
statistical power was established for all analyses using the program
GPower (Faul, Erdfelder, Buchner, & Lang, 2009).
3. Results
3.1. Assessment of pre-service teachers’ feedback competence prior to
intervention
We investigated possible pre-test differences between conditions
(CG, V-Feedback, V-Feedback+) by calculating ANOVAs because dif-
ferences before the intervention could impact the estimated effects.
One-way ANOVAs showed that all conditions were comparable
concerning overall feedback expertise, F(2,100) = 0.36, p = .70. This
also was the case for individual categories. No differences could be
established for assessment criteria, F(2,100) = 0.90, p = .41, specifi-
city, F(2,100) = 0.34, p = .72, suggestions, F(2,100) = 1.59, p = .21,
questions, F(2,100) = 0.23, p = .80, first person, F(2,100) = 1.16,
p = .32, and positive/negative, F(2,100) = 0.45, p = .64.
3.2. Effects of feedback practice during the practicum
Concerning research question 1, to what extent feedback practice
during the practicum improved pre-service teachers’ feedback compe-
tence in CG, V-Feedback and V-Feedback+, we first analysed the effect
for time on overall feedback competence and feedback competence
categories.
To establish the effect for time on overall feedback competence
(dependent variable), we conducted paired samples t-tests. Each con-
dition was analysed individually (CG, V-Feedback, V-Feedback+).
Concerning the control group, the paired samples t-test showed a sig-
nificant, small effect for overall feedback competence, t(64) = 3.046,
p = .003, d = 0.38, while the V-Feedback condition failed to approach
statistical significance, t(16) = 1.499, p = .16, d = 0.43. Regarding the
V-Feedback+ condition a large, significant effect was found, t
(21) = 3.974, p < .001, d = 1.1.
To assess the effect for time on all individual feedback competence
categories (dependent variables: assessment criteria, specificity, sug-
gestions, questions, first person, positive/negative) simultaneously, we
performed one-way repeated measures MANOVAs. Conditions (CG, V-
Feedback, V-Feedback+) were analysed individually. Concerning the
control group, the multivariate analysis of variance established a
medium, significant effect for first person and a large, significant effect
for suggestions (see Table 2), while no significant effects could be found
for the V-Feedback condition as Wilks’ Lambda did not reach sig-
nificance level. Regarding the V-Feedback+ condition, large, significant
effects were revealed for specificity, suggestions, questions and first
person.
In brief, practicing feedback in face-to-face and digital video-based
feedback environments enhanced pre-service teachers’ feedback com-
petence overall and in a variety of categories. The lack of significant
results in the V-Feedback condition can cautiously be attributed to the
small sample size, especially when considering the large effect size for
specificity (see Table 2).
Regarding research questions 2 (differences between V-Feedback/V-
Feedback+ and CG) and 3 (differences between V-Feedback and V-
Feedback+), we investigated differences in development of pre-service
teachers’ overall feedback competence and feedback competence cate-
gories between conditions.
To compare changes from pre-to post-test between conditions, we
performed a one-way ANOVA with difference scores (see Table 3) and
planned contrasts. Difference scores (post-test score minus pre-test
score) were entered as the dependent variable (Huck & McLean, 1975;
Maxwell & Howard, 1981, pp. 747–756). This analytical rational was
also applied by other researchers (e.g., Heemsoth & Kleickmann, 2018).
The ANOVA showed that conditions differed significantly in terms
of specificity, F(2,100) = 4.42, p = .014, and suggestions, F
(2,100) = 3.37, p = .038. No significant effects were found for the
other categories or overall feedback competence, p > .05. Concerning
Table 2
Results of one-way repeated measures MANOVAs for time on all feedback competence categories for individual conditions.
CG V-Feedback V-Feedback+
F(1,64) p ηp
2
F(1,15) p ηp
2
F(1,21) p ηp
2
Assessment criteria 0.095 .76 .001 1.000 .06 .06 2.783 .11 .12
Specificity 0.454 .50 .01 3.151 .10 .17 5.045 .036 .19
Suggestions 6.779 .011 .10 0.238 .63 .02 21.138 < .001 .50
Questions 1.957 .17 .03 0.00 1.00 .00 4.667 .042 .18
First person 11.468 < .001 .15 1.518 .24 .09 4.433 .047 .17
Positive/Negative 0.006 .94 .00 1.364 .26 .08 0.00 1.00 .00
Note: Wilks‘ Lambda: CG, F(6,59) = 2.630, p = .025, ηp
2
= 0.21; V-Feedback, F(6,10) = 1.084, p = .433, ηp
2
= 0.39; V-Feedback+, F(6,16) = 4.530, p = .007,
ηp
2
= 0.63.
Table 3
Means, standard deviations and changes from pre-to post-test.
Pre-test Post-test
M SD M SD Δ
Feedback competence
CG 4.16 2.04 4.90 1.90 0.73
VF 4.63 2.06 5.50 2.13 0.88
VF+ 4.27 1.54 6.07 1.79 1.80
Assessment criteria
CG 1.16 0.48 1.14 0.46 0.02
VF 1.06 0.44 0.94 0.57 0.13
VF+ 1.27 0.53 1.02 0.50 0.25
Specificity
CG 0.39 0.70 0.35 0.65 0.04
VF 0.25 0.68 0.58 0.73 0.31
VF+ 0.41 0.57 0.82 0.81 0.41
Suggestions
CG 1.03 0.90 1.36 0.75 0.33
VF 1.38 0.96 1.50 0.73 0.13
VF+ 0.86 0.77 1.66 0.62 0.80
Questions
CG 0.23 0.56 0.37 0.76 0.14
VF 0.31 0.70 0.31 0.60 0.00
VF+ 0.18 0.59 0.55 0.91 0.36
First person
CG 0.88 0.94 1.32 0.85 0.34
VF 1.38 0.96 1.67 0.70 0.31
VF+ 1.09 0.93 1.57 0.62 0.48
Positive/Negative
CG 0.37 0.67 0.38 0.62 0.00
VF 0.25 0.58 0.50 0.73 0.25
VF+ 0.45 0.69 0.45 0.72 0.00
Note: CG = control group, VF = V-Feedback, VF+ = V-Feedback+, nCG = 65,
nVF = 16 and nVF = 22; feedback competence Min = 0, Max = 12, categories
Min = 0 and Max = 2.
C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131
127
research question 2, we analysed what impact blended digital video-
based environments (V-Feedback, V-Feedback+) have on pre-service
teachers' feedback competence in comparison to the face-to-face con-
dition (CG). Planned contrasts revealed that V-Feedback and V-Feedback
+ together differed significantly from the CG in specificity, with a
medium effect size, t(100) = 2.87, p = .005, d = 0.55, but not in sug-
gestions, t(100) = 0.85, p = .40, d = 0.16. Thus, practicing feedback in
blended digital video-based environments increased the specificity of
feedback. For research question 3, we investigated how expert feedback
(V-Feedback+) influenced pre-service teachers' feedback competence
compared with the condition without expert feedback (V-Feedback).
The comparison of V-Feedback+ with V-Feedback revealed a significant
difference between conditions for suggestions with a medium effect
size, t(100) = 2.33, p = .022, d = 0.77, but not specificity, t
(100) = 0.42, p = .68, d = 0.14. Hence, having experts present en-
hanced the quality of pre-service teachers’ suggestions.
4. Discussion
Our study investigated the effects of two blended digital video-
based feedback environments during a teaching practicum on pre-ser-
vice teachers' feedback competence on teaching practice compared with
a control group. The control group participated in a conventional
practicum setting with face-to-face feedback. Besides face-to-face
feedback sessions, one of the experimental groups practised feedback in
a digital video-based environment without expert feedback (V-
Feedback), and the other in the same setting included expert feedback
(V-Feedback+). We assumed that pre-service teachers’ feedback com-
petence would increase in all conditions, but with stronger improve-
ments in sheltered digital video-based environments. Furthermore, we
expected participants in the V-Feedback+ group to profit from expert
modelling compared with the V-Feedback group.
Contrary to our first assumption, overall feedback competence did
not increase significantly in all conditions. Improvements were sig-
nificant only for the V-Feedback+ condition and control group.
Consequently, these results (CG, V-Feedback+) align with Grossman
and McDonald’s (2008) assertion that novices require multiple practice
occasions to start and develop a skill. However, analyses of individual
feedback categories revealed a more comprehensive picture that al-
ready hints at research questions 2 and 3. In the V-Feedback+ condi-
tion, significant developments were established in four of the six cate-
gories (specificity, suggestions, questions, first person), two in the CG
(suggestions, first person) and none in the V-Feedback group. However,
the lack of improvement in all groups in the positive/negative category
is of particular interest. This might be the result of the complexity in
classroom management for pre-service teachers. Various extant studies
(Van den Bogert, Van Bruggen, Kostons, & Jochems, 2014; Wolff,
Jarodzka, & Boshuizen, 2017) were able to establish that novices,
contrary to experienced teachers, predominantly concentrate on nega-
tive events when analysing classroom situations. Thus, it might require
more classroom observation to foster their professional vision (Gold &
Holodynski, 2017) and enable pre-service teachers to spot more posi-
tive events, then incorporate them into their peer feedback. Although
we provided methodological support in the lecture and seminar before
the practicum by clarifying classroom management criteria and setting
a specific focus for observation, additional assistance might be neces-
sary. An observation sheet might be a viable solution, providing pre-
service teachers with “a particular lens” (Santagata & Angelici, 2010, p.
339).
Concerning research question 2, we assumed that the digital video-
based environments fostered feedback competence more than face-to-
face feedback sessions. On one hand, these approximations of practice
limit complexity by possessing no real-time pressure. On the other
hand, the use of video has proven beneficial regarding feedback quality.
The digital video-based feedback environments produced better effects
than the control group concerning specificity. Thus, the comparison of
V-Feedback+ and V-Feedback conditions with the CG confirms extant
research (Lee & Wu, 2006; Wu & Kao, 2008; Tripp & Rich, 2012a,
2012b). Practising with classroom videos seems to make feedback more
specific, accurate and concrete. This is probably also a result of the
higher degree of decomposition in the digital video-based environment
as an approximation of practice. Pre-service teachers were provided
with a specific sequence focusing on classroom management and did
not practise giving feedback after having observed an entire lesson. On
one hand, observing an entire lesson without video support could yield
feedback that is too general, as Tripp and Rich (2012a) established. On
the other hand, using own and other video sequences in the V-Feedback
and V-Feedback+ environment also trained their professional vision of
classroom management, making their feedback more specific (Weber
et al., 2018; Hellermann et al., 2015; Krammer et al., 2016; Wolff et al.,
2015).
Regarding research question 3, expert modelling seems to be ne-
cessary to facilitate feedback competence. The V-Feedback+ group
performed better than the V-Feedback group concerning the suggestions
category. Thus, to foster feedback competence in digital video-based
environments, experts need to participate. This aligns with previous
findings of students having “more confidence in their supervisors’
opinions than their own” (Tripp & Rich, 2012b, p. 683). This also il-
lustrates that pre-service teachers need “modelers of practice” (Clarke
et al., 2014, p. 177), whom they can mimic, as experts provide “more
sophisticated feedback” (Weber et al., 2018, p. 46).
4.1. Future directions and implications for teacher education
Some issues should be considered for future research, as well as for
implementing digital video-based environments in university or teacher
training courses.
Regarding the construction of digital video-based environments,
future studies should focus on effects from scaffolding elements.
Although it is assumed that video-based interventions for novices in
particular require highly structured settings (e.g., Kleinknecht &
Gröschner, 2016; Moreno & Valdez, 2007), Peters et al. (2018) suggest
that scaffolding does not always elicit positive effects with respect to
feedback. Actually, it can be harmful to students’ motivation (S. Gielen
et al., 2010) and lower peer feedback beliefs (Alqassab et al., 2018).
Nevertheless, other studies (Gan, 2011; M.; Gielen & De Wever, 2015)
found positive effects from structuring feedback elements.
In terms of future feedback research on teaching, fostering pre-
service teachers' professional vision of classroom practice needs to be
addressed. Extant studies showed that novices pay more attention to
negative teaching events (Van den Bogert et al., 2014). Consequently,
professional vision and its connection to feedback should be analysed
more comprehensively. The relationship among these components
could be assessed by testing student teachers’ professional vision during
the teaching practicum (Gold & Holodynski, 2017). Furthermore, pre-
service teachers possess a limited amount of teaching experience, but
also of opportunities to improve their feedback skills. Alqassab et al.
(2018) assessed that “peer feedback is usually a new practice to most
students, including pre-service teachers” (p. 15). However, although
feedback might be less applicable when provided by novices (Carless,
Chan, To, Lo, & Barrett, 2018), an expert level can be reached with
training (Topping, 2017). Therefore, more comprehensive training op-
portunities should be devised to foster feedback competence before the
teaching practicum.
Concerning possible implementations, incorporating classroom
video into digital environments requires a high degree of technical and
legal preparation and assistance. First, students must be provided with
cameras and trained in how to use them. Second, technical assistance is
needed for uploading videos or editing video sequences. Third, written
consent needs to be acquired from schools, students’ parents and uni-
versity students being filmed due to data-privacy laws. The latter factor
in particular can hinder effective research and, thus, the development
C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131
128
of effective interventions. And yet, we consider applying classroom
videos in teacher education to be a bottom-up process. The more uni-
versity students participate in video-based seminars, lectures, or inter-
ventions, the more open future teachers and parents will be to allowing
this technology into classrooms.
4.2. Limitations
Our study faced some limitations that encourage future research in
this field. First, as noted above, data-privacy laws only allow for partial
randomisation. Pre-service teachers had to volunteer to participate in
one of the video-based environments. Consequently, assignment of
study participants to the CG and the digital video-based conditions (V-
Feedback, V-Feedback+) was not random; therefore, pre-service tea-
chers' performances in video-based conditions could have been based
on higher motivation. However, we were able to assign participants,
who had volunteered, randomly to the V-Feedback and V-Feedback
+ groups; thus, this limitation only relates to the CG. Second, our
sample size is relatively small for the digital video-based conditions. As
noted in the discussion, effects or the lack thereof in the V-Feedback
condition in particular need to be viewed with caution. Generally, re-
search comprising video recordings rarely involves large samples (e.g.,
Gröschner, Schindler, Holzberger, Alles, & Seidel, 2018; Körkkö et al.,
2019; Tripp &amp; Rich, 2012a). In a recent review Major and Watson
(2018) established that three quarters of video-based studies involved
19 or fewer participants. Yet, we expect video-based interventions to be
conducted more easily in the future when the application of video turns
into a more common feature of teacher education. Third, we used a
short video sequence during the pre- and post-tests. Considering that
participants in the V-Feedback and V-Feedback+ conditions worked
with video recordings throughout the practicum, this could have pro-
vided them with an advantage over the face-to-face CG group. How-
ever, pre-service teachers were accustomed to working with video se-
quences of classroom situations during the preparatory lecture and
seminar. Furthermore, this limitation only pertains to the comparison of
face-to-face and video groups, but not V-Feedback and V-Feedback
+ groups. Finally, conducting our intervention in an ecologically valid
setting also entails a high degree of contextual factors that need to be
controlled. Although we tried to assess possible influences, such as
number of feedback occasions, mentors and peers’ feedback compe-
tence, especially that of mentors, is likely to influence pre-service tea-
chers (Kraft et al., 2018).
5. Conclusions
Our study increases the understanding of effects from blended di-
gital video-based feedback environments during teaching practicums
and contributes to the field of feedback research. Our study is the first
to assess pre-service teachers' feedback competence on teaching prac-
tice during a practicum, particularly in blended digital-video-based
environments. Results can inform teacher educators in both fields. Our
results indicate that digital video-based feedback environments need to
be combined with expert feedback to tap their full potential. Digital
video-based environments make feedback time- and location-in-
dependent, thereby offering a viable substitute for face-to-face sessions,
which often are not a standard component of teaching practicums or are
limited in scope because of lack of resources (Lee & Wu, 2006; Valencia
et al., 2009). Providing pre-service teachers with the opportunity to
practise feedback is an important prerequisite to facilitate future life-
long learning and, thus, fostering (pre-service) teachers’ professional
knowledge (Hammerness et al., 2005).
Acknowledgements
We would like to thank the pre-service teachers who participated in
this study and the school directors and teachers who provided them
with the opportunity of a practical placement. Furthermore, we are
grateful to our student assistants Karoline Glimm, Johanna Meyn and
Kristina Lindstedt for helping to plan the practicum and coding data.
Appendix A. Supplementary data
Supplementary data to this article can be found online at https://
doi.org/10.1016/j.chb.2019.08.011.
References
Allen, J. P., Hafen, C. A., Gregory, A. C., Mikami, A. Y., & Pianta, R. (2015). Enhancing
secondary school instruction and student achievement: Replication and extension of
the My Teaching Partner-Secondary intervention. Journal of Research on Educational
Effectiveness, 8(4), 475–489.
Alqassab, M., Strijbos, J.-W., & Ufer, S. (2018). Training peer-feedback skills on geometric
construction tasks: Role of domain knowledge and peer-feedback levels. European
Journal of Psychology of Education, 33(1), 11–30.
Bandura, A., & Cervone, D. (1986). Differential engagement of self-reactive influences in
cognitive motivation. Organizational Behavior and Human Decision Processes, 38(1),
92–113.
Barth, V. L., Thiel, F., & Ophardt, D. (2018). Professionelle Wahrnehmung von
Unterrichtsstörungen: Konzeption einer videobasierten Fallanalyse mit offenem
Antwortformat [Professional vision of classroom disruptions: Development of a
video-based case analysis with open questions]. In A. Krüger, F. Radisch, T. Häcker, &
M. Walm (Eds.). Empirische Bildungsforschung im Kontext von Schule und
Lehrer⁎
innenbildung [Empirical education research in the context of school and teacher
education] (pp. 141–153). Bad Heilbrunn: Julius Klinkhardt.
Blomberg, G., Renkl, A., Sherin, M., Borko, H., & Seidel, T. (2013). Five research-based
heuristics for using video in pre-service teacher education. Journal of Educational
Research Online, 5(1), 90–114.
Blömeke, S., Gustafsson, J.-E., & Shavelson, R. J. (2015). Beyond dichotomies: Ompetence
viewed as a continuum. Zeitschrift für Psychologie, 223(1), 3–13.
Borko, H., Whitcomb, J., & Liston, D. (2009). Wicked problems and other thoughts on
issues of technology and teacher learning. Journal of Teacher Education, 60(1), 3–7.
Carless, D., Chan, K. K. H., To, J., Lo, M., & Barrett, E. (2018). Developing students'
capacities for evaluative judgement through analysing exemplars. In D. Boud, R.
Ajjawi, P. Dawson, & J. Tai (Eds.). Developing evaluative judgement in higher education:
Assessment for knowing and producing quality work. London, UK: Routledge.
Chi, M. T. H. (1997). Quantifying qualitative analyses of verbal data: A practical guide.
The Journal of the Learning Sciences, 6(3), 271–315.
Clarke, A., Triggs, V., & Nielsen, W. (2014). Cooperating teacher participation in teacher
education: A review of the literature. Review of Educational Research, 84(2), 163–202.
Derry, S., Sherin, M., & Sherin, B. (2014). Multimedia learning with video. In R. Mayer
(Ed.). The Cambridge handbook of multimedia learning. Cambridge, UK: Cambridge
University Press.
Dobie, T. E., & Anderson, E. R. (2015). Interaction in teacher communities: Three forms
teachers use to express contrasting ideas in video clubs. Teaching and Teacher
Education, 47, 230–240.
Ericsson, K. A. (2004). Deliberate practice and the acquisition and maintenance of expert
performance in medicine and related domains. Academic Medicine, 79(10), 70–81.
Ericsson, K. A., Krampe, R. T., & Tesch-Römer, C. (1993). The role of deliberate practice
in the acquisition of expert performance. Psychological Review, 100(3), 363–406.
Faul, F., Erdfelder, E., Buchner, A., & Lang, A.-G. (2009). Statistical power analyses using
G⁎
Power 3.1: Tests for correlation and regression analyses. Behavior Research
Methods, 41, 1149–1160.
Fisher, D., Frey, N., & Lapp, D. (2011). Coaching middle-level teachers to think aloud
improves comprehension instruction and student reading achievement. The Teacher
Educator, 46(3), 231–243.
Fleiss, J. L., & Cohen, J. (1973). The equivalence of weighted kappa and the intraclass
correlation coefficient as measures of reliability. Educational and Psychological
Measurement, 33(3), 613–619.
Fukkink, R. G., & Tavecchio, L. W. C. (2010). Effects of video interaction guidance on
early childhood teachers. Teaching and Teacher Education, 26(8), 1652–1659.
Gan, M. (2011). The effects of prompts and explicit coaching on peer feedback quality
(Unpublished doctoral dissertation)New Zealand: University of Auckland.
Gan, M. J. S., & Hattie, J. (2014). Prompting secondary students‘ use of criteria, feedback
specificity and feedback levels during an investigative task. Instructional Science,
42(6), 861–878.
Gaudin, C., & Chaliès, S. (2015). Video viewing in teacher education and professional
development: A literature review. Educational Research Review, 16, 41–67.
Gielen, M., & De Wever, B. (2015). Structuring peer assessment: Comparing the impact of
the degree of structure on peer feedback content. Computers in Human Behavior, 52,
315–325.
Gielen, S., Peeters, E., Dochy, F., Onghena, P., & Struyven, K. (2010). Improving the
effectiveness of peer feedback for learning. Learning and Instruction, 20(4), 304–315.
Gold, B., & Holodynski, M. (2017). Using digital video to measure the professional vision
of elementary classroom management: Test validation and methodological chal-
lenges. Computers & Education, 107, 13–30.
Gregory, A., Ruzek, E., Hafen, C. A., Mikami, A. Y., Allen, J. P., & Pianta, R. C. (2017). My
teaching partner-secondary: A video-based coaching model. Theory Into Practice,
56(1), 38–45.
C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131
129
Gröschner, A., Schindler, A.-K., Holzberger, D., Alles, M., & Seidel, T. (2018). How sys-
tematic video reflection in teacher professional development regarding classroom
discourse contributes to teacher and student self-efficacy. International Journal of
Educational Research, 90, 223–233.
Grossman, P., Hammerness, K., & McDonald, M. (2009). Redefining teaching, re‐-
imagining teacher education. Teachers and Teaching: Theory and Practice, 15(2),
273–289.
Grossman, P., & McDonald, M. (2008). Back to the future: Directions for research in
teaching and teacher education. American Educational Research Journal, 45(1),
184–205.
Hafner, J., & Hafner, P. (2003). Quantitative analysis of the rubric as an assessment tool:
An empirical study of student peer‐group rating. International Journal of Science
Education, 25(12), 1509–1528.
Hammerness, K. M., Darling-Hammond, L., Bransford, J., Berliner, D. C., Cochran-Smith,
M., McDonald, M., et al. (2005). How teachers learn and develop. In L. Darling-
Hammond, J. Bransford, P. LePage, K. Hammerness, & H. Duffy (Eds.). Preparing
teachers for a changing world: What teachers should learn and be able to do (pp. 358–
389). San Francisco, CA: Jossey-Bass.
Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research,
77(1), 81–112.
Heemsoth, T., & Kleickmann, T. (2018). Learning to plan self-controlled physical edu-
cation : Good vs. problematic teaching examples. Teaching and Teacher Education, 71,
168–178.
Hellermann, C., Gold, B., & Holodynski, M. (2015). Förderung von
Klassenführungsfähigkeiten im Lehramtsstudium. Die Wirkung der Analyse eigener
und fremder Unterrichtsvideos auf das strategische Wissen und die professionelle
Wahrnehmung [Fostering classroom management skills in teacher education: Effects
of analysis of one's own or other teachers‘ classroom videos on strategic knowledge
and professional vision]. Zeitschrift für Entwicklungspsychologie und Pädagogische
Psychologie, 47(2), 97–109.
Hixon, E., & So, H.-J. (2009). Technology's role in field experiences for preservice teacher
training. Educational Technology & Society, 12(4), 294–304.
Hollingsworth, H., & Clarke, D. (2017). Video as a tool for focusing teacher self-reflection:
Supporting and provoking teacher learning. Journal of Mathematics Teacher Education,
70(5), 457–475.
Huck, S. W., & McLean, R. A. (1975). Using a repeated measures ANOVA to analyze the
data from a pretest-posttest design: A potentially confusing task. Psychological
Bulletin, 82(4), 511–518.
Huppertz, P., Massler, U., & Plötzner, R. (2005). V-share: Video-based analysis and re-
flection of teaching experiences in virtual groups. Proceedings of the international
conference on computer support for collaborative learning (pp. 245–253). Mahwah, NJ:
Lawrence Erlbaum Associates.
Joyce, B., & Showers, B. (1996). Staff development as a comprehensive service organi-
sation. Journal of Staff Development, 17(1), 2–6.
Joyce, B. R., & Showers, B. (2002). Student achievement through staff development ((3rd
ed.)). Alexandria: VA: ASCD.
Kang, H., & Van Es, E. A. (2018). Articulating design principles for productive use of video
in preservice education. Journal of Teacher Education, 1–14.
Kersting, N. (2008). Using video clips of mathematics classroom instruction as item
prompts to measure teachers’ knowledge of teaching mathematics. Educational and
Psychological Measurement, 68(5), 845–861.
Kleinknecht, M., & Gröschner, A. (2016). Fostering preservice teachers’ noticing with
structured video feedback: Results of an online- and video-based intervention study.
Teaching and Teacher Education, 59, 45–56.
Kleinknecht, M., & Schneider, J. (2013). What do teachers think and feel when analyzing
videos of themselves and other teachers teaching? Teaching and Teacher Education, 33,
13–23.
Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance:
A historical review, a meta-analysis and a preliminary feedback intervention theory.
Psychological Bulletin, 119(2), 254–284.
Körkkö, M., Morales Rios, S., & Kyrö-Ämmäla, O. (2019). Using a video app as a tool for
reflective practice. Educational Research, 61(1), 22–37.
Kounin, J. S. (1970). Discipline and group management in classrooms. New York, NY: Holt,
Rinchart & Winston.
Kraft, M. A., Blazar, D., & Hogan, D. (2018). The effect of teacher coaching on instruction
and achievement: A meta-analysis of the causal evidence. Review of Educational
Research, 88(4), 547–588.
Krammer, K., Hugener, I., Biaggi, S., Frommelt, M., Fürrer, A.der M., & G., & Stürmer, K.
(2016). Videos in der Ausbildung von Lehrkräften: Förderung der professionellen
Unterrichtswahrnehmung durch die Analyse von eigenen und fremden Videos
[Classroom videos in initial teacher education: Fostering professional vision by
analysing one's own and other teachers‘ videos]. Unterrichtswissenschaft, 44(4),
357–372.
Lee, G. C., & Wu, C.-C. (2006). Enhancing the teaching experience of pre-service teachers
through the use of videos in web-based computer-mediated communication (CMC).
Innovations in Education & Teaching International, 43(4), 369–380.
Lu, H.-L. (2010). Research on peer-coaching in preservice teacher education – a review of
literature. Teaching and Teacher Education, 26(4), 748–753.
Major, L., & Watson, S. (2018). Using video to support in-service teacher professional
development: The state of the field, limitations and possibilities. Technology, Pedagogy
and Education, 27(1), 49–68.
Malewski, E., Phillion, J., & Lehman, J. D. (2005). A Freirian framework for technology-
based virtual field experiences. Contemporary Issues in Technology and Teacher
Education, 4(4), 410–428.
Matsumura, L. C., Garnier, H. E., & Spybrook, J. (2013). Literacy coaching to improve
student reading achievement: A multi-level mediation model. Learning and Instruction,
25, 35–48.
Maxwell, S. E., & Howard, G. S. (1981). Change scores—necessarily. Anathema?
Educational and Psychological Measurement, Vol. 41 3.
Moreno, R., & Valdez, A. (2007). Immediate and delayed effects of using a classroom case
exemplar in teacher education: The role of presentation format. Journal of Educational
Psychology, 99(1), 194–206.
Narciss, S. (2013). Designing and evaluating tutoring feedback strategies for digital
learning environments on the basis of the interactive feedback model. Digital
Education Review, 23(1), 7–26.
Nicol, D. J., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated
learning: A model and seven principles of good feedback practice. Studies in Higher
Education, 31(2), 199–218.
O'Brien, R. G., & Kaiser, M. K. (1985). MANOVA method for analyzing repeated measures
designs: An extensive primer. Psychological Bulletin, 97(2), 316–333.
Peters, O., Körndle, H., & Narciss, S. (2018). Effects of a formative assessment script on
how vocational students generate formative assessment feedback to a peer's or their
own performance. European Journal of Psychology of Education, 33(1), 117–143.
Prins, F., Sluijsmans, D., & Kirschner, P. A. (2006). Feedback for general practitioners in
training: Quality, styles and preferences. Advances in Health Sciences Education, 11,
289–303.
Questback (2017). Unipark - EFS survey, version summer 2017. Köln: Questback GmbH.
Rich, P., & Hannafin, M. J. (2008). Capturing and assessing evidence of student teacher
inquiry: A case study. Teaching and Teacher Education, 24(6), 1426–1440.
Rotsaert, T., Panadero, E., Schellens, T., & Raes, A. (2018). Now you know what you’re
doing right or wrong!’ Peer feedback quality in synchronous peer assessment in
secondary education. European Journal of Psychology of Education, 33(2), 255–275.
Sailors, M., & Price, L. (2015). Support for the improvement of practices through intensive
coaching (SIPIC): A model of coaching for improving reading instruction and reading
achievement. Teaching and Teacher Education, 45, 115–127.
Santagata, R., & Angelici, G. (2010). Studying the impact of the lesson analysis framework
on preservice teachers' abilities to reflect on videos of classroom teaching. Journal of
Teacher Education, 61(4), 339–349.
Santagata, R., & Guarino, J. (2011). Using video to teach future teachers to learn from
teaching. ZDM, 43(1), 133–145.
Schmider, E., Ziegler, M., Danay, E., Beyer, L., & Bühner, M. (2010). Is it really robust?
Reinvestigating the robustness of ANOVA against violations of the normal distribu-
tion assumption. Methodology: European Journal of Research Methods for the
Behavioural and Social Sciences, 6(4), 147–151.
Seidel, T., & Stürmer, K. (2014). Modelling and measuring the structure of professional
vision in pre-service teachers. American Educational Research Journal, 51(4), 739–771.
Seidel, T., Stürmer, K., Blomberg, G., Kobarg, M., & Schwindt, K. (2011). Teacher learning
from analysis of videotaped classroom situations: Does it make a difference whether
teachers observe their own teaching or that of others? Teaching and Teacher Education,
27(2), 259–267.
Sharpe, L., Hu, C., Crawford, L., Gopinathan, S., Khine, M. S., Moo, S. N., et al. (2003).
Enhancing multipoint desktop video conferencing (MDVC) with lesson video clips:
Recent developments in pre-service teaching practice in Singapore. Teaching and
Teacher Education, 19(5), 529–541.
Sherin, M. (2007). New perspectives on the role of video in teacher education. In J.
Brophy (Ed.). Advances in research on teaching (pp. 1–27). Bingley, UK: Emerald.
Sluijsmans, D. M. A., Brand-Gruwel, S., & Van Merriënboer, J. J. G. (2002). Peer as-
sessment training in teacher education: Effects on performance and perceptions.
Assessment & Evaluation in Higher Education, 27(5), 443–454.
Sluijsmans, D. M. A., Brand-Gruwel, S., Van Merriënboer, J. J. G., & Bastiaens, T. J.
(2003). The training of peer assessment skills to promote the development of re-
flection skills in teacher education. Studies In Educational Evaluation, 29(1), 23–42.
Sluijsmans, D. M. A., Brand-Gruwel, S., Van Merriënboer, J. J. G., & Martens, R. L. (2004).
Training teachers in peer-assessment skills: Effects on performance and perceptions.
Innovations in Education & Teaching International, 41(1), 59–78.
So, W. W., Pow, J. W., & Hung, V. H. (2009). The interactive use of a video database in
teacher education: Creating a knowledge base for teaching through a learning com-
munity. Computers & Education, 53(3), 775–786.
Topping, K. J. (2017). Peer assessment: Learning by judging and discussing the work of
other learners. Interdisciplinary Education and Psychology, 1(1), 1–17.
Tripp, T. R., & Rich, P. J. (2012a). The influence of video analysis on the process of
teacher change. Teaching and Teacher Education, 28(5), 728–739.
Tripp, T. R., & Rich, P. J. (2012b). Using video to analyse one’s own teaching. British
Journal of Educational Technology, 43(4), 678–704.
Tschannen-Moran, M., & McMaster, P. (2009). Sources of self-efficacy: Four professional
development formats and their relationship to self-efficacy and implementation of a
new teaching strategy. The Elementary School Journal, 110(2), 228–245.
Tschannen-Moran, M., Woolfolk Hoy, A., & Hoy, W. K. (1998). Teacher efficacy: Its
meaning and measure. Review of Educational Research, 68(2), 202–248.
Valencia, S. W., Martin, S. D., Place, N. A., & Grossman, P. (2009). Complex interactions
in student teaching: Lost opportunities for learning. Journal of Teacher Education,
60(3), 304–322.
Van Steendam, E., Rijlaarsdam, G., Sercu, L., & Van den Bergh, H. (2010). the effect of
instruction type and dyadic or individual emulation on the quality of higher-order
peer feedback in EFL. Learning and Instruction, 20(4), 316–327.
Van den Bogert, N., Van Bruggen, J., Kostons, D., & Jochems, W. (2014). First steps into
understanding teachers‘ visual perception of classroom events. Teaching and Teacher
Education, 37, 208–216.
Vogt, F., & Rogalla, M. (2009). Developing adaptive teaching competency through
coaching. Teaching. Teacher Education, 25(8), 1051–1060.
Weber, K. E., Gold, B., Prilop, C. N., & Kleinknecht, M. (2018). Promoting pre-service
teachers’ professional vision of classroom management during practical school
C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131
130
training: Effects of a structured online- and video-based self-reflection and feedback
intervention. Teaching and Teacher Education, 76, 39–49.
Wolff, C. E. (2015). Revisiting ‘withitness’: Differences in teachers' representations, perceptions
and interpretations of classroom management. Heerlen, Netherlands: Open University of
the Netherlands.
Wolff, C. E., Jarodzka, H., & Boshuizen, H. (2017). See and tell: Differences between
expert and novice teachers’ interpretations of problematic classroom management
events. Teaching and Teacher Education, 66(1), 295–308.
Wolff, C. E., Van den Bogert, N., Jarodzka, H., & Boshuizen, H. (2015). Keeping an eye on
learning: Differences between expert and novice teachers' representations of class-
room management events. Journal of Teacher Education, 66(1), 68–85.
Wu, C.-C., & Kao, H.-C. (2008). Streaming videos in peer assessment to support training
pre- service teachers. Educational Technology & Society, 11(1), 45–55.
Zottmann, J. M., Stegmann, K., Strijbos, J.-W., Vogel, F., Wecker, C., & Fischer, F. (2013).
Computer-supported collaborative learning with digital video cases in teacher edu-
cation: The impact of teaching experience on knowledge convergence. Computers in
Human Behavior, 29(5), 2100–2108.
C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131
131

More Related Content

What's hot

Process technology
Process technologyProcess technology
Process technologyMas Jaya
 
Stephanie McKendry 'The conflicting priorities of blended and inclusive learn...
Stephanie McKendry 'The conflicting priorities of blended and inclusive learn...Stephanie McKendry 'The conflicting priorities of blended and inclusive learn...
Stephanie McKendry 'The conflicting priorities of blended and inclusive learn...johnroseadams1
 
Here project toolkit
Here project toolkitHere project toolkit
Here project toolkitEd Foster
 
Remote internship supervision
Remote internship supervisionRemote internship supervision
Remote internship supervisionMatthieu Petit
 
'Here's Looking At You’ 3 Sept2007
'Here's Looking At You’ 3 Sept2007'Here's Looking At You’ 3 Sept2007
'Here's Looking At You’ 3 Sept2007cies
 
Flipping heck! how can we engage students in the lecture experience?
Flipping heck! how can we engage students in the lecture experience?Flipping heck! how can we engage students in the lecture experience?
Flipping heck! how can we engage students in the lecture experience?SEDA
 
2008 Was it worth it? Looking back at EdD
2008 Was it worth it? Looking back at EdD2008 Was it worth it? Looking back at EdD
2008 Was it worth it? Looking back at EdDSue Greener
 
Journal of Educational Technology Systems-2015-53-68
Journal of Educational Technology Systems-2015-53-68Journal of Educational Technology Systems-2015-53-68
Journal of Educational Technology Systems-2015-53-68flashinvegas
 
2022_01_21 «Teaching Computing in School: Is research reaching classroom prac...
2022_01_21 «Teaching Computing in School: Is research reaching classroom prac...2022_01_21 «Teaching Computing in School: Is research reaching classroom prac...
2022_01_21 «Teaching Computing in School: Is research reaching classroom prac...eMadrid network
 
Closing The 2-Sigma Gap Eight Strategies to Replicate One-to-One Tutoring in ...
Closing The 2-Sigma Gap Eight Strategies to Replicate One-to-One Tutoring in ...Closing The 2-Sigma Gap Eight Strategies to Replicate One-to-One Tutoring in ...
Closing The 2-Sigma Gap Eight Strategies to Replicate One-to-One Tutoring in ...David Denton
 
Interactive learning strategies using technology in teaching science
Interactive learning strategies using technology in teaching scienceInteractive learning strategies using technology in teaching science
Interactive learning strategies using technology in teaching scienceFernando Altares, Jr.
 
Closing session: using a digital student voice platform to shape the student ...
Closing session: using a digital student voice platform to shape the student ...Closing session: using a digital student voice platform to shape the student ...
Closing session: using a digital student voice platform to shape the student ...Jisc
 
Teacher design team as a professional develoment arrangement to develop TPACK...
Teacher design team as a professional develoment arrangement to develop TPACK...Teacher design team as a professional develoment arrangement to develop TPACK...
Teacher design team as a professional develoment arrangement to develop TPACK...Ayoub Kafyulilo
 

What's hot (17)

Process technology
Process technologyProcess technology
Process technology
 
Stephanie McKendry 'The conflicting priorities of blended and inclusive learn...
Stephanie McKendry 'The conflicting priorities of blended and inclusive learn...Stephanie McKendry 'The conflicting priorities of blended and inclusive learn...
Stephanie McKendry 'The conflicting priorities of blended and inclusive learn...
 
Here project toolkit
Here project toolkitHere project toolkit
Here project toolkit
 
Remote internship supervision
Remote internship supervisionRemote internship supervision
Remote internship supervision
 
'Here's Looking At You’ 3 Sept2007
'Here's Looking At You’ 3 Sept2007'Here's Looking At You’ 3 Sept2007
'Here's Looking At You’ 3 Sept2007
 
Flipping heck! how can we engage students in the lecture experience?
Flipping heck! how can we engage students in the lecture experience?Flipping heck! how can we engage students in the lecture experience?
Flipping heck! how can we engage students in the lecture experience?
 
2008 Was it worth it? Looking back at EdD
2008 Was it worth it? Looking back at EdD2008 Was it worth it? Looking back at EdD
2008 Was it worth it? Looking back at EdD
 
Journal of Educational Technology Systems-2015-53-68
Journal of Educational Technology Systems-2015-53-68Journal of Educational Technology Systems-2015-53-68
Journal of Educational Technology Systems-2015-53-68
 
2022_01_21 «Teaching Computing in School: Is research reaching classroom prac...
2022_01_21 «Teaching Computing in School: Is research reaching classroom prac...2022_01_21 «Teaching Computing in School: Is research reaching classroom prac...
2022_01_21 «Teaching Computing in School: Is research reaching classroom prac...
 
Closing The 2-Sigma Gap Eight Strategies to Replicate One-to-One Tutoring in ...
Closing The 2-Sigma Gap Eight Strategies to Replicate One-to-One Tutoring in ...Closing The 2-Sigma Gap Eight Strategies to Replicate One-to-One Tutoring in ...
Closing The 2-Sigma Gap Eight Strategies to Replicate One-to-One Tutoring in ...
 
Interactive learning strategies using technology in teaching science
Interactive learning strategies using technology in teaching scienceInteractive learning strategies using technology in teaching science
Interactive learning strategies using technology in teaching science
 
Part 3
Part 3Part 3
Part 3
 
Closing session: using a digital student voice platform to shape the student ...
Closing session: using a digital student voice platform to shape the student ...Closing session: using a digital student voice platform to shape the student ...
Closing session: using a digital student voice platform to shape the student ...
 
Teacher design team as a professional develoment arrangement to develop TPACK...
Teacher design team as a professional develoment arrangement to develop TPACK...Teacher design team as a professional develoment arrangement to develop TPACK...
Teacher design team as a professional develoment arrangement to develop TPACK...
 
UNC TLT 2010
UNC TLT 2010UNC TLT 2010
UNC TLT 2010
 
Sreb March 2010 5
Sreb March 2010 5Sreb March 2010 5
Sreb March 2010 5
 
Tma02 part 2 extended
Tma02   part 2 extendedTma02   part 2 extended
Tma02 part 2 extended
 

Similar to 1 s2.0-s0747563219302985-main (1)

Effectiveness of Flipped Learning: Improving Pre-service Teachers’ Prowess in...
Effectiveness of Flipped Learning: Improving Pre-service Teachers’ Prowess in...Effectiveness of Flipped Learning: Improving Pre-service Teachers’ Prowess in...
Effectiveness of Flipped Learning: Improving Pre-service Teachers’ Prowess in...Dr. Almodaires
 
Habilidades de retroalimentacion de los maestros
Habilidades de retroalimentacion de los maestrosHabilidades de retroalimentacion de los maestros
Habilidades de retroalimentacion de los maestrosSisercom SAC
 
Retroalimentacion digital
Retroalimentacion digitalRetroalimentacion digital
Retroalimentacion digitalSisercom SAC
 
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...IJITE
 
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...IJITE
 
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...IJITE
 
Flipped classroom
Flipped classroomFlipped classroom
Flipped classroomGrei Grei
 
Google docs in google drive for collaborative reflective 2
Google docs in google drive for collaborative reflective 2Google docs in google drive for collaborative reflective 2
Google docs in google drive for collaborative reflective 2Melanie Alperstein
 
Advice For Action With Automatic Feedback Systems
Advice For Action With Automatic Feedback SystemsAdvice For Action With Automatic Feedback Systems
Advice For Action With Automatic Feedback SystemsNat Rice
 
A Practical Case Study Exploring How Flipped Learning Affects The Development...
A Practical Case Study Exploring How Flipped Learning Affects The Development...A Practical Case Study Exploring How Flipped Learning Affects The Development...
A Practical Case Study Exploring How Flipped Learning Affects The Development...Sarah Brown
 
Flipped Classroom - Literature Review
Flipped Classroom - Literature ReviewFlipped Classroom - Literature Review
Flipped Classroom - Literature ReviewKatie Jaehnke
 
A Comparison And Evaluation Of Personal Response Systems In Introductory Comp...
A Comparison And Evaluation Of Personal Response Systems In Introductory Comp...A Comparison And Evaluation Of Personal Response Systems In Introductory Comp...
A Comparison And Evaluation Of Personal Response Systems In Introductory Comp...Sandra Long
 
MuLLLTi_Blended learning for lifelong learners in a multicampuscontext
MuLLLTi_Blended learning for lifelong learners in a multicampuscontextMuLLLTi_Blended learning for lifelong learners in a multicampuscontext
MuLLLTi_Blended learning for lifelong learners in a multicampuscontextYves Blieck
 
Book Launch: Designing effective feedback processes
Book Launch: Designing effective feedback processesBook Launch: Designing effective feedback processes
Book Launch: Designing effective feedback processesDavid Carless
 
16Action Research Study ReportInsert Your Na
16Action Research Study ReportInsert Your Na16Action Research Study ReportInsert Your Na
16Action Research Study ReportInsert Your NaEttaBenton28
 
16Action Research Study ReportInsert Your Na
16Action Research Study ReportInsert Your Na16Action Research Study ReportInsert Your Na
16Action Research Study ReportInsert Your NaKiyokoSlagleis
 
16 action research study reportinsert your na
16 action research study reportinsert your na16 action research study reportinsert your na
16 action research study reportinsert your naabhi353063
 

Similar to 1 s2.0-s0747563219302985-main (1) (20)

Effectiveness of Flipped Learning: Improving Pre-service Teachers’ Prowess in...
Effectiveness of Flipped Learning: Improving Pre-service Teachers’ Prowess in...Effectiveness of Flipped Learning: Improving Pre-service Teachers’ Prowess in...
Effectiveness of Flipped Learning: Improving Pre-service Teachers’ Prowess in...
 
Habilidades de retroalimentacion de los maestros
Habilidades de retroalimentacion de los maestrosHabilidades de retroalimentacion de los maestros
Habilidades de retroalimentacion de los maestros
 
Retroalimentacion digital
Retroalimentacion digitalRetroalimentacion digital
Retroalimentacion digital
 
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
 
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
 
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
THE IMPACT OF SIMULATION ON TEACHING EFFECTIVENESS AND STUDENT LEARNING PERFO...
 
Flipped classroom
Flipped classroomFlipped classroom
Flipped classroom
 
Google docs in google drive for collaborative reflective 2
Google docs in google drive for collaborative reflective 2Google docs in google drive for collaborative reflective 2
Google docs in google drive for collaborative reflective 2
 
EL7001-8
EL7001-8EL7001-8
EL7001-8
 
Advice For Action With Automatic Feedback Systems
Advice For Action With Automatic Feedback SystemsAdvice For Action With Automatic Feedback Systems
Advice For Action With Automatic Feedback Systems
 
A Practical Case Study Exploring How Flipped Learning Affects The Development...
A Practical Case Study Exploring How Flipped Learning Affects The Development...A Practical Case Study Exploring How Flipped Learning Affects The Development...
A Practical Case Study Exploring How Flipped Learning Affects The Development...
 
Tarpan shah , Megha A Patel ,Hiral T Shah
Tarpan shah , Megha A Patel ,Hiral T ShahTarpan shah , Megha A Patel ,Hiral T Shah
Tarpan shah , Megha A Patel ,Hiral T Shah
 
Flipped Classroom - Literature Review
Flipped Classroom - Literature ReviewFlipped Classroom - Literature Review
Flipped Classroom - Literature Review
 
A Comparison And Evaluation Of Personal Response Systems In Introductory Comp...
A Comparison And Evaluation Of Personal Response Systems In Introductory Comp...A Comparison And Evaluation Of Personal Response Systems In Introductory Comp...
A Comparison And Evaluation Of Personal Response Systems In Introductory Comp...
 
sudha ppt.pptx
sudha ppt.pptxsudha ppt.pptx
sudha ppt.pptx
 
MuLLLTi_Blended learning for lifelong learners in a multicampuscontext
MuLLLTi_Blended learning for lifelong learners in a multicampuscontextMuLLLTi_Blended learning for lifelong learners in a multicampuscontext
MuLLLTi_Blended learning for lifelong learners in a multicampuscontext
 
Book Launch: Designing effective feedback processes
Book Launch: Designing effective feedback processesBook Launch: Designing effective feedback processes
Book Launch: Designing effective feedback processes
 
16Action Research Study ReportInsert Your Na
16Action Research Study ReportInsert Your Na16Action Research Study ReportInsert Your Na
16Action Research Study ReportInsert Your Na
 
16Action Research Study ReportInsert Your Na
16Action Research Study ReportInsert Your Na16Action Research Study ReportInsert Your Na
16Action Research Study ReportInsert Your Na
 
16 action research study reportinsert your na
16 action research study reportinsert your na16 action research study reportinsert your na
16 action research study reportinsert your na
 

More from Sisercom SAC

Wsudiantes universitarios sobre retroalimentacion
Wsudiantes universitarios sobre retroalimentacionWsudiantes universitarios sobre retroalimentacion
Wsudiantes universitarios sobre retroalimentacionSisercom SAC
 
Videos de animacion
Videos de animacionVideos de animacion
Videos de animacionSisercom SAC
 
Trabajo en pares uniersitarios
Trabajo en pares uniersitariosTrabajo en pares uniersitarios
Trabajo en pares uniersitariosSisercom SAC
 
Retroalimentacion visual sobre rendiemiento
Retroalimentacion visual sobre rendiemientoRetroalimentacion visual sobre rendiemiento
Retroalimentacion visual sobre rendiemientoSisercom SAC
 
Retroalimentacion verbal
Retroalimentacion verbalRetroalimentacion verbal
Retroalimentacion verbalSisercom SAC
 
Retroalimentacion de videos de pares
Retroalimentacion de videos de paresRetroalimentacion de videos de pares
Retroalimentacion de videos de paresSisercom SAC
 
Retroalimentacion con papas
Retroalimentacion con papasRetroalimentacion con papas
Retroalimentacion con papasSisercom SAC
 
La retroalimentacion en video
La retroalimentacion en videoLa retroalimentacion en video
La retroalimentacion en videoSisercom SAC
 
Fukkink2011 article video_feedbackineducationandtra
Fukkink2011 article video_feedbackineducationandtraFukkink2011 article video_feedbackineducationandtra
Fukkink2011 article video_feedbackineducationandtraSisercom SAC
 
Efectos de retroaluimentacion
Efectos de retroaluimentacionEfectos de retroaluimentacion
Efectos de retroaluimentacionSisercom SAC
 
Efectos de la retroalimentcion verbal
Efectos de la retroalimentcion verbalEfectos de la retroalimentcion verbal
Efectos de la retroalimentcion verbalSisercom SAC
 
Analisis de los videos y percepiones de los docentes
Analisis de los videos y percepiones de los docentesAnalisis de los videos y percepiones de los docentes
Analisis de los videos y percepiones de los docentesSisercom SAC
 
10.1080@09588221.2019.1677721
10.1080@09588221.2019.167772110.1080@09588221.2019.1677721
10.1080@09588221.2019.1677721Sisercom SAC
 

More from Sisercom SAC (16)

Videos
VideosVideos
Videos
 
Wsudiantes universitarios sobre retroalimentacion
Wsudiantes universitarios sobre retroalimentacionWsudiantes universitarios sobre retroalimentacion
Wsudiantes universitarios sobre retroalimentacion
 
Videos de animacion
Videos de animacionVideos de animacion
Videos de animacion
 
Trabajo en pares uniersitarios
Trabajo en pares uniersitariosTrabajo en pares uniersitarios
Trabajo en pares uniersitarios
 
Retroalimentacion visual sobre rendiemiento
Retroalimentacion visual sobre rendiemientoRetroalimentacion visual sobre rendiemiento
Retroalimentacion visual sobre rendiemiento
 
Retroalimentacion verbal
Retroalimentacion verbalRetroalimentacion verbal
Retroalimentacion verbal
 
Retroalimentacion de videos de pares
Retroalimentacion de videos de paresRetroalimentacion de videos de pares
Retroalimentacion de videos de pares
 
Retroalimentacion con papas
Retroalimentacion con papasRetroalimentacion con papas
Retroalimentacion con papas
 
Peek asa2019
Peek asa2019Peek asa2019
Peek asa2019
 
La retroalimentacion en video
La retroalimentacion en videoLa retroalimentacion en video
La retroalimentacion en video
 
Fukkink2011 article video_feedbackineducationandtra
Fukkink2011 article video_feedbackineducationandtraFukkink2011 article video_feedbackineducationandtra
Fukkink2011 article video_feedbackineducationandtra
 
Efectos de retroaluimentacion
Efectos de retroaluimentacionEfectos de retroaluimentacion
Efectos de retroaluimentacion
 
Efectos de la retroalimentcion verbal
Efectos de la retroalimentcion verbalEfectos de la retroalimentcion verbal
Efectos de la retroalimentcion verbal
 
Eduacion fisica
Eduacion fisicaEduacion fisica
Eduacion fisica
 
Analisis de los videos y percepiones de los docentes
Analisis de los videos y percepiones de los docentesAnalisis de los videos y percepiones de los docentes
Analisis de los videos y percepiones de los docentes
 
10.1080@09588221.2019.1677721
10.1080@09588221.2019.167772110.1080@09588221.2019.1677721
10.1080@09588221.2019.1677721
 

Recently uploaded

A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docxPoojaSen20
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...M56BOOKSTORE PRODUCT/SERVICE
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfSumit Tiwari
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Sapana Sha
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfUmakantAnnand
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesFatimaKhan178732
 

Recently uploaded (20)

A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
mini mental status format.docx
mini    mental       status     format.docxmini    mental       status     format.docx
mini mental status format.docx
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Call Girls in Dwarka Mor Delhi Contact Us 9654467111
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.Compdf
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and Actinides
 

1 s2.0-s0747563219302985-main (1)

  • 1. Contents lists available at ScienceDirect Computers in Human Behavior journal homepage: www.elsevier.com/locate/comphumbeh Full length article Effects of digital video-based feedback environments on pre-service teachers’ feedback competence Christopher Neil Prilop ⁎ , Kira Elena Weber, Marc Kleinknecht Leuphana University Lüneburg, Institute of Educational Science, Universitätsallee 1, 21335, Lüneburg, Germany A R T I C L E I N F O Keywords: Pre-service teacher education Digital learning environments Video Intervention Feedback Practicum A B S T R A C T The present study investigates the added value of blended digital video-based feedback environments in fos- tering pre-service teachers' feedback competence on teaching during a practicum. Pre-service teachers practised providing their peers with feedback on their classroom management in traditional face-to-face feedback sessions (control group, n = 65) or blended digital video-based environments with an expert present (V-Feedback+, n = 22) or without (V-Feedback, n = 16). Before and after the practicum, a video-based tool was applied that required pre-service teachers to provide written feedback to a teacher concerning fictitious classroom events. Written feedbacks were analysed by applying quantitative content analysis. Feedback competence was assessed with six categories: assessment criteria; specificity; suggestions; questions; first person; and positive/negative emphasis. This study demonstrated that digital video-based environments can elicit stronger effects than tra- ditional face-to-face settings, especially when combined with expert feedback. Results from the pre- and post- tests revealed that V-Feedback and V-Feedback + participants provided more specific feedback than the control group. V-Feedback + participants provided feedback containing more high quality suggestions than the V- Feedback group. This study illustrates how pre-service teachers’ feedback competence can be fostered in teaching practicums using digital video-based feedback environments. 1. Introduction Actively seeking peer feedback on one's own teaching is considered essential to acquiring teaching expertise (Hammerness et al., 2005). Feedback can be obtained by inviting colleagues into one's classroom to observe and reflect on one's teaching practice. These feedback and re- flection sessions help teachers “learn, grow and change” (Joyce & Showers, 1996, p. 12) and, thus, become expert teachers. Hammerness et al. (2005) emphasise that feedback needs to become a continuous activity in the teaching profession. However, research in other domains has shown that feedback also can elicit detrimental effects on perfor- mance (Kluger & DeNisi, 1996). Therefore, the ability to provide feedback on classroom practices needs to be taught during teacher education. Pre-service teachers need to become competent providers of high-quality feedback that fosters reflection on classroom practice (Tripp & Rich, 2012a). In pre-service teacher education, feedback and reflection sessions during practical school experiences offer an ecologically valid learning setting in which to acquire feedback competence. And yet, face-to-face feedback and reflection sessions rarely are realised (Valencia, Martin, Place, & Grossman, 2009), especially when experts are supposed to be present (Lee & Wu, 2006). However, digital video-based feedback and reflection environments can offer a solution. They increase opportunities for sessions because they make feedback and reflection sessions time- and location-independent (So, Pow, & Hung, 2009; Wu & Kao, 2008). The educational research community has yet to focus on fostering the competence of providing feedback on classroom practice, so we investigated whether pre-service teachers' feedback competence in- creased more by participating in one of three different feedback and reflection environments in our practicum: a traditional face-to-face feedback and reflection format; a blended-learning setting that com- prised face-to-face feedback and a digital video-based environment with peer and expert feedback; and a blended-learning setting with face-to-face feedback and a digital video-based environment with only peer feedback. With this approach, the present study broadens the perspective on the use of digital video-based environments in ecologi- cally valid settings and provides a foundation for future research on fostering pre-service teachers’ feedback competence on teaching prac- tice. https://doi.org/10.1016/j.chb.2019.08.011 Received 1 October 2018; Received in revised form 12 August 2019; Accepted 13 August 2019 ⁎ Corresponding author. Institute of Educational Science, Universitätsallee 1, C1.207, 21335, Lüneburg, Germany. E-mail addresses: prilop@leuphana.de (C.N. Prilop), kweber@leuphana.de (K.E. Weber), marc.kleinknecht@leuphana.de (M. Kleinknecht). Computers in Human Behavior 102 (2020) 120–131 Available online 17 August 2019 0747-5632/ © 2019 Elsevier Ltd. All rights reserved. T
  • 2. 1.1. Feedback and feedback sessions Feedback is considered one of the most powerful factors in pro- moting achievement in a variety of contexts (Hattie & Timperley, 2007; Narciss, 2013). It provides individuals with information about their current performance to help them improve and reach desired standards (Narciss, 2013). Studies concerning expertise show that feedback is essential to improve performance. Ericsson, Krampe, and Tesch-Römer (1993, p. 367) assert that in the “absence of adequate feedback efficient learning is impossible and improvement only minimal even for highly motivated subjects”. Regarding pre- and in-service teachers, this means that merely practicing teaching is not sufficient to become an expertly skilled educator. They need to evaluate their own teaching in co- ordination with colleagues to learn from each other. Consequently, Hammerness et al. (2005) assert that teachers actively need to seek feedback to develop teaching expertise. Such feedback occasions increasingly are being incorporated into pre- and in-service teacher education (Kleinknecht & Gröschner, 2016; Joyce & Showers, 2002; Kraft, Blazar, & Hogan, 2018). Feedback ses- sions take place after observing a teacher's lesson or specific skills training. They can involve either experts who possess more advanced knowledge than the teacher or peers who share a similar level of teaching expertise (Lu, 2010). Feedback sessions “stimulate reflection” (Hammerness et al., 2005, p. 380) and, thus, cause “a self-critical, in- vestigative process wherein teachers consider the effect of their peda- gogical decisions on their situated practice with the aim of improving those practices” (Tripp & Rich, 2012a p. 678). A growing body of re- search (e.g., Allen, Hafen, Gregory, Mikami, & Pianta, 2015; Weber, Gold, Prilop, & Kleinknecht, 2018; Fisher, Frey, & Lapp, 2011; Matsumura, Garnier, & Spybrook, 2013; Sailors & Price, 2015; Tschannen-Moran & McMaster, 2009; Vogt & Rogalla, 2009) has con- firmed the substantial effects from feedback sessions on teacher knowledge, practice, beliefs and, consequently, student achievement. However, in different domains, Kluger and DeNisi (1996) established that receiving feedback does not necessarily lead to improved perfor- mance, i.e., fostering expertise requires high-quality feedback (Ericsson, 2004). Thus, to develop the ability to provide high-quality feedback concerning teaching situations productively, pre-service teachers need to acquire this competence. Feedback competence in teacher assessment can be defined as the skill to convey critical assessments of a teacher's classroom practice to initiate reflection (Hammerness et al., 2005). After providing a criteria- based evaluation of a teaching performance and identifying possible strengths and weaknesses, the feedback provider needs to communicate this information to her or his fellow teacher constructively (Sluijsmans, Brand-Gruwel, Van Merriënboer, & Bastiaens, 2003). Prins, Sluijsmans, and Kirschner (2006) analysed what distinguishes expert feedback from novice feedback. They established that experts make more use of cri- teria, provide more situation-specific comments, and more frequently use a first-person perspective style. Additionally, they found that no- vices prefer being provided with feedback that contains many reflective questions, examples and suggestions for improvement. 1.2. Feedback quality Competence in providing feedback commonly is assessed by ana- lysing feedback quality (e.g., Prins et al., 2006). However, extant re- search on fostering peer feedback quality has focussed on school stu- dents (e.g., Gan & Hattie, 2014; Rotsaert, Panadero, Schellens, & Raes, 2018) or content-related tasks in higher education (e.g., M. Gielen & De Wever, 2015; Peters, Körndle, & Narciss, 2018). Only a few have dealt with the effects of pre-service teachers' peer feedback on mathematical or writing tasks (e.g., Alqassab, Strijbos, & Ufer, 2018; Sluijsmans, Brand-Gruwel, Van Merriënboer, & Martens, 2004; Sluijsmans et al., 2003). Furthermore, to our knowledge, no extant studies have in- vestigated how to promote pre-service teachers’ feedback quality concerning teaching practice. Generally, feedback's efficacy is determined through three facets: content, function and presentation (Narciss, 2013). Following Hattie and Timperley, feedback needs to answer three questions: Where am I going? (Feed Up); How am I going? (Feed Back); and Where to next? (Feed Forward). No clear consensus exists as to how feedback quality can be measured accurately (S. Gielen, Peeters, Dochy, Onghena, & Struyven, 2010). While some studies have examined feedback ac- cording to validity and reliability (Hafner & Hafner, 2003; Van Steendam, Rijlaarsdam, Sercu, & Van den Bergh, 2010), most applied abstract classifications to measure feedback quality. Abstract classifi- cations offer the advantage that generic knowledge can be measured. Such measures then can be applied in a multitude of feedback situations without being limited by domain- or task-specific aspects (S. Gielen et al., 2010). As teaching situations do not entail clear-cut solutions, an assessment in terms of validity and reliability would seem nearly im- possible. Therefore, pre-service teachers' feedback quality concerning teaching practice was evaluated in terms of content and/or style criteria in this study. Extant studies that have measured feedback quality (Prins et al., 2006; S. Gielen et al., 2010; M. Gielen & De Wever, 2015) largely have been based on a set of criteria originally suggested by Sluijsmans, Brand-Gruwel, and Van Merriënboer (2002). First, feedback comments need to be appropriate for the specific context, i.e., the assessor needs to be able to evaluate a performance based on defined criteria (Feed Up). Second, the assessor must be able to explain his or her judgements and highlight specific examples (S. Gielen et al., 2010). Third, feedback must contain constructive suggestions, which are part of feedback's tutoring component. They provide learners with additional information besides evaluative aspects. This can include knowledge about task constraints, concepts, mistakes, how to proceed or teaching strategies (Narciss, 2013). Explaining one's judgements can be viewed as Hattie and Timperley’s (2007) concept of Feed Back, whereas suggestions can be compared with Feed Forward. Fourth, feedback messages should contain ‘thought-provoking questions’ (S. Gielen et al., 2010, p. 307) that aim to enhance individuals' active engagement (Nicol & Macfarlane-Dick, 2006). Fifth, M. Gielen and De Wever (2015) have determined that feedback messages should contain positive and nega- tive comments. Both can enhance future performances (Bandura & Cervone, 1986; Kluger & DeNisi, 1996). Finally, high-quality feedback should be written in the first person, with a clear structure and for- mulations (Prins et al., 2006). 1.3. Feedback environments A growing demand exists for ecologically valid practice environ- ments in the field of teacher education, with research showing that pre- service teachers need to develop situation-specific skills through which to apply their professional knowledge effectively (Blömeke, Gustafsson, & Shavelson, 2015; Grossman, Hammerness, & McDonald, 2009). Teaching practicums offer such an authentic setting in which to practise feedback on teaching. Pre-service teachers need to be able to practise skills repeatedly to develop fluidity (Grossman & McDonald, 2008). This can be achieved during practicums through pre-service teachers observing their peers’ classroom interactions, then providing them with feedback. Feedback sessions during the practicum, as well as the practicum itself, can be viewed as “approximations of practice” (Grossman & McDonald, 2008, p. 190). However, although they are highly authentic approximations, feedback sessions still “rely on de- composition” (Grossman et al., 2009, p. 2091), as mentors or university supervisors set a specific focus on feedback. Simultaneously, these ex- perts take up the role of “modelers of practice” (Clarke, Triggs, & Nielsen, 2014, p. 177). As pre-service teachers tend to imitate their practice in the classroom (Clarke et al., 2014, p. 177), this also can be expected from feedback sessions. Although feedback can be viewed as essential in developing expertise, feedback sessions and, thus, the C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131 121
  • 3. option to practise providing feedback are limited in teaching practi- cums (Lee & Wu, 2006; Valencia et al., 2009). This is largely due to time and location constraints, especially when experts are supposed to be part of the process (Lee & Wu, 2006). Pre- service teachers usually are assigned to different schools or in different classes, making classroom observations and feedback sessions difficult to implement. Furthermore, feedback sessions with pre-service teachers at remote schools might not be provided due to a lack of resources (Hixon & So, 2009). Digital video-based feedback environments can resolve such logistical issues because student teachers can participate online, regardless of time or location (Wu & Kao, 2008, p. 45). Con- sequently, recent studies have applied digital video-based environments to feedback sessions in teacher education (e.g., Kleinknecht & Gröschner, 2016; Weber et al., 2018; Gregory et al., 2017; So et al., 2009; Wu & Kao, 2008). Consequently, the advantages of these en- vironments outweigh the disadvantages (Hixon & So, 2009). Being able to interact remotely, and at different times, leads to higher interaction frequency between participants and experts, allowing for “opportu- nities to learn and to emulate each other” (Wu & Kao, 2008, p. 54). Moreover, digital video-based environments enable pre-service teachers to observe a variety of effective and ineffective teaching practices, eliciting a more comprehensive understanding of classroom reality (So et al., 2009, p. 783). Wu and Kao (2008) also showed that by being able to select particular teaching situations, discussions and reflections were more focussed. However, selecting specific teaching situations also has been criticised. One study's (Sharpe et al., 2003) participants found video clips to be artificial because pre-service teachers seemed to share only positive events. Furthermore, when all pre-service teachers and experts interacted solely through a digital video-based environment, feedback sessions can become impersonal. For this reason, Malewski, Phillion, and Lehman (2005) added traditional classroom visits along with digital environments. Ultimately, digital video-based feedback environments can be viewed as approximations of practice that carry a slightly lower degree of authenticity compared with face-to-face settings. They are not con- ducted in real time and do not contain a complete representation of the lesson. Consequently, the use of video provides pre-service teachers with additional time for reflection and the opportunity to prepare their feedback (Grossman et al., 2009, pp. 2079–2083). 1.4. Video as a tool in (pre-service) teacher education In their recent review of 250 studies, Gaudin and Chaliès (2015) determined that classroom videos increasingly are being used to sup- port teacher education and professional development worldwide. Video-based feedback and reflection have become a standard compo- nent of digital practicum environments (e.g., Kleinknecht & Gröschner, 2016; Hixon & So, 2009; Lee & Wu, 2006; So et al., 2009), as well as settings that entail face-to-face in- and pre-service teacher education (e.g., Dobie & Anderson, 2015; Fukkink & Tavecchio, 2010; Hollingsworth & Clarke, 2017; Rich & Hannafin, 2008; Tripp & Rich, 2012a 2012b). Incorporating classroom videos offers many advantages, as well as potential disadvantages. Classroom videos are authentic representa- tions of teaching events that capture the complexity of teaching pro- cesses to a high degree (Borko, Whitcomb, & Liston, 2009). Sequences can be watched repeatedly, making it possible to revisit and examine specific situations with different foci (Sherin, 2007). Videos of class- room practice act as situated stimuli for eliciting knowledge about teaching and learning (Kersting, 2008; Seidel & Stürmer, 2014). Fur- thermore, analysis of classroom videos has been shown to lead to high activation, immersion, resonance and motivation (Kleinknecht & Schneider, 2013; Seidel, Stürmer, Blomberg, Kobarg, & Schwindt, 2011). However, classroom videos also contain potential constraints. Although classroom videos can be considered rich representations of teaching interactions, they offer less contextual information than live observations (Sherin, 2007). Körkkö, Morales Rios, and Kyrö-Ämmäla (2019) emphasise that video sequences require contextualisation to convey the classroom's culture, atmosphere and environment. Fur- thermore, videos can lead to “attentional biases”, i.e., only noticing limited aspects of classroom reality, or “cognitive overload”, i.e., being overwhelmed by the density of information (Derry, Sherin, & Sherin, 2014, p. 787). To counteract these limitations, various researchers have formulated design principles for learning environments using video in teacher education (for a discussion of existing frameworks, see Kang & Van Es, 2018). To develop classroom videos' full potential, they need to be embedded in contextual information, such as a description of the class or the lesson's learning goals (Blomberg, Renkl, Sherin, Borko, & Seidel, 2013). This provides the first guiding scaffold for learners and adds information that is not transferred through the video. Further- more, (pre-service) teachers have the opportunity to watch classroom videos repeatedly, and instructors simultaneously can direct (pre-ser- vice) teachers' attention to important aspects of the videotaped se- quence, e.g., by setting a specific observation target (Derry et al., 2014, pp. 785–812). Moreover, setting such a target also can be part of a highly scaffolded learning environment that reduces the risk of cogni- tive overload (Kang & Van Es, 2018). When designing video-based learning environments, the video ma- terial's origins also need to be considered (Blomberg et al., 2013). Ex- tant studies either have applied classroom videos of (pre-service) tea- chers' own practice (own videos) or classroom videos of peers or unknown teachers (other videos) (Major & Watson, 2018). Studies (e.g., Kleinknecht & Schneider, 2013; Seidel et al., 2011) have analysed teachers' motivational and cognitive processes when working with own videos or other videos in depth, showing that videos of own teaching led to a higher degree of activation than classroom videos of other teaching. Higher activation is characterised by a deeper engagement and involvement (immersion), being able to place oneself in the si- tuation more easily (resonance) and continuous motivation. However, these studies also established that teachers using other materials im- proved their knowledge-based reasoning skills more. They analysed more critically and deduced more consequences and alternatives than the own video group. Concerning pre-service teacher education, Santagata and Guarino (2011) emphasise that analysing videos of peers can increase motivation because pre-service teachers identify with their peers and find their classroom control more achievable. Furthermore, Krammer et al., (2016) compared the use of own vs. other video on pre- service teachers' professional vision. They were able to show that both groups increased their professional vision significantly. In a slightly different study, Hellermann, Gold, and Holodynski (2015) analysed the effect of training professional vision skills of pre-service teachers with own videos or with own and other videos. Though both groups resulted in learning effects, the group with own and other videos improved their professional vision more. They attribute the larger improvements to more in depth learning by being faced with an inner (own video) and outer perspective (other video). 1.5. Video-based feedback Tripp and Rich (2012a) specifically investigated the impact of video-based reflection and feedback on in-service teacher change. Their study indicates that feedback becomes more focussed when using video. Participants reported that the feedback they received was more specific, and suggestions were more relevant. They assessed that the feedback they previously had been provided with in learning environments without video support was too general. Furthermore, feedback sessions tended to be more dialogic. Video sequences elicited questions from the teachers providing feedback, as they wanted to understand the entire context. Hollingsworth and Clarke (2017) also found this effect to be present in video-based reflection and feedback for in-service teachers. Their study indicated that teachers perceived feedback sessions “as an opportunity for the teachers and researchers to discuss observations, C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131 122
  • 4. analyses and reflections” (p. 471). This led to conversations that em- phasised support, rather than a one-way transmission of information. The findings by Tripp and Rich (2012a) and Hollingsworth and Clarke (2017) are confirmed in a review of 63 video-based studies (Tripp & Rich, 2012b). Additionally, the authors found that mentors or super- visors play a significant role with pre-service teachers in reflecting on classroom videos. Participants said they trusted their supervisors’ opi- nions more than their own. However, video support also seems to in- fluence the feedback that inexperienced mentors provided. In a study by Rich and Hannafin (2008), some pre-service teachers ascertained that the face-to-face feedback that their mentors provided lacked structure, but when provided with video support, it was more specific and in-depth. Digital environments that apply video for feedback and reflection yielded similar findings. Lee and Wu (2006) and Wu and Kao (2008) created a digital environment in which pre-service teachers could watch videos of their own and peers’ teaching. They were able to dis- cuss their teaching practice with their peers and experienced teachers. Wu and Kao (2008) found that the digital video-based environment allowed “for more accurate and more probing reviews of teaching in- stances” (p. 54). Lee and Wu (2006) also emphasised that pre-service teachers received “more concrete feedback” (p. 379) in the digital video-based environment. In both studies, the authors ascertained that feedback concreteness was enhanced because pre-service teachers were able to mark specific teaching situations on the videos that related to their feedback. This led So et al. (2009) to evaluate digital video-based environments as an ideal feedback and mentoring resource during teaching practicums. Although these modern, high-tech environments have produced encouraging findings, it is unclear whether digital video-based feedback environments can foster pre-service teachers’ feedback competence on teaching practice more effectively than face-to-face sessions. 1.6. Research questions As high-quality feedback significantly can foster (pre-service) tea- chers' professional knowledge, teacher education needs to develop teachers’ feedback competence early in their careers. Currently, prac- ticing feedback provision in authentic situations mostly is limited to face-to-face sessions, but face-to-face feedback sessions often are not a standard component or are limited in number during teaching practi- cums due to location and time constraints. Blended digital video-based feedback environments can be a solution to this. Therefore, we in- vestigated whether pre-service teachers enhanced their feedback com- petence more by practicing feedback in a traditional face-to-face setting (control group, CG) or in two blended settings that were combined with face-to-face feedback: a digital video-based environment with expert feedback (V-Feedback+) and a digital video-based environment without expert feedback (V-Feedback) during a teaching practicum. The following research questions were investigated: 1) To what extent does feedback practice during a practicum improve pre-service teachers' feedback competence (CG, V-Feedback, V- Feedback+)? 2) What impact do blended digital video-based environments (V- Feedback, V-Feedback+) have on pre-service teachers' feedback competence compared with the face-to-face condition (CG)? 3) How does expert feedback (V-Feedback+) influence pre-service teachers' feedback competence compared with the condition without expert feedback (V-Feedback)? We assumed that pre-service teachers in all conditions (CG, V- Feedback, V-Feedback+) would show positive development in their feedback competence during the practicum, as they are provided with multiple opportunities to practise in an authentic environment. Moreover, we expected participants of the blended digital video-based feedback environments (V-Feedback, V-Feedback+) to provide more specific feedback containing more suggestions than the traditional face- to-face condition (CG). It can be presumed that participants would profit from practicing in the sheltered online environment without real- time pressures. Finally, we hypothesised that pre-service teachers in the blended V-Feedback + condition (expert feedback included) would in- crease their feedback competence more than members of the V- Feedback condition (without expert feedback) because the latter lacked a modeler of practice. 2. Methods 2.1. Design Pre-service teachers participated in a quasi-experimental pre-post- design (see Fig. 1). The intervention was conducted during a four-week practicum. Pre-service teachers participated in either traditional face- to-face (CG), digital video-based feedback sessions (V-Feedback) or di- gital video-based feedback sessions with expert input (V-Feedback+). The pre- and post-tests comprised a video-based measure of pre-service teachers’ feedback competence. 2.2. Participants The study1 was conducted with fourth-semester bachelor's students at a regional German university. Student teachers participated in a four- week teaching practicum at local schools, with 120 student teachers in practical placement. Only participants who completed the pre- and post-tests were included in the sample. Consequently, a limited number had to be excluded, resulting in 103 participants in the final sample. In all, 65 student teachers were assigned to the traditional face-to-face condition (CG; 92.3% female; Mage = 23.35, SDage = 4.61), 16 to the V- Feedback condition (87.5% female; Mage = 24.29, SDage = 6.82) and 22 to the V-Feedback+ condition (95.5% female; Mage = 22.86, Fig. 1. Timetable of Quasi-Experimental Study (CG = control group, VF = V-Feedback condition, VF+ = V-Feedback+ condition). 1 The sample used in this study also was subject to analyses concerning other dependent variables (Weber et al., 2018). C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131 123
  • 5. SDage = 4.25). Originally, we planned for equally sized digital video- based feedback groups (V-Feedback, V-Feedback+) and a larger tradi- tional face-to-face group (CG). Video recordings in German schools must abide by strict data-privacy policies. This meant that student teachers had to volunteer to be part of our video-based intervention; therefore, they could not be assigned randomly to the control group or video-based groups. To film in classrooms, written consent had to be acquired from schools and students' parents. Based on an advance in- quiry with schools, we assigned student teachers to schools. Un- fortunately, several parents did not sign their consent forms, so four students who we had placed in the V-Feedback condition had to join the CG. However, the remaining student teachers who volunteered for the video groups were assigned randomly to the V-Feedback and V-Feed- back + conditions. Applying one-way analyses of variance (ANOVAs), no statistically significant difference was found for participants in the three conditions concerning age, F(2,100) = 0.55, p = .58 (CG: MCG = 23.06, SDCG = 4.70 vs. V-Feedback: MVF = 24.29, SDVF = 6.82 vs. V-Feedback +: MVF+ = 22.86, SDVF+ = 4.25), previous teaching experience, F (2,100) = 1.39, p = .26 (MCG = 0.27, SDCG = 0.76 vs. MVF = 0.15, SDVF = 0.38 vs. MVF+ = 0.30, SDVF+ = 0.81), self-estimated prior knowledge of the dimensions of classroom management, F (2,100) = 2.34, p = .10 (MCG = 2.89, SDCG = 0.71 vs. MVF = 3.14, SDVF = 5.34 vs. MVF+ = 2.59, SDVF+ = 0.87), or self-efficacy of class- room management, F(2,100) = 0.22, p = .80 (MCG = 35.29, SDCG = 3.62 vs. MVF = 35.17, SDVF = 3.86 vs. MVF+ = 34.38, SDVF+ = 7.80). After the practicum, we asked student teachers how many informal feedback sessions they participated in with mentors or peers, and sought their assessment of peer and mentor feedback quality because this could affect their feedback competence. However, the participants in the conditions did not show any statistical differences, F (2,100) = 0.02, p = .99 (number of feedback occasions: MCG = 2.78, SDCG = 1.18 vs. MVF = 2.71, SDVF = 1.07 vs. MVF+ = 2.77, SDVF+ = 1.10, F(2,100) = 1.18, p = .31 (quality of mentor feedback: (MCG = 3.55, SDCG = 1.10 vs. MVF = 3.29, SDVF = 1.54 vs. MVF+ = 3.86, SDVF+ = 0.89, F(2,100) = 0.21, p = .82 (quality of peer feedback: (MCG = 3.58, SDCG = 0.75 vs. MVF = 3.71, SDVF = 0.73 vs. MVF+ = 3.64, SDVF+ = 0.58. 2.3. Teacher education in Germany In Germany, teacher education is divided into two phases: a five- year university phase and a one-and-a-half-year induction phase in schools. Only after completing the induction phase can student teachers become fully qualified teachers. During the university phase, student teachers must complete bachelor's and master's degrees. They usually study two teaching subjects and must enrol in courses on psychology, pedagogy and sociology. Additionally, they must participate in several practicums.2 Our study was conducted during the student teachers' second practicum during the bachelor's phase. Student teachers already had completed their first observational practicum, lasting three weeks, during the second semester. The second practicum during the fourth semester lasted four weeks and required that students teach by them- selves for the first time. Educational research indicates that when practical experiences are aligned to coursework, students can connect theory to practice more readily (Hammerness et al., 2005). Consequently, student teachers had to complete a lecture on didactics and methods, as well as a seminar, as preparation for the practicum. While the lecture provided an overview of theoretical concepts and teaching methods, the seminar focussed on ensuring that student teachers can plan a lesson. Theory and methods from the lecture were elaborated on, then used by student teachers to plan a fictitious lesson in detail. They had to hand in the lesson as coursework by the end of the semester. On two occasions, the lecture and seminar focussed on classroom management (Kounin, 1970) for the entire length of sessions. As part of the seminar sessions, student tea- chers had to act out parts of their fictitious lessons, with the rest of the group acting as school students. They subsequently received feedback on their classroom management skills from the group. 2.4. Intervention procedure Teaching practicums can lead to limited interaction between stu- dent teachers when they are all placed at separate schools, and feed- back and reflection sessions in digital environments can become par- ticularly impersonal (Malewski et al., 2005). To foster interaction and allow for a learning community to develop, we placed students in teams at the schools. Team partners were supposed to observe each other teaching and provide feedback. Furthermore, each student teacher had a tandem partner at a different school. When visiting tandem partners, student teachers also had to participate in feedback and reflection sessions with the university supervisor. University supervisors visited student teachers in the V-Feedback and V-Feedback+ conditions once and in the face-to-face condition (CG) twice. Instead of a second face-to- face session, students in the V-Feedback and V-Feedback+ groups par- ticipated in two feedback and reflection sessions in the digital video- based environment (see Fig. 1). University supervisors provided expert feedback in the V-Feedback+ digital video-based environment. 2.5. Video-based feedback environments As feedback and reflection sessions often lack a “substantive focus” (Valencia et al., 2009, p. 314), pre-service teachers followed a highly structured reflection and feedback cycle in the digital-video-based en- vironments. This cycle is based on a previous study by Kleinknecht & Gröschner (2016). Classroom management was set as the feedback and reflection focus. On one hand, classroom management is considered a necessary prerequisite for successful teaching and has proven to be hard for pre- service teachers to accomplish (Wolff, Van den Bogert, Jarodzka, & Boshuizen, 2015). On the other hand, a specific focus limits complexity, making it possible for students to work on one set of skills in depth (Tschannen-Moran, Woolfolk Hoy, & Hoy, 1998). Consequently, pre- service teachers had to choose instances of classroom management based on the dimensions monitoring, managing momentum and es- tablishing rules and routines (Gold & Holodynski, 2017). Individual lecture and course sessions on didactics and methods before the prac- ticum were aligned to this focus. Additionally, specific criteria were presented and clarified during the practicum's introductory event. Before reflecting and receiving feedback in the digital video-based environment, pre-service teachers filmed themselves. They used cam- eras mounted on tripods provided by the university. Team partners were responsible for handling the cameras. Subsequently, pre-service teachers chose a five-to 10-min video sequence of their teaching ac- cording to the classroom management focus. Sequences were supposed to show an instructional phase and the following transitional phase to individual, peer, or group work. Each sequence should contain at least one critical classroom management event so that participants did not solely choose positive events (Sharpe et al., 2003). The sequences were uploaded to a Moodle forum with vShare software enhancement (Huppertz, Massler, & Plötzner, 2005). Students composed a self-re- flection of the video sequence following a three-step approach (Kleinknecht & Gröschner, 2016; Seidel et al., 2011). They were asked first to describe the classroom-management situation, evaluate and explain evaluations, and finally consider possible alternative teaching strategies. Additionally, they were able to annotate specific situations in their video using the vShare tool. This course of reflection was presented and explained in the introductory lecture and seminar one week before 2 The length and timing of practicums can vary depending on individual German states. C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131 124
  • 6. the practicum started. On one hand, the three-step approach was sup- posed to foster a more in-depth and structured reflection of the lesson. On the other hand, the structured reflection embedded the video with additional contextual information and facilitated comprehensive un- derstanding (Blomberg et al., 2013; Körkkö et al., 2019). After posting their self-reflections online, they received feedback from two peers re- garding their classroom video and self-reflection. Methodological sup- port also was provided for this step through a feedback example and rules at the introductory event (i.e., Base your feedback on specific classroom situations. Suggest alternatives. Ask thought-provoking questions.). Furthermore, a fictitious example of the feedback and re- flection cycle was presented during the practicum's introductory event and on the online platform. After having received feedback from their peers, pre-service teachers in the V-Feedback condition completed the reflection and feedback cycle by composing a feedback balance. They were supposed to reflect on the feedback that their peers provided and decide which classroom management alternatives they would try to use in future situations. In the V-Feedback+ condition, this step was pre- ceded by expert feedback from the university supervisors. Fig. 2 shows the interface of the V-Feedback+ digital video-based environment and a fictitious example of self-reflection, peer feedbacks, expert feedback and feedback balance. Face-to-face sessions (CG) followed the same reflection and feedback cycle, but without video. Feedback and re- flection sessions were based on a lesson observed directly before, in the CG. At the end of the practicum, V-Feedback and V-Feedback+ partici- pants self-reflected and received feedback in one face-to-face and two digital video-based feedback and reflection sessions on their own classroom practice. Student teachers in the CG participated in two face- to-face feedback and reflection sessions. As face-to-face feedback and reflection sessions dealt with an entire lesson, two feedback and re- flection occasions in the digital video-based environments equalled one face-to-face session. Furthermore, every student provided feedback on their peers’ teaching the same number of times. 2.6. Instruments A quasi-experimental, repeated-measures, pre-test-treatment-post- test design was adopted for this study. To assess pre-service teachers’ feedback competence, a video- and text-based measure was developed. Instruments using video are used to enhance authenticity and complexity (Borko et al., 2009; Sherin, 2007) and have become a “prominent tool for studying teacher learning and the activating of teacher knowledge” (Seidel & Stürmer, 2014, p. 740). Recent studies indicate that they can measure situation-specific skills more efficiently than text-based cases (Barth, Thiel, & Ophardt, 2018; Gold & Holodynski, 2017). Our tool comprises a feedback situation that focuses on classroom management. Student teachers were presented with a classroom and feedback situation and had to provide the depicted teacher with feedback. Feedback comments were limited to 200 words or fewer, and test time was approximately 30 min. Student teachers were presented with the feedback situation through the online platform Unipark (Questback, 2017). Six weeks after the pre-test, an identical post-test was administered (see Fig. 1). The teaching and feedback scenario was constructed around a 1-min video sequence (see Fig. 3). During the video clip, a second-grade pri- mary-school class is shown working at different learning stations. The students are trying to determine what happens with water when an object is immersed in it. The teacher, for example, checks results, asks students to return to their work stations, or scolds them for disrupting the lesson. The video sequence shows both successful and unsuccessful classroom-management scenarios. Apart from the video vignette, the feedback situation contained information on the teacher and a self-re- flection and utterance from the teacher. These were presented in text form. The content of the self-reflection and utterance focussed on the facet of withitness in classroom management. Withitness can be defined as a teacher's competence in monitoring classroom events continuously and reacting appropriately when needed. It was employed as it is considered a focal constituent of successful classroom management Fig. 2. Interface of the V-Feedback+ digital video-based environment (V-Feedback = without expert feedback). Fig. 3. Elements of the peer feedback measure. C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131 125
  • 7. (Kounin, 1970; Wolff, 2015). To determine whether the scenario is authentic, the content was validated. Twelve feedback experts (feedback occasions per year: M = 77.67, SD = 45.61) were invited to assess authenticity. The ex- perts comprised teacher educators who are responsible for training teachers during the German induction phase (1.5 years). One of their principal tasks is to observe trainee teachers’ teaching and provide them with feedback. We asked the experts to assess the authenticity, their interest, and typicality of the feedback situation on a four-point Likert scale (from 1 = does not apply to 4 = applies). The experts perceived the feedback situation as authentic (M = 3.25, SD = 1.04, interesting, M = 2.90, SD = 1.10, and typical, M = 3.00, SD = 0.85). As a result, an acceptable degree of content validity can be assumed. 2.7. Coding and analysis The written feedback that pre-service teachers provided for the pre- and post-tests was coded by applying a coding scheme that Prins et al. (2006) developed. It is based on a prior study in which Sluijsmans et al. (2002) extracted characteristics of constructive feedback from expert feedback. Although Prins et al. (2006) coding scheme was developed to assess general practitioners' feedback-report quality, versions of it have been applied in a variety of domains (M. Gielen & De Wever, 2015; S. Gielen et al., 2010). We adopted the coding scheme to feedback con- cerning classroom management. We analysed the feedback, following guidelines for quantifying qualitative data from Chi (1997). Instead of “counting the number of criteria used and the number of comments or certain words present [Prins et al.‘s coding scheme], evaluates the presence of a set of ne- cessary ingredients” (S. Gielen et al., 2010, p. 307). Consequently, we decided to treat each feedback as a unit of analysis because it captures the “semantics of the inference at a more appropriate level” (Chi, 1997, p. 10). Our coding scheme comprises six categories (see Table 1). Sub-op- timal feedback was coded as ‘0’, average feedback as ‘1’ and good feedback as ‘2’. An example for code 2 of the specificity category is ‘I had the impression that you did not notice Max and Anna playing with the water when you were explaining the task to Charlotte’. In the suggestions category, feedback such as ‘Maybe it would be an idea to stand in front of the class during such station learning so that you have everything in view’ would have been coded ‘2’. Consequently, partici- pants could achieve a maximum score of two points in each category. As an estimation of overall feedback competence, we combined in- dividual categories into an aggregated score, resulting in a possible maximum of 12 points. Three coders carried out the coding of feedback comments. Coders were student workers trained by a member of the research team. Before coding the entire sample, 10 randomly chosen feedback comments were coded for practice. Differences between coders were discussed (Zottmann et al., 2013), then coders randomly were assigned feedback comments so that each feedback comment was coded by two in- dependent coders to establish reliability. We calculated Cohen's kappa (κ) (Fleiss & Cohen, 1973). Coding yielded substantial kappa values (see Table 1). Consequently, sufficient content reliability was established. 2.8. Methods of analysis The data was analysed applying one-way analyses of variance (ANOVA), paired samples t-tests, one-way repeated measures multi- variate analyses of variance (MANOVA), a one-way ANOVA with dif- ference scores and planned contrasts. All analyses were computed using SPSS25 software. Furthermore, the alpha value was set at p < .05 for all statistical analyses. Normal distribution of data was slightly violated in the CG. As parametric analysis tools such as ANOVA and MANOVA are robust against this kind of violation (O'Brien & Kaiser, 1985; Schmider, Ziegler, Danay, Beyer, & Bühner, 2010), we decided to apply Table1 Contentanalysisofpre-serviceteachers'peerfeedback:Category,feedbackquality,scorespercentageofcoderagreementandinter-coderreliability(Cohen'skappa). CategoryGoodfeedbackAveragefeedbackSub-optimalfeedbackPercentageofcoderagreementκ AssessmentcriteriaReferencetoclassroommanagement,including terminology 2Referencetoclassroommanagementwithout terminology 1Noreferencetoclassroommanagement087.2%.715 SpecificitySpecificsituationiselaboratedon2Specificteachingphaseiselaboratedon1Nospecificsituationorphaseelaboratedon081.8%.662 SuggestionsAlternativespresentedwithexplanation2Alternativespresentedwithoutexplanation1Noalternativespresented084.9%.707 QuestionsActivatingquestionposed2Clarifyingquestionposed1Noquestionsposed091.6%.700 FirstpersonWritteninfirstpersonthroughoutfeedback2Occasionallywritteninfirstperson1Notwritteninfirstperson084.9%.752 Positive/negativeEquilibriumofpositiveandnegativefeedback2Mainlypositivefeedback1Mainlynegativefeedback081.8%.697 C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131 126
  • 8. it. Sphericity was provided in the ANOVAs and MANOVAs. Sufficient statistical power was established for all analyses using the program GPower (Faul, Erdfelder, Buchner, & Lang, 2009). 3. Results 3.1. Assessment of pre-service teachers’ feedback competence prior to intervention We investigated possible pre-test differences between conditions (CG, V-Feedback, V-Feedback+) by calculating ANOVAs because dif- ferences before the intervention could impact the estimated effects. One-way ANOVAs showed that all conditions were comparable concerning overall feedback expertise, F(2,100) = 0.36, p = .70. This also was the case for individual categories. No differences could be established for assessment criteria, F(2,100) = 0.90, p = .41, specifi- city, F(2,100) = 0.34, p = .72, suggestions, F(2,100) = 1.59, p = .21, questions, F(2,100) = 0.23, p = .80, first person, F(2,100) = 1.16, p = .32, and positive/negative, F(2,100) = 0.45, p = .64. 3.2. Effects of feedback practice during the practicum Concerning research question 1, to what extent feedback practice during the practicum improved pre-service teachers’ feedback compe- tence in CG, V-Feedback and V-Feedback+, we first analysed the effect for time on overall feedback competence and feedback competence categories. To establish the effect for time on overall feedback competence (dependent variable), we conducted paired samples t-tests. Each con- dition was analysed individually (CG, V-Feedback, V-Feedback+). Concerning the control group, the paired samples t-test showed a sig- nificant, small effect for overall feedback competence, t(64) = 3.046, p = .003, d = 0.38, while the V-Feedback condition failed to approach statistical significance, t(16) = 1.499, p = .16, d = 0.43. Regarding the V-Feedback+ condition a large, significant effect was found, t (21) = 3.974, p < .001, d = 1.1. To assess the effect for time on all individual feedback competence categories (dependent variables: assessment criteria, specificity, sug- gestions, questions, first person, positive/negative) simultaneously, we performed one-way repeated measures MANOVAs. Conditions (CG, V- Feedback, V-Feedback+) were analysed individually. Concerning the control group, the multivariate analysis of variance established a medium, significant effect for first person and a large, significant effect for suggestions (see Table 2), while no significant effects could be found for the V-Feedback condition as Wilks’ Lambda did not reach sig- nificance level. Regarding the V-Feedback+ condition, large, significant effects were revealed for specificity, suggestions, questions and first person. In brief, practicing feedback in face-to-face and digital video-based feedback environments enhanced pre-service teachers’ feedback com- petence overall and in a variety of categories. The lack of significant results in the V-Feedback condition can cautiously be attributed to the small sample size, especially when considering the large effect size for specificity (see Table 2). Regarding research questions 2 (differences between V-Feedback/V- Feedback+ and CG) and 3 (differences between V-Feedback and V- Feedback+), we investigated differences in development of pre-service teachers’ overall feedback competence and feedback competence cate- gories between conditions. To compare changes from pre-to post-test between conditions, we performed a one-way ANOVA with difference scores (see Table 3) and planned contrasts. Difference scores (post-test score minus pre-test score) were entered as the dependent variable (Huck & McLean, 1975; Maxwell & Howard, 1981, pp. 747–756). This analytical rational was also applied by other researchers (e.g., Heemsoth & Kleickmann, 2018). The ANOVA showed that conditions differed significantly in terms of specificity, F(2,100) = 4.42, p = .014, and suggestions, F (2,100) = 3.37, p = .038. No significant effects were found for the other categories or overall feedback competence, p > .05. Concerning Table 2 Results of one-way repeated measures MANOVAs for time on all feedback competence categories for individual conditions. CG V-Feedback V-Feedback+ F(1,64) p ηp 2 F(1,15) p ηp 2 F(1,21) p ηp 2 Assessment criteria 0.095 .76 .001 1.000 .06 .06 2.783 .11 .12 Specificity 0.454 .50 .01 3.151 .10 .17 5.045 .036 .19 Suggestions 6.779 .011 .10 0.238 .63 .02 21.138 < .001 .50 Questions 1.957 .17 .03 0.00 1.00 .00 4.667 .042 .18 First person 11.468 < .001 .15 1.518 .24 .09 4.433 .047 .17 Positive/Negative 0.006 .94 .00 1.364 .26 .08 0.00 1.00 .00 Note: Wilks‘ Lambda: CG, F(6,59) = 2.630, p = .025, ηp 2 = 0.21; V-Feedback, F(6,10) = 1.084, p = .433, ηp 2 = 0.39; V-Feedback+, F(6,16) = 4.530, p = .007, ηp 2 = 0.63. Table 3 Means, standard deviations and changes from pre-to post-test. Pre-test Post-test M SD M SD Δ Feedback competence CG 4.16 2.04 4.90 1.90 0.73 VF 4.63 2.06 5.50 2.13 0.88 VF+ 4.27 1.54 6.07 1.79 1.80 Assessment criteria CG 1.16 0.48 1.14 0.46 0.02 VF 1.06 0.44 0.94 0.57 0.13 VF+ 1.27 0.53 1.02 0.50 0.25 Specificity CG 0.39 0.70 0.35 0.65 0.04 VF 0.25 0.68 0.58 0.73 0.31 VF+ 0.41 0.57 0.82 0.81 0.41 Suggestions CG 1.03 0.90 1.36 0.75 0.33 VF 1.38 0.96 1.50 0.73 0.13 VF+ 0.86 0.77 1.66 0.62 0.80 Questions CG 0.23 0.56 0.37 0.76 0.14 VF 0.31 0.70 0.31 0.60 0.00 VF+ 0.18 0.59 0.55 0.91 0.36 First person CG 0.88 0.94 1.32 0.85 0.34 VF 1.38 0.96 1.67 0.70 0.31 VF+ 1.09 0.93 1.57 0.62 0.48 Positive/Negative CG 0.37 0.67 0.38 0.62 0.00 VF 0.25 0.58 0.50 0.73 0.25 VF+ 0.45 0.69 0.45 0.72 0.00 Note: CG = control group, VF = V-Feedback, VF+ = V-Feedback+, nCG = 65, nVF = 16 and nVF = 22; feedback competence Min = 0, Max = 12, categories Min = 0 and Max = 2. C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131 127
  • 9. research question 2, we analysed what impact blended digital video- based environments (V-Feedback, V-Feedback+) have on pre-service teachers' feedback competence in comparison to the face-to-face con- dition (CG). Planned contrasts revealed that V-Feedback and V-Feedback + together differed significantly from the CG in specificity, with a medium effect size, t(100) = 2.87, p = .005, d = 0.55, but not in sug- gestions, t(100) = 0.85, p = .40, d = 0.16. Thus, practicing feedback in blended digital video-based environments increased the specificity of feedback. For research question 3, we investigated how expert feedback (V-Feedback+) influenced pre-service teachers' feedback competence compared with the condition without expert feedback (V-Feedback). The comparison of V-Feedback+ with V-Feedback revealed a significant difference between conditions for suggestions with a medium effect size, t(100) = 2.33, p = .022, d = 0.77, but not specificity, t (100) = 0.42, p = .68, d = 0.14. Hence, having experts present en- hanced the quality of pre-service teachers’ suggestions. 4. Discussion Our study investigated the effects of two blended digital video- based feedback environments during a teaching practicum on pre-ser- vice teachers' feedback competence on teaching practice compared with a control group. The control group participated in a conventional practicum setting with face-to-face feedback. Besides face-to-face feedback sessions, one of the experimental groups practised feedback in a digital video-based environment without expert feedback (V- Feedback), and the other in the same setting included expert feedback (V-Feedback+). We assumed that pre-service teachers’ feedback com- petence would increase in all conditions, but with stronger improve- ments in sheltered digital video-based environments. Furthermore, we expected participants in the V-Feedback+ group to profit from expert modelling compared with the V-Feedback group. Contrary to our first assumption, overall feedback competence did not increase significantly in all conditions. Improvements were sig- nificant only for the V-Feedback+ condition and control group. Consequently, these results (CG, V-Feedback+) align with Grossman and McDonald’s (2008) assertion that novices require multiple practice occasions to start and develop a skill. However, analyses of individual feedback categories revealed a more comprehensive picture that al- ready hints at research questions 2 and 3. In the V-Feedback+ condi- tion, significant developments were established in four of the six cate- gories (specificity, suggestions, questions, first person), two in the CG (suggestions, first person) and none in the V-Feedback group. However, the lack of improvement in all groups in the positive/negative category is of particular interest. This might be the result of the complexity in classroom management for pre-service teachers. Various extant studies (Van den Bogert, Van Bruggen, Kostons, & Jochems, 2014; Wolff, Jarodzka, & Boshuizen, 2017) were able to establish that novices, contrary to experienced teachers, predominantly concentrate on nega- tive events when analysing classroom situations. Thus, it might require more classroom observation to foster their professional vision (Gold & Holodynski, 2017) and enable pre-service teachers to spot more posi- tive events, then incorporate them into their peer feedback. Although we provided methodological support in the lecture and seminar before the practicum by clarifying classroom management criteria and setting a specific focus for observation, additional assistance might be neces- sary. An observation sheet might be a viable solution, providing pre- service teachers with “a particular lens” (Santagata & Angelici, 2010, p. 339). Concerning research question 2, we assumed that the digital video- based environments fostered feedback competence more than face-to- face feedback sessions. On one hand, these approximations of practice limit complexity by possessing no real-time pressure. On the other hand, the use of video has proven beneficial regarding feedback quality. The digital video-based feedback environments produced better effects than the control group concerning specificity. Thus, the comparison of V-Feedback+ and V-Feedback conditions with the CG confirms extant research (Lee & Wu, 2006; Wu & Kao, 2008; Tripp & Rich, 2012a, 2012b). Practising with classroom videos seems to make feedback more specific, accurate and concrete. This is probably also a result of the higher degree of decomposition in the digital video-based environment as an approximation of practice. Pre-service teachers were provided with a specific sequence focusing on classroom management and did not practise giving feedback after having observed an entire lesson. On one hand, observing an entire lesson without video support could yield feedback that is too general, as Tripp and Rich (2012a) established. On the other hand, using own and other video sequences in the V-Feedback and V-Feedback+ environment also trained their professional vision of classroom management, making their feedback more specific (Weber et al., 2018; Hellermann et al., 2015; Krammer et al., 2016; Wolff et al., 2015). Regarding research question 3, expert modelling seems to be ne- cessary to facilitate feedback competence. The V-Feedback+ group performed better than the V-Feedback group concerning the suggestions category. Thus, to foster feedback competence in digital video-based environments, experts need to participate. This aligns with previous findings of students having “more confidence in their supervisors’ opinions than their own” (Tripp & Rich, 2012b, p. 683). This also il- lustrates that pre-service teachers need “modelers of practice” (Clarke et al., 2014, p. 177), whom they can mimic, as experts provide “more sophisticated feedback” (Weber et al., 2018, p. 46). 4.1. Future directions and implications for teacher education Some issues should be considered for future research, as well as for implementing digital video-based environments in university or teacher training courses. Regarding the construction of digital video-based environments, future studies should focus on effects from scaffolding elements. Although it is assumed that video-based interventions for novices in particular require highly structured settings (e.g., Kleinknecht & Gröschner, 2016; Moreno & Valdez, 2007), Peters et al. (2018) suggest that scaffolding does not always elicit positive effects with respect to feedback. Actually, it can be harmful to students’ motivation (S. Gielen et al., 2010) and lower peer feedback beliefs (Alqassab et al., 2018). Nevertheless, other studies (Gan, 2011; M.; Gielen & De Wever, 2015) found positive effects from structuring feedback elements. In terms of future feedback research on teaching, fostering pre- service teachers' professional vision of classroom practice needs to be addressed. Extant studies showed that novices pay more attention to negative teaching events (Van den Bogert et al., 2014). Consequently, professional vision and its connection to feedback should be analysed more comprehensively. The relationship among these components could be assessed by testing student teachers’ professional vision during the teaching practicum (Gold & Holodynski, 2017). Furthermore, pre- service teachers possess a limited amount of teaching experience, but also of opportunities to improve their feedback skills. Alqassab et al. (2018) assessed that “peer feedback is usually a new practice to most students, including pre-service teachers” (p. 15). However, although feedback might be less applicable when provided by novices (Carless, Chan, To, Lo, & Barrett, 2018), an expert level can be reached with training (Topping, 2017). Therefore, more comprehensive training op- portunities should be devised to foster feedback competence before the teaching practicum. Concerning possible implementations, incorporating classroom video into digital environments requires a high degree of technical and legal preparation and assistance. First, students must be provided with cameras and trained in how to use them. Second, technical assistance is needed for uploading videos or editing video sequences. Third, written consent needs to be acquired from schools, students’ parents and uni- versity students being filmed due to data-privacy laws. The latter factor in particular can hinder effective research and, thus, the development C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131 128
  • 10. of effective interventions. And yet, we consider applying classroom videos in teacher education to be a bottom-up process. The more uni- versity students participate in video-based seminars, lectures, or inter- ventions, the more open future teachers and parents will be to allowing this technology into classrooms. 4.2. Limitations Our study faced some limitations that encourage future research in this field. First, as noted above, data-privacy laws only allow for partial randomisation. Pre-service teachers had to volunteer to participate in one of the video-based environments. Consequently, assignment of study participants to the CG and the digital video-based conditions (V- Feedback, V-Feedback+) was not random; therefore, pre-service tea- chers' performances in video-based conditions could have been based on higher motivation. However, we were able to assign participants, who had volunteered, randomly to the V-Feedback and V-Feedback + groups; thus, this limitation only relates to the CG. Second, our sample size is relatively small for the digital video-based conditions. As noted in the discussion, effects or the lack thereof in the V-Feedback condition in particular need to be viewed with caution. Generally, re- search comprising video recordings rarely involves large samples (e.g., Gröschner, Schindler, Holzberger, Alles, & Seidel, 2018; Körkkö et al., 2019; Tripp &amp; Rich, 2012a). In a recent review Major and Watson (2018) established that three quarters of video-based studies involved 19 or fewer participants. Yet, we expect video-based interventions to be conducted more easily in the future when the application of video turns into a more common feature of teacher education. Third, we used a short video sequence during the pre- and post-tests. Considering that participants in the V-Feedback and V-Feedback+ conditions worked with video recordings throughout the practicum, this could have pro- vided them with an advantage over the face-to-face CG group. How- ever, pre-service teachers were accustomed to working with video se- quences of classroom situations during the preparatory lecture and seminar. Furthermore, this limitation only pertains to the comparison of face-to-face and video groups, but not V-Feedback and V-Feedback + groups. Finally, conducting our intervention in an ecologically valid setting also entails a high degree of contextual factors that need to be controlled. Although we tried to assess possible influences, such as number of feedback occasions, mentors and peers’ feedback compe- tence, especially that of mentors, is likely to influence pre-service tea- chers (Kraft et al., 2018). 5. Conclusions Our study increases the understanding of effects from blended di- gital video-based feedback environments during teaching practicums and contributes to the field of feedback research. Our study is the first to assess pre-service teachers' feedback competence on teaching prac- tice during a practicum, particularly in blended digital-video-based environments. Results can inform teacher educators in both fields. Our results indicate that digital video-based feedback environments need to be combined with expert feedback to tap their full potential. Digital video-based environments make feedback time- and location-in- dependent, thereby offering a viable substitute for face-to-face sessions, which often are not a standard component of teaching practicums or are limited in scope because of lack of resources (Lee & Wu, 2006; Valencia et al., 2009). Providing pre-service teachers with the opportunity to practise feedback is an important prerequisite to facilitate future life- long learning and, thus, fostering (pre-service) teachers’ professional knowledge (Hammerness et al., 2005). Acknowledgements We would like to thank the pre-service teachers who participated in this study and the school directors and teachers who provided them with the opportunity of a practical placement. Furthermore, we are grateful to our student assistants Karoline Glimm, Johanna Meyn and Kristina Lindstedt for helping to plan the practicum and coding data. Appendix A. Supplementary data Supplementary data to this article can be found online at https:// doi.org/10.1016/j.chb.2019.08.011. References Allen, J. P., Hafen, C. A., Gregory, A. C., Mikami, A. Y., & Pianta, R. (2015). Enhancing secondary school instruction and student achievement: Replication and extension of the My Teaching Partner-Secondary intervention. Journal of Research on Educational Effectiveness, 8(4), 475–489. Alqassab, M., Strijbos, J.-W., & Ufer, S. (2018). Training peer-feedback skills on geometric construction tasks: Role of domain knowledge and peer-feedback levels. European Journal of Psychology of Education, 33(1), 11–30. Bandura, A., & Cervone, D. (1986). Differential engagement of self-reactive influences in cognitive motivation. Organizational Behavior and Human Decision Processes, 38(1), 92–113. Barth, V. L., Thiel, F., & Ophardt, D. (2018). Professionelle Wahrnehmung von Unterrichtsstörungen: Konzeption einer videobasierten Fallanalyse mit offenem Antwortformat [Professional vision of classroom disruptions: Development of a video-based case analysis with open questions]. In A. Krüger, F. Radisch, T. Häcker, & M. Walm (Eds.). Empirische Bildungsforschung im Kontext von Schule und Lehrer⁎ innenbildung [Empirical education research in the context of school and teacher education] (pp. 141–153). Bad Heilbrunn: Julius Klinkhardt. Blomberg, G., Renkl, A., Sherin, M., Borko, H., & Seidel, T. (2013). Five research-based heuristics for using video in pre-service teacher education. Journal of Educational Research Online, 5(1), 90–114. Blömeke, S., Gustafsson, J.-E., & Shavelson, R. J. (2015). Beyond dichotomies: Ompetence viewed as a continuum. Zeitschrift für Psychologie, 223(1), 3–13. Borko, H., Whitcomb, J., & Liston, D. (2009). Wicked problems and other thoughts on issues of technology and teacher learning. Journal of Teacher Education, 60(1), 3–7. Carless, D., Chan, K. K. H., To, J., Lo, M., & Barrett, E. (2018). Developing students' capacities for evaluative judgement through analysing exemplars. In D. Boud, R. Ajjawi, P. Dawson, & J. Tai (Eds.). Developing evaluative judgement in higher education: Assessment for knowing and producing quality work. London, UK: Routledge. Chi, M. T. H. (1997). Quantifying qualitative analyses of verbal data: A practical guide. The Journal of the Learning Sciences, 6(3), 271–315. Clarke, A., Triggs, V., & Nielsen, W. (2014). Cooperating teacher participation in teacher education: A review of the literature. Review of Educational Research, 84(2), 163–202. Derry, S., Sherin, M., & Sherin, B. (2014). Multimedia learning with video. In R. Mayer (Ed.). The Cambridge handbook of multimedia learning. Cambridge, UK: Cambridge University Press. Dobie, T. E., & Anderson, E. R. (2015). Interaction in teacher communities: Three forms teachers use to express contrasting ideas in video clubs. Teaching and Teacher Education, 47, 230–240. Ericsson, K. A. (2004). Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Academic Medicine, 79(10), 70–81. Ericsson, K. A., Krampe, R. T., & Tesch-Römer, C. (1993). The role of deliberate practice in the acquisition of expert performance. Psychological Review, 100(3), 363–406. Faul, F., Erdfelder, E., Buchner, A., & Lang, A.-G. (2009). Statistical power analyses using G⁎ Power 3.1: Tests for correlation and regression analyses. Behavior Research Methods, 41, 1149–1160. Fisher, D., Frey, N., & Lapp, D. (2011). Coaching middle-level teachers to think aloud improves comprehension instruction and student reading achievement. The Teacher Educator, 46(3), 231–243. Fleiss, J. L., & Cohen, J. (1973). The equivalence of weighted kappa and the intraclass correlation coefficient as measures of reliability. Educational and Psychological Measurement, 33(3), 613–619. Fukkink, R. G., & Tavecchio, L. W. C. (2010). Effects of video interaction guidance on early childhood teachers. Teaching and Teacher Education, 26(8), 1652–1659. Gan, M. (2011). The effects of prompts and explicit coaching on peer feedback quality (Unpublished doctoral dissertation)New Zealand: University of Auckland. Gan, M. J. S., & Hattie, J. (2014). Prompting secondary students‘ use of criteria, feedback specificity and feedback levels during an investigative task. Instructional Science, 42(6), 861–878. Gaudin, C., & Chaliès, S. (2015). Video viewing in teacher education and professional development: A literature review. Educational Research Review, 16, 41–67. Gielen, M., & De Wever, B. (2015). Structuring peer assessment: Comparing the impact of the degree of structure on peer feedback content. Computers in Human Behavior, 52, 315–325. Gielen, S., Peeters, E., Dochy, F., Onghena, P., & Struyven, K. (2010). Improving the effectiveness of peer feedback for learning. Learning and Instruction, 20(4), 304–315. Gold, B., & Holodynski, M. (2017). Using digital video to measure the professional vision of elementary classroom management: Test validation and methodological chal- lenges. Computers & Education, 107, 13–30. Gregory, A., Ruzek, E., Hafen, C. A., Mikami, A. Y., Allen, J. P., & Pianta, R. C. (2017). My teaching partner-secondary: A video-based coaching model. Theory Into Practice, 56(1), 38–45. C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131 129
  • 11. Gröschner, A., Schindler, A.-K., Holzberger, D., Alles, M., & Seidel, T. (2018). How sys- tematic video reflection in teacher professional development regarding classroom discourse contributes to teacher and student self-efficacy. International Journal of Educational Research, 90, 223–233. Grossman, P., Hammerness, K., & McDonald, M. (2009). Redefining teaching, re‐- imagining teacher education. Teachers and Teaching: Theory and Practice, 15(2), 273–289. Grossman, P., & McDonald, M. (2008). Back to the future: Directions for research in teaching and teacher education. American Educational Research Journal, 45(1), 184–205. Hafner, J., & Hafner, P. (2003). Quantitative analysis of the rubric as an assessment tool: An empirical study of student peer‐group rating. International Journal of Science Education, 25(12), 1509–1528. Hammerness, K. M., Darling-Hammond, L., Bransford, J., Berliner, D. C., Cochran-Smith, M., McDonald, M., et al. (2005). How teachers learn and develop. In L. Darling- Hammond, J. Bransford, P. LePage, K. Hammerness, & H. Duffy (Eds.). Preparing teachers for a changing world: What teachers should learn and be able to do (pp. 358– 389). San Francisco, CA: Jossey-Bass. Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77(1), 81–112. Heemsoth, T., & Kleickmann, T. (2018). Learning to plan self-controlled physical edu- cation : Good vs. problematic teaching examples. Teaching and Teacher Education, 71, 168–178. Hellermann, C., Gold, B., & Holodynski, M. (2015). Förderung von Klassenführungsfähigkeiten im Lehramtsstudium. Die Wirkung der Analyse eigener und fremder Unterrichtsvideos auf das strategische Wissen und die professionelle Wahrnehmung [Fostering classroom management skills in teacher education: Effects of analysis of one's own or other teachers‘ classroom videos on strategic knowledge and professional vision]. Zeitschrift für Entwicklungspsychologie und Pädagogische Psychologie, 47(2), 97–109. Hixon, E., & So, H.-J. (2009). Technology's role in field experiences for preservice teacher training. Educational Technology & Society, 12(4), 294–304. Hollingsworth, H., & Clarke, D. (2017). Video as a tool for focusing teacher self-reflection: Supporting and provoking teacher learning. Journal of Mathematics Teacher Education, 70(5), 457–475. Huck, S. W., & McLean, R. A. (1975). Using a repeated measures ANOVA to analyze the data from a pretest-posttest design: A potentially confusing task. Psychological Bulletin, 82(4), 511–518. Huppertz, P., Massler, U., & Plötzner, R. (2005). V-share: Video-based analysis and re- flection of teaching experiences in virtual groups. Proceedings of the international conference on computer support for collaborative learning (pp. 245–253). Mahwah, NJ: Lawrence Erlbaum Associates. Joyce, B., & Showers, B. (1996). Staff development as a comprehensive service organi- sation. Journal of Staff Development, 17(1), 2–6. Joyce, B. R., & Showers, B. (2002). Student achievement through staff development ((3rd ed.)). Alexandria: VA: ASCD. Kang, H., & Van Es, E. A. (2018). Articulating design principles for productive use of video in preservice education. Journal of Teacher Education, 1–14. Kersting, N. (2008). Using video clips of mathematics classroom instruction as item prompts to measure teachers’ knowledge of teaching mathematics. Educational and Psychological Measurement, 68(5), 845–861. Kleinknecht, M., & Gröschner, A. (2016). Fostering preservice teachers’ noticing with structured video feedback: Results of an online- and video-based intervention study. Teaching and Teacher Education, 59, 45–56. Kleinknecht, M., & Schneider, J. (2013). What do teachers think and feel when analyzing videos of themselves and other teachers teaching? Teaching and Teacher Education, 33, 13–23. Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis and a preliminary feedback intervention theory. Psychological Bulletin, 119(2), 254–284. Körkkö, M., Morales Rios, S., & Kyrö-Ämmäla, O. (2019). Using a video app as a tool for reflective practice. Educational Research, 61(1), 22–37. Kounin, J. S. (1970). Discipline and group management in classrooms. New York, NY: Holt, Rinchart & Winston. Kraft, M. A., Blazar, D., & Hogan, D. (2018). The effect of teacher coaching on instruction and achievement: A meta-analysis of the causal evidence. Review of Educational Research, 88(4), 547–588. Krammer, K., Hugener, I., Biaggi, S., Frommelt, M., Fürrer, A.der M., & G., & Stürmer, K. (2016). Videos in der Ausbildung von Lehrkräften: Förderung der professionellen Unterrichtswahrnehmung durch die Analyse von eigenen und fremden Videos [Classroom videos in initial teacher education: Fostering professional vision by analysing one's own and other teachers‘ videos]. Unterrichtswissenschaft, 44(4), 357–372. Lee, G. C., & Wu, C.-C. (2006). Enhancing the teaching experience of pre-service teachers through the use of videos in web-based computer-mediated communication (CMC). Innovations in Education & Teaching International, 43(4), 369–380. Lu, H.-L. (2010). Research on peer-coaching in preservice teacher education – a review of literature. Teaching and Teacher Education, 26(4), 748–753. Major, L., & Watson, S. (2018). Using video to support in-service teacher professional development: The state of the field, limitations and possibilities. Technology, Pedagogy and Education, 27(1), 49–68. Malewski, E., Phillion, J., & Lehman, J. D. (2005). A Freirian framework for technology- based virtual field experiences. Contemporary Issues in Technology and Teacher Education, 4(4), 410–428. Matsumura, L. C., Garnier, H. E., & Spybrook, J. (2013). Literacy coaching to improve student reading achievement: A multi-level mediation model. Learning and Instruction, 25, 35–48. Maxwell, S. E., & Howard, G. S. (1981). Change scores—necessarily. Anathema? Educational and Psychological Measurement, Vol. 41 3. Moreno, R., & Valdez, A. (2007). Immediate and delayed effects of using a classroom case exemplar in teacher education: The role of presentation format. Journal of Educational Psychology, 99(1), 194–206. Narciss, S. (2013). Designing and evaluating tutoring feedback strategies for digital learning environments on the basis of the interactive feedback model. Digital Education Review, 23(1), 7–26. Nicol, D. J., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199–218. O'Brien, R. G., & Kaiser, M. K. (1985). MANOVA method for analyzing repeated measures designs: An extensive primer. Psychological Bulletin, 97(2), 316–333. Peters, O., Körndle, H., & Narciss, S. (2018). Effects of a formative assessment script on how vocational students generate formative assessment feedback to a peer's or their own performance. European Journal of Psychology of Education, 33(1), 117–143. Prins, F., Sluijsmans, D., & Kirschner, P. A. (2006). Feedback for general practitioners in training: Quality, styles and preferences. Advances in Health Sciences Education, 11, 289–303. Questback (2017). Unipark - EFS survey, version summer 2017. Köln: Questback GmbH. Rich, P., & Hannafin, M. J. (2008). Capturing and assessing evidence of student teacher inquiry: A case study. Teaching and Teacher Education, 24(6), 1426–1440. Rotsaert, T., Panadero, E., Schellens, T., & Raes, A. (2018). Now you know what you’re doing right or wrong!’ Peer feedback quality in synchronous peer assessment in secondary education. European Journal of Psychology of Education, 33(2), 255–275. Sailors, M., & Price, L. (2015). Support for the improvement of practices through intensive coaching (SIPIC): A model of coaching for improving reading instruction and reading achievement. Teaching and Teacher Education, 45, 115–127. Santagata, R., & Angelici, G. (2010). Studying the impact of the lesson analysis framework on preservice teachers' abilities to reflect on videos of classroom teaching. Journal of Teacher Education, 61(4), 339–349. Santagata, R., & Guarino, J. (2011). Using video to teach future teachers to learn from teaching. ZDM, 43(1), 133–145. Schmider, E., Ziegler, M., Danay, E., Beyer, L., & Bühner, M. (2010). Is it really robust? Reinvestigating the robustness of ANOVA against violations of the normal distribu- tion assumption. Methodology: European Journal of Research Methods for the Behavioural and Social Sciences, 6(4), 147–151. Seidel, T., & Stürmer, K. (2014). Modelling and measuring the structure of professional vision in pre-service teachers. American Educational Research Journal, 51(4), 739–771. Seidel, T., Stürmer, K., Blomberg, G., Kobarg, M., & Schwindt, K. (2011). Teacher learning from analysis of videotaped classroom situations: Does it make a difference whether teachers observe their own teaching or that of others? Teaching and Teacher Education, 27(2), 259–267. Sharpe, L., Hu, C., Crawford, L., Gopinathan, S., Khine, M. S., Moo, S. N., et al. (2003). Enhancing multipoint desktop video conferencing (MDVC) with lesson video clips: Recent developments in pre-service teaching practice in Singapore. Teaching and Teacher Education, 19(5), 529–541. Sherin, M. (2007). New perspectives on the role of video in teacher education. In J. Brophy (Ed.). Advances in research on teaching (pp. 1–27). Bingley, UK: Emerald. Sluijsmans, D. M. A., Brand-Gruwel, S., & Van Merriënboer, J. J. G. (2002). Peer as- sessment training in teacher education: Effects on performance and perceptions. Assessment & Evaluation in Higher Education, 27(5), 443–454. Sluijsmans, D. M. A., Brand-Gruwel, S., Van Merriënboer, J. J. G., & Bastiaens, T. J. (2003). The training of peer assessment skills to promote the development of re- flection skills in teacher education. Studies In Educational Evaluation, 29(1), 23–42. Sluijsmans, D. M. A., Brand-Gruwel, S., Van Merriënboer, J. J. G., & Martens, R. L. (2004). Training teachers in peer-assessment skills: Effects on performance and perceptions. Innovations in Education & Teaching International, 41(1), 59–78. So, W. W., Pow, J. W., & Hung, V. H. (2009). The interactive use of a video database in teacher education: Creating a knowledge base for teaching through a learning com- munity. Computers & Education, 53(3), 775–786. Topping, K. J. (2017). Peer assessment: Learning by judging and discussing the work of other learners. Interdisciplinary Education and Psychology, 1(1), 1–17. Tripp, T. R., & Rich, P. J. (2012a). The influence of video analysis on the process of teacher change. Teaching and Teacher Education, 28(5), 728–739. Tripp, T. R., & Rich, P. J. (2012b). Using video to analyse one’s own teaching. British Journal of Educational Technology, 43(4), 678–704. Tschannen-Moran, M., & McMaster, P. (2009). Sources of self-efficacy: Four professional development formats and their relationship to self-efficacy and implementation of a new teaching strategy. The Elementary School Journal, 110(2), 228–245. Tschannen-Moran, M., Woolfolk Hoy, A., & Hoy, W. K. (1998). Teacher efficacy: Its meaning and measure. Review of Educational Research, 68(2), 202–248. Valencia, S. W., Martin, S. D., Place, N. A., & Grossman, P. (2009). Complex interactions in student teaching: Lost opportunities for learning. Journal of Teacher Education, 60(3), 304–322. Van Steendam, E., Rijlaarsdam, G., Sercu, L., & Van den Bergh, H. (2010). the effect of instruction type and dyadic or individual emulation on the quality of higher-order peer feedback in EFL. Learning and Instruction, 20(4), 316–327. Van den Bogert, N., Van Bruggen, J., Kostons, D., & Jochems, W. (2014). First steps into understanding teachers‘ visual perception of classroom events. Teaching and Teacher Education, 37, 208–216. Vogt, F., & Rogalla, M. (2009). Developing adaptive teaching competency through coaching. Teaching. Teacher Education, 25(8), 1051–1060. Weber, K. E., Gold, B., Prilop, C. N., & Kleinknecht, M. (2018). Promoting pre-service teachers’ professional vision of classroom management during practical school C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131 130
  • 12. training: Effects of a structured online- and video-based self-reflection and feedback intervention. Teaching and Teacher Education, 76, 39–49. Wolff, C. E. (2015). Revisiting ‘withitness’: Differences in teachers' representations, perceptions and interpretations of classroom management. Heerlen, Netherlands: Open University of the Netherlands. Wolff, C. E., Jarodzka, H., & Boshuizen, H. (2017). See and tell: Differences between expert and novice teachers’ interpretations of problematic classroom management events. Teaching and Teacher Education, 66(1), 295–308. Wolff, C. E., Van den Bogert, N., Jarodzka, H., & Boshuizen, H. (2015). Keeping an eye on learning: Differences between expert and novice teachers' representations of class- room management events. Journal of Teacher Education, 66(1), 68–85. Wu, C.-C., & Kao, H.-C. (2008). Streaming videos in peer assessment to support training pre- service teachers. Educational Technology & Society, 11(1), 45–55. Zottmann, J. M., Stegmann, K., Strijbos, J.-W., Vogel, F., Wecker, C., & Fischer, F. (2013). Computer-supported collaborative learning with digital video cases in teacher edu- cation: The impact of teaching experience on knowledge convergence. Computers in Human Behavior, 29(5), 2100–2108. C.N. Prilop, et al. Computers in Human Behavior 102 (2020) 120–131 131