Approaches To Assessment In CLIL Classrooms Two Case Studies
1. 1
This is a pre-print and does not represent the final version. Please reference the published
version.
OâDwyer, Fergus & de Boer, Mark (2015) Approaches to assessment in CLIL classrooms:
Two case studies. Language Learning in Higher Education, 5(2), 397â421. DOI:
10.1515/cercles-2015-0019
Fergus OâDwyer and Mark de Boer
Approaches to assessment in CLIL classrooms: Two case studies
Abstract: This article presents two case studies that show how learner involvement and
collaboration in assessment are valid pedagogic tools to encourage learner reflection and
engagement, particularly where a very traditional approach to language learning is the norm.
The authors, who teach in universities in Japan, discuss different but related approaches to
assessment in CLIL (Content and Language Integrated Learning) classrooms. The first case
study describes how assessment in the classrooms in focus requires more engagement on the
part of learners as they must work things out for themselves. Collaborating with classmates,
which entails discussing assessment decisions, can foster language development. If learners
engage in informal discussion about their learning performance, they can review previous
learning, affirm progress, and make suggestions about future learning goals and how to
improve their learning outcomes. In the authorsâ view these processes help learners to develop
self-regulation and self-efficacy. The second case study involves students developing
collaboration skills during project work in which they are also involved in the assessment
process. They give slide and poster presentations, write reports, and analyze scientific
information while collaborating with their classmates. During this process they are also
responsible for self- and peer-assessment. As a result of their collaboration they acquire
2. 2
language, but they also develop the ability to collaborate further. The article concludes by
briefly discussing learner involvement and collaboration, and the central role that feedback
practices can play in learning.
Keywords: assessment, Dynamic Assessment, CLIL, feedback, learner involvement, learner
collaboration
Fergus OâDwyer: School of Foreign Studies, Faculty of Language and Culture, Osaka
University (Japan). E-mail: fodwyerj@gmail.com
Mark de Boer: Global Education Center, Iwate University (Japan). E-mail: mark@iwate-
u.ac.jp
1 Introduction
Content and Language Integrated Learning (CLIL) is an educational approach in which
subjects such as agriculture or linguistics are taught to students through a language that is
neither their first language nor the dominant medium of instruction in the respective education
system (Dalton-Puffer and Nikula 2014). CLIL involves different ways of using language as
the medium of instruction, depending on the specific context. All forms of CLIL are generally
based on methodological principles established by research on âlanguage immersionâ. CLIL
âcan provide effective opportunities for pupils to use their new language skills now, rather
than learn them now for use laterâ (Commission of the European Communities 2003). In the
two case studies presented here, learners use English to engage with academic content
relevant to their major. Previous learning of English in this context has been typically based
on rote learning of the structures of the language (especially grammar) in order to pass
entrance examinations. In general the type of CLIL used in these contexts, when compared
with studentsâ previous learning experience, demands a wider and higher range of skills that
3. 3
must be deployed and assessed flexibly. We believe that assessment practices should require
and encourage more engagement on the part of learners and that the onus should be on them
to work things out for themselves. In comparison to learning prior to these classes, learners
should apply more self-regulation strategies, and provide a greater amount of constructive
feedback in peer assessment. The practices adopted in the first case study (OâDwyer)
encourage learners to develop a greater range of skills; in comparison to the process writing
classes they have attended previously, they are less guided and require more engagement from
learners. OâDwyer describes practices designed to involve learners, for example the provision
of constructive feedback. The second case study (deBoer) explains how Dynamic Assessment
(DA) and classroom practices that encourage learner collaboration are implemented in a
similar context and analyzes the discourse learners used for collaboration.
We begin the theoretical framework with an explanation of how learners can be
involved in assessment, before going on to the related topics of collaboration and DA in
classrooms. We also outline a general framework of feedback practices used in these contexts.
2 Theoretical framework
The two case studies presented share the same framework of good feedback practice, but in
the first, assessment is formative in nature and shaped by learning-oriented assessment
(LOA), while in the second, assessment is based on DA and student-student mediated
learning. While the two approaches have different starting points, they can produce similarly
positive results in terms of feedback and learning outcomes.
2.1 Assessment
Japanâs approach to English language teaching and learning is a structural one, largely
concerned with the teaching of grammar (Kikuchi and Browne 2009; Yoshida 2003), which
makes standardized assessment easy to implement. Although there is a consensus that this
4. 4
approach to English language teaching is not effective for communicative learning (Gorsuch
1998; Kikuchi and Browne 2009; Yoshida 2003), the requirements of the university entrance
exam in English trickle down and influence the English curriculum in junior high and high
schools (Bailey 1999; Murphey 2004). This ânegative backwash effectâ has stalled changes in
the curriculum at these levels of education (Kikuchi and Browne 2009). Higher education
need not be affected, but this language teaching approach is so engrained (Hino 1988) that
even at university level, a structural approach is still favoured for testing purposes: âOnce a
practice is accepted as a tradition, it becomes a norm ... and the longer history the tradition
has, the stronger the norm isâ (Hino 1988: 52). Implementing CLIL-type activities is a
deviation from this norm, and a change in teaching approach requires a change in assessment
methods, which can include bringing students into the assessment process. Assessment know-
how provides learners with opportunities to develop their skills in self- and peer-assessment
via learner-learner mediation in tasks designed to stimulate productive learning practices.
Assessment in a CLIL classroom cannot focus on language structure, since the lessons are not
concerned with learning about language, but âinvite the learner to act primarily as a language
userâ (Van den Branden 2006: 8). In the practices outlined below learners take an active
attitude to learning, independently understanding learning tasks (Dickinson 1995: 165) and
how they can be assessed. This article describes practices that encourage learners to adopt the
habits of autonomous learning.
2.2 Learner involvement in assessment
Classroom assessment has a great potential to enhance language learning. Gipps (1994) and
Tunstall and Gipps (1996) emphasize the importance of learner engagement in effective
classroom assessment. There is also a need to promote learner agency in collaborative
classrooms by developing skills of self- and peer-assessment (Rea-Dickins 2009: 267). In
order to involve learners in assessment, Carless (2009) advocates learning-oriented
5. 5
assessment (LOA), which generally works toward developing learnersâ ability to self-assess,
and embodies principles such as: assessment tasks should be designed to stimulate productive
learning practices amongst students; assessment should involve students in actively engaging
with transparent criteria (e.g., effective presentation formats), and quality (e.g., quality of
explanations and information provided in a presentation); learnersâ own and/or their peersâ
performances and feedback should be timely and forward-looking so as to support current and
future student learning (i.e., feedback should focus learners on improvement in carrying out
current and future learning tasks). As OâDwyer et al. (2013) suggest, in order to highlight
underlying problems and implement suggestions for improvement, assessment can perform
the roles of a decision making tool which contributes to the learning task, a participatory
pedagogical process, and inform the next learning task or stage of learning. If assessment
fulfils these roles, it can help learners to progress autonomously. Engaging with transparent
measures of quality and criteria (e.g., the can do descriptors of the CEFR) in a focused way
involves learners, for example, in defining what an effective presentation is, or how to agree
and disagree politely. This can be viewed as contextualized learning (Bateson 2000) as
opposed to learning through facts, or in an abstract way by concentrating on function and
form (e.g., grammar drills). Involving learners in assessment practice is a key element to
enhance their language development.
2.3 Understanding Dynamic Assessment and learner collaboration
Learner involvement is a focus of Dynamic Assessment, in which assessment and instruction
are seen as a single activity that seeks simultaneous diagnosis and promotion of learner
development by highlighting underlying problems and implementing procedures that produce
suggestions for interventions to facilitate improved learner performance (Lantolf and Poehner
2008 273). The mediation in the classroom occurs primarily between students as they discuss
their content, and the role of the teacher is to facilitate learning by moving around the
6. 6
classroom and working with students and groups. Through this interaction, teachers can
assess and assist students in their development (Poehner 2008). Learning to assess is part of
the mediation process. In language learning classrooms, even where DA is involved, learning
the language is primary and mediation is for the purpose of error correction (Lantolf 2006;
Poehner 2008; Poehner and Lantolf 2013; Swain et al. 2011), yet in the context we are
concerned with, the difference lies in the fact that the content is learner-generated in the target
language so the students are not learning about the language, they are learning language
through social interaction in the target language and assessment which facilitates language
development. To implement assessment practices and to study the effects in a situation where
the learners are the mediators and the teacher acts only as a facilitator, assessment parameters
need to be clearly outlined. The learners can play a role in setting these parameters. This
process, and the resulting assessment, includes both informative and reflective gap-closing
practices.
2.3.1 Using Longacreâs procedural discourse to understand Dynamic Assessment
In case study 2 Longacreâs (1996) model of discourse structures is used to analyse the
structure of the learnersâ discourse. Longacre bases his model on the thesis that language is
language only in context, and his model provides us with the tools to analyse procedural
dialogue, which is discourse that discusses goal-oriented routines: how something should be
done or how it was done, such as experiment procedure or a scientific journal article. The
focus is on the procedure and not on the person carrying out the procedure. The analysis used
in this context examines three discourse types (Table 1). The first is a request for assistance or
advice from a learner (self-driven (+) projection), the second is a suggestion for change from
a learner (feedback-driven (+) projection), and the third is discourse describing how
something was done ((-) projection).
7. 7
(-) Agent Orientation (Agent)
(+)
Contingent
Succession
(temporal)
Procedural discourse Projection
How-to-do-it
self-driven (+) projection (+SD)
feedback-driven (+) projection (+FD)
How-it-was-done - Projection (-P)
Table 1: Procedural discourse (adapted from Longacre 1996: 10)
The learners work together on a presentation. The uploaded PowerPoint file will be
labeled as an âimprovable objectâ (IO) (Scardamalia and Bereiter 2006), as it will be
âimprovedâ after group members make comments and suggestions for changes. To analyse
the stages of the IO, deBoer designed a process analysis tool (Figure 1), which is used to
visualize the learning process by following the location of the improvable object and how it is
moved towards completion.
Figure 1: Process analysis tool
2.4 Good feedback practices
When learners are involved in assessment in a CLIL classroom, they are involved in decisions
about how to learn and what to learn and why they are learning, and they are also actively
involved in decisions about the criteria for assessment and the process of judging their own
8. 8
and othersâ work. As a result, their relationship to their studies will probably be qualitatively
different from that of students who are treated as recipients of teaching and who are the object
of othersâ unilateral assessment (McConnell 2006: 92). This is a tools-and-results process
(Newman and Holzman 1993), where the tools the learners create to learn about assessment
practices are the same tools used in the final assessment process.
Similar assessment frameworks include assessment for learning, the purpose of which is
to provide feedback to both the teacher and learner regarding the learnerâs progress towards
achieving the learning objective(s) (Black et al. 2003). The assessment frameworks applied in
these classrooms and other contexts share many similarities but are also different in various
ways. It is not the purpose of this article to dwell on these differences; rather we aim to
analyse practices in terms of how they impact on learning. For this purpose we use broad
principles of good feedback practice as defined by Nicol and Macfarlane-Dick (2006):
1. Facilitates the development of self-assessment (reflection) in learning.
2. Encourages teacher and peer dialogue around learning.
3. Helps clarify what good performance is (goals, criteria, expected standards).
4. Provides opportunities to close the gap between current and desired performance.
5. Delivers high quality information to students about their learning.
6. Encourages positive motivational beliefs and self-esteem.
7. Provides information to teachers that can be used to help shape the teaching.
These relatively simple principles will be explained in more detail as necessary. For the
purposes of this article we divide them into two groups: informative and reflective gap-closing
practices. Informative feedback practices involve the exchange of information to improve
learning (2, 3, 5 and 7), while reflective gap-closing practices involve developing assessment,
self-assessment and reflection abilities in order to progress learning (1, 4 and 6).
We now turn to the case studies.
9. 9
3 The implementation of assessment practices in CLIL
3.1 Case study 1: Learner involvement in assessment
The starting point of the first case study is that CLIL, in comparison with studentsâ previous
learning experience, requires controlled use of a wide range of relatively high cognitive skills,
such as discussing academic content in English. The context is classes taken by English
language majors in the School of Foreign Studies, Faculty of Language and Culture, Osaka
University, Japan. In their first two years of the degree, the learners take between five and
seven 90-minute English classes per week in a semester of 15 weeks.
. Classes focus on spoken interaction, spoken production, writing, reading, and integrated
skills. In their second year students are introduced to LOA in OâDwyerâs process writing
class. By introducing them to goal-setting and reflective practices alongside self- and peer-
assessment, it is hoped that those practices can help to shape a forward-looking learning style.
These LOA practices are then adapted for CLIL-type classes (called âreading and discussionâ
classes), which require skills such as analysing the content of academic papers.
The reading and discussion classes, taken in the studentsâ last two undergraduate years,
typically involve individual projects that engage with academic content. There are several
topics available, ranging from the more general (e.g., media literacy, preparing for and
discussing starting successful starts to future careers) to the more specific (e.g., English
language writing from the outer (e.g., Africa, India) and expanding (e.g., Japan, China)
circles) (Kachru 1992). The classes follow a cycle of focused reading leading to an
assignment that synthesizes, analyses and presents research material; they are based on
materials used for undergraduate classes in inner circle countries and have a linguistic focus.
The class introduced here is a World Englishes (WE) class, as it provides a good example of
how materials are scaffolded to support the anticipated development of learnersâ cognitive
abilities in the L2. In the first semester, learners analyse and discuss the issues that have led to
the spread of English around the world. Reading and research presentations guide learners to
10. 10
discover more about this topic. In the second semester, there is a specific focus on the
connection between identity and intelligibility and the use of WE. The semester begins with
focused reading leading to an assignment that synthesizes, analyses, and presents research
material. Important issues in the area of WE are then introduced. The final section of the
course is dedicated to researching, student-led presentations of the research, and debating
important issues relating to the content. The next part explains how assessment practices in
these classes are adapted so that they correspond to the development of the learners.
Whereas learners are guided through the practices they are expected to develop in
writing classes, the focus in the CLIL-type classes is on more independent learning of higher-
level academic skills (e.g., synthesizing and critically analysing detailed academic papers). It
can be argued that assessment (e.g., providing constructive feedback) needs to be appropriate
developmentally, in line with the aims of CLIL (to develop academic skills such as critically
analysing content). The pedagogical goals should be mirrored and facilitated in the
assessment practices, and assessment tools and self-regulation practices need to be developed
and adjusted by both learners and instructors. At the beginning of the year, the instructor
emphasizes that learners should work things out for themselves: rather than presenting them
with goals, learners are encouraged to decide on and work toward relevant goals from lists
extracted from the CEFR (Council of Europe 2001) (see Table 2).
Possible year goals
Goal-setting and Self-assessment Checklist, Skill: Reading for Information
and Argument
Use this checklist, from the illustrative scales in the Common European Framework,
to (a) set personal learning goals and (b) record your progress in achieving these goals.
Possible Evaluative Criteria: I can do this *reasonably well **well ***very well
Next
goal
* ** ***
B1 I can recognise significant points in straightforward articles on familiar
subjects.
11. 11
I can identify the main issues, arguments and conclusions of
specialised articles, even though I can only understand
the detail reasonably well
B2 I can understand general articles and reports in which the writers adopt
particular viewpoints.
I can understand specialised articles, provided I can use a dictionary
occasionally to confirm terminology.
I can obtain information, ideas and opinions from specialised sources
C1 I can understand in detail a wide range of lengthy,
complex texts, identifying finer points of detail
including attitudes and implied as well as stated
opinions.
Goal-setting and Self-assessment Checklist, Skill: Formal Discussion
Next
goal
* ** ***
B1 âI can follow much of what is said that is related to
my field, provided speakers avoid very idiomatic usage
and speak clearly.
âĄI can take part in formal discussion of familiar
subjects, such as the exchange of factual information,
receiving instructions or the discussion of solutions to
practical problems.
âłI can put over a point of view clearly, and can engage
in debate reasonably well.
B2 âI can keep up with lively discussion, identifying
accurately arguments supporting and opposing points
of view.
I can follow the discussion on matters related to my
field, understanding in detail of significant points.
âĄI can participate actively in routine and non routine
formal discussion.
âłI can express my ideas and opinions with precision,
presenting and responding to complex lines of
argument convincingly.
I can contribute, account for and sustain my opinion,
evaluate alternative proposals and make and respond
12. 12
to hypotheses.
C1 âĄI can easily keep up with debate, even on
abstract, complex unfamiliar topics.
âłI can argue a formal position convincingly,
responding to questions and comments and answering
complex lines of counter argument fluently,
spontaneously and appropriately.
Possible Presentation goals: B1 I can give a short and straightforward presentation
on a chosen topic in a reasonably clear and precise manner
B2 I can give clear, detailed descriptions, expanding and supporting ideas with
subsidiary points and relevant examples
C1 I can give a clear, well-structured presentation on a complex subject, expanding
and supporting points of view with appropriate reasons and examples
Table 2: Possible year goals
Typical year goals could be based around formative assessment on the following can-do
statements:
Reading (for information and argument): B2 â I can understand specialized articles,
provided I can use a dictionary occasionally to confirm terminology.
(Formal) Discussion: B2 â I can keep up with lively discussion, accurately identifying
arguments, supporting and opposing points of view.
Presentation: B2 â I can give clear, detailed descriptions, expanding and supporting
ideas with subsidiary points and relevant examples (Council of Europe 2001)
The assessment of the presentation in semester 1 (research presentations on variation in
varieties of World English) is the first step in getting students to engage with flexible
assessment criteria. In classes taken by learners in the first two years of their degree, a
presentation is assessed using a rigid assessment grid that rates criteria on a Likert scale (see
Table 3).
13. 13
Features of World English Varieties Presentation Assignment
Name: Variety of English:
âThe information provided in the background was sufficient, with relevant supporting
ideasâ
1 2 3 4 5 6 7 8
âThe pronunciation and grammar examples were relevant and appropriateâ
1 2 3 4 5 6 7 8 9 10
âThe examples in the vocabulary section were relevant and appropriateâ
1 2 3 4 5 6 7
âThe presenter made a good effort in the communication style sectionâ
1 2 3 4 5
âThe explanations were clear, detailed and easy to understandâ
1 2 3 4 5
âThe poster was informative and visually impressiveâ
1 2 3 4 5
âThe comprehension and discussion questions helped me understandâ
1 2 3 4 5
âOverall impression of the effort, presentation and discussionâ
1 2 3 4 5
Scale = *(reasonably well) **(well) ***(very well)
You can research detailed information on varieties of English *, ** or ***
You can present clear, detailed descriptions, with supporting ideas and relevant
examples *, ** or ***
You can participate actively in discussion on academic topics (e.g. World Englishes) *,
** or ***
Total: /50
What was done well:
14. 14
Advice for improvement:
Table 3: Assessment rubric
In order to encourage more engagement, learners are presented with simplified criteria
and asked to give constructive criticism directly to presenters orally. These simplified criteria,
projected on a screen in the classroom, are as follows:
ď Background: sufficient, with relevant supporting ideas
ď Pronunciation, grammar examples: relevant and appropriate
ď Explanations: clear, detailed and easy to understand
ď Communication style: clear and easy to understand
ď Poster: informative and visually impressive
The instructor uses the assessment rubric (Table 3) to generate specific feedback and for
assignment grading purposes, with a focus on giving feedback about what students are doing
well and what they can improve. The emphasis in the feedback practices of these
presentations is on learners generating constructive criticism of learning, and working
together to generate ideas on how to continue to improve. The simplified criteria are designed
to encourage learners to be more involved with assessment criteria in a cognitively
demanding way than has been the case in their previous learning experience. OâDwyer
assumes that informative practices encourage learners to engage in dialogue providing
information about their peersâ learning (principles 2 and 5). This information is based on
criteria that clarify the nature of effective language task performance (principle 3). These
exchanges are aimed at closing the gap between current and desired performance (reflective
gap-closing, principle 4). The CLIL-type classrooms encourage the command of higher skills
than previously, and the feedback practices encourage learners to integrate the controlled use
15. 15
of a wide range of assessment, self-assessment and reflective abilities (principle 1). In the
courses taught in 2014, insufficient time and the decision to prioritize learner-learner
engagement were responsible for a lack of instructor-learner dialogue about learning
(principle 2), and feedback was limited to the exchange of a completed assessment rubric.
This was, however, one element marked for future improvement.
In order for reflection to be forward-looking, learners are asked questions such as: Can I
adjust my year goals? What other skills do I need to learn/improve? In this way they are
encouraged to develop a range of skills independently. It is hoped that this will impact on their
motivation because it is part of a process that allows them to verify improvement in their
learning through feedback on their learning efforts. This effort to promote self-efficacy has a
link to self-esteem (principle 6).
The tasks in the second semester involve reading and discussing texts about
sociolinguistic theories and interpreting academic papers based on these theories. This is the
first time the majority of learners have attempted such activities, and they are required to
engage with more wide-ranging assessment criteria, in comparison to previous learning. The
first learning stage in semester 2 involves reading about WE and identity, and then presenting
a poster about an academic paper (e.g., taken from the journal World Englishes). The learners
are required to (a) explain the identity/language situation in easy-to-understand but thorough
detail, and (b) interpret the situation through keywords/phrases (e.g., identity is performed
rather than possessed, or there are subtle messages in language that construct an image or
identity of the speaker).
The simplified criteria, presented on a screen in the classroom, are as follows:
ď Situation: explained in easy to understand but thorough detail
ď Centre of the presentation: emphasized appropriately
ď Clear interpretations, and following explanations
ď Discussion questions: encouraged lively discussion
16. 16
ď I have gained a greater understanding of the relevant issues
ď Generally well prepared
These criteria generally work well when the learning goal is: âsummarize and present research
in a clear and interesting manner, explaining all relevant issues in sufficient detailâ.
The assessment style is similar to that used in semester 1 and again emphasizes the
importance of learnersâ exchanging constructive, relevant feedback. Furthermore, learners are
encouraged to develop their skills in preparing for the presentation by engaging with the
assessment criteria and can do statements in more meaningful ways. For example, in the
following CEFR descriptors, underlined and italicized text identifies features that students can
usefully focus on.
Possible presentation goals:
B1 â I can give a short and straightforward presentation on a chosen topic in a
reasonably clear and precise manner
B2 â I can give clear, detailed descriptions, expanding and supporting ideas with
subsidiary points and relevant examples
C1 â I can give a clear, well-structured presentation on a complex subject, expanding
and supporting points of view with appropriate reasons and examples
These CEFR descriptors are presented alongside the task assessment criteria to encourage
learners to provide relevant feedback. At the end of presentations, after groups of learners
have discussed their peersâ performance, presenters are encouraged to self-assess based on the
criteria. The results are then compared orally with the feedback of the instructor, who
provides written feedback. This time no rubric is used: just a simple form in which to provide
notes on what has been done well, and what can be improved (See Table 4).
Presentation Assessment
Presenter: Topic:
Points to consider when assessing the presentation:
âThe situation was explained in easy to understand but thorough detail âThe centre of
17. 17
the presentation was emphasized appropriately
âThe interpretations, and following explanations were clear âThe discussion questions
were well planned and encouraged lively discussion
âI have gained a greater understanding of the relevant issues âDid I become interested
in the topic after the presentation and discussion
âOverall this was a well prepared presentation âEvaluation of presentation skills, maybe
talking about the following scales:
B1 I can give a short and straightforward presentation on a chosen topic in a
reasonably clear and precise manner
B2 I can give clear, detailed descriptions, expanding and supporting ideas with
subsidiary points and relevant examples
C1 I can give a clear, well-structured presentation on a complex subject, expanding
and supporting points of view with appropriate reasons and examples
What was done well:
Advice for improvement/other comments:
Table 4: Simplified assessment criteria
This gives the learners a chance to process peer feedback, discuss previous learning,
look forward to future learning, discuss instructor feedback, and raise any emerging issues
with the instructor. OâDwyer found that these practices were a marked improvement on
18. 18
principle 2: learner-teacher dialogue about learning. The integration of learning scales
provides a focus for learning goals and task performance (principle 3), and generates more
information feedback (principles 2 and 5: learner dialogue providing information about peersâ
learning). The processes of assessment again provide plentiful time for reflective gap-closing
feedback, in much the same way as in semester 1.
In the two assignments outlined above â research presentations on variation in WE and
poster presentations about an academic paper â learners work independently on assessing and
reflecting on predominantly CEFR-based goals. In the final task they are encouraged to look
forward to future learning by focusing on common goals in life that go beyond the CEFR and
reach into their careers (e.g., researching, teaching, and project management). In the final
assignment (teaching and debating topics in WE such as How can we specifically decide what
is an âerrorâ and an âinnovationâ?), learners are encouraged to base their learning objectives
on goals such as:
ď Researching: I can critically analyse and synthesize new and complex information
ď Teaching: I can present language content in ways which are appropriate for specific
groups of learners
ď Project management: I can plan a timescale for my objectives
As before, assessment criteria are presented simply, for example:
ď Handout and presentation: clarified the relevant issues
ď Debate questions and material: well planned and encouraged lively discussion
ď Presented in an appropriate manner
ď I became interested and gained a greater understanding of the issues
Learners are again encouraged to give constructive feedback. These goals are also presented
to encourage learners to look forward to future learning. The final reflection focuses on
questions such as: As a presenter, project manager, researcher and teacher, what have I
improved? What am I going to do with have I learned? My future plans? As a presenter,
19. 19
project manager, researcher and teacher, what can I improve?
The processes described here are based on the belief that, in order to develop self-
assessment and reflection, learners must engage in critically constructive assessment of their
peersâ learning performance. The instructorâs main purpose is to facilitate learning based on
reflective gap-closing and informative feedback, and an effective way of doing this is to
engage learners in informal discussion about information related to learning performances.
The assessment of these performances is based on clear and scaled criteria, and discussion
aims to review previous learning and give suggestions for the improvement of future learning.
This process provides learners with information that confirms their progress and helps them to
define future learning goals; it also has the potential to strengthen motivation and foster self-
efficacy.
3.2 Case study 2: Collaboration and responsibility through dynamic assessment
Case study 2 reports on a first-year general English course for students in the Faculties of
Engineering and Agriculture at Iwate University, Japan. The course is 90 minutes per week
for 15 weeks, and the design of the syllabus emphasizes studentsâ responsibility for their own
learning with the teacher acting as a facilitator. The students work in groups on a series of
projects and the discourse of their mediation can be used to evaluate their assessment of each
other and the effectiveness of this type of learning.
3.2.1 Curriculum content and context
A five-week section of this course will be used to illustrate how collaborative practices and
DA are implemented. During this section of the course students prepare a five-minute
PowerPoint group presentation and an assessment rubric that will be used to peer-assess the
presentations. The flow of the activities is outlined in Table 5. Students use the ICT contents
platform (deBoer 2011; deBoer et al. 2012) which was built using Moodle (Dougiamas 2011)
20. 20
as the framework.
Week Activity Description
1 Video Watching seven short videos and choosing one to do a
presentation about. Groups are formed from their choices for a
maximum of six students per group.
2 Video
quizzes
Formative assessment quizzes to get the students to be aware
of the vocabulary for each of the videos.
3 Presentation
Preparation
Students prepare their presentation using PowerPoint. They
use the online forums to prepare their scripts, share
information and upload their presentation slides.
4 Peer
assessment
rubric
Students make a rubric that they will use for assessing their
presentations based on their own ideas on what makes a good
presentation.
5 Presenting
assessment
Students present and peer assess one another.
Table 5: The flow of activities in context 2
Videos â In the first week, students watch short science-related videos that provide a
small amount of information about a topic. The videos only provide the students with a
starting point for gathering the information they will need to prepare a presentation on the
topic in question. Groups are formed when students choose the topic they would like to
present, and each group chooses a leader who is responsible for organizing the group,
scheduling, mailing, discussing problems with the teacher, and finalizing documents for the
presentation. The video content is discussed in the online forums, and during the first week
students begin to research their topics and post URL links, diagrams, and information into the
forums.
21. 21
Video quizzes â For each of the videos, there is a short quiz and students are required to
complete the quizzes for all the videos. The assessment of the quizzes is formative and the
purpose is to make the students generally aware of what the other groups are presenting about.
During the second week, forum discussions continue.
PowerPoint slides â By the third week students are required to start uploading and
sharing their PowerPoint presentation slides within their group. When a student uploads a
PowerPoint presentation file, the other members of the group access it, view it, examine and
understand the content, evaluate and make corrections and comments and then re-upload it.
This content is shared within the group and students work to co-create their presentations,
adding to the content and correcting it, improving each version as it goes through this process.
Assessment rubric â In the fourth week, students begin to create an assessment rubric
based on their own ideas of what makes a good presentation. They share ideas and create the
rubric during class time, with the teacher available to answer questions. When the rubric is
complete, the teacher incorporates it into Moodleâs workshop function (Dougiamas 2011).
During the presentations, the students use the rubric to assess each otherâs presentations. The
grades they give do not count towards the final result, however; peer-assessment is used to
give students the experience of learning to prepare a presentation, understanding how it will
be evaluated by creating the evaluation criteria, and then taking part in the evaluation process.
Because students are aware of the assessment criteria they are able to evaluate the quality of
their own presentation and strive to improve it.
Categor
y
Element Grade
0 1 2 3 4
Attitude Eye contact No eye
contact
Not
suggesting
the state of
Not much
move
A presenter
is a little
active
Standard A presenter is
very active
and present in
a clearly
voice
A presenter
try to
communicate
with the
audience. The
Movement
Gesture
Voice
Expression
22. 22
Communica
tion
the other
party
presentation
is very
interesting.
Slide Simple
words
A lot of
words
The image is
very
complicated.
Pale color
word
A lot of
picture
Statements
fairly
The slide so
simple that
the audience
can
understand
easily.
Easy to
understand
Vivid color
word
Clearly
Simple
screen
Use color
Group Smoothness Everyone
apart
The audience
canât feel
cooperation
of the group
Some
member
work hard
The other
member
donât work
Moderately
together
All member
work hard
There is unity
Proceed
smoothly
Cohesivene
ss
Teamwork
Picture Easy to look Small
Confused
Clearly Normal Little easy to
understand
Easy to
understand
Bright color
Good
animation
Beautiful
Size
Graph
Other Speak
clearly
Difficult to
understand
Many misses
Not possible
to allocate
time
Some miss
spelling,
pronunciati
on, and
grammar
Normal Speak clearly
moderate
voice
A few miss
Possible to
allocate time
exactly
Time
allocation
Spelling
Grammar
Table 6: Student-generated assessment rubric
23. 23
Figure 2: Student uploading a file to the forum
Students upload their PowerPoint slides into the forums to share them with their group
members as shown in Figure 2. They also add text into the forum area indicating what the file
contains and a request that it be checked by other members of the group. This is the
introduction of student-generated output in the form of document and text. The content of the
document is introduced through the text in the forum and this output indicates the learnerâs
current thinking process and level of understanding. This file is input for the other group
members to download, view, comprehend and evaluate. The next section describes a sample
of the process.
3.2.2 A sample of the process
Table 7 presents a sample of data, the file upload of one student (S7) and the dialogue that
ensues, in order to illustrate the use of the process analysis tool referred to in section 2.3.1
(Figure 1) and the type of dialogue that takes place between the students. The labels used are
as follows: G â group, S# â student, U â file upload, D â file download, -P â minus projection,
+SD â self-driven plus projection, +FD â feedback-driven plus projection.
Line Student File Projection Dialogue
1 S7 to G U -P Finished.
24. 24
+SD Please inspect. And please point out anything anytime.
There is a possibility that you and I picked up same
picture. In that case, please make contact with me.
2 S7 to G - +SD Test my slide show.(moving smoothly, speed, and so
on...)
My contents for speaking, I will tell you near the day
by using e-mail. (If I do so, other people canât look
my speaking contents.)
3 S2 to S7 D -P
+FD
Good morning Sp7!
I watched your slide.
I think Before and After photos distinction is difficult.
I recommend you to devise to better illustrate the
before and after.
4 S5 to S7 D -P
+FD
Your work is very fast !
Itâs a nice slide but I think you had better delete
Japanese in pictures.
5 S7 to
S2/S5
U +SD Thank you for testing and replying,
Oh right...well, how about this one?
I changed picture, and speed up of animation.
I tested my slide show with speaking, it took 19
seconds!
6 S5 to S7 D -P Itâs very good !
Thank you for listening to my opinion.
25. 25
7 S1 to S7 D -P
+FD
Good evening. I see this slide. It is a good.
But I think this pictures is not easy to understand which
is before or after.
To change place of subtitle from left side to above or
under, more easy to understand.
15 S7 to G U -P I heard showing our slide show with any click canât be
acceptable.
So, I changed my page.
When you put in order, please use this.
Table 7: Excerpt of student 7âs improvable object.
3.2.3 Process analysis tool
In this section the process analysis tool is used to analyse the data in Table 7 from the
perspective of good feedback practice.
In line 1, S7 uploads his slide for the first time and indicates this with Finished (-P). S7
requests advice from the other group members, indicated by +SD, and this begins the process
of driving his improvable object forward. In line 2, S7 is still working on the slide and lets the
group know what he is doing. This exchange is aimed at requesting feedback that will close
the gap between current and desired performance (principles 2 and 4). In line 3, S2 gives S7
feedback with +FD by suggesting changes to the slide. In line 4, S5 also suggests changes
with +FD. In both of these lines, the IO has begun the cycle of evaluation. This discourse is
âpushingâ the file towards completion, thus facilitating self-assessment and closing the gap
26. 26
between current and desired performance (principles 1, 2, 3, 4, and 5). In line 5, S7 replies to
S2 and S5 in the group forum. Changes have been made to the slide and S7 requests
evaluation with +SD. The IO has started a second round through the cycle (principles 1, 2, 3,
4, 5, and 6). In line 6, S5 replies acknowledging the changes. At this point the slide could be
considered complete, but in lines 7 and 8, S1 and S6 provide feedback (+FD) indicating that
additional changes could be made. This feedback facilitates development, encourages peer
dialogue around learning, clarifies performance, closes the gap between current and desired
performance, and delivers information to students about their learning (Principles 1, 2, 3, 4,
and 5). In line 15, S7 revises his slide by removing the animation and then requests that his
slide be incorporated with the rest of the slides and put in the correct order. At this point S7âs
individual slide is complete. Line 16 has been added here to show that the individual slides
have now been combined in one complete file and have been uploaded, but the word
âprobablyâ indicates that even though the files have been put together in the correct order
changes may still be required (+SD). This now changes from individual slides into a group
slide file and from this point group members may indicate through feedback-driven
+projection any changes that need to be made. At this point the group file has re-entered the
cycle as a new file and is waiting to be downloaded and evaluated. There may be evidence
that the creation of the PowerPoint slides is influenced by the assessment rubric and studentsâ
realization that their slides will be assessed using that rubric.
3.2.4 Good feedback practices
When learners receive feedback from other learners (principle 2), it helps them to reflect on
what they have done, and this provides information about their learning and the standard the
other group members are expecting (principles 1, 2, 3, 4, and 5). The feedback in the
discourse shown here gives the learner opportunities to assess and edit what he has done,
based on peer feedback (principle 1). Students in a group setting need to adhere to the
27. 27
expected standards of the group, and the feedback gives them opportunities to close the gap
between their current performance and the desired performance (either their own desired
performance or the level desired by the group) (principles 3 and 4). In the dialogue analysis,
there were no instances of students attacking their peers; the discourse was all group-oriented
â we must do this, we need to improve our slides â which encourages motivation (principle 6).
From the dialogue, teachers can observe gaps in studentsâ thought processes and assist them
with ideas for new direction or for rethinking some of their current ideas (principle 7).
To recap from the perspective of good feedback, we can look at the practices in case
study 2 and associate dialogue with each practice.
1. Facilitates the development of self-assessment (reflection) in learning: student-
student dialogue that provides information about what should be changed helps
students reflect on what they have done and what they need to do to improve their
contribution to the group.
2. Encourages teacher and peer dialogue about learning: the activity in itself encourages
peer dialogue, in order to push the IO forward. This dialogue constitutes the learning
process.
3. Helps clarify what good performance is (goals, criteria, expected standards): the
feedback provided lets the students know what the expected standards are and what
needs to be done to reach the goals for the group. Each time the IO is submitted, the
group membersâ feedback clarifies the standards further.
4. Provides opportunities to close the gap between current and desired performance:
this is part of the process and the dialogue occurring between students is all centred
on closing the gap, as each line of feedback is acted upon and the IO is pushed closer
towards completion based on the standards of the group.
5. Delivers high quality information to students about their learning: peer feedback
provides clear information about what needs to be done and how to go about doing it.
28. 28
This is evident throughout the dialogue in the forums.
6. Encourages positive motivational beliefs and self-esteem: as stated, all feedback was
encouraging and attacks on members of the group did not occur. Priority was given
to achieving the groupâs goal, to create a final presentation using the IO. This priority
is reflected in studentsâ positive attitudes towards each other.
7. Provides information to teachers that can be used to help shape their teaching:
teachers examining the process can see how effective student learning is and whether
changes to the course would be likely to facilitate a higher level of agency.
4 Discussion and conclusion
There are differences between the practices described in the two case studies. The first
focuses on learner involvement in assessment, while the second focuses on learner
collaboration, which involves assessment decisions. However, in both studies classroom
processes facilitate learner-learner reflective gap-closing and informative feedback in various
ways. In the first case study, learners take CLIL-type reading and discussion classes after
taking process-writing classes. By comparison with their earlier learning experience, these
classes are less guided and require a wider and higher range of skills like interpreting
academic papers on sociolinguistic theories. The assessment of these activities should elicit
more engagement from learners because it requires them to work things out for themselves
and gives them a greater degree of control over the skills they use. Compared with previous
classes, assessment is less teacher and grade-led, and there is more constructive discussion
about learning among learners. Assessment demands progress in line with the linguistic and
critical thinking demands of the classroom tasks; critical assessment skills are developed
partly as a result of the incremental cognitive load demanded by CLIL-type activities. In this
article, we generally argue that assessment in such classrooms should develop critical
assessment skills at a deeper cognitive level. This is consistent with the aim to develop the
29. 29
academic skills needed for a critical analysis of content. The main purpose of learner
assessment tasks is to encourage reflective gap-closing and informative feedback, by engaging
in informal discussions about information related to learning performance. By reviewing past
learning and giving suggestions about future learning, learners exchange information that
confirms their progress and helps them to identify relevant future learning goals. This can
foster both motivation and self-efficacy.
In the second case study, content and assessment are primarily learner-generated.
Studentsâ output, input and reflection constitute a cyclical process in which content provided
by learners for learners facilitates the development of self-assessment skills and generates
dialogue about the learning process. This dialogue clarifies standards, closes the gaps between
current and desired performances, encourages peer development and motivates students. It
makes language use more authentic because the core of learning is not language but content:
students begin to appropriate the language by using it as a tool. By analysing learner dialogue
the teacher can identify areas that may need to be adjusted in order to facilitate a higher level
of learner agency.
The practices discussed in this article involve learners in assessment decisions and help
them to use their new language skills effectively by critically engaging with language content.
On a more general level, autonomous learning has not been highlighted in previous classes
taken by these learners, and the practices they now encounter encourage them to engage
actively with learning and assessment. We hope that readers consider how such practices can
be integrated into their classrooms.
References
Bailey, Kathleen M. 1999. Washback in language testing, Princeton, NJ: Educational Testing
Service.
Bateson, Gregory. 2000 [1976].The logical types of learning and communication. In Gregory
30. 30
Bateson, Steps to an ecology of mind: Collected essays in anthropology, psychiatry,
evolution, and epistemology, 279â308. Chicago: University of Chicago Press.
Black, Paul, Harrison, Chris; Lee, Clara, Marshall, Bethan, & William, Dylan.
2003. Assessment for Learning- putting it into practice. Maidenhead: Open University
Press
Carless, David. 2009. Learning-oriented assessment: Principles, practice and a project. In
Luanna H. Meyer, Susan Davidson, Helen Anderson, Richard B. Fletcher, Patricia M.
Johnston & Malcolm Rees (eds.), Tertiary assessment and higher education student
outcomes: Policy, practice and research, 79â90. Wellington, New Zealand:
AkoAotearoa.
Commission of the European Communities. 2003. Promoting Language Learning and
Linguistic Diversity: An Action Plan 2004 â 2006. http://www.saaic.sk/eu-
label/doc/2004-06_en.pdf (accessed 25 July 2015).
Council of Europe. 2001. Common European Framework of Reference for Languages:
Learning, teaching, assessment. Cambridge: Cambridge University Press.
Dalton-Puffer, Christiane & Tarja Nikula. 2014. Content and language integrated learning.
The Language Learning Journal 42(2). 117â122.
deBoer, Mark. 2011. ICT contents project at Iwate University. In Alison Stewart (ed.),
JALT2010 conference proceedings, 311â319. Tokyo: JALT.
deBoer, Mark, Onaka, Natsumi & Nakanishi, Takahiro. 2012. English ICT contents program
development through collaboration at Iwate University. In Alison Stewart & Naoko
Sonda (eds.), JALT2011 conference proceedings, 229â240. Tokyo: JALT.
Dickinson, Leslie. 1995. Autonomy and motivation: A literature review. System 23(2). 165â
174.
Dougiamas, Martin. 2011. Moodle. http://www.moodle.org (accessed 20 July 2015).
Gipps, Caroline V. 1994. Beyond testing: Towards a theory of educational assessment.
31. 31
London: Falmer Press.
Gorsuch, Greta J. 1998. Yakudoku EFL instruction in two Japanese high school classrooms:
An exploratory study. JALT Journal 20(1). 6â32.
Hino, Nobuyuki. 1988. Japanâs dominant tradition in foreign language learning. The Japan
Association for Language Teaching Journal, 10(1), 45-55.
Kachru, Braj. B. 1992. The other tongue. Oxford: Pergamon.
Kikuchi, Keita, & Browne, Charles. 2009. English educational policy for high schools in
Japan. Regional Language Center Journal, 40(2), 172-191.
Lantolf, James P. 2006. Sociocultural theory and L2: State of the art. Studies in Second
Language Acquisition 28(1). 67â109.
Lantolf, James P, and Poehner, Matthew E. 2008. Sociocultural theory and the teaching of
second languages. London: Equinox Publishing.
Longacre, Robert E. 1996. The grammar of discourse. New York: Plenum Publishing
Corporation.
McConnell, David. 2006. E-learning groups and communities. New York: Open University
Press.
Murphey, Tim. 2004. Participation, (dis-)identification, and Japanese university entrance
exams. TESOL Quarterly 38(4). 700â710.
Newman, Fred and Lois Holzman. 1993. Lev Vygotsky: Revolutionary scientist. New York:
Routledge.
Nicol, David J. and Macfarlane-Dick, Debra. 2006. Formative assessment and selfregulated
learning: a model and seven principles of good feedback practice. Studies in Higher
Education. 31(2), 199â218.
OâDwyer, Fergus, Alexander Imig & Noriko Nagai. 2013. Connectedness through a strong
form of TBLT, classroom implementation of the CEFR, cyclical learning, and learning-
oriented assessment. Language Learning in Higher Education 3(2). 231â253.
32. 32
Poehner, Matthew E. 2008. Dynamic Assessment: A Vygotskian approach to understanding
and promoting L2 development. Norwell, MA: Springer.
Poehner, Matthew E. & James P. Lantolf. 2013. Bringing the ZPD into the equation:
Capturing L2 development during Computerized Dynamic Assessment (C-DA).
Language Teaching Research 17(3). 323â342.
Rea-Dickins, Pauline 2009. Classroom-based assessment. In Nancy H. Hornberger & Elana
Shohany (eds.), Encyclopedia of language and education, vol. 7: Language testing and
assessment. 257â272. New York: Springer.
Scardamalia, Marlene & Carl Bereiter. 2006. Knowledge building: Theory, pedagogy, and
technology. In Keith Sawyer (ed.), Cambridge handbook of the learning sciences, 97â
118. New York: Cambridge University Press.
Tunstall, Pat & CarolineV. Gipps. 1996. Teacher feedback to young children in formative
assessment: A typology. British Educational Research Journal 22(4). 389â416.
Van den Branden, Kris. 2006. Task-based language education: From theory to practice.
Cambridge: Cambridge University Press.
Yoshida, Kensaku. 2003. Language education policy in Japan: The problem of espousing
objectives versus practice. Modern Language Journal, 87(2), 290-292.
Bionotes
Fergus OâDwyer
Fergus OâDwyer is currently based at Osaka University. His interests include assessment,
classroom decision making and negotiation, World, Dublin and Irish Englishes, the European
Language Portfolio, and the pedagogy of introducing World Englishes in the ELT classroom.
Mark deBoer
Mark deBoer is an academic researcher at Iwate University in Japan, and a PhD candidate at
the University of Birmingham. His research interests are Dynamic Assessment and tool-
33. 33
mediated learning and he is currently developing a new model for Dynamic Assessment. He is
a semi-professional cellist and enjoys road cycling.