SlideShare a Scribd company logo
1 of 53
Download to read offline
PBM4029ononaji21809224 Page 1
PORTFOLIO/ASSIGNMENT COVER SHEET
Programmed/Pathway MA EDUCATION
Module Start Date:
Module Code: PBM4029
Module Title: THE IMPACT OF EDUCATIONAL PRACTICE
Student's Name: AUGUSTINE CHIDI ONONAJI
Student ID Number
21809224
Tutor's Name HUGH SLOAN
Latest submission
Date:
12TH
JANUARY 2012
Actual submission
Date:
12TH
JANUARY 2012
Received by:
Date Received:
………………………………………………………………………
………………………………………………………………………
Instructions:
Please submit your portfolio electronically – you will receive e-notification that we have received it
Students are required to complete this coversheet and attach it to their assignment prior to handing in
The deadline for your submission is 4.15pm on or before the due date
On submission staff receiving the assignment will date and sign this form
Students should retain the student copy as a receipt for their work
If you wish a receipt for posted assignments, please include an SAE
Professional
Development
PBM4029ononaji21809224 Page 2
THE IMPACT OF ASSESSMENT FOR LEARNING IN SCHOOLS:
WHAT IMPACT DOES ASSESSMENT FOR LEARNING HAVE ON LEARNERS AND
PRACTITIONERS IN THE CLASSROOM WHEN EMBEDDED IN PRACTICE?
A CASE STUDY BASED ON MY EXPERIENCE AS A TEACHER AND HEAD OF ICT
DEPARTMENT
Submitted in part fulfilment of
the requirements of the
MA in Education
at Edge Hill University
Augustine Chidiegwu Ononaji Student ID: 21809224
PBM4029ononaji21809224 Page 3
TABLE OF CONTENTS
Acknowledgement ……………………………………………………………..4
Introduction……………………………………………………………………5-6
Rationale ……………………………………………………………………….7-
Research questions……………………………………………………………8
Literature review………………………………………………………………..9-13
Research methods……………………………………………………………..14-21
Research ethics …………………………………………………………………22-22
Findings Analysis and discussion………………………………………………23-31
Conclusion …………………………………………………………………………32-34
Appendices (Supporting evidence)
Appendix 1…………………… (Extract from college Ofsted inspection report) ….. pg. 35
Appendix 2…………… (Evidence of request for consent and college authorisation).. pg. 36
Appendix 3………………… (Participants consent form) ………………………………..pg. 37
Appendix 4………………………… (Evidence of monitoring sheet) …………………..pg. 38
Appendix 5………………… (Extract from department achievement reviews)………...pg. 39-41
Appendix 6………………………. (Research action plan) ……………………………….pg. 42
Appendix 7………………………. (Lesson observation commentary) ………………….pg. 43-44
Appendix 8……………………… (Student questionnaires and feedback) ……………..pg.45
Bibliography………………………………………………………………………..pg. 46-51
PBM4029ononaji21809224 Page 4
Acknowledgement
I am using this opportunity to express my gratitude to everyone who supported me throughout the course of my
MA in Education project. I am thankful for their aspiring guidance, invaluably constructive criticism and advice
during the project work. I am sincerely grateful to all for sharing their truthful and illuminating views on a
number of issues related to the project.
I express my warm thanks to my personal tutor, Mr. Hugh Sloan whose contributions in stimulating suggestions
and encouragements helped me coordinate my project.
Furthermore I would also like to acknowledge that with much appreciations the crucial role of the Principal of
Haringey Sixth Form Centre June Jarrett (my mentor) for her support and guidance through mentoring and
coaching in my role as the PAM for ICT department in her organisation, that I used as a case study in this
research.
In addition, I would like to express my special thanks of gratitude to my colleague Haileselassie Girmay who
had helped me in putting the project together.
Last but not the least, many thanks go to my dear wife Lilan and our children; Chizaram, Chigadi, Chinomso
and Kosimasichi for their kind co-operation and encouragement throughout the programme.
PBM4029ononaji21809224 Page 5
INTRODUCTION
According to Kirton, et al. (2007) in the late1990’s formative assessment was not a priority for many
teachers and policy makers in England and Scotland. This was because the emphasis was on
summative assessment in the form of national tests, formal examinations and the pressure of
frequent inspection and league table performance by the government. The interest in formative
assessment was triggered by two publications namely, ‘Assessment and classroom learning’ (Black &
William, 1998a) and, in more accessible form for teachers and policy makers, Inside the black box
(Black & Wiliam, 1998b). The latter having sold over 50, 000 copies indicated a healthy interest in
formative assessment, and subsequently gave rise to other researchers and groups such as the
King’s College London team (Black et al, 2002, 2003) to produce publications/guidelines on using
formative assessment in the classroom. The team in their work argued that Assessment for Learning
and Summative Assessment were used interchangeably and shared extended definition. Other
researchers like McDowell et al (2009) and QIA (2008) view Summative Assessment as a key part of
Assessment for Learning and key stage 3 strategy.
Since the publications of Black and Wiliam (1998b), Black et al, (2002, 2003), ARG (2009) and
others, AFL has become ‘a free brand name to attach to any practice’ Black (2006:11).
The official endorsement of AFL in Britain as good practice for improving teaching and learning in the
classroom is evidenced at various levels. For instance, in 2008 the Minister of States for Schools
encouraged all schools in England to use AFL to improve teaching and learning in schools. The
Minister reported that ‘the Government had invested £150 million over a period of three years for
continuing professional development for teachers in assessment for learning’ (DFCSF (2008:1).
Similarly, many policy-makers and educational institutions since then have developed programmes
and resources to support practitioners seeking to improve their teaching and learning practices.
Accordingly, many Universities and Colleges in UK are now providing internal and external CPD
PBM4029ononaji21809224 Page 6
programmes for practitioners while teacher training agencies have made AFL a core curriculum for
trainee teachers.
Black et al. (2004) defined formative assessment as:
Assessment refers to all those activities undertaken by teachers, and by the students in
assessing themselves, which provide information to be used as feedback to modify the
teaching and learning activities in which they are engaged. Such assessments become
formative assessment when the evidence is actually used to adapt teaching to meet the needs
(Black et al., 2004 p. 10).
Indeed, Hargreaves (2001) cited in ‘UKEssays’ website argues that:
Assessment for Learning is the beginning of a ‘revolution’ in education and the key driver of the
convergence between curriculum, assessment and pedagogy because it undermines the old
conception that assessment is something that follows teaching and learning (UKEssays).
Similarly, Assessment Reform Group (2009) highlighted the significance of AFL by stating that AFL
has become a proxy measure that is used to facilitate judgements on the quality of most elements of
educational systems which has been largely welcomed by teachers, head teachers, school, support
services, local authorities and even the government itself.
In the light of these developments, and the increasing emphasises on the benefits of AFL, as a newly
appointed head of a department in a school in London that needed to improve learners’ outcomes,
my tasks was to explore the potential use of AFL to improve the quality of teaching and learning
experiences in the ICT department I manage.
The aims of this research were first, to improve professional practice through a practice based
enquiry. Second, to explore the relationships between theoretical and practical knowledge, and
critically evaluate the impact on practice through a methodical investigation and collection of work
based evidence and wider professional context. The objective therefore was to plan, implement, test
and evaluate the impact of Formative Assessment (AFL) on practice using a case study research
method.
PBM4029ononaji21809224 Page 7
THE RATIONALE
In accordance with BERA (2011), Data Protection Act (1998), and for privacy and confidentiality
reasons, pseudonyms have been used throughout this enquiry. Hence, all identities used in the
supporting documents and the evidence attached in the appendices are pseudonyms.
For the past six years, I have worked as a teacher of ICT and the Head of the department in
Reflective Sixth Form College, which is located in a deprived part of London engulfed with
socioeconomic difficulties. Reflective college is inclusive, hence caters for young adults aged 16 to19
from different ethnic origins, able and disabled with mixed ability in attainment. The institution is made
up of six faculties with the population of one thousand and two hundred students. The management
structure includes a principal, three vice principals, six heads of faculties and ten programme area
managers. My role is in the faculty of business and ICT as a teacher of ICT and a programme area
manager. The faculty provides A level and vocational courses from entry 3 level to 3 levels
qualifications in ICT and business studies. Currently, there are eleven members of teaching staff and
350 mixed ability students in the faculty. The department was graded a grade 3; requiring
improvement in the last Ofsted inspections in 2012.
My rationale for undertaking this study is threefold. First, to improve practice, enhance autonomy and
motivation of learning in the classroom and develop more research and reflective practice skills;
hoping that on completion I will become a more competent and reflective person in my approach to
addressing issues and engaging in wider discussions. Second, as a teacher I want to use the skills
develop through the enquiry to promote good teaching and learning practices across the department
with the assumption that it will help improve the quality of teaching and learning as well as raising
learners’ outcomes as recommended by Ofsted in 2012 feedback. Ofsted advised the college to:
Continue to raise achievement by: − increasing the average grade and the proportion of
higher grades gained at A level and in BTEC Level 3 qualifications, so that all aspects of
attainments are firmly in line with national averages− ensuring that greater proportions of
students complete the courses that they start, consistently across all subjects and curriculum
areas (College Ofsted report, 2012:5) (see appendix 1).
PBM4029ononaji21809224 Page 8
Lastly, to accomplish a performance management target agreed during an annual review with my line
manager. It became crucial that I must research and pilot the use of formative assessment
techniques in the department. My rationales for embarking on this study are both personal and
professional thus, aided the formulation of the research questions I had designed below.
RESEARCH QUESTIONS
1. What impact does AFL have on [my] students, is there evidence that embedding AFL in
practice raises the standards of learners and the quality of teaching in the classroom?
2. Is there evidence that there is room for improvement in implementing AFL in the classroom, if
so what is the evidence and how can it be improved?
3. What were the impacts of this study on my practice as a teacher, a manager and my school?
PBM4029ononaji21809224 Page 9
ASSESSMENT FOR LEARNING: THEORY AND LITERATURE REVIEW.
Black and Wiliam (1998) claim that firm evidence indicates that formative assessment should be an
essential component of classroom work and that its development can raise standards of learners’
achievement. They ‘point out that they know of no other way of raising standards for which such a
strong prima facie case can be made than the use of formative assessment’(p:1). The authors, in
presenting their case for formative assessment, likened the classroom as ‘a black box’. In doing so,
suggested that the current approaches to improving achievement do not produce the desired results.
They argue that whatever is happening in the classroom (that black box) is affecting the output. Black
and Wiliam see formative assessment as the tool that lies at "the heart of effective teaching."
Miller (2005) reported that since the pioneering work of Black and Wiliam, formative assessment has
attracted increasing interest as a topic of research, classroom practice and educational policy due to
its potential of guiding teaching and learning process.Two most noticeable publications were
produced by the Assessment Reform Group (2002) and DCSF (2008). The group recommendations
on how to improve Assessment for Learning in the classroom were widely adopted and a revised
version of Ten Principles was incorporated into the Assessment for Learning Strategy (DCSF, 2008).
Assessment Reform Group (2009) adds that formative assessment represents a fundamental change
from the test and examination results that were predominantly used to determine what a pupil knew
and understood of a subject.
Furthermore, The Quality Improvement Agency for Lifelong Learning (QIA) (2008) reported that
Formative assessment is part of Assessment for Learning. They stated that:
Assessment for learning-sometimes called formative assessment- involves teachers and learners
using assessment to improve learning. It is about assessing progress and analysing and feeding
back the outcome of that assessment positively and constructively to:
 Agree actions to help the learner improve
 Adapt teaching methods to meet the learner’s identified needs (QIA, 2008:2).
QIA went further to argue that Assessment for Learning demonstrates a particular view of learning
which could enable all learners to gradually make progress and achieve their full potential. The group
PBM4029ononaji21809224 Page 10
confirms that Assessment for Learning can take place in teaching and learning sessions, through
written and verbal feedback and as part of a review, a target setting and an action planning. Teachers
are encouraged to use a range of approaches such as teacher-led assessment, learner-led
assessment, peer assessment and computer-based assessment to check learning and assess
progress in a classroom:
Effective checking for learning enables learners to be involved in the assessment process and
to make sense of what they are learning and how it relates to and builds on what they already
know (QIA, 2008:5)
Similarly, Black and Wiliam (1998a) argue that teachers need to have a deep understanding of
formative assessment to enable them employ strategies to assist students to identify the gaps
between their present achievements and the desired goals. Sadler (1989) cited in (Black 2000) states
that formative assessment should equip students with essential tools to manage their own learning.
This theory compliments the publication of the Assessment and Reporting Unit (2005:2) which claims
that:
Meaningful learning occurs when learners are actively involved and have opportunity to take
control of their own learning. This means that teachers should provide sensitive and
constructive feedback to students and use assessment practices that encourage self-
assessment and metacognition.
In addition, Bransford, et al. (2000) argue that learners construct knowledge and understanding on
the basis of what they already know and believe. Hence, teachers should use dialogues, questions
and answers to establish learners’ prior knowledge and monitor learners’ changing conceptions as
teaching and learning progresses in the classroom. They also state that assessment should help
expose students’ thinking processes to themselves and their teachers and that appropriate feedback
throughout the learning/teaching process should enable students to modify and refine their thinking.
Hayward and Hedge (2005:69) cited in Assessment and Reporting Unit (2005:4) report that data
emerging from research on AFL in Scotland ‘suggests that teachers not only find their involvement
energizing, but that they report positive changes in the quality of pupils’ work and commitment to
learning’. This research indicates that the implementation of formative assessment strategies aided
PBM4029ononaji21809224 Page 11
teachers and students and improved students learning and teachers’ satisfaction. Hayward and
Hedge argue that Black and Wiliam have ‘helped to take the emphasis in formative assessment
studies away from systems, its formative-summative interface, and relocate it on classroom
progresses’ (Black and Wiliam 2003:628). Hayward and Hedge claimed that the renewed emphasis
on pedagogy and assessment practices focusing on learning is central to the quest for improved
outcomes for all students. Hattie (2002) cited in the QIA (2008) states that feedback has more impact
on learning than any other general factor; but it requires an activity and a product. He also advises
that teachers should see feedback as a reflection of their expertise as teachers and not only as giving
information about their students and their grasp of the subject:
Giving feedback on learning errors and getting the learner to correct them as identify strategies
to improve future work is directly linked to significant improvement rates (Hattie. 2002) cited in
QIA (2008:4).
Butler (1998) states that using constructive comments in practice leads to improved performance by
33%, Butler also warns that marking using grades can have a negative effect on learner performance,
particularly for low achievers.
Black and William (2003), point out that Assessment for Learning includes summative assessment.
They state that the use of formative assessment does not preclude the use of summative
assessment. They argue that when summative assessment is aligned to curriculum and the students’
learning experiences, then it becomes integrated into the learning and assessment cycle and feeds
into improving students’ learning rather than just measuring.
There is need to align formative and summative work in new overall systems, so that teachers’
formative work would not be undermined by summative pressure, and indeed, so that
summative requirements might be better served by taking full advantage of improvements in
teachers’ assessment work (Black and Wiliam, 2003:623-4)
Similarly, other researchers and authors have dealt with the relationship between formative and
summative assessment methods and the underlying tensions. Brigggs (1998:105) states ‘Sensible
educational models make effective use of both FA (formative assessment) and SA (summative
assessment)’. Stiggins (2002) makes reference to the importance of both Assessment of Learning
PBM4029ononaji21809224 Page 12
(summative) and Assessment for Learning. Although Stiggins may not equate Assessment for
Learning to formative assessment, he suggests that Assessment for Learning goes further and it
involves the student in the process. He therefore argues that formative and summative assessments
are important in teaching and learning process and should be integral to the learning and teaching
cycle.
Although, Assessment for Learning has now become part of primary and secondary strategies for all
schools in England (Ofsted 2008), it is not without criticisms. Pollard (2008) argues that AFL helps
identify ‘bright and weak learners’, with consequent implications for the self-image and social status of
both groups. He demonstrated that there is research evidence that the assessment process itself can
place children under enormous pressure and can have a negative effect on children’s perceptions of
themselves and their peers and can reduce an individual’s self-esteem and motivation. He also
argues that high-attaining peers may become the subject of bullying as a consequence of good
results. Similarly, Brown and Smith (1997) cited in Brown (2004-5) argue that for assessment to be
effective in practice, it needs to be ‘fit-for-purpose’; that is it should enable evaluation of the extent to
which learners have learnt and the extent to which they can demonstrate that learning.
Kirton et al, (2007) argue that Formative Assessment ‘plays a crucial role in raising standards through
giving students a clear sense of themselves as learners, the goals they are trying to achieve and how
to reach them’ (p:607). They also stated that the overall criticism of AFL is dependent on its weak
implementation in the classroom. First, they argue that weak practice of AFL encourages superficial
and rota learning in the classroom thus limiting students from making progress.
Second that some teachers do not generally review the assessment questions they use and do not
discuss them critically with peers, which leads to little reflection on what is being assessed.
Third, they argue that the grading function is over emphasised whilst the learning function under-
emphasised.
PBM4029ononaji21809224 Page 13
Fourth, that there is a tendency to use normative rather than a criterion approach, which emphasises
competition between pupils rather than personal improvement of each. Black and Wiliam (1998)
debated that such practice is hostile to weaker students that lack the ability leading to de-motivation
and lose of confidence in their own capacity to learn.
Furthermore, Hodgson and Pyle (2010) argue that AFL has many generic features but some features
can be specifically honed for science teaching and learning. They argued that the classroom climate
is an important factor that enable learning in science and should be recognised as one of those AFL
techniques identified by Black and Wiliam. They contended that it is crucial that a non-threatening
environment and a reciprocal interaction is established in classroom for pupils to feel able to express
their ideas and allow the teacher to establish what the pupils know in order to develop teaching that
will move their understanding on.
Similarly, Kyriacou (2007) suggested that the use of AFL could prove problematic to practitioners due
to inadequate time and management issues. He stated that teaching involves a variety of tasks such
as; lesson planning, making use of appropriate resources, sharing data, marking, meeting parents
and liaising with colleagues, etc. in a short period of time. He therefore believes that these processes
take time to accomplish hence require caution and concentration by a teacher. Crooks emphasised
that a substantial proportion of teachers have little or no formal training in educational measurement
techniques, and many of those who do have such training find it little relevance to their classroom
evaluation activities.
Thus, Ofsted (2008) contended that where Assessment for Learning has had less impact in the
classroom is because the teacher had not understood how the approaches were supposed to be
used to improve pupil’s achievement. They argued that teachers may have used key aspects of
assessment for learning, such as identifying and explaining lesson objectives, questioning, reviewing
pupils’ progress and providing feedback precision and skills but have failed to embed these concepts
PBM4029ononaji21809224 Page 14
in their lessons in a way that both pupils and the teachers can understand and take advantages of
them to improve their learning and teaching respectively.
In conclusion, it is evident from the reporting of the reviews conducted by different authors and
researchers cited above that potential benefits of AFL are significant on students in many different
ways. According to Black and Wiliam, the impact of AFL affected students’ improvement and helped
reshape policy toward schools:
For public policy towards schools, the case to be made here is firstly that significant learning
gains lie within our grasp. The research reported here shows conclusively that formative
assessment does improve learning. The gains in achievement appear to be quite considerable
and as noted earlier, amongst the largest ever reported for educational intervention (Black and
Wiliam, 1998:46).
Crooks (2001:1-2) summarised by stating that implementation of classroom evaluation requires
caution but emphasised that it appears likely to benefit the greatest proportion of students, and in
particular guides students’ judgement of what is important to learn, it affects students’ motivation and
self-perceptions of competence; structures their approaches to and timing of personal study;
consolidates learning; and affects the development of enduring learning strategies and skills. It
appears to be one of the most potent forces influencing education: ‘Accordingly, it deserves very
careful planning and considerable investment of time from educators’ (Crooks 1988: 467).
In the light of this review, the primary quest in this enquiry is to identify strategies to use to embed
AFL in my practice, to examine its impact on my students, my practice as a teacher/head of the ICT
department, and finally report the findings generated through a research methodology.
PBM4029ononaji21809224 Page 15
METHODOLOGY: RATIONALE AND LIMITATIONS
Having established the research objectives, questions and noting that the research stance are
naturalistic, unique, individualistic and qualitative, I decided to adopt the interpretivism theoretical
perspective as my epistemological stance. This is because it blended well with the research
methodologies (the case study research) I had chosen. Gary (2009) suggests that a relationship
exists between the theoretical stance adopted by the researcher, the methodology and the methods
used to collect the data.
Yin (2003) defines the case study research method as:
an empirical inquiry that investigates a contemporary phenomenon within its real-life context;
when the boundaries between phenomenon and context are not clearly evident; and in which
multiple sources of evidence are used (Yin, 2003:23).
In addition, having considered other research methodologies such as experiment, action research,
and ethnographic research methods during planning, I chose the case study approach over others
because of the following reasons:
Firstly, the case study research provided the qualitative platform needed to address the research
topic qualitatively (looking in-depth at non-numerical data), and allowed the use multiple methods of
data collection and analysis to determine the research outcomes. Burton et al. (2011:84) confirmed
that case study research enables teachers to use a range of research methods to investigate a
particular issue as it relates to them.
Secondly, it was preferred because it blended well with the research field settings, my personal
context and the circumstances of my study. Applying it, I was able to implement the research in real
world context; using my place of work and the students to test the theories of assessment for
learning. Yin (2009:2) confirms that general case study is preferred when ‘(1) ‘how’ or ‘why’ questions
are being posed, (2) the investigator has little control over events, (3) the focus is on contemporary
phenomenon within a real-life context’.
PBM4029ononaji21809224 Page 16
Thirdly, the case study approach was favored because it supports educational researchers/
practitioner like me than other research methods I had considered.
McNally et al (2003:6) cited in Burton and Bartlett (2011) argued that they use case study approach to
their study of early professional learning (EPI) because it offered ‘a deep understanding of the nature
of EPI which requires sustained contact with the learners and their context’.
Finally, the case study approach was preferred because it is systematic, comprehensive and topic
oriented, and provided me the opportunity to relate and interact mutually with research participants
(students and colleagues). Walsh (2001:52) argues that its strategy involves a systematic
investigation into single individual, event or situation; that is, the researcher studies a single example,
or case of some phenomenon.
On the contrary, Hammersley (2002) claims that each research method has particular strengths and
weaknesses, and criticism arise from their weaknesses.
Yin (2009:14) argues that although case study approach is ‘a distinctive form of empirical enquiry it
remains one of the most challenging of all social science endeavours’. He claims that:
(1) case studies lack rigor…[because they] allow equivocal evidence or biased views to
influence the direction of the findings and conclusions,…(2) case studies provide little basis for
scientific generalization, (3) [they] take too long resulting in massive, unreadable documents,
and finally (4) they lack ‘true experiments’ [value]’(Yin,ibid:14-16).
Similarly, Simons (2010:24) questioned that ‘the personal involvement and/or subjectivity of the
researcher are a concern’. There was a valid reason for being aware of subjectivity or ‘self, because
as Simons (2010: 82) explains further, ‘[I was] the main instrument of data collection: [I] looked at
documents, observed participants and interacted with learners/colleagues in the field.’ Consequently,
‘[my] view, predilections and values influenced how [I acted]’ (Simons, 2010:26). Furthermore, critics
have argued that case study ‘failed to provide clear-cut solutions, presenting instead an overly
complex analysis of educational issues Nisbet (2000) cited in Burton and Bartlett (2005:25).
PBM4029ononaji21809224 Page 17
Despite the limitations and concerns noted above by Simons, Yin and Nisbet, Simons note that the
personal involvement or subjectivity of case researcher ‘are not all of them necessarily limitations…it
is a question of how they are perceived and interpreted by [the researcher]’ (Simons, 2010:24). In
fact, it has been argued by Wiersma and Jurs (2009), Williman and Buckler (2008) that it is
impossible to separate the ‘self’ from the context of the study.
In spite of these concerns, I still preferred the case study approach as it enabled me to carryout an
in-depth research using multiple data collection and analysis methods on the impact of AFL on my
students in natural settings which experimental research method could not do. Further, it helped me
to explore a holistic investigation where each student or group responded to and express their
understanding of themselves, their experiences on the impact of my practice on them.
On the other hand, Burton et al. (2011:37) described Action research as;
Curriculum development in the classroom that is concerned with how to improve education
practice and it is practitioners themselves who carry out the research in examining and
developing their teaching.
I personally refused to use Action research approach for my project even though Burton et al (2011)
and McNff and Whitehead (2010) sees it as a research use by practitioners (teachers) for improving
practice.
My main reasons are first, Action research does not entirely fit and merger into my research project
(questions). Second, I was concerned that its spiral nature that promotes a rigid approach could lead
the findings to uncertain outcomes. For example, addressing a particular research question could
lead to the discovery of new outcomes, and the enquiry can deviate from its original action plan
making the investigation complex Burton et al (2011). Third, I was even more concerned that my
research questions may change as research develops making it more daunting and confusing for a
novice like me to cope. Finally, I was also worried that its action-reflection cycle nature will
undoubtedly, impact on the project feasibility that is to say; it may make the project longer than
PBM4029ononaji21809224 Page 18
planned hence impacting on my accessibility to the participants, research methods, and the overall
project success.
PREJUDICE AND BIAS ISSUES
However, being aware of the concerns raised on the weaknesses of the case study research method,
to avoid prejudice and pre-formed judgment influencing my outcomes, I took the following
precautions;
First, I ensured that I had strategies in place to monitor myself and my activities throughout the
research process. The idea was to visualize ‘how [my] personal sense of self interacted with the
study, shaped the inquiry and outcomes, and [I] reflected on the dynamics this created’ (Simons,
2010: 82). A research diary was kept and reviewed regularly for the following reasons: to generate
and monitor the project life cycle, to provide material for reflection, to provide data on the research
process and to identify and deal with the values and subjective selves that could have influenced my
interpretation.
Second, I employed multiple data collection methods and analysis in order to minimise the impact of
my personal weaknesses. I used in-depth interviews, participant observations and questionnaires.
The interview (unstructured) approach enabled me to obtain participants (students/colleagues) ‘real’
views and beliefs during the inquiry. Through observations, I was able to carry out my research in
‘real’ life natural settings which helped in the collection of highly valid data while questionnaire made it
easy to collect, and compare large amount of data quickly from participants to ensure the reliability
and validity of the research outcomes (see appendix 8: questionnaires).
In addition, to enhance internal reliability of data which Burton and Barlett (2011:27) described as the
‘truthfulness’, ‘correctness’ or accuracy of research data, I applied the respondent validation
procedure. I checked with participants (students/colleagues) to certify that my reporting of their views
is in accordance with the feedback they provided (see appendix 3: participant consent form).
PBM4029ononaji21809224 Page 19
Furthermore, colleagues’ participants who observed my lessons provided me with reflective feedback;
which provided evidence of the strengths and weakness of the strategies I had employed. For
external validity, I asked colleagues and friends to determine whether my inquiry is credible and
useful by comparing my findings to previous publications or researches and their own experiences.
Finally, to increase the validity of the research and to address the overall concerns highlighted above,
I applied the triangulation concept. According to Burton and Barlett (ibid: 28), ‘triangulation is the
process carried out by researchers to increase the validity of their research and it means checking
one’s findings by using several points of reference’. I therefore triangulated my research by analyzing
all data generated from the various methods of data collection I had used, looking for congruency
(see tables 1 and 2 in pages 28 and 34).
PBM4029ononaji21809224 Page 20
RESEARCH METHODS AND ANALYSIS
With the remit to improve practice through this enquiry, I decided to apply Thousand and Villa (2000)
model of managing change. This model states that, for a change in programme to be successful and
sustainable, the initiator must consider; Vision, Skills, Incentives, Resources, Action plans, Success.
Using this model as a precursor, an action plan was prepared and followed to facilitate the process
(see appendix 6: research action plan).
Data was collected from the empirical study conducted on 50 BTEC level and A level ICT students
enrolled in 2012-2013 academic year and 10 colleagues participants who volunteered to participate;
data collected were analysed after the research enquiry. The purpose was to measure and compare
the findings gathered from participants in order to determine the impact of practice on students and
myself. Using interviews, observation and questionnaires as the primary data collection methods, I
had the privilege of reflecting on my practice based on the feedback I received from students and
colleagues. These enabled me ‘to gain a deeper insight into the real way of life, beliefs and activities
of the group in their ‘natural settings’ Walsh (2001:67). Observing students in this context helped to
provide elicit data that could not be gathered from a questionnaire or interview methods. For instance,
I was able to observe, note, and interact holistically the immediate impact of my practice on the
learners as a group than individualistically.
Conversely, Burton et al. (2011), argue that using observation technique to collate data could be
difficult and complex as ‘it is impossible to observe and note down everything that occurs in a
complex situation such as a classroom of 25 children and one or more adults’ (Burton and Barlett
2011:131).
Similarly, I used the questionnaires as a complementary data collection procedure; they provided a
cross-check on data obtained in the interviews, observation and document analysis to enhance the
validity of my account. They were used as a way to obtain anonymous and rich qualitative and
PBM4029ononaji21809224 Page 21
quantitative data on specific aspects about the impact of formative assessment techniques from the
large number of students and colleagues participating in the study.
However, the questionnaire method had some limitations as data collection tool. Example, due to its
design and format, (closed ended questions, or even open ended questions) participant’s feedback
were generally limited. The data collected sometimes lacked deeper understanding and were difficult
to establish whether or not respondents understood the questions properly or a question asked meant
the same to all respondents as they did to me, especially where I was not present to explain (see
appendix 8: questionnaires).
Semi-structured interviews with participants were used to gather in-depth qualitative information on
my practice. They were also used to encourage the active participation and learning for myself and
the respondents (students and colleagues), since engaging participants helps to identify the impact of
my investigation was central to the study (see research questions). Equally, it allowed me to adapt
the questions to suit different situations and respondents thereby produced detailed qualitative data
outcomes expressed in the respondent’s own words. Also, I was able to ‘pick up’ non-verbal clues
that were indiscernible from questionnaires, for example, the annoyance or pleasures shown by a
respondent over certain topics or questions.
Nevertheless, using interviews as a data source presented some difficulties for me particularly when
dealing with a large number of interviewees. I found it difficult to ask questions, listen to responses
and taking notes at the same time; it was a complex process that requires prioritizing and multi-
tasking skills.
Finally, the above methods used eased the way for me to compare previous data collected or held on
college records such as examination results, class progress data, lesson observation feedback and
programme area self-assessment reviews (SAR) which were reviewed to justify the impact of my
practice.
PBM4029ononaji21809224 Page 22
DATA ANALYSIS AND STRATEGY
Having used a qualitative research approach to conduct my study and seeing that the data set is
mostly qualitative generated from multiple data collections used; questionnaires, interviews,
participants’ observations and document reviews, I decided to analysis the data qualitatively. I applied
Miles and Humberman’s Qualitative Data Analysis techniques (cited in Simons, 2010), a systematic
approach in three steps. First, I organised the data into manageable formats, grouped them into
themes (Thematic analysis) by working through the data set looking for similarities or contrasting
ideas in the responses gathered from the various data collection methods used. (Taylor Powell and
Renner (2003). Second, having grouped the data into themes, knowing that data gathered from
observation notes, interviews and questionnaire are difficult to predict, I converted them into
quantitative data in order for them to make sense during analysis and discussion (see tables 1 and 2
in pages. 23 and 35).
Coolican (1994), cited in Eysenck (2004) stated that the benefits of using qualitative analysis of data
are; [1] it can shed much light on the motivation and values of individuals who are actively involved in
the collection and analysis of data collected using interviews, case studies, or observation. [2] Data
analysis often takes place alongside data collection to allow questions to be refined and new avenues
of inquiry to develop. [3] Textual data is typically explored inductively using content analysis to
generate categories and explanations; software packages can help with analysis but should not be
viewed as short cuts to rigorous and systematic analysis.
Critically, Eysenck (2004) argues that the greatest limitation of the qualitative approach is that the
findings that are reported tend to be unreliable and hard to replicate due to its subjective and
impressionistic, and the ways in which the information is categorised and then interpreted often differ
considerably from one investigator to another. Finally, noting the issues surrounding qualitative data
analysis strategy, to reduce subjectivity and reliability, I followed Simons (2010) and Coolican (1994)
PBM4029ononaji21809224 Page 23
advice who recommend that I use different methods of data analysis (research cycle) in order to
remain transparent and increase reliability.
RESEARCH ETHICS
As the underlying issues for doing this investigation were to improve practice, enhance autonomy and
motivation of learning in the classroom and develop more research and reflective skills, this study
was therefore, conducted in my place of work where both students and colleagues were co-
participants. It adhered to all ethical issues raised in BERA guidelines (2011). I was granted ethical
approval to conduct this research by the Research Ethics Committee at the university at which I am
working towards a Master degree in Education. Permission to undertake this enquiry was equally
sought from my school authorities and participants students and colleagues. Participants were clearly
and fully informed through a fair processing and voluntary notice which included seeking participants’
consent using a ‘consent form’ and a permission to execute research in the college from the college
authority (see appendix 2 and 3: consent letter and form). The students and colleagues participants
were fully informed that the research purpose was to improve my practice and were duly assured of
their protection and rights to be treated fairly, sensitively, with dignity, and within an ethic of respect
and freedom from prejudice, regardless of age, gender, sexuality, race, ethnicity, class, nationality,
cultural identity, partnership status, faith, disability, political belief or any other significant difference. I
ensured that the project was free from deception or subterfuge and that all sensitive information
collected were not disclosed to anyone without the permission of the individual or participants. I made
sure that only relevant data were collected in order not to jeopardise the welfare of the participants. In
addition, I guaranteed that participants were duly informed of their right to withdraw from participating
in the research at any time without any consequence. This was communicated through email and
consent form.
As recommended by the Data Protection Act (1998) and BERA guidelines (2011), I respected
participants’ privacy and confidentiality throughout the research. I used anonymity and pseudonyms
PBM4029ononaji21809224 Page 24
to ensure that participants were not identified. Furthermore, all documents such as; interview scripts,
lesson observations, questionnaires, forms, etc. used in the project were scrutinised and the data
gathering processes were conducted fairly and lawfully. Finally, all data and information collected
during the research process were securely kept and destroyed at the end of the project.
Introduction to findings, Analysis and Discussion
The findings and analysis presented in this report were generated from a case study in a college,
which investigated the impact of Assessment for Learning on students. The aims were to interrogate
the impact of Assessment in practice as identified in research questions 1, 2 and 3 on page 7.
Data collection employed documents analysis, questionnaires, observations and interviews with the
AS level and BTEC groups of 50 students and 10 teachers in the faculty of business and ICT who
kindly volunteered to participate in the project. For the purpose of data analysis, I abstracted key data
gathered from the various data collection methods and organised them into themes, then displayed
the data into figures and tables before engaging with data verification. The findings were evaluated in
light of the research questions followed by a critical discussion backed up with evidence in relation to
the literature review presented earlier and my experiences. The discoveries on the AFL theories were
conclusively positive on learners, practitioners and pedagogy.
Research question 1; findings and discussion
Table 1 shows the analysis of the outcomes reported on the impact of embedding AFL in classroom
practice during the case study. It discusses research question 1.
Results
Impact Very strong impact Strong impact Little impact Not impact
Improving student learning and skills. 45 (90%) 5 (10%) 0% 0%
Improving student confidence and motivation 28 (56%) 15 (30%) 7(14%) 0%
Improving student concentration and
understanding
30 (60%) 15 (30%) 5 (10%) 0%
Improving student awareness 25 (50%) 15 (30%) 10(20%) 0%
Improving student attainment 30 (60%) 20 (40%) 0% 0%
Improving the quality of student work 40 (80%) 10 (20%) 0% 0%
PBM4029ononaji21809224 Page 25
The wide range of AFL strategies adopted during the case study from 2011 to 2013 focused on the
following methods;
1. Engaging student collaboration through small groups, pairs and trios, to plan, discuss, draft
and redraft written work.
2. Using of peer and self-assessment in the classroom.
3. The use and the development of higher order questioning skills and
4. Giving feedback
The main outcome of this study was its impact in classroom practices. It was evident from the results
of the case study that engaging learners through group work increased the active participation of
students in the classroom. This involved ‘students to take ownership of their learning rather than
being passive recipients of the ‘delivery’ of the curriculum’ Kirton et al. (2007). The analysis of the
outcomes in table 1 shows that the large majority (90%) of the findings indicated that routinely use
AFL in practice is a very effective method of improving students learning and skills. A colleague, who
observed my AS ICT students engaged in designing an imaginative website in groups of four,
reported that the students claimed that working in a group is much fun than working on their own
because it enabled them to share ideas, skills and support each other.
In addition, during one of my observations, I observed that sharing the lesson objectives with learners
and using collaborative learning in the classroom, enhanced students understanding, helped them
monitored their progress and increased their confidence as we moved from one activity to the next.
My experience is similar to that of colleague participants and students identified in (Table1); where a
large majority of student participants reported that working in a group gives them the confidence and
motivation to engage in classroom discussions.
Furthermore, another colleague observer reported that using group work in a classroom helps to
increase the involvement of the whole class and increased more equitable participation of students in
the classroom. Admittedly, these conclusions connect with the research reported by Ofsted
PBM4029ononaji21809224 Page 26
(2008:24); they stated that ‘in order to raise standards, teachers should engage students in small
groups and in whole class dialogue’. Similarly, Black and Wiliam (2009:7) reporting on the importance
of student involvement stated that ‘since the responsibility for learning rests with both the teacher and
the learner, it is incumbent on each to do all they can to mitigate the impact of any failure of the
other’. Alexander (2004) argues that increasing the amount of ‘talk’ in the classrooms is crucial to
improve students’ thinking and learning through what he called dialogic teaching. Kirtton et al
(2007:617) claimed that such teaching is collective, reciprocal, supportive, cumulative and purposeful
and bears resemblance to the practises described above. For these reasons, sharing lesson
objectives with students and engaging them collaboratively are effective ways to improve teaching
and learning and other skills in the classroom.
Also, the statistic in table 1 shows that frequently use of AFL in practice helps to improve student
motivation. The analysis indicated that the 56% of the participants acknowledged the impact of
self/peer-assessment methods used in the classroom. I personally noticed that students were highly
motivated when they were involved in evaluating each other’s work or theirs; they saw value in the
activity, engaged cooperatively as well as reflective. The impacts of self/peer-assessment in my
lessons were evident in the improved learning and motivation commented in line managers’
observation feedback (see appendix 7 line manager’s lesson observation feedback).
Whilst the group work took place, using self/peer assessment, Austin monitored and worked
with each group, monitoring and prompting, and challenging or clarifying the key issues as
students prepared their feedback (line manager)
And students’ testimonies held that peer/self-assessment methods helps them understand how to
take ownership of their learning, construe marking criteria (marking scheme), identify their own
mistake and reflect deeply on their own learning. A student commented that:
I thought marking is only meant for teachers. I like it when we mark each other’s work because
it helps me to find out my mistakes quickly and also makes me feel responsible (student).
PBM4029ononaji21809224 Page 27
I also learnt that the main difficulty with self-assessment is not that students are dishonest or
unreliable in assessing their own work but rather that they need clear guidelines or examples of what
constitutes ‘good work’ (Hallam, 2001).
This finding again revealed that effective learning can occur when students serve as learning
resources for either themselves or one another. This is in line with the views of Ofsted (2008), Black
and Wiliam (1998), Cooks (2001), Hodgson and Pyle (2010). Black and Wiliam believed that two
elements are critical for peer assessment to be valuable. First, students must work as a group or
team. Second, each student must be held accountable in some ways. Equally, Harrson and Harlen
(2006:30) reported a number of advantages following a research about primary teachers who had
implemented self-assessment by children.
They identified that self-assessment is an essential component of AFL because it can help children
direct their learning activities towards their learning goals.
Harrison and Harlen further explained that peer-assessment builds on the ALF notion of learning as a
co-constructivist activity whereby learning occurs as a result of social interaction. In this way, AFL
contributes to learning that is in accord with current research into effective learning (e.g. Watkins et
al., 2001; Wells, 2008).
Thirdly, in table1, 60% of the participants reported that regularly use of AFL in practice help to
increase students’ concentration in the classroom. For instance, since I started using questioning
regularly in my lessons, I discovered that questioning helps both teachers and learners to monitor
learning and increase the learner and the teacher motivation in the classroom. A student said:
I like it when my teachers question me in the class because it allows me to demonstrate my
level of understanding.
In addition, colleagues interviewed acknowledge that they use questioning for different reasons: to
improve motivation in the classroom; to improve students’ behaviour and concentration and to
enhance a learning climate. He said:
PBM4029ononaji21809224 Page 28
I also use questioning in the classroom as a tool to control students’ behaviours and to
reposition the entire class particularly when the learners are being disruptive (colleague
participant)
Another colleague reported that since he increased his questioning time in his lessons, he has
noticed improvements in the entire class outcomes particular with the lower attaining and disruptive
students when targeted.
The findings agree with the work of Black and Wiliam (1999b:143) who stated that:
Opportunity for students to express their understandings should be designed into any piece of
teaching, for this will initiate the interaction through which formative assessment aids learning.
Equally, Black and Harrison (2004) explain that through questioning, teachers are able to collect
evidence about pupils’ understanding with the aim of finding out what they do know and what they
partly know. Questioning therefore provides a starting point for the teaching allowing for the pupils’
knowledge and understanding to move on, and developing their thinking skills.
Finally, these findings indicate principally (60%) that regular use of AFL improves learners’ attainment
in the classroom. I learnt during the enquiry that the use of constructive feedback as an aspect of
formative assessment helped students to be aware of their progresses, learning needs, the different
standards of work and what they needed to do to enhance progress in their learning.
Feedback helped improve the quality of students work and raised the students’ achievements. This
impact was evident in the 2012-13 BTEC Level 3 and AS level ICT results where the two groups that
participated in the case study increased their achievements by 20% and 26% respectively (see
appendix 5: 2013-14 academic year results).
Cognisant that feedback must be specific and constructive for it to have positive impact on practice,
one student wrote:
I like it when my teacher gives me feedback on what I have done well and where I need to
improve. I don’t like negative feedback every time.
Again, this finding relates to the works of Black and Wiliam (1998), who pointed out the importance of
feedback when they specified, ‘we know of no other way of raising standards for which such a prima
PBM4029ononaji21809224 Page 29
facie case can be made’. Hattie (1999), summarised his wide-ranging review of research on ‘’what
works’’ in education with the statement that, ‘’the most powerful single moderator that enhances
achievement is feedback’’. While, Crooks (2001:3) warned that:
Feedback should be specific and related to need. Simple knowledge of results should be
provided consistently (directly or implicitly), with more detailed feedback only where necessary
to help the student work through misconceptions or other weakness in performance. Praise
should be used sparingly and where used should be task-specific, whereas criticism (other
than simply identifying deficiencies) is usually counterproductive.
Similarly, Butler (1996) suggested that appropriate descriptive feedback could improve performance
by 33% while grading method can have a negative effect on learner performance. Likewise, Black et
al, (2002) wrote that the use of feedback as an AFL tool helps both the learner and the student to
make progress:
An assessment activity can help learning if it provides information to be used as feedback by
teachers and their pupils in assessing themselves and each other, to modify the teaching and
learning activities in which they are engaged. Such assessment becomes formative
assessment when the evidence is actually used to adapt the teaching to meet learning needs.
PBM4029ononaji21809224 Page 30
Table 2 presents finding and discusses research question 2 in relation to the improvements needed
and recommendations.
To draw a logical conclusion and to ensure that the recommendations for the areas of improvement in
practice are free from subjectivity, I sought the opinions of the 10 colleagues who participated in the
research project. The table above reveals the analysis of data gathered from the colleagues during
the study on the constraints limitations and tensions noted while implementing and sustaining
assessment policies in practice.
Formative assessment is time consuming
Notwithstanding, the main outcome of this research is that embedding formative assessment in
practice increases learners’ outcomes conversely, colleague participants identified ‘time’ as a
hindrance to implementation of AFL in practice. This is evident in the table 2 above where 80% of
colleague participants reported their experiences on lack of time as a burden to application. Even
more, they pointed out that involving learners in practice by ways of self and peer assessment in a
more discursive and interactive lessons, and improved questioning led to slowed down in the pace of
curriculum delivery as a consequence, that the curriculum may not be covered within the specified
deadlines. A colleague participant reported that:
This new approach has lots of benefits to students and teachers, but the problem is that it is
time consuming. It is difficult, or impossible to embed formative assessment every time in your
lessons. (Colleague)
Areas of Improvement Yes % Maybe % Not sure % No %
Time consuming 8 (80%) 2 (20%) 0 (0%) 0 (0%)
A change in educational philosophy is
needed
7 (70%) 2 (20%) 1(10%) 0 (0%)
assessment policy 0 (0%)
Provide support/CPD for staff. 10 (100%) 0 (0%) 0 (0%) 0 (0%)
PBM4029ononaji21809224 Page 31
Change in educational philosophy
Also, a large majority of the teachers (70%) involved in the survey discovered that adopting and
embedding formative assessment in practice requires changes in teachers’ thinking and actions;
indeed, their whole educational philosophy for it to have positive impact on practice. For example,
some staff found the implementation of the AFL strategies fundamental to their pedagogical approach
and were comfortable with the ‘democratic classroom practices and the increased student-centred
focus. However, the statistics in table 2 likewise shows 20% of colleagues’ participants regarded AFL
as a risky, stressful, uncomfortable and an unnecessary procedure based on the fact that learners
were taking more control of their practice (Kirton et al. 2007:625). Equally, Black et al. (2003:83) in a
critical discussion on the implementation of ALF theory in the classroom reported that having ‘to let
go’ by teachers and let students take some responsibility for the lessons is a limitation to formative
assessment that practitioners need to overcome through professional development.
The need for support and sustainability
Once more, all teachers involved in the research noted that constant self-awareness through CPD
and trainings are good ways to overcome the limitations facing AFL in practice. The conclusions
suggested that meetings, networks and opportunities for sharing good practice of assessment
materials are important elements in supporting teachers in changing and sustaining their practice.
Speaking about this, Black (2007) argued that the biggest problem associated with formative
assessment is that the practical implementation seems to be based on a limited understanding and a
superficial adoption of the strategy by teachers and policy makers.
In the light of this development, many researchers including Black and Wiliam (1998a) made
numerous recommendations on how to close these gaps. Black et al. (2002, 2003) recommended
that teachers should adapt the following strategies to improve formative assessment in the
classrooms;
PBM4029ononaji21809224 Page 32
1) Teachers should improve their questioning techniques by using a variety of questioning styles
to encourage classroom interactions; allow more time for students to respond to questions
(wait time) and to focus more on the discussion of wrong answers through effective use of oral
and written feedback rather than, or in addition to, grades or marks.
2) Teachers should adapt the use of self and peer assessment by pupils, for example, the use of
‘traffic lighting’ in which pupils use the icons of green, amber or red to assess whether their
understanding of the subject matter is good, partial or poor.
3) Teachers should share the criteria for assessment with pupils as an important way of initiating
learning and make effective use of class collaboration through small groups, pairs and trios, to
plan, discuss, draft and redraft written work, including summative tests drawing up and using
mark schemes.
Consequently, embedding Formative Assessment in practice can enhance students’ performance, aid
them to participate actively, reinforce their grasp of course material, and participate in their own self-
assessment. Besides, the results can help teachers immediately redirect the learning experience to
address learners’ difficulties. Teachers must listen to students, ask them appropriate questions, and
give them the opportunity to show what they know in a variety of ways because research has proven
these strategies as effective methods to increase learning in practice.
PBM4029ononaji21809224 Page 33
CONCLUSION
In conclusion, through this research I have discovered that Assessment for Learning is a powerful
teaching and learning procedure used to raise learners’ achievements, and a monitoring tool for
teachers to judge the impact of their practise on their learners. This section discusses research
question no: 3; the impact on practice:
Impact on students’ and the ICT department
A review of the departmental achievement records from 2012 to 2014 indicated that learners’ grades
and the departmental outcomes increased tremendously. There is a perception among staff that the
regularly use of ALF firmly in practice contributed to the significant improvements in the results. In
2013 and 2014 academic year, all results in the department were recorded ‘Good’ with significant
number of students achieving above their TMGs. For instance, 88% of L3ICT 90 credit, 56% of
ASICT and 23% of L3EXDIP of students in these cohorts achieved above their TMGs in in the
academic year. Also, retention and success rates improved rapidly in the department from 2012 to
2014 when compared to previous outcomes. We (research participants) believed that the involvement
of the department in this projected attributed to the results improvements.
Impact on teaching and tutoring
It was believed that the regular use of AFL helped enhance teaching and learning and improved
independent learning in the department; thus improved learners’ confidence, engagement and
outcomes. For example, both colleagues and students participants thought that the increased use of
questioning and feedback by teachers in classroom occasioned to improved learners’ literacy and
high grades in some programme areas (see appendix 5: results analysis). Equally, college senior
leadership team acknowledged the improvements in teaching and learning in their 2014 Quality
Improvement Review feedback conducted in the department. They held that:
Students found lessons challenging, interesting and engaging with meaningful discussions,
good teaching. They repeat information; help students to process and understand content (ICT
SMT quality review feedback Oct 2014).
PBM4029ononaji21809224 Page 34
Similarly, embedding AFL theories in practice enabled internal and external progressions in the
department; for example, from 2012-14, more than 95% of students in a cohort progressed to the
higher level of their programmes indeed; learners were achieving better grades and maintained
confidence in themselves due to the skills they developed in the class through differentiated
assessment techniques routinely used by their teachers.
Finally, the outcomes of this project resulted to an overhaul of teaching and learning in the
department. The department/college now has a teaching and learning community that meets once a
month to strategize on ways of improving teaching and learning across the college.
Impact on personal and professional development
For myself as a practitioner, the impacts are as follows;
The experiences gained have added more values to my professional development. First, It has
enabled me develop a deeper understanding of the role of AFL in teaching and learning; thereby
making me a better reflective practitioner. I am now more confident in identifying my personal
weaknesses as well as managing situations arising from it. That is to say, my classroom
management; tracking/monitoring and intervention skills have improved as a result of embedding
assessment policies routinely in my practice (see appendix 4: track and monitoring sheet).
Second, I have validated my pedagogical practice with current educational theories as a result,
enhanced my practice and learners outcomes. Third, through this project, I have now seen the needs
to recognise teaching and classroom activities as ‘artistry’ that requires careful planning and
implementation. As a consequence, I have begun to reflect much deeper on my role and practices in
the classroom; that is to explore more and better ways to build learning partnership with learners
using formative assessment strategies as opposed to solely relying on myself as the only leader in
the classroom. I believe working this way with students will not only help students develop mutual
trust, gain greater confidence to engage more in lessons, but also it would re-enforce my role as an
enabler facilitating the learning process for the benefits of all students.
PBM4029ononaji21809224 Page 35
Finally, I have noticed the importance and the benefits of using formative assessment in my everyday
practice. I have now understood in detail that formative assessment is ‘a range of formal and informal
assessment procedures undertaken by teachers in the classroom as an integral part of the normal
teaching and learning process in order to modify and enhance learning and understanding’ (Ministry
of Education, New Zealand, 1994).
LIMITATION AND CHALLENGES
Completing this project was not without personal challenges. The first was the difficulty of
accommodating my family life and teaching work with my studies. Overcrowded timetable made it
difficult for me to find time to engage fully in my studies.
Further, it was initially hard and time consuming to gathering and analysing data from participants due
to the complex nature of the school settings; for example, arranging interviews with students were not
easy because of their structured school timetables and the fact that students understandably could
not agree to stay behind after lessons. Also, I had limitations in planning and documenting the
research activities as I found it particularly demanding to generate and coordinate evidence in support
of the project due to personal circumstances and workload in my place of work; for instance, it was
challenging to organise interviews for the 50 students that participated, observe many lessons and to
compose understandable questionnaires that were used to gather learners’ experiences without
difficulties. However, with the support of the participants, colleagues and family I overcome the
challenges by managing my time wisely and prioritised my daily activities to free the weekends and
evenings for the purpose of the project.
In conclusion, I have enjoyed the experiences gained in this research and development project. My
participation has provided me with models of best assessment practices and techniques. The
learners and I have now developed more self-confidence in managing everyday classroom
challenges and complexities unlike before.
PBM4029ononaji21809224 Page 36
FURTHER RESEARCH
If I have the opportunity to participate in a future research work, I would like to investigate whether
gender and ethnicity have impact on learners’ during classroom assessment. My possible research
question would be:
Is there evidence that gender and ethnicity have impact on learners during Classroom Assessment?
PBM4029ononaji21809224 Page 37
Inspection report: 24–25 April 2012 5 of 12
Inspection grades: 1 is outstanding, 2 is good, 3 is satisfactory, and 4 is inadequate
Please turn to the glossary for a description of the grades and inspection terms
students’ spiritual, moral, social and cultural development are very good, and
this work does much to foster the highly inclusive ethos of the centre.
What does the school need to do to improve further?
Continue to raise achievement by:
increasing the average grade and the proportion of higher grades gained
at A level and in BTEC Level 3 qualifications, so that all aspects of
attainment are firmly in line with national averages
ensuring that greater proportions of students complete the courses that
they start, consistently across all subjects and curriculum areas.
Improve the effectiveness of teaching by:
ensuring that students are much better matched to and placed on
programmes which are appropriate to their previous learning, so that
teaching can focus more precisely both on their needs and the
requirements of the course
ensuring that teaching has a strong focus on developing deep subject specific
skills and knowledge, as well as providing students with strategies
to pass examinations and complete coursework tasks.
Improve students’ punctuality to lessons, so that maximum use is made of
teaching and learning time.
Commentary
This document is an extract from the 2012 Ofsted inspectors report. The report highlights what the
school need to do to improve further.
This inspector’s recommendation was my main reason for undertaken the MA programmes; which lead
to my research on the ‘impact of assessment for learning in the classroom when embedded firmly in
practice’.
APPENDIX 1
PBM4029ononaji21809224 Page 38
This is evidence of consent letter from the college authority to use the college/students for the case
study.
APPENDIX 2
PBM4029ononaji21809224 Page 39
RESEARCH ETHICS: PARTICITANTS CONSENT FORM
Full title of Project: The Impact of Educational Practice (PBM4029)
Please Initial Box
1. I confirm that I have read and understand the information sheet for
the above study and have had the opportunity to ask questions.
2. I understand that my participation is voluntary and that I
am free to withdraw at any time, without giving reason.
3. I agree to take part in the above study.
Note for researchers:
Include the following statements if appropriate, or delete from your consent
form:
4. I agree to the interview / focus group / consultation being audio
recorded
5. I agree to the interview / focus group / consultation being video
recorded
6. I agree to the use of anonymised quotes in publications
Name of Participant Date Signature
Name of Researcher Date Signature
APPENDIX 3
PBM4029ononaji21809224 Page 40
Sept to Dec 2011 Summative Test Record
Teacher: Austin Ononaji
AS ICT Level 2011-2013 (GroupD) Grade Modelling/Progress
Data
2012 units Exam Results
TMG Student TEST
1
(Sept 11)
TEST
2
(Nov
2011)
TEST
3
(Dec
2011)
TEST4
(Jan
2012)
TEST
4
(FEB/M
ARC
H
2012)
W
A
G
CH
G
RAD
E
INFO1 Jan 2012 Exam
INFO2 Jan 2012 Exam
Results
TOTALAS GRADE INFO3 INFO4 TOTAL A2 GRADE
D Abdullahi, Mohamed C C C D/C C/B C 0 U
E Ahmed, Yusuf C C B B C B B/A E 0 U
Bello, Ahmed C A 0 U
U Brown, Pierre A A A A A A A B 0 U
Douglas-Williams, Aaron A A A C B B A C 0 U
E Khan, Zainab C U(B) D D D D/C C/B D 0 U
D Ndulor, Promise A E(C) A B C B A/B D 0 U
D Onomousiuko, Ejowhokoghene B U(D) D C D D/C C/B E 0 U
D Osifeso, Oyinlade-Samson B B B B C B B/A D 0 U
C Piotrowicz, Przemyslaw B C C N/A B C/B B/A D 0 U
U Rhea Petters C U U D N/A U/E E/D B
WAG
Total No of students: 8 CG
Male: 9
Female: 2
Predicted class achievement 100% 160-200 A 320-400 A
140-159 B 280-319 B
120-139 C 240-279 C
100-119 D 200-239 D
80-99 E 160-199 E
0-79 U 0-159 U
AS Grades A2 Grades
Rhea has attendance problem that is preventing her from achieving to
her potential. Ejo is a very quiet/shy person in class and does not like
to contribute to class discussions unless targeted. Mohamed
sometimes find it difficult to assimilate (concentrate) and talks out of
context in some occasion. Yusuf, Pierre, Aaron, Samson, Promise,
Zainab, Przemyslaw need to be constantly challenged to get the
best out of them.
Working at Ggrade
Challenging Grade
Comment
This is a monitoring sheet used to track and monitor learners against the TMGs. It helped me to understand,
support, predict grades and provide feedback to my students.
APPENDIX 4
PBM4029ononaji21809224 Page 41
ACHIEVEMENT REVIEW 2013
1. COURSE ACHIEVEMENT DATA (High pass rate = A*-B (AS/A2); MMD - D*D*D* (BED3); MM - D*D* (BD3); M -D* (BSD3/BC3); D -D*
(BD2 )
Course & Level Completion year: 2011 2012 2013 BM
Target
2013
Target
2014
Comment and analysis
(achievement and retention against benchmark,
significance of value added, attendance, high
pass rate, reasons for students leaving early, etc)
AS LEVEL
No. of starts 26 24 13 16 AS ICT shows a 26% increase in A-E and
47.8% A-B grades when compared to
2012 results.
Possible reasons:
Good use of ALF in lessons.
The Value added is 0.9%; (0.7% better
than 2012 cohort).
% retention 77% 79% 89% 91% 91%
% pass rate 100% 74% 100% 100% 80%
% high pass rate 20% 11% 53.8% 18.8% 35%%
% success rate 77% 59%. 89% 79.3 79.3%
value added +0.7 +.03 +0.4 NA
% attendance 89.1 90.9% 90%
A2 ICT
No. of starts 8 16 7 6 In A2, the overall grade is 85% in A-E grades
and 71% in A-C. 38% increase in A-C grades
when compared to 2012 results. Overall results
was not as anticipated as no student achieved
A*-B grades.
Possible reasons:
(15%) (1) Student failed because a review
of scripts showed that they had difficulties
in answering most questions in the exams.
That student did not use a scriber as
recommended.
A2 ICT has exceeded all/most achievements
benchmarks and targets in 2013. Value added
is +0.3; (0.4%) better than 2012 record.
% retention 100% 94% 100% 96.1% 95%
% pass rate 100% 100% 85% 96.6% 91%
% high pass rate 25% 7% 0 19.8% 10%
% success rate 100% 94% 85% 83% 87%
value added +0.3 -0.1 +0.3
% attendance
87.7% 82.9%
90%
LEVEL 2 DIPLOMA ICT
No. of starts 23 22 16 16 L2 recorded 100% pass rate with 60%
APPENDIX 5
This document shows a breakdown of the departmental results for the last
three years. The date shows that there is significant improvement in 2013
resulting from my improved practice (AFL)
PBM4029ononaji21809224 Page 42
% retention 87% 64% 94% 89% 89% high grades. 31% high grades better than
2012.
 4% above benchmark.
 5% above retention.
 Retention 94%, 30% better than
2012 and 5% above benchmark.
 1 student withdrawn for not
attending by senior tutor having
followed all college procedures.
 L2 Dip ICT achievement
exceeded all benchmarks in 2013.
% pass rate 100% 93%
(100%)
100% 96% 96%
% high pass rate 50% 29% 60% NA 30%
% success rate 87% 59% 94% 86% 86%
value added +13 +6 NA NA
% attendance
80% 86.7%
Course & Level Completion year: 2011 2012 2013 BM
Target
2013
Target
2014 Comments
BTEC LEVEL 1 ICT No. of starts 10 12 100% pass rate, 8% better than 2012.
100% retention.
% retention 80% 108% 100%
% pass rate 100% 92% 100%
% high pass rate N/A N/A NA
% success rate 80% 100%
value added N/A N/A
% attendance 93.7 82.6%
BTEC NATIONAL
DIPLOMA ICT YR2
No. of starts 16 16 21 19 No non achiever.
100% pass rate with
58% high pass rate (D*D*-MM)
4% above benchmark
Retention:
93.3%, 20% better than 2012 and
20.3% above benchmark.
2 students left for an
apprenticeship in March 2013.
L3 Dip ICT achievement
exceeded all benchmarks in 2013.
% retention 88% 75% 93.3% 73% 75%
% pass rate 100% 92%
(100%)
100% 96% 92%
% high pass rate 83% 58% NA 55%
% success rate 88% 75% 90.5% 69% 80%
value added -01 -
% attendance 82.6 71.9% 90%
BTEC NATIONAL EXT
DIPLOMA YR2
No. of starts 26 15 11 10 No non achiever.
100% pass rate with 82% high
pass rate (D*D*D*-MMM) same as
2012.
% retention 50% 73% 91 68 76%
% pass rate 100% 100% 100% 65% 90%
% high pass rate 62% 82% 82% 55%
PBM4029ononaji21809224 Page 43
% success rate 50% 73% 71% 44% 75% 35% above benchmark.
Retention:
91%, 18% better than 2012 and
23% above benchmark.
L3 EXT Dip ICT achievement exceeded all
benchmarks in 2013.
value added +0.5 +0.3
% attendance 90% 80%
BTEC L 3 90 DIPLOMA
YR1
No. of starts 19 No non achiever.
100% pass rate with 28% high
pass rate (D*D*-MM)
Retention:
73.7 due to a difficult cohort with
new programme piloted. (see note
on progression)
% retention 73*
% pass rate 100%
% high pass rate 28.5%
% success rate 73.7%
value added
% attendance
PBM4029ononaji21809224 Page 44
1. Research and Development Plan
Action to be taken By whom By when Performance indicator Reference
To review 2009/12 academic year results
in all programmes and set targets to
improve underperformed area/s.
Me
Sept 2011
Result analysis and review of
2010/11 SEF and development
plan.
Faculty Development
plan
To implement teaching and learning
strategy that will be use to monitor and
support students. (Assessment for
learning)
Same On going
Team meeting and CPD and
sharing good practise.
Lesson plan
Lesson observation report
SOW
Faculty develop plan
To Set up a homework and grade tracking
system for old and new students based on
TMG or exam results.
Continuous assessment/Summative test
Same Sept
2011/On
going
Students grades
Teachers feedback Same
To update SOW, set up revision and mock
exam timetables for AS/A2 by focusing on
developing examination writing techniques
in preparation for Jan/May/June 2012
exams. (Assessment for learning)
Same
Sept 2012/
On going
Set deadlines and monitor
progress and compare grades
with TMGs.
Same
To ensure that there is effective monitoring
and support for all learners in
underperforming areas.
Same
On going
Provide feedback through
‘Personalised learning (ILP)
Same
Review process and evaluate outcome of
embedding assessment for learning firmly
in my planning and practice.
Me /
mentor
April 2012 Lesson observation report
Students grades
Students’ feedback etc.
Same
Sharing good practice on assessment for
learning with staff through internal CPD
Me Sept 2012 Same
Commentary
An action plan used to implement my reflective practice programme. This action plan enabled me to conduct logical investigation and
to collect factual evidence and data analysis throughout my research programme. The table shows the stages of activities used to collect
facts and figures from college and learners before and after my investigation.
APPENDIX 6
PBM4029ononaji21809224 Page 45
CLASSROOM OBSERVATION FEEDBACK FORM
Observed: Austin Ononaji Observer: xxxxxxxx Date: 27
th
May 2013
Course Title: AS ICT Session length: 2 - 3pm Observation time: 1 hour
Students on register: 10 Students in class: 10 Students arriving late: 0
Focus of observation
Line management (as acting HoF)
Brief overview of lesson
Design of a solution recap, formal exposition, group
work & mini written assessment
Evaluative comments on key issues for feedback – what went well and
areas for improvement.
Please include comments on: student engagement, progression of
learning, behaviour, differentiation (personalisation, inclusion &
challenge)
The lesson started promptly with Austin asking directed questions to students as a
recap on previous learning. This prompted a good response from all but one student -
Austin received the answers with comments such as ‘fantastic’ or ‘I will accept that’. It
was a good idea to show the key words used in the topic and to reinforce their
meanings/definitions by again challenging students to explain them; after a brief power
point presentation, a group work exercise followed in which students worked in pairs or
threes.
It was clear that some thought had gone into the pairings and membership of the
groups ensuring that the most able shared their knowledge and confidence with weaker
students. The work was timed and the results were to be recorded on a sheet and fed
back to all students.
Each group had different but complementary task. The students worked well together,
discussing their ideas, listening to each other with one member recording the key items.
In some groups one student rather dominated proceedings and took over the
scribe/design from another student; in another pair one (weaker) student recorded the
feedback whilst the more able student ‘dictated’ most of the content. Whilst the group
work took place, using self/peer assessment, Austin monitored and worked with each
group, monitoring and prompting, and challenging or clarifying the key issues as
students prepared their feedback. It might have been better to ensure that members of
each group had clear notes and that there was a method of attributing individual
contributions which would have helped to monitor progress more effectively.
Students listened attentively to each group’s feedback and Austin prompted and also
challenged the content/feedback on the designs and plans produced. Most students
contributed to the feedback although the effectiveness of the feedback would have
been improved by using larger flipchart sized paper.
Students were rather passive though attentive during the last part of the lesson on
design feedback. Non-directed questioning during this part of the lesson was
dominated by one particular student. Students used textbooks or the Topic B handout
to complete a table on tasks & software and types; the ensuing discussion and line of
questioning (directed and non-directed) help to consolidate progress and learning.
Austin finished the session by asking if any student had questions - by this time some
students had packed up ready to leave the lesson. An open discussion followed which
involved only a few of the most able students.
Prompts
Teaching
 Preparation
 Variety
 Structure
 Work or real world
applications
 Skills development
 Lesson start
 Lesson ending
 Objectives
 Differentiation
 AfL
 Questioning
technique
 Task management
 Behaviour
management
 Clear
 Challenging
 Expertise
 Enthusiasm
 Inspiring
 Use of learning
resources
 ICT
 Seating Learning
environment
 Equality/diversity
Learning
 Good working reins
 Behaviour
 Group work
 Good use of time
 Independent learning
 Enjoyment
 Motivation
 Individual challenge
 Individual needs
 Variety of task &
assessment
 Students aware how
to improve
 All contribute
 ICT
 Attendance
 Punctuality
Attainment
 Progress
 Standards & level
APPENDIX 7
Lesson observation report showing improvement in practice
PBM4029ononaji21809224 Page 46
Skills (specific and
general
Please comment on and grade the following areas of the lesson from 1 (outstanding) to 4 (inadequate)
Aspect of lesson Grade Comment (reason for grade)
Student learning and progress
1
Students made good progress in learning about design
solutions; clean monitoring through directed Q/A, group work,
written work and response.
Demonstration of teacher subject
knowledge 1
Confident, up to date & relevant.
Effective use of time & structure
of activities 2
A lesson of good pace, well structured with a range of tasks
and activities in which all students were engaged.
Effective use of assessment to
inform teaching & learning
*Validation & Verifications*
1
There were a number of opportunities used to assess learning
including directed and non-directed Q/A, group work, written
work & student contributions.
Effective use of questioning
and/or tasks/activities to
challenge & assess
1
Directed questioning was used affectively, non-directed was
at times dominated by a few very capable individuals - XXXX
the potential contributions of others.
Students understand how to
improve their work 2
Assessment of student contributions and feedback given
during the lesson was mostly effective although feedback to
individuals needed to be clearer at times
Effective use of resources
(including technology where
appropriate)
1
Good use of IWB as a teaching resource.
Give details of any strategies used to support language development & literacy skills, maths skills and ICT
skills
None observed
Is there evidence of use of the error code for assessed work (if in doubt, ask to see
examples of work or ask students)
Yes No
X
Lesson Grade: 1  2 3 4 IQRs only
PBM4029ononaji21809224 Page 47
1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4
1 1 1 1 1 1 1 1 1 1
2 1 1 1 1 1 1 1 1 1
3 1 1 1 1 1 1 1 1 1
4 1 1 1 1 1 1 1 1 1
5 1 1 1 1 1 1 1 1 1
6 1 1 1 1 1 1 1 1 1
7 1 1 1 1 1 1 1 1 1
8 1 1 1 1 1 1 1 1 1
9 1 1 1 1 1 1 1 1 1
10 1 1 1
11 1
12 1
13 1
14 1
Total 9 0 0 0 5 5 0 0 9 0 0 0 8 2 0 0 5 4 0 0 5 4 0 0 1 7 1 0 8 5 0 0 4 5 0 0
Percentage 64% 0% 0% 0% 36% 36% 0% 0% 64% 0% 0% 0% 57% 14% 0% 0% 36% 29% 0% 0% 36% 29% 0% 0% 7% 50% 7% 0% 57% 36% 0% 0% 29% 36% 0% 0%
No of
Questionnaire
completed
Q3a. How much do you think
this will help you to learn
Q4 Do you do you think your
teacher explain how what you
are learning will help you do
well in the subject
Q4a. How much do you think
this will helps you learn
Q5. How often does your
teacher check your learning
before the end of the lesson
Q6. Do you think is good or bad
for the teacher to share the
learning objectives with you at
the start of the lesson
AS ICT Student Assessment for LearningFeedback 2012/13
Learning Objectives
Q1. How often does your
teacher clearly explain
what you are trying to learn
Q1a. How much do you think
this helps you to learn
Q2. How often does your test
your knowledge during lesson
to see if you are understanding
Q3. How often does your
teacher explain or show you
what need to be done to
achieve the learning objectives
APPENDIX 8
Student questionnaire and feedback summary
PBM4029ononaji21809224 Page 48
Bibliography
Black, P., Harrison, C., Lee, C., Marshall, B. and William, D. (2006) Assessment
for Learning, Putting it into practice. London. Open University Press.
Black, P., Harrison, C., Lee, C., Marshall, B. & Wiliam, D. (2002) Working inside the black
box: assessment for learning in the classroom (London King’s College).
Black, P. and Harrison, C. (2004) Science Inside the Black Box. London: nferNelson.
Bolton, G. (2010) Reflective Practice: Writing and Professional Development. London:
SAGE.
Bransford, J., Brown, A.L., Cocking, R.R., Donovan, M.S. & Pellegrino, J.W. (eds).
(2000). How People Learn, Brain, Mind, Experience, and School. Expanded Edition,
National Research Council, National Academy Press, Washington.
Burton, D. M. and Bartlett, S. (2011) Practitioner Research for Teachers. London: Paul
Chapman.
Butler, (1988), Enhancing and understanding intrinsic motivation; the effects of task-
involving and ego-involving evaluation on interest and performance’, British journal of
Education Psychology, 58, 1-14.
Clark, S. (2005) Formative Assessment in the Secondary Classroom. London: Hodder
Murray.
Coolican, H. (2011) Introduction to Research Methods in Psychology. UK: Hodder
Education.
Dobbert, M.L. (1982), Ethnographic research: Theory and Application for Modern Schools
and Societies, New York, NY; Praeger Publishers.
Eysenck, M.W. (2004), Psychology: An International Perspective, Research Methods:
Data Analysis. Psychology press limited.
Fetterman, D.M. (1989), Ethnography Step by Step Newberry Park, CA; Sage
Publications.
Forde, C., McMahon, M., McPhee, A.D. and Patrick, F. (2006) Professional Development,
Reflection and Enquiry. London: Paul Chapman.
Ghaye, K. and Ghaye, T. (2004) Teaching and Learning Through Critical Reflective
Practice. London: David Fulton Publishers Ltd.
Hammersley (2002) Educational Research, Policymaking and Practice. London: Paul
Chapman Publishing.
PBM4029ononaji21809224 Page 49
Kyriacou, C. (2009) Effective Teaching In Schools: Theory And Practice. Cheltenham:
Nelson Thornes.
Kyriacou, C. (2007) Essential Teaching Skills: Nelson Thornes.
McNiff, J & Whitehead, J (2010) You and Your Action Research Project 3rd
Ed.
Routledge: London.
Moon, J. (2006) Learning Journals: A Handbook for Reflective Practice and Professional
Development. Oxon: Routledge Falmer.
Moon, J. (2008) Critical Thinking: An Exploration of Theory and Practice. London:
Routledge.
Pollard, A. (2008) Reflective Teaching: Evidence-Informed Professional Practice. London:
Continuum International Publishing Group Ltd.
Paulson L. and Wallace M. (2004) Learning to Read Critically in Teaching and Learning.
London: SAGE.
Powell, R. (2010), Outstanding Teaching, Learning and Assessment ‘The Handbook’.
London Robert Powell Publication LTD.
Shon, D.A. (1987) The Reflective Practitioner: How Professionals Think in Action, San
Francisco, Josey Bass.
Silvan, R.E., Huurley, E.A, & Chamberlain, A.M (2005) Cooperative learning and
achievement. In W.M. Reynolds & G.J. Miller (Eds.), Handbook of psychology:
Educational psychology (Vol. 7, pp.177-198). Hoboken, NJ: Wiley.
Simons, H. (2010). Case study Research in Practice. London Sage
Taylor Powell, E. and Renner, M. (2003) Analysing Qualitative Data. Maddison, WI:
University of Wisconsin.
Available at: http://learningstore.uwex.edu/assets/pdfs/g3658-12.pdf.
Walsh, M. (2001) Research Made Real (A guide for students). Nelson Thornes.
Watford, G. (2001) Doing Educational Qualitative Research: A personal guide to the
research process. Published by Anthony Rowe LTD Chippenham, Wilshire, GB.
Watkins, C., Carnell, E., Lodge C., Wagner, P. and Whalley. C. (2001) ‘Learning about
Learning enhances performance’, NSIN Research Matters, 13.
Wells, G. (2008). ‘Dialogue, inquery and the construction of Learning Communities’. In:
Lingard. Nixon, J. and Ranson, S. (Eds) Transforming Learning in schools and
communities: the Remarking of Education for Cosmopolitan Society. London Continuum.
Wiersma, W, & Jurs, S., G. (2009) Research Methods in Education: an introduction, 9th
ed. Pearson Education.
PBM4029ononaji21809224 Page 50
William, N. and Buckler, S. (2008) Your Dissertation in Education. London: Sage.
Yates, S. (2004) Doing Social Science Research. London: Sage/Open University Press.
Yin, R, K (2009) Case Study Research: design and methods. London: Sage.
Journals and related publications
Assessment and Rporting Unit Learning Policies Branch Office of Learning and Teaching
(2005) Current Perspective on Assessment.
Assessment Reform Group (2002) ‘Assessment for Learning’: 10 principles.
Assessment Reform Group (2009) Assessment in schools Fit for purpose? A
Commentary by teaching and learning research programme
Black, P. and Wiliam, D. (1998) Assessment and classroom learning, Assessment in
Education: Principles, Policy & Practices Mar 1998, Vol.5 issue 1.
Black, P. and Wiliam, D. (1998a) Inside the black box: raising standards through
classroom assessment (London King’s College).
Black, P and Wiliam, D (1999), Assessment for learning: beyond the black box. London:
Kings College London.
Black, P. and Wiliam, D. (1998b) Assessment and classroom learning, Assessment in
Education, 5(1), 7-74.
Black, P. (2007) Formative Assessment: Promise or problems?. King’s College London.
Briggs, J. (1998). Assessment and classroom learning: a role for summative
assessment? Assessment in Education. Vol.5, No.1, pp.103-110.
Brown (2004-5), Assessment for Learning: Learning and teaching in Higher Education,
Issue 1, 2004-05.
Butler, R (1988). ‘Enhancing and undermining intrinsic motivation: effects of task-
involving and ego-involving evaluation on interest and performance’. British Journal
Of Educational Psychology 56 (51–63).
Crooks, T, J (1988). ‘The Impact of Classroom Evaluation Practice on Students’. Source
Review of Educational Research, Vol. 58 No.4, Winter, 1988, pp 438-481.
Department for Children, Schools and Families (2009) Report of the expert group on
assessment. London DCSF.
PBM4029ononaji21809224 Page 51
Elwood, J. (2004) Gender and achievement: new issues or old problems in assessment for
learning? Keynote lecture, SEED and SQL Assessment is for Learning Conference Glasgow,
4 June 2004.
Gagne, R. M., Briggs, L, J., & Wager, W, W. (1988). Principles of instrumental design.
New York: Holt, Rinehart and Winston.
Hargreaves, D.H. (2001) A future for the school curriculum. Available online at:
http://www.qca.uk/ca/14-19/dh_QCA.
Harrison, C. and Harlen, W. (2006). ‘Children’s self- and peer-assessment’. In: Harlen, W.
(Ed) ASE Guide to primary Science Education. Hatfield: Association for Science
Education.
Hattie, J. (1999). Influence on student learning: University of Auckland, New Zealand:
Inaugural professorial lecture.
Hayward, L. & Hedge, N. (2005). Travelling towards change in assessment: policy,
practice and research in education. Assessment in Education, Vol.12, No.1, pp55-75.
Hodgson, C.& Pyle, K. (2010) A literature review Of Assessment for Learning in Science.
National Foundation for Education Research.
Kirton, et al. (2007) Revolution, evolution or a Trojan horse? Piloting assessment for
learning in some Scottish primary schools.
McDowell, Liz, Sambell, Kay and Davison, Gillian (2009) Assessment for learning: a brief
history and review of terminology. In: Improving Student Learning Through the
Curriculum. Oxford Centre for Staff and Learning Development, Oxford, pp. 56-64. ISBN
1873576786
Miller, M (2005) Assessment: Literature Review, Bulletin number 19.
Miller, R. and Hames, V. (2002). ‘EPSE Project 1: Using diagnostic assessment to
improve science teaching and learning’, School Science Review, 84, 307, 21-24.
Miller, A. H. Imrie, B. W. & Cox, K. (1998) Student Assessment in Higher Education: a
handbook for assessing performance, London: Kogan Page.
Ministry of Education, New Zealand. (1994) Assessment: policy to practice. Wellington,
New Zealand: Learning Media.
Natriello, G. (1987) The Impact pf evaluation processes on students, Educational
Psychologist, 22, pp.155-175.
PBM4029ononaji21809224 Page 52
Office for Standard in Education (2010) The quality of teaching and the use of
assessment to support learning.
Office for Standard in Education (2008) Assessment for learning: the impact of National
Strategy support.
Office for Standard in Education (2012) College report. Quality Improvement Agency for
Lifelong Learning (2008). Guidance for assessment and learning.
Rust, C. (2002) The impact of assessment on student learning: how can the research
literature practically help to inform the development of departmental assessment
strategies and learner-centred assessment practices? Active learning in Higher
Education, vol 3, no.2, pp.145-158.
Salthouse, T., A. (2011) All Data Collection and Analysis Methods Have Limitations:
Reply to Rabbitt (2011) and Raz and Lindenberger (2011). Psycological Bulletin, 137,
No.5, 796-799.
Stiggins, R.J. (2002). Assessment in Crisis: The Absence Of Assessment FOR Learning.
Phi Delta Kappan. Vol.83, No.10, pp758-765.
Winter, R [1991] ‘Fictional-critical writing as a method for educational research,’ British
Educational Research Journal, 17, 3, pp251-262.
Young, W. (2005) Assessment for Learning: Embedding and Extending. Assessment is
for Learning.
Websites
www.learning-teaching-update.com
www.sflip.org.uk
www.niu.edu/assessment/Resources/Assessment_Glossary.htm
www.ofsted.gov.uk
British Education Index (BEI) http://www.leeds.ac.uk/bei
British Educational Research Association (BERA)http://www.bera.ac.uk
Department for Education http://www.education.gov.uk
ERIC (The Education Resources Information Centre) http://www.eric.ed.gov
ITSLIFE - Learning for Teaching on Reflective
Practice.http://www.itslifejimbutnotasweknowit.org.uk/RefPractice.htm
Professor John Hattie’s website is at:
www.arts.auckland.ac.nz/staff/index.cfm?p+8650
NFER (National Foundation for Education Research) http://www.nfer.ac.uk/index.cfm
http://www.bera.ac.uk/
PBM4029ononaji21809224 Page 53
www.lancsngfl.ac.uk/secondary/.../getfile.php?src...questionnaire
http://nrl.northumbria.ac.uk/2906/
http://beyondcrossroads.amatyc.org/doc/CH5.html
http://www.cscc.edu/about/assessment/
http://beyondcrossroads.matyc.org/doc/CH5.html
http://www.ukessays.com/essays/education/assessment-for-the-purpose-of-
instruction.php

More Related Content

What's hot

Student Perceptions of Online Learning
Student Perceptions of Online LearningStudent Perceptions of Online Learning
Student Perceptions of Online LearningK.M. Smith, Ph.D.
 
Assessment and feedback in large classes
Assessment and feedback in large classesAssessment and feedback in large classes
Assessment and feedback in large classesDavid Carless
 
Supporting the development of student feedback literacy
Supporting the development of student feedback literacySupporting the development of student feedback literacy
Supporting the development of student feedback literacyDavid Carless
 
Creating learning environments for self-generated feedback to thrive
Creating learning environments for  self-generated feedback to thriveCreating learning environments for  self-generated feedback to thrive
Creating learning environments for self-generated feedback to thriveDavid Carless
 
Assessment for digital futures
Assessment for digital futuresAssessment for digital futures
Assessment for digital futuresDavid Carless
 
Developing students' feedback through literacy
Developing students' feedback through literacyDeveloping students' feedback through literacy
Developing students' feedback through literacyDavid Carless
 
Partnership with students in designing feedback processes for large classes
Partnership with students in designing feedback processes for large classesPartnership with students in designing feedback processes for large classes
Partnership with students in designing feedback processes for large classesDavid Carless
 
Assessment possibilities
Assessment possibilitiesAssessment possibilities
Assessment possibilitiesDavid Carless
 
Feedback literacy as a key to ongoing improvement
Feedback literacy as a key to ongoing improvementFeedback literacy as a key to ongoing improvement
Feedback literacy as a key to ongoing improvementDavid Carless
 
Developing staff & student feedback literacy
Developing staff & student feedback literacyDeveloping staff & student feedback literacy
Developing staff & student feedback literacyDavid Carless
 
Feedback literacy for digital futures
Feedback literacy for digital futuresFeedback literacy for digital futures
Feedback literacy for digital futuresDavid Carless
 
Teacher feedback literacy & feedback designs for student learning
Teacher feedback literacy & feedback designs for student learningTeacher feedback literacy & feedback designs for student learning
Teacher feedback literacy & feedback designs for student learningDavid Carless
 
Feedback in online learning environments
Feedback in online learning environmentsFeedback in online learning environments
Feedback in online learning environmentsDavid Carless
 
Effective learning-oriented assessment: Developing student feedback literacy
Effective learning-oriented assessment: Developing student feedback literacyEffective learning-oriented assessment: Developing student feedback literacy
Effective learning-oriented assessment: Developing student feedback literacyDavid Carless
 
Sustainable assessment & online learning environments
Sustainable assessment & online learning environmentsSustainable assessment & online learning environments
Sustainable assessment & online learning environmentsDavid Carless
 
Designing satisfying feedback experiences
Designing satisfying feedback experiencesDesigning satisfying feedback experiences
Designing satisfying feedback experiencesDavid Carless
 
The use of the internet in higher education
The use of the internet in higher educationThe use of the internet in higher education
The use of the internet in higher educationNorshim Hashim
 
Dr. Norman Butler & Dr. W.A. Kritsonis
Dr. Norman Butler & Dr. W.A. KritsonisDr. Norman Butler & Dr. W.A. Kritsonis
Dr. Norman Butler & Dr. W.A. KritsonisWilliam Kritsonis
 

What's hot (20)

Student Perceptions of Online Learning
Student Perceptions of Online LearningStudent Perceptions of Online Learning
Student Perceptions of Online Learning
 
Assessment and feedback in large classes
Assessment and feedback in large classesAssessment and feedback in large classes
Assessment and feedback in large classes
 
Supporting the development of student feedback literacy
Supporting the development of student feedback literacySupporting the development of student feedback literacy
Supporting the development of student feedback literacy
 
Creating learning environments for self-generated feedback to thrive
Creating learning environments for  self-generated feedback to thriveCreating learning environments for  self-generated feedback to thrive
Creating learning environments for self-generated feedback to thrive
 
Assessment for digital futures
Assessment for digital futuresAssessment for digital futures
Assessment for digital futures
 
Developing students' feedback through literacy
Developing students' feedback through literacyDeveloping students' feedback through literacy
Developing students' feedback through literacy
 
Partnership with students in designing feedback processes for large classes
Partnership with students in designing feedback processes for large classesPartnership with students in designing feedback processes for large classes
Partnership with students in designing feedback processes for large classes
 
Assessment possibilities
Assessment possibilitiesAssessment possibilities
Assessment possibilities
 
Education service delivery presentation
Education service delivery   presentationEducation service delivery   presentation
Education service delivery presentation
 
Feedback literacy as a key to ongoing improvement
Feedback literacy as a key to ongoing improvementFeedback literacy as a key to ongoing improvement
Feedback literacy as a key to ongoing improvement
 
19 Nov08 Student Engagement K Duffner
19 Nov08  Student Engagement  K Duffner19 Nov08  Student Engagement  K Duffner
19 Nov08 Student Engagement K Duffner
 
Developing staff & student feedback literacy
Developing staff & student feedback literacyDeveloping staff & student feedback literacy
Developing staff & student feedback literacy
 
Feedback literacy for digital futures
Feedback literacy for digital futuresFeedback literacy for digital futures
Feedback literacy for digital futures
 
Teacher feedback literacy & feedback designs for student learning
Teacher feedback literacy & feedback designs for student learningTeacher feedback literacy & feedback designs for student learning
Teacher feedback literacy & feedback designs for student learning
 
Feedback in online learning environments
Feedback in online learning environmentsFeedback in online learning environments
Feedback in online learning environments
 
Effective learning-oriented assessment: Developing student feedback literacy
Effective learning-oriented assessment: Developing student feedback literacyEffective learning-oriented assessment: Developing student feedback literacy
Effective learning-oriented assessment: Developing student feedback literacy
 
Sustainable assessment & online learning environments
Sustainable assessment & online learning environmentsSustainable assessment & online learning environments
Sustainable assessment & online learning environments
 
Designing satisfying feedback experiences
Designing satisfying feedback experiencesDesigning satisfying feedback experiences
Designing satisfying feedback experiences
 
The use of the internet in higher education
The use of the internet in higher educationThe use of the internet in higher education
The use of the internet in higher education
 
Dr. Norman Butler & Dr. W.A. Kritsonis
Dr. Norman Butler & Dr. W.A. KritsonisDr. Norman Butler & Dr. W.A. Kritsonis
Dr. Norman Butler & Dr. W.A. Kritsonis
 

Similar to The Impact of Assessment for Learning

Enhancing learning and_teaching_in_higher_education_in_northern_ireland
Enhancing learning and_teaching_in_higher_education_in_northern_irelandEnhancing learning and_teaching_in_higher_education_in_northern_ireland
Enhancing learning and_teaching_in_higher_education_in_northern_irelandViewpoints, University of Ulster
 
ACTION RESEARCH RATIONALE OF THE STUDY LAC SESSION.docx
ACTION RESEARCH RATIONALE OF THE STUDY LAC SESSION.docxACTION RESEARCH RATIONALE OF THE STUDY LAC SESSION.docx
ACTION RESEARCH RATIONALE OF THE STUDY LAC SESSION.docxEmmanuel Gamana
 
Tbl, pbl, ebl, scale up, buzz, virtual or what?
Tbl, pbl, ebl, scale up, buzz, virtual or what?Tbl, pbl, ebl, scale up, buzz, virtual or what?
Tbl, pbl, ebl, scale up, buzz, virtual or what?SEDA
 
Teamwork seda may 2018
Teamwork seda may 2018Teamwork seda may 2018
Teamwork seda may 2018Peter Hartley
 
LS Curriculum & Assessment Guide
LS Curriculum & Assessment GuideLS Curriculum & Assessment Guide
LS Curriculum & Assessment GuideWai-Kwok Wong
 
MUHAMMAD AKBAR ZAHIDI - american journal education 2-10-2 copy
MUHAMMAD AKBAR ZAHIDI - american journal education 2-10-2 copyMUHAMMAD AKBAR ZAHIDI - american journal education 2-10-2 copy
MUHAMMAD AKBAR ZAHIDI - american journal education 2-10-2 copymuhammadakbarzahidi
 
Successful implementation of the EPQ
Successful implementation of the EPQSuccessful implementation of the EPQ
Successful implementation of the EPQCranleigh School
 
Exploring scholarship and scholarly activity in college-based Higher Education
Exploring scholarship and scholarly activity in college-based Higher EducationExploring scholarship and scholarly activity in college-based Higher Education
Exploring scholarship and scholarly activity in college-based Higher EducationThe Education and Training Foundation
 
The Teacher Use of ET for Motivating English Majors in EFL Speaking Class at ...
The Teacher Use of ET for Motivating English Majors in EFL Speaking Class at ...The Teacher Use of ET for Motivating English Majors in EFL Speaking Class at ...
The Teacher Use of ET for Motivating English Majors in EFL Speaking Class at ...Tuấn Phan Đình
 
A Tracer Study on the Graduates of Bachelor of Secondary Education Major in E...
A Tracer Study on the Graduates of Bachelor of Secondary Education Major in E...A Tracer Study on the Graduates of Bachelor of Secondary Education Major in E...
A Tracer Study on the Graduates of Bachelor of Secondary Education Major in E...Erica Thompson
 
Project-Based Learning Guided Lesson Study Improve the Achievement of Learnin...
Project-Based Learning Guided Lesson Study Improve the Achievement of Learnin...Project-Based Learning Guided Lesson Study Improve the Achievement of Learnin...
Project-Based Learning Guided Lesson Study Improve the Achievement of Learnin...iosrjce
 
Julia Parra Research for VSS 2009
Julia Parra Research for VSS 2009Julia Parra Research for VSS 2009
Julia Parra Research for VSS 2009Julia Parra
 
Using formative assessment to assess students in Bac Hai Primary school (1) (...
Using formative assessment to assess students in Bac Hai Primary school (1) (...Using formative assessment to assess students in Bac Hai Primary school (1) (...
Using formative assessment to assess students in Bac Hai Primary school (1) (...TheDivergent
 
21St Century Skills In The Case Critical Thinking In The Higher Education In ...
21St Century Skills In The Case Critical Thinking In The Higher Education In ...21St Century Skills In The Case Critical Thinking In The Higher Education In ...
21St Century Skills In The Case Critical Thinking In The Higher Education In ...Courtney Esco
 
E mentoring as a tool muzaffer cetin
E mentoring as a tool muzaffer cetinE mentoring as a tool muzaffer cetin
E mentoring as a tool muzaffer cetinMuzaffer Çetin
 

Similar to The Impact of Assessment for Learning (20)

Enhancing learning and_teaching_in_higher_education_in_northern_ireland
Enhancing learning and_teaching_in_higher_education_in_northern_irelandEnhancing learning and_teaching_in_higher_education_in_northern_ireland
Enhancing learning and_teaching_in_higher_education_in_northern_ireland
 
ACTION RESEARCH RATIONALE OF THE STUDY LAC SESSION.docx
ACTION RESEARCH RATIONALE OF THE STUDY LAC SESSION.docxACTION RESEARCH RATIONALE OF THE STUDY LAC SESSION.docx
ACTION RESEARCH RATIONALE OF THE STUDY LAC SESSION.docx
 
Tbl, pbl, ebl, scale up, buzz, virtual or what?
Tbl, pbl, ebl, scale up, buzz, virtual or what?Tbl, pbl, ebl, scale up, buzz, virtual or what?
Tbl, pbl, ebl, scale up, buzz, virtual or what?
 
Teamwork seda may 2018
Teamwork seda may 2018Teamwork seda may 2018
Teamwork seda may 2018
 
LS Curriculum & Assessment Guide
LS Curriculum & Assessment GuideLS Curriculum & Assessment Guide
LS Curriculum & Assessment Guide
 
Building Foundations for Para-professionalism
Building Foundations for Para-professionalismBuilding Foundations for Para-professionalism
Building Foundations for Para-professionalism
 
MUHAMMAD AKBAR ZAHIDI - american journal education 2-10-2 copy
MUHAMMAD AKBAR ZAHIDI - american journal education 2-10-2 copyMUHAMMAD AKBAR ZAHIDI - american journal education 2-10-2 copy
MUHAMMAD AKBAR ZAHIDI - american journal education 2-10-2 copy
 
Successful implementation of the EPQ
Successful implementation of the EPQSuccessful implementation of the EPQ
Successful implementation of the EPQ
 
Exploring scholarship and scholarly activity in college-based Higher Education
Exploring scholarship and scholarly activity in college-based Higher EducationExploring scholarship and scholarly activity in college-based Higher Education
Exploring scholarship and scholarly activity in college-based Higher Education
 
The Teacher Use of ET for Motivating English Majors in EFL Speaking Class at ...
The Teacher Use of ET for Motivating English Majors in EFL Speaking Class at ...The Teacher Use of ET for Motivating English Majors in EFL Speaking Class at ...
The Teacher Use of ET for Motivating English Majors in EFL Speaking Class at ...
 
Synthesis of research report of personnel: Case study on Faculty of Education...
Synthesis of research report of personnel: Case study on Faculty of Education...Synthesis of research report of personnel: Case study on Faculty of Education...
Synthesis of research report of personnel: Case study on Faculty of Education...
 
A Tracer Study on the Graduates of Bachelor of Secondary Education Major in E...
A Tracer Study on the Graduates of Bachelor of Secondary Education Major in E...A Tracer Study on the Graduates of Bachelor of Secondary Education Major in E...
A Tracer Study on the Graduates of Bachelor of Secondary Education Major in E...
 
Project-Based Learning Guided Lesson Study Improve the Achievement of Learnin...
Project-Based Learning Guided Lesson Study Improve the Achievement of Learnin...Project-Based Learning Guided Lesson Study Improve the Achievement of Learnin...
Project-Based Learning Guided Lesson Study Improve the Achievement of Learnin...
 
Julia Parra Research for VSS 2009
Julia Parra Research for VSS 2009Julia Parra Research for VSS 2009
Julia Parra Research for VSS 2009
 
EL7001-8
EL7001-8EL7001-8
EL7001-8
 
Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University
Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State UniversityMinutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University
Minutes of the Tenth AMCOA Meeting, May 1st, 2012 -- Framingham State University
 
Using formative assessment to assess students in Bac Hai Primary school (1) (...
Using formative assessment to assess students in Bac Hai Primary school (1) (...Using formative assessment to assess students in Bac Hai Primary school (1) (...
Using formative assessment to assess students in Bac Hai Primary school (1) (...
 
21St Century Skills In The Case Critical Thinking In The Higher Education In ...
21St Century Skills In The Case Critical Thinking In The Higher Education In ...21St Century Skills In The Case Critical Thinking In The Higher Education In ...
21St Century Skills In The Case Critical Thinking In The Higher Education In ...
 
Iain Mansfield
Iain MansfieldIain Mansfield
Iain Mansfield
 
E mentoring as a tool muzaffer cetin
E mentoring as a tool muzaffer cetinE mentoring as a tool muzaffer cetin
E mentoring as a tool muzaffer cetin
 

The Impact of Assessment for Learning

  • 1. PBM4029ononaji21809224 Page 1 PORTFOLIO/ASSIGNMENT COVER SHEET Programmed/Pathway MA EDUCATION Module Start Date: Module Code: PBM4029 Module Title: THE IMPACT OF EDUCATIONAL PRACTICE Student's Name: AUGUSTINE CHIDI ONONAJI Student ID Number 21809224 Tutor's Name HUGH SLOAN Latest submission Date: 12TH JANUARY 2012 Actual submission Date: 12TH JANUARY 2012 Received by: Date Received: ……………………………………………………………………… ……………………………………………………………………… Instructions: Please submit your portfolio electronically – you will receive e-notification that we have received it Students are required to complete this coversheet and attach it to their assignment prior to handing in The deadline for your submission is 4.15pm on or before the due date On submission staff receiving the assignment will date and sign this form Students should retain the student copy as a receipt for their work If you wish a receipt for posted assignments, please include an SAE Professional Development
  • 2. PBM4029ononaji21809224 Page 2 THE IMPACT OF ASSESSMENT FOR LEARNING IN SCHOOLS: WHAT IMPACT DOES ASSESSMENT FOR LEARNING HAVE ON LEARNERS AND PRACTITIONERS IN THE CLASSROOM WHEN EMBEDDED IN PRACTICE? A CASE STUDY BASED ON MY EXPERIENCE AS A TEACHER AND HEAD OF ICT DEPARTMENT Submitted in part fulfilment of the requirements of the MA in Education at Edge Hill University Augustine Chidiegwu Ononaji Student ID: 21809224
  • 3. PBM4029ononaji21809224 Page 3 TABLE OF CONTENTS Acknowledgement ……………………………………………………………..4 Introduction……………………………………………………………………5-6 Rationale ……………………………………………………………………….7- Research questions……………………………………………………………8 Literature review………………………………………………………………..9-13 Research methods……………………………………………………………..14-21 Research ethics …………………………………………………………………22-22 Findings Analysis and discussion………………………………………………23-31 Conclusion …………………………………………………………………………32-34 Appendices (Supporting evidence) Appendix 1…………………… (Extract from college Ofsted inspection report) ….. pg. 35 Appendix 2…………… (Evidence of request for consent and college authorisation).. pg. 36 Appendix 3………………… (Participants consent form) ………………………………..pg. 37 Appendix 4………………………… (Evidence of monitoring sheet) …………………..pg. 38 Appendix 5………………… (Extract from department achievement reviews)………...pg. 39-41 Appendix 6………………………. (Research action plan) ……………………………….pg. 42 Appendix 7………………………. (Lesson observation commentary) ………………….pg. 43-44 Appendix 8……………………… (Student questionnaires and feedback) ……………..pg.45 Bibliography………………………………………………………………………..pg. 46-51
  • 4. PBM4029ononaji21809224 Page 4 Acknowledgement I am using this opportunity to express my gratitude to everyone who supported me throughout the course of my MA in Education project. I am thankful for their aspiring guidance, invaluably constructive criticism and advice during the project work. I am sincerely grateful to all for sharing their truthful and illuminating views on a number of issues related to the project. I express my warm thanks to my personal tutor, Mr. Hugh Sloan whose contributions in stimulating suggestions and encouragements helped me coordinate my project. Furthermore I would also like to acknowledge that with much appreciations the crucial role of the Principal of Haringey Sixth Form Centre June Jarrett (my mentor) for her support and guidance through mentoring and coaching in my role as the PAM for ICT department in her organisation, that I used as a case study in this research. In addition, I would like to express my special thanks of gratitude to my colleague Haileselassie Girmay who had helped me in putting the project together. Last but not the least, many thanks go to my dear wife Lilan and our children; Chizaram, Chigadi, Chinomso and Kosimasichi for their kind co-operation and encouragement throughout the programme.
  • 5. PBM4029ononaji21809224 Page 5 INTRODUCTION According to Kirton, et al. (2007) in the late1990’s formative assessment was not a priority for many teachers and policy makers in England and Scotland. This was because the emphasis was on summative assessment in the form of national tests, formal examinations and the pressure of frequent inspection and league table performance by the government. The interest in formative assessment was triggered by two publications namely, ‘Assessment and classroom learning’ (Black & William, 1998a) and, in more accessible form for teachers and policy makers, Inside the black box (Black & Wiliam, 1998b). The latter having sold over 50, 000 copies indicated a healthy interest in formative assessment, and subsequently gave rise to other researchers and groups such as the King’s College London team (Black et al, 2002, 2003) to produce publications/guidelines on using formative assessment in the classroom. The team in their work argued that Assessment for Learning and Summative Assessment were used interchangeably and shared extended definition. Other researchers like McDowell et al (2009) and QIA (2008) view Summative Assessment as a key part of Assessment for Learning and key stage 3 strategy. Since the publications of Black and Wiliam (1998b), Black et al, (2002, 2003), ARG (2009) and others, AFL has become ‘a free brand name to attach to any practice’ Black (2006:11). The official endorsement of AFL in Britain as good practice for improving teaching and learning in the classroom is evidenced at various levels. For instance, in 2008 the Minister of States for Schools encouraged all schools in England to use AFL to improve teaching and learning in schools. The Minister reported that ‘the Government had invested £150 million over a period of three years for continuing professional development for teachers in assessment for learning’ (DFCSF (2008:1). Similarly, many policy-makers and educational institutions since then have developed programmes and resources to support practitioners seeking to improve their teaching and learning practices. Accordingly, many Universities and Colleges in UK are now providing internal and external CPD
  • 6. PBM4029ononaji21809224 Page 6 programmes for practitioners while teacher training agencies have made AFL a core curriculum for trainee teachers. Black et al. (2004) defined formative assessment as: Assessment refers to all those activities undertaken by teachers, and by the students in assessing themselves, which provide information to be used as feedback to modify the teaching and learning activities in which they are engaged. Such assessments become formative assessment when the evidence is actually used to adapt teaching to meet the needs (Black et al., 2004 p. 10). Indeed, Hargreaves (2001) cited in ‘UKEssays’ website argues that: Assessment for Learning is the beginning of a ‘revolution’ in education and the key driver of the convergence between curriculum, assessment and pedagogy because it undermines the old conception that assessment is something that follows teaching and learning (UKEssays). Similarly, Assessment Reform Group (2009) highlighted the significance of AFL by stating that AFL has become a proxy measure that is used to facilitate judgements on the quality of most elements of educational systems which has been largely welcomed by teachers, head teachers, school, support services, local authorities and even the government itself. In the light of these developments, and the increasing emphasises on the benefits of AFL, as a newly appointed head of a department in a school in London that needed to improve learners’ outcomes, my tasks was to explore the potential use of AFL to improve the quality of teaching and learning experiences in the ICT department I manage. The aims of this research were first, to improve professional practice through a practice based enquiry. Second, to explore the relationships between theoretical and practical knowledge, and critically evaluate the impact on practice through a methodical investigation and collection of work based evidence and wider professional context. The objective therefore was to plan, implement, test and evaluate the impact of Formative Assessment (AFL) on practice using a case study research method.
  • 7. PBM4029ononaji21809224 Page 7 THE RATIONALE In accordance with BERA (2011), Data Protection Act (1998), and for privacy and confidentiality reasons, pseudonyms have been used throughout this enquiry. Hence, all identities used in the supporting documents and the evidence attached in the appendices are pseudonyms. For the past six years, I have worked as a teacher of ICT and the Head of the department in Reflective Sixth Form College, which is located in a deprived part of London engulfed with socioeconomic difficulties. Reflective college is inclusive, hence caters for young adults aged 16 to19 from different ethnic origins, able and disabled with mixed ability in attainment. The institution is made up of six faculties with the population of one thousand and two hundred students. The management structure includes a principal, three vice principals, six heads of faculties and ten programme area managers. My role is in the faculty of business and ICT as a teacher of ICT and a programme area manager. The faculty provides A level and vocational courses from entry 3 level to 3 levels qualifications in ICT and business studies. Currently, there are eleven members of teaching staff and 350 mixed ability students in the faculty. The department was graded a grade 3; requiring improvement in the last Ofsted inspections in 2012. My rationale for undertaking this study is threefold. First, to improve practice, enhance autonomy and motivation of learning in the classroom and develop more research and reflective practice skills; hoping that on completion I will become a more competent and reflective person in my approach to addressing issues and engaging in wider discussions. Second, as a teacher I want to use the skills develop through the enquiry to promote good teaching and learning practices across the department with the assumption that it will help improve the quality of teaching and learning as well as raising learners’ outcomes as recommended by Ofsted in 2012 feedback. Ofsted advised the college to: Continue to raise achievement by: − increasing the average grade and the proportion of higher grades gained at A level and in BTEC Level 3 qualifications, so that all aspects of attainments are firmly in line with national averages− ensuring that greater proportions of students complete the courses that they start, consistently across all subjects and curriculum areas (College Ofsted report, 2012:5) (see appendix 1).
  • 8. PBM4029ononaji21809224 Page 8 Lastly, to accomplish a performance management target agreed during an annual review with my line manager. It became crucial that I must research and pilot the use of formative assessment techniques in the department. My rationales for embarking on this study are both personal and professional thus, aided the formulation of the research questions I had designed below. RESEARCH QUESTIONS 1. What impact does AFL have on [my] students, is there evidence that embedding AFL in practice raises the standards of learners and the quality of teaching in the classroom? 2. Is there evidence that there is room for improvement in implementing AFL in the classroom, if so what is the evidence and how can it be improved? 3. What were the impacts of this study on my practice as a teacher, a manager and my school?
  • 9. PBM4029ononaji21809224 Page 9 ASSESSMENT FOR LEARNING: THEORY AND LITERATURE REVIEW. Black and Wiliam (1998) claim that firm evidence indicates that formative assessment should be an essential component of classroom work and that its development can raise standards of learners’ achievement. They ‘point out that they know of no other way of raising standards for which such a strong prima facie case can be made than the use of formative assessment’(p:1). The authors, in presenting their case for formative assessment, likened the classroom as ‘a black box’. In doing so, suggested that the current approaches to improving achievement do not produce the desired results. They argue that whatever is happening in the classroom (that black box) is affecting the output. Black and Wiliam see formative assessment as the tool that lies at "the heart of effective teaching." Miller (2005) reported that since the pioneering work of Black and Wiliam, formative assessment has attracted increasing interest as a topic of research, classroom practice and educational policy due to its potential of guiding teaching and learning process.Two most noticeable publications were produced by the Assessment Reform Group (2002) and DCSF (2008). The group recommendations on how to improve Assessment for Learning in the classroom were widely adopted and a revised version of Ten Principles was incorporated into the Assessment for Learning Strategy (DCSF, 2008). Assessment Reform Group (2009) adds that formative assessment represents a fundamental change from the test and examination results that were predominantly used to determine what a pupil knew and understood of a subject. Furthermore, The Quality Improvement Agency for Lifelong Learning (QIA) (2008) reported that Formative assessment is part of Assessment for Learning. They stated that: Assessment for learning-sometimes called formative assessment- involves teachers and learners using assessment to improve learning. It is about assessing progress and analysing and feeding back the outcome of that assessment positively and constructively to:  Agree actions to help the learner improve  Adapt teaching methods to meet the learner’s identified needs (QIA, 2008:2). QIA went further to argue that Assessment for Learning demonstrates a particular view of learning which could enable all learners to gradually make progress and achieve their full potential. The group
  • 10. PBM4029ononaji21809224 Page 10 confirms that Assessment for Learning can take place in teaching and learning sessions, through written and verbal feedback and as part of a review, a target setting and an action planning. Teachers are encouraged to use a range of approaches such as teacher-led assessment, learner-led assessment, peer assessment and computer-based assessment to check learning and assess progress in a classroom: Effective checking for learning enables learners to be involved in the assessment process and to make sense of what they are learning and how it relates to and builds on what they already know (QIA, 2008:5) Similarly, Black and Wiliam (1998a) argue that teachers need to have a deep understanding of formative assessment to enable them employ strategies to assist students to identify the gaps between their present achievements and the desired goals. Sadler (1989) cited in (Black 2000) states that formative assessment should equip students with essential tools to manage their own learning. This theory compliments the publication of the Assessment and Reporting Unit (2005:2) which claims that: Meaningful learning occurs when learners are actively involved and have opportunity to take control of their own learning. This means that teachers should provide sensitive and constructive feedback to students and use assessment practices that encourage self- assessment and metacognition. In addition, Bransford, et al. (2000) argue that learners construct knowledge and understanding on the basis of what they already know and believe. Hence, teachers should use dialogues, questions and answers to establish learners’ prior knowledge and monitor learners’ changing conceptions as teaching and learning progresses in the classroom. They also state that assessment should help expose students’ thinking processes to themselves and their teachers and that appropriate feedback throughout the learning/teaching process should enable students to modify and refine their thinking. Hayward and Hedge (2005:69) cited in Assessment and Reporting Unit (2005:4) report that data emerging from research on AFL in Scotland ‘suggests that teachers not only find their involvement energizing, but that they report positive changes in the quality of pupils’ work and commitment to learning’. This research indicates that the implementation of formative assessment strategies aided
  • 11. PBM4029ononaji21809224 Page 11 teachers and students and improved students learning and teachers’ satisfaction. Hayward and Hedge argue that Black and Wiliam have ‘helped to take the emphasis in formative assessment studies away from systems, its formative-summative interface, and relocate it on classroom progresses’ (Black and Wiliam 2003:628). Hayward and Hedge claimed that the renewed emphasis on pedagogy and assessment practices focusing on learning is central to the quest for improved outcomes for all students. Hattie (2002) cited in the QIA (2008) states that feedback has more impact on learning than any other general factor; but it requires an activity and a product. He also advises that teachers should see feedback as a reflection of their expertise as teachers and not only as giving information about their students and their grasp of the subject: Giving feedback on learning errors and getting the learner to correct them as identify strategies to improve future work is directly linked to significant improvement rates (Hattie. 2002) cited in QIA (2008:4). Butler (1998) states that using constructive comments in practice leads to improved performance by 33%, Butler also warns that marking using grades can have a negative effect on learner performance, particularly for low achievers. Black and William (2003), point out that Assessment for Learning includes summative assessment. They state that the use of formative assessment does not preclude the use of summative assessment. They argue that when summative assessment is aligned to curriculum and the students’ learning experiences, then it becomes integrated into the learning and assessment cycle and feeds into improving students’ learning rather than just measuring. There is need to align formative and summative work in new overall systems, so that teachers’ formative work would not be undermined by summative pressure, and indeed, so that summative requirements might be better served by taking full advantage of improvements in teachers’ assessment work (Black and Wiliam, 2003:623-4) Similarly, other researchers and authors have dealt with the relationship between formative and summative assessment methods and the underlying tensions. Brigggs (1998:105) states ‘Sensible educational models make effective use of both FA (formative assessment) and SA (summative assessment)’. Stiggins (2002) makes reference to the importance of both Assessment of Learning
  • 12. PBM4029ononaji21809224 Page 12 (summative) and Assessment for Learning. Although Stiggins may not equate Assessment for Learning to formative assessment, he suggests that Assessment for Learning goes further and it involves the student in the process. He therefore argues that formative and summative assessments are important in teaching and learning process and should be integral to the learning and teaching cycle. Although, Assessment for Learning has now become part of primary and secondary strategies for all schools in England (Ofsted 2008), it is not without criticisms. Pollard (2008) argues that AFL helps identify ‘bright and weak learners’, with consequent implications for the self-image and social status of both groups. He demonstrated that there is research evidence that the assessment process itself can place children under enormous pressure and can have a negative effect on children’s perceptions of themselves and their peers and can reduce an individual’s self-esteem and motivation. He also argues that high-attaining peers may become the subject of bullying as a consequence of good results. Similarly, Brown and Smith (1997) cited in Brown (2004-5) argue that for assessment to be effective in practice, it needs to be ‘fit-for-purpose’; that is it should enable evaluation of the extent to which learners have learnt and the extent to which they can demonstrate that learning. Kirton et al, (2007) argue that Formative Assessment ‘plays a crucial role in raising standards through giving students a clear sense of themselves as learners, the goals they are trying to achieve and how to reach them’ (p:607). They also stated that the overall criticism of AFL is dependent on its weak implementation in the classroom. First, they argue that weak practice of AFL encourages superficial and rota learning in the classroom thus limiting students from making progress. Second that some teachers do not generally review the assessment questions they use and do not discuss them critically with peers, which leads to little reflection on what is being assessed. Third, they argue that the grading function is over emphasised whilst the learning function under- emphasised.
  • 13. PBM4029ononaji21809224 Page 13 Fourth, that there is a tendency to use normative rather than a criterion approach, which emphasises competition between pupils rather than personal improvement of each. Black and Wiliam (1998) debated that such practice is hostile to weaker students that lack the ability leading to de-motivation and lose of confidence in their own capacity to learn. Furthermore, Hodgson and Pyle (2010) argue that AFL has many generic features but some features can be specifically honed for science teaching and learning. They argued that the classroom climate is an important factor that enable learning in science and should be recognised as one of those AFL techniques identified by Black and Wiliam. They contended that it is crucial that a non-threatening environment and a reciprocal interaction is established in classroom for pupils to feel able to express their ideas and allow the teacher to establish what the pupils know in order to develop teaching that will move their understanding on. Similarly, Kyriacou (2007) suggested that the use of AFL could prove problematic to practitioners due to inadequate time and management issues. He stated that teaching involves a variety of tasks such as; lesson planning, making use of appropriate resources, sharing data, marking, meeting parents and liaising with colleagues, etc. in a short period of time. He therefore believes that these processes take time to accomplish hence require caution and concentration by a teacher. Crooks emphasised that a substantial proportion of teachers have little or no formal training in educational measurement techniques, and many of those who do have such training find it little relevance to their classroom evaluation activities. Thus, Ofsted (2008) contended that where Assessment for Learning has had less impact in the classroom is because the teacher had not understood how the approaches were supposed to be used to improve pupil’s achievement. They argued that teachers may have used key aspects of assessment for learning, such as identifying and explaining lesson objectives, questioning, reviewing pupils’ progress and providing feedback precision and skills but have failed to embed these concepts
  • 14. PBM4029ononaji21809224 Page 14 in their lessons in a way that both pupils and the teachers can understand and take advantages of them to improve their learning and teaching respectively. In conclusion, it is evident from the reporting of the reviews conducted by different authors and researchers cited above that potential benefits of AFL are significant on students in many different ways. According to Black and Wiliam, the impact of AFL affected students’ improvement and helped reshape policy toward schools: For public policy towards schools, the case to be made here is firstly that significant learning gains lie within our grasp. The research reported here shows conclusively that formative assessment does improve learning. The gains in achievement appear to be quite considerable and as noted earlier, amongst the largest ever reported for educational intervention (Black and Wiliam, 1998:46). Crooks (2001:1-2) summarised by stating that implementation of classroom evaluation requires caution but emphasised that it appears likely to benefit the greatest proportion of students, and in particular guides students’ judgement of what is important to learn, it affects students’ motivation and self-perceptions of competence; structures their approaches to and timing of personal study; consolidates learning; and affects the development of enduring learning strategies and skills. It appears to be one of the most potent forces influencing education: ‘Accordingly, it deserves very careful planning and considerable investment of time from educators’ (Crooks 1988: 467). In the light of this review, the primary quest in this enquiry is to identify strategies to use to embed AFL in my practice, to examine its impact on my students, my practice as a teacher/head of the ICT department, and finally report the findings generated through a research methodology.
  • 15. PBM4029ononaji21809224 Page 15 METHODOLOGY: RATIONALE AND LIMITATIONS Having established the research objectives, questions and noting that the research stance are naturalistic, unique, individualistic and qualitative, I decided to adopt the interpretivism theoretical perspective as my epistemological stance. This is because it blended well with the research methodologies (the case study research) I had chosen. Gary (2009) suggests that a relationship exists between the theoretical stance adopted by the researcher, the methodology and the methods used to collect the data. Yin (2003) defines the case study research method as: an empirical inquiry that investigates a contemporary phenomenon within its real-life context; when the boundaries between phenomenon and context are not clearly evident; and in which multiple sources of evidence are used (Yin, 2003:23). In addition, having considered other research methodologies such as experiment, action research, and ethnographic research methods during planning, I chose the case study approach over others because of the following reasons: Firstly, the case study research provided the qualitative platform needed to address the research topic qualitatively (looking in-depth at non-numerical data), and allowed the use multiple methods of data collection and analysis to determine the research outcomes. Burton et al. (2011:84) confirmed that case study research enables teachers to use a range of research methods to investigate a particular issue as it relates to them. Secondly, it was preferred because it blended well with the research field settings, my personal context and the circumstances of my study. Applying it, I was able to implement the research in real world context; using my place of work and the students to test the theories of assessment for learning. Yin (2009:2) confirms that general case study is preferred when ‘(1) ‘how’ or ‘why’ questions are being posed, (2) the investigator has little control over events, (3) the focus is on contemporary phenomenon within a real-life context’.
  • 16. PBM4029ononaji21809224 Page 16 Thirdly, the case study approach was favored because it supports educational researchers/ practitioner like me than other research methods I had considered. McNally et al (2003:6) cited in Burton and Bartlett (2011) argued that they use case study approach to their study of early professional learning (EPI) because it offered ‘a deep understanding of the nature of EPI which requires sustained contact with the learners and their context’. Finally, the case study approach was preferred because it is systematic, comprehensive and topic oriented, and provided me the opportunity to relate and interact mutually with research participants (students and colleagues). Walsh (2001:52) argues that its strategy involves a systematic investigation into single individual, event or situation; that is, the researcher studies a single example, or case of some phenomenon. On the contrary, Hammersley (2002) claims that each research method has particular strengths and weaknesses, and criticism arise from their weaknesses. Yin (2009:14) argues that although case study approach is ‘a distinctive form of empirical enquiry it remains one of the most challenging of all social science endeavours’. He claims that: (1) case studies lack rigor…[because they] allow equivocal evidence or biased views to influence the direction of the findings and conclusions,…(2) case studies provide little basis for scientific generalization, (3) [they] take too long resulting in massive, unreadable documents, and finally (4) they lack ‘true experiments’ [value]’(Yin,ibid:14-16). Similarly, Simons (2010:24) questioned that ‘the personal involvement and/or subjectivity of the researcher are a concern’. There was a valid reason for being aware of subjectivity or ‘self, because as Simons (2010: 82) explains further, ‘[I was] the main instrument of data collection: [I] looked at documents, observed participants and interacted with learners/colleagues in the field.’ Consequently, ‘[my] view, predilections and values influenced how [I acted]’ (Simons, 2010:26). Furthermore, critics have argued that case study ‘failed to provide clear-cut solutions, presenting instead an overly complex analysis of educational issues Nisbet (2000) cited in Burton and Bartlett (2005:25).
  • 17. PBM4029ononaji21809224 Page 17 Despite the limitations and concerns noted above by Simons, Yin and Nisbet, Simons note that the personal involvement or subjectivity of case researcher ‘are not all of them necessarily limitations…it is a question of how they are perceived and interpreted by [the researcher]’ (Simons, 2010:24). In fact, it has been argued by Wiersma and Jurs (2009), Williman and Buckler (2008) that it is impossible to separate the ‘self’ from the context of the study. In spite of these concerns, I still preferred the case study approach as it enabled me to carryout an in-depth research using multiple data collection and analysis methods on the impact of AFL on my students in natural settings which experimental research method could not do. Further, it helped me to explore a holistic investigation where each student or group responded to and express their understanding of themselves, their experiences on the impact of my practice on them. On the other hand, Burton et al. (2011:37) described Action research as; Curriculum development in the classroom that is concerned with how to improve education practice and it is practitioners themselves who carry out the research in examining and developing their teaching. I personally refused to use Action research approach for my project even though Burton et al (2011) and McNff and Whitehead (2010) sees it as a research use by practitioners (teachers) for improving practice. My main reasons are first, Action research does not entirely fit and merger into my research project (questions). Second, I was concerned that its spiral nature that promotes a rigid approach could lead the findings to uncertain outcomes. For example, addressing a particular research question could lead to the discovery of new outcomes, and the enquiry can deviate from its original action plan making the investigation complex Burton et al (2011). Third, I was even more concerned that my research questions may change as research develops making it more daunting and confusing for a novice like me to cope. Finally, I was also worried that its action-reflection cycle nature will undoubtedly, impact on the project feasibility that is to say; it may make the project longer than
  • 18. PBM4029ononaji21809224 Page 18 planned hence impacting on my accessibility to the participants, research methods, and the overall project success. PREJUDICE AND BIAS ISSUES However, being aware of the concerns raised on the weaknesses of the case study research method, to avoid prejudice and pre-formed judgment influencing my outcomes, I took the following precautions; First, I ensured that I had strategies in place to monitor myself and my activities throughout the research process. The idea was to visualize ‘how [my] personal sense of self interacted with the study, shaped the inquiry and outcomes, and [I] reflected on the dynamics this created’ (Simons, 2010: 82). A research diary was kept and reviewed regularly for the following reasons: to generate and monitor the project life cycle, to provide material for reflection, to provide data on the research process and to identify and deal with the values and subjective selves that could have influenced my interpretation. Second, I employed multiple data collection methods and analysis in order to minimise the impact of my personal weaknesses. I used in-depth interviews, participant observations and questionnaires. The interview (unstructured) approach enabled me to obtain participants (students/colleagues) ‘real’ views and beliefs during the inquiry. Through observations, I was able to carry out my research in ‘real’ life natural settings which helped in the collection of highly valid data while questionnaire made it easy to collect, and compare large amount of data quickly from participants to ensure the reliability and validity of the research outcomes (see appendix 8: questionnaires). In addition, to enhance internal reliability of data which Burton and Barlett (2011:27) described as the ‘truthfulness’, ‘correctness’ or accuracy of research data, I applied the respondent validation procedure. I checked with participants (students/colleagues) to certify that my reporting of their views is in accordance with the feedback they provided (see appendix 3: participant consent form).
  • 19. PBM4029ononaji21809224 Page 19 Furthermore, colleagues’ participants who observed my lessons provided me with reflective feedback; which provided evidence of the strengths and weakness of the strategies I had employed. For external validity, I asked colleagues and friends to determine whether my inquiry is credible and useful by comparing my findings to previous publications or researches and their own experiences. Finally, to increase the validity of the research and to address the overall concerns highlighted above, I applied the triangulation concept. According to Burton and Barlett (ibid: 28), ‘triangulation is the process carried out by researchers to increase the validity of their research and it means checking one’s findings by using several points of reference’. I therefore triangulated my research by analyzing all data generated from the various methods of data collection I had used, looking for congruency (see tables 1 and 2 in pages 28 and 34).
  • 20. PBM4029ononaji21809224 Page 20 RESEARCH METHODS AND ANALYSIS With the remit to improve practice through this enquiry, I decided to apply Thousand and Villa (2000) model of managing change. This model states that, for a change in programme to be successful and sustainable, the initiator must consider; Vision, Skills, Incentives, Resources, Action plans, Success. Using this model as a precursor, an action plan was prepared and followed to facilitate the process (see appendix 6: research action plan). Data was collected from the empirical study conducted on 50 BTEC level and A level ICT students enrolled in 2012-2013 academic year and 10 colleagues participants who volunteered to participate; data collected were analysed after the research enquiry. The purpose was to measure and compare the findings gathered from participants in order to determine the impact of practice on students and myself. Using interviews, observation and questionnaires as the primary data collection methods, I had the privilege of reflecting on my practice based on the feedback I received from students and colleagues. These enabled me ‘to gain a deeper insight into the real way of life, beliefs and activities of the group in their ‘natural settings’ Walsh (2001:67). Observing students in this context helped to provide elicit data that could not be gathered from a questionnaire or interview methods. For instance, I was able to observe, note, and interact holistically the immediate impact of my practice on the learners as a group than individualistically. Conversely, Burton et al. (2011), argue that using observation technique to collate data could be difficult and complex as ‘it is impossible to observe and note down everything that occurs in a complex situation such as a classroom of 25 children and one or more adults’ (Burton and Barlett 2011:131). Similarly, I used the questionnaires as a complementary data collection procedure; they provided a cross-check on data obtained in the interviews, observation and document analysis to enhance the validity of my account. They were used as a way to obtain anonymous and rich qualitative and
  • 21. PBM4029ononaji21809224 Page 21 quantitative data on specific aspects about the impact of formative assessment techniques from the large number of students and colleagues participating in the study. However, the questionnaire method had some limitations as data collection tool. Example, due to its design and format, (closed ended questions, or even open ended questions) participant’s feedback were generally limited. The data collected sometimes lacked deeper understanding and were difficult to establish whether or not respondents understood the questions properly or a question asked meant the same to all respondents as they did to me, especially where I was not present to explain (see appendix 8: questionnaires). Semi-structured interviews with participants were used to gather in-depth qualitative information on my practice. They were also used to encourage the active participation and learning for myself and the respondents (students and colleagues), since engaging participants helps to identify the impact of my investigation was central to the study (see research questions). Equally, it allowed me to adapt the questions to suit different situations and respondents thereby produced detailed qualitative data outcomes expressed in the respondent’s own words. Also, I was able to ‘pick up’ non-verbal clues that were indiscernible from questionnaires, for example, the annoyance or pleasures shown by a respondent over certain topics or questions. Nevertheless, using interviews as a data source presented some difficulties for me particularly when dealing with a large number of interviewees. I found it difficult to ask questions, listen to responses and taking notes at the same time; it was a complex process that requires prioritizing and multi- tasking skills. Finally, the above methods used eased the way for me to compare previous data collected or held on college records such as examination results, class progress data, lesson observation feedback and programme area self-assessment reviews (SAR) which were reviewed to justify the impact of my practice.
  • 22. PBM4029ononaji21809224 Page 22 DATA ANALYSIS AND STRATEGY Having used a qualitative research approach to conduct my study and seeing that the data set is mostly qualitative generated from multiple data collections used; questionnaires, interviews, participants’ observations and document reviews, I decided to analysis the data qualitatively. I applied Miles and Humberman’s Qualitative Data Analysis techniques (cited in Simons, 2010), a systematic approach in three steps. First, I organised the data into manageable formats, grouped them into themes (Thematic analysis) by working through the data set looking for similarities or contrasting ideas in the responses gathered from the various data collection methods used. (Taylor Powell and Renner (2003). Second, having grouped the data into themes, knowing that data gathered from observation notes, interviews and questionnaire are difficult to predict, I converted them into quantitative data in order for them to make sense during analysis and discussion (see tables 1 and 2 in pages. 23 and 35). Coolican (1994), cited in Eysenck (2004) stated that the benefits of using qualitative analysis of data are; [1] it can shed much light on the motivation and values of individuals who are actively involved in the collection and analysis of data collected using interviews, case studies, or observation. [2] Data analysis often takes place alongside data collection to allow questions to be refined and new avenues of inquiry to develop. [3] Textual data is typically explored inductively using content analysis to generate categories and explanations; software packages can help with analysis but should not be viewed as short cuts to rigorous and systematic analysis. Critically, Eysenck (2004) argues that the greatest limitation of the qualitative approach is that the findings that are reported tend to be unreliable and hard to replicate due to its subjective and impressionistic, and the ways in which the information is categorised and then interpreted often differ considerably from one investigator to another. Finally, noting the issues surrounding qualitative data analysis strategy, to reduce subjectivity and reliability, I followed Simons (2010) and Coolican (1994)
  • 23. PBM4029ononaji21809224 Page 23 advice who recommend that I use different methods of data analysis (research cycle) in order to remain transparent and increase reliability. RESEARCH ETHICS As the underlying issues for doing this investigation were to improve practice, enhance autonomy and motivation of learning in the classroom and develop more research and reflective skills, this study was therefore, conducted in my place of work where both students and colleagues were co- participants. It adhered to all ethical issues raised in BERA guidelines (2011). I was granted ethical approval to conduct this research by the Research Ethics Committee at the university at which I am working towards a Master degree in Education. Permission to undertake this enquiry was equally sought from my school authorities and participants students and colleagues. Participants were clearly and fully informed through a fair processing and voluntary notice which included seeking participants’ consent using a ‘consent form’ and a permission to execute research in the college from the college authority (see appendix 2 and 3: consent letter and form). The students and colleagues participants were fully informed that the research purpose was to improve my practice and were duly assured of their protection and rights to be treated fairly, sensitively, with dignity, and within an ethic of respect and freedom from prejudice, regardless of age, gender, sexuality, race, ethnicity, class, nationality, cultural identity, partnership status, faith, disability, political belief or any other significant difference. I ensured that the project was free from deception or subterfuge and that all sensitive information collected were not disclosed to anyone without the permission of the individual or participants. I made sure that only relevant data were collected in order not to jeopardise the welfare of the participants. In addition, I guaranteed that participants were duly informed of their right to withdraw from participating in the research at any time without any consequence. This was communicated through email and consent form. As recommended by the Data Protection Act (1998) and BERA guidelines (2011), I respected participants’ privacy and confidentiality throughout the research. I used anonymity and pseudonyms
  • 24. PBM4029ononaji21809224 Page 24 to ensure that participants were not identified. Furthermore, all documents such as; interview scripts, lesson observations, questionnaires, forms, etc. used in the project were scrutinised and the data gathering processes were conducted fairly and lawfully. Finally, all data and information collected during the research process were securely kept and destroyed at the end of the project. Introduction to findings, Analysis and Discussion The findings and analysis presented in this report were generated from a case study in a college, which investigated the impact of Assessment for Learning on students. The aims were to interrogate the impact of Assessment in practice as identified in research questions 1, 2 and 3 on page 7. Data collection employed documents analysis, questionnaires, observations and interviews with the AS level and BTEC groups of 50 students and 10 teachers in the faculty of business and ICT who kindly volunteered to participate in the project. For the purpose of data analysis, I abstracted key data gathered from the various data collection methods and organised them into themes, then displayed the data into figures and tables before engaging with data verification. The findings were evaluated in light of the research questions followed by a critical discussion backed up with evidence in relation to the literature review presented earlier and my experiences. The discoveries on the AFL theories were conclusively positive on learners, practitioners and pedagogy. Research question 1; findings and discussion Table 1 shows the analysis of the outcomes reported on the impact of embedding AFL in classroom practice during the case study. It discusses research question 1. Results Impact Very strong impact Strong impact Little impact Not impact Improving student learning and skills. 45 (90%) 5 (10%) 0% 0% Improving student confidence and motivation 28 (56%) 15 (30%) 7(14%) 0% Improving student concentration and understanding 30 (60%) 15 (30%) 5 (10%) 0% Improving student awareness 25 (50%) 15 (30%) 10(20%) 0% Improving student attainment 30 (60%) 20 (40%) 0% 0% Improving the quality of student work 40 (80%) 10 (20%) 0% 0%
  • 25. PBM4029ononaji21809224 Page 25 The wide range of AFL strategies adopted during the case study from 2011 to 2013 focused on the following methods; 1. Engaging student collaboration through small groups, pairs and trios, to plan, discuss, draft and redraft written work. 2. Using of peer and self-assessment in the classroom. 3. The use and the development of higher order questioning skills and 4. Giving feedback The main outcome of this study was its impact in classroom practices. It was evident from the results of the case study that engaging learners through group work increased the active participation of students in the classroom. This involved ‘students to take ownership of their learning rather than being passive recipients of the ‘delivery’ of the curriculum’ Kirton et al. (2007). The analysis of the outcomes in table 1 shows that the large majority (90%) of the findings indicated that routinely use AFL in practice is a very effective method of improving students learning and skills. A colleague, who observed my AS ICT students engaged in designing an imaginative website in groups of four, reported that the students claimed that working in a group is much fun than working on their own because it enabled them to share ideas, skills and support each other. In addition, during one of my observations, I observed that sharing the lesson objectives with learners and using collaborative learning in the classroom, enhanced students understanding, helped them monitored their progress and increased their confidence as we moved from one activity to the next. My experience is similar to that of colleague participants and students identified in (Table1); where a large majority of student participants reported that working in a group gives them the confidence and motivation to engage in classroom discussions. Furthermore, another colleague observer reported that using group work in a classroom helps to increase the involvement of the whole class and increased more equitable participation of students in the classroom. Admittedly, these conclusions connect with the research reported by Ofsted
  • 26. PBM4029ononaji21809224 Page 26 (2008:24); they stated that ‘in order to raise standards, teachers should engage students in small groups and in whole class dialogue’. Similarly, Black and Wiliam (2009:7) reporting on the importance of student involvement stated that ‘since the responsibility for learning rests with both the teacher and the learner, it is incumbent on each to do all they can to mitigate the impact of any failure of the other’. Alexander (2004) argues that increasing the amount of ‘talk’ in the classrooms is crucial to improve students’ thinking and learning through what he called dialogic teaching. Kirtton et al (2007:617) claimed that such teaching is collective, reciprocal, supportive, cumulative and purposeful and bears resemblance to the practises described above. For these reasons, sharing lesson objectives with students and engaging them collaboratively are effective ways to improve teaching and learning and other skills in the classroom. Also, the statistic in table 1 shows that frequently use of AFL in practice helps to improve student motivation. The analysis indicated that the 56% of the participants acknowledged the impact of self/peer-assessment methods used in the classroom. I personally noticed that students were highly motivated when they were involved in evaluating each other’s work or theirs; they saw value in the activity, engaged cooperatively as well as reflective. The impacts of self/peer-assessment in my lessons were evident in the improved learning and motivation commented in line managers’ observation feedback (see appendix 7 line manager’s lesson observation feedback). Whilst the group work took place, using self/peer assessment, Austin monitored and worked with each group, monitoring and prompting, and challenging or clarifying the key issues as students prepared their feedback (line manager) And students’ testimonies held that peer/self-assessment methods helps them understand how to take ownership of their learning, construe marking criteria (marking scheme), identify their own mistake and reflect deeply on their own learning. A student commented that: I thought marking is only meant for teachers. I like it when we mark each other’s work because it helps me to find out my mistakes quickly and also makes me feel responsible (student).
  • 27. PBM4029ononaji21809224 Page 27 I also learnt that the main difficulty with self-assessment is not that students are dishonest or unreliable in assessing their own work but rather that they need clear guidelines or examples of what constitutes ‘good work’ (Hallam, 2001). This finding again revealed that effective learning can occur when students serve as learning resources for either themselves or one another. This is in line with the views of Ofsted (2008), Black and Wiliam (1998), Cooks (2001), Hodgson and Pyle (2010). Black and Wiliam believed that two elements are critical for peer assessment to be valuable. First, students must work as a group or team. Second, each student must be held accountable in some ways. Equally, Harrson and Harlen (2006:30) reported a number of advantages following a research about primary teachers who had implemented self-assessment by children. They identified that self-assessment is an essential component of AFL because it can help children direct their learning activities towards their learning goals. Harrison and Harlen further explained that peer-assessment builds on the ALF notion of learning as a co-constructivist activity whereby learning occurs as a result of social interaction. In this way, AFL contributes to learning that is in accord with current research into effective learning (e.g. Watkins et al., 2001; Wells, 2008). Thirdly, in table1, 60% of the participants reported that regularly use of AFL in practice help to increase students’ concentration in the classroom. For instance, since I started using questioning regularly in my lessons, I discovered that questioning helps both teachers and learners to monitor learning and increase the learner and the teacher motivation in the classroom. A student said: I like it when my teachers question me in the class because it allows me to demonstrate my level of understanding. In addition, colleagues interviewed acknowledge that they use questioning for different reasons: to improve motivation in the classroom; to improve students’ behaviour and concentration and to enhance a learning climate. He said:
  • 28. PBM4029ononaji21809224 Page 28 I also use questioning in the classroom as a tool to control students’ behaviours and to reposition the entire class particularly when the learners are being disruptive (colleague participant) Another colleague reported that since he increased his questioning time in his lessons, he has noticed improvements in the entire class outcomes particular with the lower attaining and disruptive students when targeted. The findings agree with the work of Black and Wiliam (1999b:143) who stated that: Opportunity for students to express their understandings should be designed into any piece of teaching, for this will initiate the interaction through which formative assessment aids learning. Equally, Black and Harrison (2004) explain that through questioning, teachers are able to collect evidence about pupils’ understanding with the aim of finding out what they do know and what they partly know. Questioning therefore provides a starting point for the teaching allowing for the pupils’ knowledge and understanding to move on, and developing their thinking skills. Finally, these findings indicate principally (60%) that regular use of AFL improves learners’ attainment in the classroom. I learnt during the enquiry that the use of constructive feedback as an aspect of formative assessment helped students to be aware of their progresses, learning needs, the different standards of work and what they needed to do to enhance progress in their learning. Feedback helped improve the quality of students work and raised the students’ achievements. This impact was evident in the 2012-13 BTEC Level 3 and AS level ICT results where the two groups that participated in the case study increased their achievements by 20% and 26% respectively (see appendix 5: 2013-14 academic year results). Cognisant that feedback must be specific and constructive for it to have positive impact on practice, one student wrote: I like it when my teacher gives me feedback on what I have done well and where I need to improve. I don’t like negative feedback every time. Again, this finding relates to the works of Black and Wiliam (1998), who pointed out the importance of feedback when they specified, ‘we know of no other way of raising standards for which such a prima
  • 29. PBM4029ononaji21809224 Page 29 facie case can be made’. Hattie (1999), summarised his wide-ranging review of research on ‘’what works’’ in education with the statement that, ‘’the most powerful single moderator that enhances achievement is feedback’’. While, Crooks (2001:3) warned that: Feedback should be specific and related to need. Simple knowledge of results should be provided consistently (directly or implicitly), with more detailed feedback only where necessary to help the student work through misconceptions or other weakness in performance. Praise should be used sparingly and where used should be task-specific, whereas criticism (other than simply identifying deficiencies) is usually counterproductive. Similarly, Butler (1996) suggested that appropriate descriptive feedback could improve performance by 33% while grading method can have a negative effect on learner performance. Likewise, Black et al, (2002) wrote that the use of feedback as an AFL tool helps both the learner and the student to make progress: An assessment activity can help learning if it provides information to be used as feedback by teachers and their pupils in assessing themselves and each other, to modify the teaching and learning activities in which they are engaged. Such assessment becomes formative assessment when the evidence is actually used to adapt the teaching to meet learning needs.
  • 30. PBM4029ononaji21809224 Page 30 Table 2 presents finding and discusses research question 2 in relation to the improvements needed and recommendations. To draw a logical conclusion and to ensure that the recommendations for the areas of improvement in practice are free from subjectivity, I sought the opinions of the 10 colleagues who participated in the research project. The table above reveals the analysis of data gathered from the colleagues during the study on the constraints limitations and tensions noted while implementing and sustaining assessment policies in practice. Formative assessment is time consuming Notwithstanding, the main outcome of this research is that embedding formative assessment in practice increases learners’ outcomes conversely, colleague participants identified ‘time’ as a hindrance to implementation of AFL in practice. This is evident in the table 2 above where 80% of colleague participants reported their experiences on lack of time as a burden to application. Even more, they pointed out that involving learners in practice by ways of self and peer assessment in a more discursive and interactive lessons, and improved questioning led to slowed down in the pace of curriculum delivery as a consequence, that the curriculum may not be covered within the specified deadlines. A colleague participant reported that: This new approach has lots of benefits to students and teachers, but the problem is that it is time consuming. It is difficult, or impossible to embed formative assessment every time in your lessons. (Colleague) Areas of Improvement Yes % Maybe % Not sure % No % Time consuming 8 (80%) 2 (20%) 0 (0%) 0 (0%) A change in educational philosophy is needed 7 (70%) 2 (20%) 1(10%) 0 (0%) assessment policy 0 (0%) Provide support/CPD for staff. 10 (100%) 0 (0%) 0 (0%) 0 (0%)
  • 31. PBM4029ononaji21809224 Page 31 Change in educational philosophy Also, a large majority of the teachers (70%) involved in the survey discovered that adopting and embedding formative assessment in practice requires changes in teachers’ thinking and actions; indeed, their whole educational philosophy for it to have positive impact on practice. For example, some staff found the implementation of the AFL strategies fundamental to their pedagogical approach and were comfortable with the ‘democratic classroom practices and the increased student-centred focus. However, the statistics in table 2 likewise shows 20% of colleagues’ participants regarded AFL as a risky, stressful, uncomfortable and an unnecessary procedure based on the fact that learners were taking more control of their practice (Kirton et al. 2007:625). Equally, Black et al. (2003:83) in a critical discussion on the implementation of ALF theory in the classroom reported that having ‘to let go’ by teachers and let students take some responsibility for the lessons is a limitation to formative assessment that practitioners need to overcome through professional development. The need for support and sustainability Once more, all teachers involved in the research noted that constant self-awareness through CPD and trainings are good ways to overcome the limitations facing AFL in practice. The conclusions suggested that meetings, networks and opportunities for sharing good practice of assessment materials are important elements in supporting teachers in changing and sustaining their practice. Speaking about this, Black (2007) argued that the biggest problem associated with formative assessment is that the practical implementation seems to be based on a limited understanding and a superficial adoption of the strategy by teachers and policy makers. In the light of this development, many researchers including Black and Wiliam (1998a) made numerous recommendations on how to close these gaps. Black et al. (2002, 2003) recommended that teachers should adapt the following strategies to improve formative assessment in the classrooms;
  • 32. PBM4029ononaji21809224 Page 32 1) Teachers should improve their questioning techniques by using a variety of questioning styles to encourage classroom interactions; allow more time for students to respond to questions (wait time) and to focus more on the discussion of wrong answers through effective use of oral and written feedback rather than, or in addition to, grades or marks. 2) Teachers should adapt the use of self and peer assessment by pupils, for example, the use of ‘traffic lighting’ in which pupils use the icons of green, amber or red to assess whether their understanding of the subject matter is good, partial or poor. 3) Teachers should share the criteria for assessment with pupils as an important way of initiating learning and make effective use of class collaboration through small groups, pairs and trios, to plan, discuss, draft and redraft written work, including summative tests drawing up and using mark schemes. Consequently, embedding Formative Assessment in practice can enhance students’ performance, aid them to participate actively, reinforce their grasp of course material, and participate in their own self- assessment. Besides, the results can help teachers immediately redirect the learning experience to address learners’ difficulties. Teachers must listen to students, ask them appropriate questions, and give them the opportunity to show what they know in a variety of ways because research has proven these strategies as effective methods to increase learning in practice.
  • 33. PBM4029ononaji21809224 Page 33 CONCLUSION In conclusion, through this research I have discovered that Assessment for Learning is a powerful teaching and learning procedure used to raise learners’ achievements, and a monitoring tool for teachers to judge the impact of their practise on their learners. This section discusses research question no: 3; the impact on practice: Impact on students’ and the ICT department A review of the departmental achievement records from 2012 to 2014 indicated that learners’ grades and the departmental outcomes increased tremendously. There is a perception among staff that the regularly use of ALF firmly in practice contributed to the significant improvements in the results. In 2013 and 2014 academic year, all results in the department were recorded ‘Good’ with significant number of students achieving above their TMGs. For instance, 88% of L3ICT 90 credit, 56% of ASICT and 23% of L3EXDIP of students in these cohorts achieved above their TMGs in in the academic year. Also, retention and success rates improved rapidly in the department from 2012 to 2014 when compared to previous outcomes. We (research participants) believed that the involvement of the department in this projected attributed to the results improvements. Impact on teaching and tutoring It was believed that the regular use of AFL helped enhance teaching and learning and improved independent learning in the department; thus improved learners’ confidence, engagement and outcomes. For example, both colleagues and students participants thought that the increased use of questioning and feedback by teachers in classroom occasioned to improved learners’ literacy and high grades in some programme areas (see appendix 5: results analysis). Equally, college senior leadership team acknowledged the improvements in teaching and learning in their 2014 Quality Improvement Review feedback conducted in the department. They held that: Students found lessons challenging, interesting and engaging with meaningful discussions, good teaching. They repeat information; help students to process and understand content (ICT SMT quality review feedback Oct 2014).
  • 34. PBM4029ononaji21809224 Page 34 Similarly, embedding AFL theories in practice enabled internal and external progressions in the department; for example, from 2012-14, more than 95% of students in a cohort progressed to the higher level of their programmes indeed; learners were achieving better grades and maintained confidence in themselves due to the skills they developed in the class through differentiated assessment techniques routinely used by their teachers. Finally, the outcomes of this project resulted to an overhaul of teaching and learning in the department. The department/college now has a teaching and learning community that meets once a month to strategize on ways of improving teaching and learning across the college. Impact on personal and professional development For myself as a practitioner, the impacts are as follows; The experiences gained have added more values to my professional development. First, It has enabled me develop a deeper understanding of the role of AFL in teaching and learning; thereby making me a better reflective practitioner. I am now more confident in identifying my personal weaknesses as well as managing situations arising from it. That is to say, my classroom management; tracking/monitoring and intervention skills have improved as a result of embedding assessment policies routinely in my practice (see appendix 4: track and monitoring sheet). Second, I have validated my pedagogical practice with current educational theories as a result, enhanced my practice and learners outcomes. Third, through this project, I have now seen the needs to recognise teaching and classroom activities as ‘artistry’ that requires careful planning and implementation. As a consequence, I have begun to reflect much deeper on my role and practices in the classroom; that is to explore more and better ways to build learning partnership with learners using formative assessment strategies as opposed to solely relying on myself as the only leader in the classroom. I believe working this way with students will not only help students develop mutual trust, gain greater confidence to engage more in lessons, but also it would re-enforce my role as an enabler facilitating the learning process for the benefits of all students.
  • 35. PBM4029ononaji21809224 Page 35 Finally, I have noticed the importance and the benefits of using formative assessment in my everyday practice. I have now understood in detail that formative assessment is ‘a range of formal and informal assessment procedures undertaken by teachers in the classroom as an integral part of the normal teaching and learning process in order to modify and enhance learning and understanding’ (Ministry of Education, New Zealand, 1994). LIMITATION AND CHALLENGES Completing this project was not without personal challenges. The first was the difficulty of accommodating my family life and teaching work with my studies. Overcrowded timetable made it difficult for me to find time to engage fully in my studies. Further, it was initially hard and time consuming to gathering and analysing data from participants due to the complex nature of the school settings; for example, arranging interviews with students were not easy because of their structured school timetables and the fact that students understandably could not agree to stay behind after lessons. Also, I had limitations in planning and documenting the research activities as I found it particularly demanding to generate and coordinate evidence in support of the project due to personal circumstances and workload in my place of work; for instance, it was challenging to organise interviews for the 50 students that participated, observe many lessons and to compose understandable questionnaires that were used to gather learners’ experiences without difficulties. However, with the support of the participants, colleagues and family I overcome the challenges by managing my time wisely and prioritised my daily activities to free the weekends and evenings for the purpose of the project. In conclusion, I have enjoyed the experiences gained in this research and development project. My participation has provided me with models of best assessment practices and techniques. The learners and I have now developed more self-confidence in managing everyday classroom challenges and complexities unlike before.
  • 36. PBM4029ononaji21809224 Page 36 FURTHER RESEARCH If I have the opportunity to participate in a future research work, I would like to investigate whether gender and ethnicity have impact on learners’ during classroom assessment. My possible research question would be: Is there evidence that gender and ethnicity have impact on learners during Classroom Assessment?
  • 37. PBM4029ononaji21809224 Page 37 Inspection report: 24–25 April 2012 5 of 12 Inspection grades: 1 is outstanding, 2 is good, 3 is satisfactory, and 4 is inadequate Please turn to the glossary for a description of the grades and inspection terms students’ spiritual, moral, social and cultural development are very good, and this work does much to foster the highly inclusive ethos of the centre. What does the school need to do to improve further? Continue to raise achievement by: increasing the average grade and the proportion of higher grades gained at A level and in BTEC Level 3 qualifications, so that all aspects of attainment are firmly in line with national averages ensuring that greater proportions of students complete the courses that they start, consistently across all subjects and curriculum areas. Improve the effectiveness of teaching by: ensuring that students are much better matched to and placed on programmes which are appropriate to their previous learning, so that teaching can focus more precisely both on their needs and the requirements of the course ensuring that teaching has a strong focus on developing deep subject specific skills and knowledge, as well as providing students with strategies to pass examinations and complete coursework tasks. Improve students’ punctuality to lessons, so that maximum use is made of teaching and learning time. Commentary This document is an extract from the 2012 Ofsted inspectors report. The report highlights what the school need to do to improve further. This inspector’s recommendation was my main reason for undertaken the MA programmes; which lead to my research on the ‘impact of assessment for learning in the classroom when embedded firmly in practice’. APPENDIX 1
  • 38. PBM4029ononaji21809224 Page 38 This is evidence of consent letter from the college authority to use the college/students for the case study. APPENDIX 2
  • 39. PBM4029ononaji21809224 Page 39 RESEARCH ETHICS: PARTICITANTS CONSENT FORM Full title of Project: The Impact of Educational Practice (PBM4029) Please Initial Box 1. I confirm that I have read and understand the information sheet for the above study and have had the opportunity to ask questions. 2. I understand that my participation is voluntary and that I am free to withdraw at any time, without giving reason. 3. I agree to take part in the above study. Note for researchers: Include the following statements if appropriate, or delete from your consent form: 4. I agree to the interview / focus group / consultation being audio recorded 5. I agree to the interview / focus group / consultation being video recorded 6. I agree to the use of anonymised quotes in publications Name of Participant Date Signature Name of Researcher Date Signature APPENDIX 3
  • 40. PBM4029ononaji21809224 Page 40 Sept to Dec 2011 Summative Test Record Teacher: Austin Ononaji AS ICT Level 2011-2013 (GroupD) Grade Modelling/Progress Data 2012 units Exam Results TMG Student TEST 1 (Sept 11) TEST 2 (Nov 2011) TEST 3 (Dec 2011) TEST4 (Jan 2012) TEST 4 (FEB/M ARC H 2012) W A G CH G RAD E INFO1 Jan 2012 Exam INFO2 Jan 2012 Exam Results TOTALAS GRADE INFO3 INFO4 TOTAL A2 GRADE D Abdullahi, Mohamed C C C D/C C/B C 0 U E Ahmed, Yusuf C C B B C B B/A E 0 U Bello, Ahmed C A 0 U U Brown, Pierre A A A A A A A B 0 U Douglas-Williams, Aaron A A A C B B A C 0 U E Khan, Zainab C U(B) D D D D/C C/B D 0 U D Ndulor, Promise A E(C) A B C B A/B D 0 U D Onomousiuko, Ejowhokoghene B U(D) D C D D/C C/B E 0 U D Osifeso, Oyinlade-Samson B B B B C B B/A D 0 U C Piotrowicz, Przemyslaw B C C N/A B C/B B/A D 0 U U Rhea Petters C U U D N/A U/E E/D B WAG Total No of students: 8 CG Male: 9 Female: 2 Predicted class achievement 100% 160-200 A 320-400 A 140-159 B 280-319 B 120-139 C 240-279 C 100-119 D 200-239 D 80-99 E 160-199 E 0-79 U 0-159 U AS Grades A2 Grades Rhea has attendance problem that is preventing her from achieving to her potential. Ejo is a very quiet/shy person in class and does not like to contribute to class discussions unless targeted. Mohamed sometimes find it difficult to assimilate (concentrate) and talks out of context in some occasion. Yusuf, Pierre, Aaron, Samson, Promise, Zainab, Przemyslaw need to be constantly challenged to get the best out of them. Working at Ggrade Challenging Grade Comment This is a monitoring sheet used to track and monitor learners against the TMGs. It helped me to understand, support, predict grades and provide feedback to my students. APPENDIX 4
  • 41. PBM4029ononaji21809224 Page 41 ACHIEVEMENT REVIEW 2013 1. COURSE ACHIEVEMENT DATA (High pass rate = A*-B (AS/A2); MMD - D*D*D* (BED3); MM - D*D* (BD3); M -D* (BSD3/BC3); D -D* (BD2 ) Course & Level Completion year: 2011 2012 2013 BM Target 2013 Target 2014 Comment and analysis (achievement and retention against benchmark, significance of value added, attendance, high pass rate, reasons for students leaving early, etc) AS LEVEL No. of starts 26 24 13 16 AS ICT shows a 26% increase in A-E and 47.8% A-B grades when compared to 2012 results. Possible reasons: Good use of ALF in lessons. The Value added is 0.9%; (0.7% better than 2012 cohort). % retention 77% 79% 89% 91% 91% % pass rate 100% 74% 100% 100% 80% % high pass rate 20% 11% 53.8% 18.8% 35%% % success rate 77% 59%. 89% 79.3 79.3% value added +0.7 +.03 +0.4 NA % attendance 89.1 90.9% 90% A2 ICT No. of starts 8 16 7 6 In A2, the overall grade is 85% in A-E grades and 71% in A-C. 38% increase in A-C grades when compared to 2012 results. Overall results was not as anticipated as no student achieved A*-B grades. Possible reasons: (15%) (1) Student failed because a review of scripts showed that they had difficulties in answering most questions in the exams. That student did not use a scriber as recommended. A2 ICT has exceeded all/most achievements benchmarks and targets in 2013. Value added is +0.3; (0.4%) better than 2012 record. % retention 100% 94% 100% 96.1% 95% % pass rate 100% 100% 85% 96.6% 91% % high pass rate 25% 7% 0 19.8% 10% % success rate 100% 94% 85% 83% 87% value added +0.3 -0.1 +0.3 % attendance 87.7% 82.9% 90% LEVEL 2 DIPLOMA ICT No. of starts 23 22 16 16 L2 recorded 100% pass rate with 60% APPENDIX 5 This document shows a breakdown of the departmental results for the last three years. The date shows that there is significant improvement in 2013 resulting from my improved practice (AFL)
  • 42. PBM4029ononaji21809224 Page 42 % retention 87% 64% 94% 89% 89% high grades. 31% high grades better than 2012.  4% above benchmark.  5% above retention.  Retention 94%, 30% better than 2012 and 5% above benchmark.  1 student withdrawn for not attending by senior tutor having followed all college procedures.  L2 Dip ICT achievement exceeded all benchmarks in 2013. % pass rate 100% 93% (100%) 100% 96% 96% % high pass rate 50% 29% 60% NA 30% % success rate 87% 59% 94% 86% 86% value added +13 +6 NA NA % attendance 80% 86.7% Course & Level Completion year: 2011 2012 2013 BM Target 2013 Target 2014 Comments BTEC LEVEL 1 ICT No. of starts 10 12 100% pass rate, 8% better than 2012. 100% retention. % retention 80% 108% 100% % pass rate 100% 92% 100% % high pass rate N/A N/A NA % success rate 80% 100% value added N/A N/A % attendance 93.7 82.6% BTEC NATIONAL DIPLOMA ICT YR2 No. of starts 16 16 21 19 No non achiever. 100% pass rate with 58% high pass rate (D*D*-MM) 4% above benchmark Retention: 93.3%, 20% better than 2012 and 20.3% above benchmark. 2 students left for an apprenticeship in March 2013. L3 Dip ICT achievement exceeded all benchmarks in 2013. % retention 88% 75% 93.3% 73% 75% % pass rate 100% 92% (100%) 100% 96% 92% % high pass rate 83% 58% NA 55% % success rate 88% 75% 90.5% 69% 80% value added -01 - % attendance 82.6 71.9% 90% BTEC NATIONAL EXT DIPLOMA YR2 No. of starts 26 15 11 10 No non achiever. 100% pass rate with 82% high pass rate (D*D*D*-MMM) same as 2012. % retention 50% 73% 91 68 76% % pass rate 100% 100% 100% 65% 90% % high pass rate 62% 82% 82% 55%
  • 43. PBM4029ononaji21809224 Page 43 % success rate 50% 73% 71% 44% 75% 35% above benchmark. Retention: 91%, 18% better than 2012 and 23% above benchmark. L3 EXT Dip ICT achievement exceeded all benchmarks in 2013. value added +0.5 +0.3 % attendance 90% 80% BTEC L 3 90 DIPLOMA YR1 No. of starts 19 No non achiever. 100% pass rate with 28% high pass rate (D*D*-MM) Retention: 73.7 due to a difficult cohort with new programme piloted. (see note on progression) % retention 73* % pass rate 100% % high pass rate 28.5% % success rate 73.7% value added % attendance
  • 44. PBM4029ononaji21809224 Page 44 1. Research and Development Plan Action to be taken By whom By when Performance indicator Reference To review 2009/12 academic year results in all programmes and set targets to improve underperformed area/s. Me Sept 2011 Result analysis and review of 2010/11 SEF and development plan. Faculty Development plan To implement teaching and learning strategy that will be use to monitor and support students. (Assessment for learning) Same On going Team meeting and CPD and sharing good practise. Lesson plan Lesson observation report SOW Faculty develop plan To Set up a homework and grade tracking system for old and new students based on TMG or exam results. Continuous assessment/Summative test Same Sept 2011/On going Students grades Teachers feedback Same To update SOW, set up revision and mock exam timetables for AS/A2 by focusing on developing examination writing techniques in preparation for Jan/May/June 2012 exams. (Assessment for learning) Same Sept 2012/ On going Set deadlines and monitor progress and compare grades with TMGs. Same To ensure that there is effective monitoring and support for all learners in underperforming areas. Same On going Provide feedback through ‘Personalised learning (ILP) Same Review process and evaluate outcome of embedding assessment for learning firmly in my planning and practice. Me / mentor April 2012 Lesson observation report Students grades Students’ feedback etc. Same Sharing good practice on assessment for learning with staff through internal CPD Me Sept 2012 Same Commentary An action plan used to implement my reflective practice programme. This action plan enabled me to conduct logical investigation and to collect factual evidence and data analysis throughout my research programme. The table shows the stages of activities used to collect facts and figures from college and learners before and after my investigation. APPENDIX 6
  • 45. PBM4029ononaji21809224 Page 45 CLASSROOM OBSERVATION FEEDBACK FORM Observed: Austin Ononaji Observer: xxxxxxxx Date: 27 th May 2013 Course Title: AS ICT Session length: 2 - 3pm Observation time: 1 hour Students on register: 10 Students in class: 10 Students arriving late: 0 Focus of observation Line management (as acting HoF) Brief overview of lesson Design of a solution recap, formal exposition, group work & mini written assessment Evaluative comments on key issues for feedback – what went well and areas for improvement. Please include comments on: student engagement, progression of learning, behaviour, differentiation (personalisation, inclusion & challenge) The lesson started promptly with Austin asking directed questions to students as a recap on previous learning. This prompted a good response from all but one student - Austin received the answers with comments such as ‘fantastic’ or ‘I will accept that’. It was a good idea to show the key words used in the topic and to reinforce their meanings/definitions by again challenging students to explain them; after a brief power point presentation, a group work exercise followed in which students worked in pairs or threes. It was clear that some thought had gone into the pairings and membership of the groups ensuring that the most able shared their knowledge and confidence with weaker students. The work was timed and the results were to be recorded on a sheet and fed back to all students. Each group had different but complementary task. The students worked well together, discussing their ideas, listening to each other with one member recording the key items. In some groups one student rather dominated proceedings and took over the scribe/design from another student; in another pair one (weaker) student recorded the feedback whilst the more able student ‘dictated’ most of the content. Whilst the group work took place, using self/peer assessment, Austin monitored and worked with each group, monitoring and prompting, and challenging or clarifying the key issues as students prepared their feedback. It might have been better to ensure that members of each group had clear notes and that there was a method of attributing individual contributions which would have helped to monitor progress more effectively. Students listened attentively to each group’s feedback and Austin prompted and also challenged the content/feedback on the designs and plans produced. Most students contributed to the feedback although the effectiveness of the feedback would have been improved by using larger flipchart sized paper. Students were rather passive though attentive during the last part of the lesson on design feedback. Non-directed questioning during this part of the lesson was dominated by one particular student. Students used textbooks or the Topic B handout to complete a table on tasks & software and types; the ensuing discussion and line of questioning (directed and non-directed) help to consolidate progress and learning. Austin finished the session by asking if any student had questions - by this time some students had packed up ready to leave the lesson. An open discussion followed which involved only a few of the most able students. Prompts Teaching  Preparation  Variety  Structure  Work or real world applications  Skills development  Lesson start  Lesson ending  Objectives  Differentiation  AfL  Questioning technique  Task management  Behaviour management  Clear  Challenging  Expertise  Enthusiasm  Inspiring  Use of learning resources  ICT  Seating Learning environment  Equality/diversity Learning  Good working reins  Behaviour  Group work  Good use of time  Independent learning  Enjoyment  Motivation  Individual challenge  Individual needs  Variety of task & assessment  Students aware how to improve  All contribute  ICT  Attendance  Punctuality Attainment  Progress  Standards & level APPENDIX 7 Lesson observation report showing improvement in practice
  • 46. PBM4029ononaji21809224 Page 46 Skills (specific and general Please comment on and grade the following areas of the lesson from 1 (outstanding) to 4 (inadequate) Aspect of lesson Grade Comment (reason for grade) Student learning and progress 1 Students made good progress in learning about design solutions; clean monitoring through directed Q/A, group work, written work and response. Demonstration of teacher subject knowledge 1 Confident, up to date & relevant. Effective use of time & structure of activities 2 A lesson of good pace, well structured with a range of tasks and activities in which all students were engaged. Effective use of assessment to inform teaching & learning *Validation & Verifications* 1 There were a number of opportunities used to assess learning including directed and non-directed Q/A, group work, written work & student contributions. Effective use of questioning and/or tasks/activities to challenge & assess 1 Directed questioning was used affectively, non-directed was at times dominated by a few very capable individuals - XXXX the potential contributions of others. Students understand how to improve their work 2 Assessment of student contributions and feedback given during the lesson was mostly effective although feedback to individuals needed to be clearer at times Effective use of resources (including technology where appropriate) 1 Good use of IWB as a teaching resource. Give details of any strategies used to support language development & literacy skills, maths skills and ICT skills None observed Is there evidence of use of the error code for assessed work (if in doubt, ask to see examples of work or ask students) Yes No X Lesson Grade: 1  2 3 4 IQRs only
  • 47. PBM4029ononaji21809224 Page 47 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 2 3 4 1 1 1 1 1 1 1 1 1 1 2 1 1 1 1 1 1 1 1 1 3 1 1 1 1 1 1 1 1 1 4 1 1 1 1 1 1 1 1 1 5 1 1 1 1 1 1 1 1 1 6 1 1 1 1 1 1 1 1 1 7 1 1 1 1 1 1 1 1 1 8 1 1 1 1 1 1 1 1 1 9 1 1 1 1 1 1 1 1 1 10 1 1 1 11 1 12 1 13 1 14 1 Total 9 0 0 0 5 5 0 0 9 0 0 0 8 2 0 0 5 4 0 0 5 4 0 0 1 7 1 0 8 5 0 0 4 5 0 0 Percentage 64% 0% 0% 0% 36% 36% 0% 0% 64% 0% 0% 0% 57% 14% 0% 0% 36% 29% 0% 0% 36% 29% 0% 0% 7% 50% 7% 0% 57% 36% 0% 0% 29% 36% 0% 0% No of Questionnaire completed Q3a. How much do you think this will help you to learn Q4 Do you do you think your teacher explain how what you are learning will help you do well in the subject Q4a. How much do you think this will helps you learn Q5. How often does your teacher check your learning before the end of the lesson Q6. Do you think is good or bad for the teacher to share the learning objectives with you at the start of the lesson AS ICT Student Assessment for LearningFeedback 2012/13 Learning Objectives Q1. How often does your teacher clearly explain what you are trying to learn Q1a. How much do you think this helps you to learn Q2. How often does your test your knowledge during lesson to see if you are understanding Q3. How often does your teacher explain or show you what need to be done to achieve the learning objectives APPENDIX 8 Student questionnaire and feedback summary
  • 48. PBM4029ononaji21809224 Page 48 Bibliography Black, P., Harrison, C., Lee, C., Marshall, B. and William, D. (2006) Assessment for Learning, Putting it into practice. London. Open University Press. Black, P., Harrison, C., Lee, C., Marshall, B. & Wiliam, D. (2002) Working inside the black box: assessment for learning in the classroom (London King’s College). Black, P. and Harrison, C. (2004) Science Inside the Black Box. London: nferNelson. Bolton, G. (2010) Reflective Practice: Writing and Professional Development. London: SAGE. Bransford, J., Brown, A.L., Cocking, R.R., Donovan, M.S. & Pellegrino, J.W. (eds). (2000). How People Learn, Brain, Mind, Experience, and School. Expanded Edition, National Research Council, National Academy Press, Washington. Burton, D. M. and Bartlett, S. (2011) Practitioner Research for Teachers. London: Paul Chapman. Butler, (1988), Enhancing and understanding intrinsic motivation; the effects of task- involving and ego-involving evaluation on interest and performance’, British journal of Education Psychology, 58, 1-14. Clark, S. (2005) Formative Assessment in the Secondary Classroom. London: Hodder Murray. Coolican, H. (2011) Introduction to Research Methods in Psychology. UK: Hodder Education. Dobbert, M.L. (1982), Ethnographic research: Theory and Application for Modern Schools and Societies, New York, NY; Praeger Publishers. Eysenck, M.W. (2004), Psychology: An International Perspective, Research Methods: Data Analysis. Psychology press limited. Fetterman, D.M. (1989), Ethnography Step by Step Newberry Park, CA; Sage Publications. Forde, C., McMahon, M., McPhee, A.D. and Patrick, F. (2006) Professional Development, Reflection and Enquiry. London: Paul Chapman. Ghaye, K. and Ghaye, T. (2004) Teaching and Learning Through Critical Reflective Practice. London: David Fulton Publishers Ltd. Hammersley (2002) Educational Research, Policymaking and Practice. London: Paul Chapman Publishing.
  • 49. PBM4029ononaji21809224 Page 49 Kyriacou, C. (2009) Effective Teaching In Schools: Theory And Practice. Cheltenham: Nelson Thornes. Kyriacou, C. (2007) Essential Teaching Skills: Nelson Thornes. McNiff, J & Whitehead, J (2010) You and Your Action Research Project 3rd Ed. Routledge: London. Moon, J. (2006) Learning Journals: A Handbook for Reflective Practice and Professional Development. Oxon: Routledge Falmer. Moon, J. (2008) Critical Thinking: An Exploration of Theory and Practice. London: Routledge. Pollard, A. (2008) Reflective Teaching: Evidence-Informed Professional Practice. London: Continuum International Publishing Group Ltd. Paulson L. and Wallace M. (2004) Learning to Read Critically in Teaching and Learning. London: SAGE. Powell, R. (2010), Outstanding Teaching, Learning and Assessment ‘The Handbook’. London Robert Powell Publication LTD. Shon, D.A. (1987) The Reflective Practitioner: How Professionals Think in Action, San Francisco, Josey Bass. Silvan, R.E., Huurley, E.A, & Chamberlain, A.M (2005) Cooperative learning and achievement. In W.M. Reynolds & G.J. Miller (Eds.), Handbook of psychology: Educational psychology (Vol. 7, pp.177-198). Hoboken, NJ: Wiley. Simons, H. (2010). Case study Research in Practice. London Sage Taylor Powell, E. and Renner, M. (2003) Analysing Qualitative Data. Maddison, WI: University of Wisconsin. Available at: http://learningstore.uwex.edu/assets/pdfs/g3658-12.pdf. Walsh, M. (2001) Research Made Real (A guide for students). Nelson Thornes. Watford, G. (2001) Doing Educational Qualitative Research: A personal guide to the research process. Published by Anthony Rowe LTD Chippenham, Wilshire, GB. Watkins, C., Carnell, E., Lodge C., Wagner, P. and Whalley. C. (2001) ‘Learning about Learning enhances performance’, NSIN Research Matters, 13. Wells, G. (2008). ‘Dialogue, inquery and the construction of Learning Communities’. In: Lingard. Nixon, J. and Ranson, S. (Eds) Transforming Learning in schools and communities: the Remarking of Education for Cosmopolitan Society. London Continuum. Wiersma, W, & Jurs, S., G. (2009) Research Methods in Education: an introduction, 9th ed. Pearson Education.
  • 50. PBM4029ononaji21809224 Page 50 William, N. and Buckler, S. (2008) Your Dissertation in Education. London: Sage. Yates, S. (2004) Doing Social Science Research. London: Sage/Open University Press. Yin, R, K (2009) Case Study Research: design and methods. London: Sage. Journals and related publications Assessment and Rporting Unit Learning Policies Branch Office of Learning and Teaching (2005) Current Perspective on Assessment. Assessment Reform Group (2002) ‘Assessment for Learning’: 10 principles. Assessment Reform Group (2009) Assessment in schools Fit for purpose? A Commentary by teaching and learning research programme Black, P. and Wiliam, D. (1998) Assessment and classroom learning, Assessment in Education: Principles, Policy & Practices Mar 1998, Vol.5 issue 1. Black, P. and Wiliam, D. (1998a) Inside the black box: raising standards through classroom assessment (London King’s College). Black, P and Wiliam, D (1999), Assessment for learning: beyond the black box. London: Kings College London. Black, P. and Wiliam, D. (1998b) Assessment and classroom learning, Assessment in Education, 5(1), 7-74. Black, P. (2007) Formative Assessment: Promise or problems?. King’s College London. Briggs, J. (1998). Assessment and classroom learning: a role for summative assessment? Assessment in Education. Vol.5, No.1, pp.103-110. Brown (2004-5), Assessment for Learning: Learning and teaching in Higher Education, Issue 1, 2004-05. Butler, R (1988). ‘Enhancing and undermining intrinsic motivation: effects of task- involving and ego-involving evaluation on interest and performance’. British Journal Of Educational Psychology 56 (51–63). Crooks, T, J (1988). ‘The Impact of Classroom Evaluation Practice on Students’. Source Review of Educational Research, Vol. 58 No.4, Winter, 1988, pp 438-481. Department for Children, Schools and Families (2009) Report of the expert group on assessment. London DCSF.
  • 51. PBM4029ononaji21809224 Page 51 Elwood, J. (2004) Gender and achievement: new issues or old problems in assessment for learning? Keynote lecture, SEED and SQL Assessment is for Learning Conference Glasgow, 4 June 2004. Gagne, R. M., Briggs, L, J., & Wager, W, W. (1988). Principles of instrumental design. New York: Holt, Rinehart and Winston. Hargreaves, D.H. (2001) A future for the school curriculum. Available online at: http://www.qca.uk/ca/14-19/dh_QCA. Harrison, C. and Harlen, W. (2006). ‘Children’s self- and peer-assessment’. In: Harlen, W. (Ed) ASE Guide to primary Science Education. Hatfield: Association for Science Education. Hattie, J. (1999). Influence on student learning: University of Auckland, New Zealand: Inaugural professorial lecture. Hayward, L. & Hedge, N. (2005). Travelling towards change in assessment: policy, practice and research in education. Assessment in Education, Vol.12, No.1, pp55-75. Hodgson, C.& Pyle, K. (2010) A literature review Of Assessment for Learning in Science. National Foundation for Education Research. Kirton, et al. (2007) Revolution, evolution or a Trojan horse? Piloting assessment for learning in some Scottish primary schools. McDowell, Liz, Sambell, Kay and Davison, Gillian (2009) Assessment for learning: a brief history and review of terminology. In: Improving Student Learning Through the Curriculum. Oxford Centre for Staff and Learning Development, Oxford, pp. 56-64. ISBN 1873576786 Miller, M (2005) Assessment: Literature Review, Bulletin number 19. Miller, R. and Hames, V. (2002). ‘EPSE Project 1: Using diagnostic assessment to improve science teaching and learning’, School Science Review, 84, 307, 21-24. Miller, A. H. Imrie, B. W. & Cox, K. (1998) Student Assessment in Higher Education: a handbook for assessing performance, London: Kogan Page. Ministry of Education, New Zealand. (1994) Assessment: policy to practice. Wellington, New Zealand: Learning Media. Natriello, G. (1987) The Impact pf evaluation processes on students, Educational Psychologist, 22, pp.155-175.
  • 52. PBM4029ononaji21809224 Page 52 Office for Standard in Education (2010) The quality of teaching and the use of assessment to support learning. Office for Standard in Education (2008) Assessment for learning: the impact of National Strategy support. Office for Standard in Education (2012) College report. Quality Improvement Agency for Lifelong Learning (2008). Guidance for assessment and learning. Rust, C. (2002) The impact of assessment on student learning: how can the research literature practically help to inform the development of departmental assessment strategies and learner-centred assessment practices? Active learning in Higher Education, vol 3, no.2, pp.145-158. Salthouse, T., A. (2011) All Data Collection and Analysis Methods Have Limitations: Reply to Rabbitt (2011) and Raz and Lindenberger (2011). Psycological Bulletin, 137, No.5, 796-799. Stiggins, R.J. (2002). Assessment in Crisis: The Absence Of Assessment FOR Learning. Phi Delta Kappan. Vol.83, No.10, pp758-765. Winter, R [1991] ‘Fictional-critical writing as a method for educational research,’ British Educational Research Journal, 17, 3, pp251-262. Young, W. (2005) Assessment for Learning: Embedding and Extending. Assessment is for Learning. Websites www.learning-teaching-update.com www.sflip.org.uk www.niu.edu/assessment/Resources/Assessment_Glossary.htm www.ofsted.gov.uk British Education Index (BEI) http://www.leeds.ac.uk/bei British Educational Research Association (BERA)http://www.bera.ac.uk Department for Education http://www.education.gov.uk ERIC (The Education Resources Information Centre) http://www.eric.ed.gov ITSLIFE - Learning for Teaching on Reflective Practice.http://www.itslifejimbutnotasweknowit.org.uk/RefPractice.htm Professor John Hattie’s website is at: www.arts.auckland.ac.nz/staff/index.cfm?p+8650 NFER (National Foundation for Education Research) http://www.nfer.ac.uk/index.cfm http://www.bera.ac.uk/