SlideShare a Scribd company logo
Jane Chapman PGCE Secondary
1
A CRITIAL ANALYSIS OF THE MATCH OR MISMATCH BETWEEN THE
PERCEIVED AND ACTUAL UNDERSTANDING OF YEAR 7 STUDENTS, STUDYING
PARTICLE SOLUTIONS, WHEN ENGAGING IN SELF-ASSESSMENT LEARNING
ACTIVITIES
Introduction
This study aims to investigate the match or mismatch between the perceived and
actual understanding of students, when engaging in self-assessment. Educators frequently use
self-assessment to assess student understanding in order to identify and respond to their
needs. However, it is important to question how well students know what they know. It has
been illustrated that people often make self-assessment errors, and when they do, they are
often overconfident. An issue which needs to be addressed with this research is how best we
can measure student understanding. In this study, students will be investigated individually,
based on tests, confidence scorings and questionnaires. Interviews will further be used to
examine the effect of self-efficacy on student understanding. The context of this enquiry is a
coeducational school in Cambridgeshire, England. The chosen class is a high-achieving
cohort of 29 students in a Science class. This paper will first give a background on the
previous research done on this topic and then explain the methodological approach taken to
address the current research questions. Later, findings will be analysed and finally,
conclusions will be made, with implications of this study.
1. Literature review
1.1. Self-Assessment for Learning
Effective assessment by schools and teachers not only needs to measure student progress,
but also identify their learning needs and respond to them. Forms of ‘summative’
assessment, such as tests and examinations are a classic way to measure student progress, in
addition to making schools and the education system accountable (Ball, 2003). However, to
be truly effective, there must also be ‘formative’ assessment. In classrooms, this includes the
teacher making frequent and interactive assessments of the students’ understanding in order
Jane Chapman PGCE Secondary
2
to identify and respond to their needs. This informs future teaching as it allows the teacher to
adapt to the changing needs of the students. Teachers should also involve students actively,
encouraging them to develop skills that better aid their learning.
Formative assessment is known to be highly effective in raising the standards of student
achievement, gains which are ‘among the largest ever reported for educational interventions’
(Black & Wiliam, 1998). In addition, formative assessment methods may also promote
‘greater equity of student outcomes’, as teachers ‘adjust methods to recognise individual,
cultural, and linguistic differences between children’ (CERI, 2008). Furthermore, this type of
assessment also builds students’ ‘learning how to learn’ skills by actively involving students
in the process of teaching and learning, helping them understand their own learning and
building students’ skills for peer- and self-assessment. In primary and secondary schooling,
self-assessment has been shown to improve student communication skills, engage and
empower students, enhance their self-regulation and metacognition, and create better
understandings of the criteria used to evaluate students’ work (Andrade, 2010; Topping,
2003).
Reliability of self-assessments is typically high, demonstrated by a study of 11-12
year old students rating their performance in mathematical problem solving (Ross et al.,
2002) and self-assessments in English (Ross et al., 1999). However, there has been shown to
be less consistency over longer time periods, particularly involving younger children
(Blatchford, 1997). Evidence about the concurrent validity of self-assessments is mixed. In
general, student self-assessments are higher than teacher ratings. Furthermore, a study
comparing student self-assessment to standardised tests found that age moderated the
relationship. Self-assessment was correlated with achievement at age 16 but not at age 7
(Blatchford, 1997). It should be taken into consideration that any form of self-assessment that
takes place in a public space may trigger threats to psychological safety and interpersonal
relationships (Brown & Harris, 2013). Furthermore, many students have doubts about their
ability to assess themselves (Brown et al., 2009) and there is evidence to suggest that school
students are relatively inaccurate assessors (Ross, 2006). One study of 23 Canadian primary
and secondary classrooms found that although students appreciated self-assessment, there
were concerns over possible cheating and inaccuracy (Ross et al., 1998). Additionally, a New
Zealand study of self-assessment reported that students preferred more traditional teacher-
controlled assessments, a belief reinforced by school grading and reporting methods. The
same study also proposed that students in high-stakes environments for educational
assessments such as the UK Key Stage testing may be more likely to resist self-assessment
Jane Chapman PGCE Secondary
3
because their assessment experiences have not allowed them to appraise their own
evaluations of their work. This is highlighted by the student response: “my teacher’s
judgement matters more than mine” (Brown & Harris, 2013). To improve the accuracy of
self-assessment and improve student confidence in their evaluations, school students need
support, direction and teacher involvement for self-assessment to work effectively (Dunning
et al., 2004).
One commonly used self-assessment practice is the Traffic Light technique, developed
out of the King’s-Medway-Oxfordshire Formative Assessment Project in England. This
popular method involves students holding up a green, amber or red sign to highlight whether
they understand, think they understand but are not quite sure, or do not understand a certain
concept. Teachers would then spend more time with students who held up amber or red
(OECD, 2005). This ‘assessment for learning’ technique can also be used by students to label
their work, indicating how confident they are of their success. However, it is important to
question how well students know what they know. It has been illustrated that people often
make self-assessment errors, and when they do, they are frequently overconfident. For
example, in Hacker et al. (2000), many students predicted they would receive examination
scores greater than 30% higher than their actual scores. This overconfidence effect was
greatest for people with lower abilities. Moreover, the same study reports that higher-scoring
students were more accurate at predicting their examination scores than lower-scoring
students. The reason for this metacognitive inaccuracy is debated. The leading interpretation
is that lower ability students lack awareness of the knowledge that they do and do not possess
(Ehrlinger, 2008). However, a study testing this theory found that low-performing students
were less subjectively confident in their predictions than high-performing students, implying
low-performers are aware of their ineptitude. This literature demonstrates dissociation
between metacognitive ability and awareness of this ability (Miller & Geraci, 2011).
1.2. Self-efficacy
Self-efficacy is one of the essential components of Bandura’s (1977) social cognitive
theory. He identified that behaviour could be affected by self-efficacy theory – the belief that
a person can successfully do whatever is required to achieve a desired outcome. Key factors
which affect a person’s efficacy expectations are; vicarious experiences (seeing other people
doing something successfully), verbal persuasion (being told that you can do something) and
emotional arousal (high levels of anxiety can reduce a person’s self-efficacy). Furthermore,
Jane Chapman PGCE Secondary
4
contextual factors such as social, situational and temporal circumstances might also affect
expectations of personal efficacy (Weiner, 1972). In the past several decades, studies have
shown that students’ motivation, cognition and actual performance are strongly influenced by
self-efficacy (Sungur, 2007; Usher & Pajares, 2006). In general, students with higher levels
of self-efficacy have been found to set higher goals, adopt flexible and varied learning
strategies, exert greater effort to complete academic tasks and obtain better academic
performance levels (Liem et al., 2008). In contrast, students with low self-efficacy tend to
avoid tasks they deem to be beyond their capabilities (Lin & Tsai, 2013). In the past, when
determining the relationship between self-belief and outcome, there has often been an
incorrect judgement of self-efficacy (Zimmermann, 1996). This has been due to self-efficacy
beliefs not being assessed at the correct level of specificity that corresponds to the specific
task being studied.
General self-efficacy assessments are thought to transform beliefs of self-efficacy into an
indiscriminate personality trait instead of the context-specific judgement Bandura suggests
they are. Bandura (1986) proposed that judgements of self-efficacy should be consistent with
the domain of task being investigated. An example of this would be a mathematics self-
efficacy instrument used to investigate the confidence students had of succeeding in
mathematics courses and comparing this to their performance in maths-related tasks (Pajares,
1996). Furthermore, students with higher self-efficacy often report higher levels of self-
knowledge judgement than students with lower self-efficacy (Gravill et al., 2002) and
students who believe in their learning efficacy develop and sustain their effort needed for
learning. Therefore, self-efficacy contributes to knowledge acquisition and skill development
(Tsai et al., 2011).
Several studies have examined the more specific self-perceived competence in science
education. Evidence highlights that students who feel more efficacious in science
demonstrate higher achievement in this subject (Borman & Overman, 2004). Bandura’s
theory would suggest that this may be due to student persistence, even when tasks are
difficult (Bandura, 1997). The literature also implies that student academic anxiety (Britner &
Pajares, 2006) and gender (i.e. being a girl; Fast et al., 2010) also contribute to students
having a lower science self-efficacy. Interestingly however, another study found no gender
differences in their science self-efficacy (Griggs, et al., 2013). One explanation for these
different findings may be because the latter study controlled for science anxiety, which was
greater among girls. Therefore, once anxiety was controlled, both genders believed
themselves to be similarly efficacious. Student experiences at school also play a role,
Jane Chapman PGCE Secondary
5
demonstrated by enhanced student self-efficacy through a focus on creating caring,
emotionally supportive learning environments (Zins & Elias, 2006). As mentioned
previously, the judgement of self-efficacy on predicting performance is shown to be
discipline- and situation - specific. However, situational conditions do not establish perceived
self-efficacy, but rather act as performance requirements for the judgement of efficacy
(Bandura, 1997). Research has previously revealed the role of self-efficacy in science
learning and has suggested that self-efficacy mediates people’s interpretation of their
knowledge (Liu et al., 2006).
1.3. Students’ Science Learning Self-Efficacy
As mentioned previously, when considering studies that aim to explore students’ self-
efficacy, the use of an instrument with general self-efficacy items would be insufficient
(Pajares, 1996). It would be more appropriate to instead develop measures that can be
adapted to several contexts. In the case of students’ self-efficacy in science, it should not be
thought of as one global measurement, but should be separated into several distinctive aspects
for more detailed study (Lin & Tsai, 2013). Science education literature has established that
there are several major aspects of science learning. Duschl (2008) has highlighted that
conceptual understanding of scientific knowledge, together with higher-order thinking skills
such as reasoning and critical thinking are of great importance.
The development of conceptual understanding and critical thinking, together with
problem solving ability has been suggested to be promoted by practical work. Furthermore,
practical work has also been proposed to stimulate and maintain students’ interest, attitude,
satisfaction, open-mindedness and curiosity in science, promote aspects of scientific thinking
and the scientific method and also allow students to develop practical abilities (Hofstein,
1988). These all contribute in helping students learn science, learn about science and allow
them to do science (Tsai, 2003). Based on the PISA 2006 survey (OECD, 2007a, b), Finnish
students obtained the highest score in the Scientific Literacy Assessment between students in
all OECD countries. A large-scale study looking at how they succeeded, found that a robust
predictor of the high results in Finland was frequent use of practical work in the classroom
(Lavonen & Laaksonen, 2009). It is also important for students to be literate in science,
meaning ‘to use scientific knowledge, to identify questions and to draw evidence-based
conclusions in order to understand and help make decisions about the natural world’ (OECD,
1999). There are many reasons why everyday science applications should be integrated into
Jane Chapman PGCE Secondary
6
school science. Firstly, empirical studies have shown that using everyday contexts enhances
student enjoyment (Dlamini, et al., 1996), allows for conceptual development, provides
teachers with an opportunity to address misconceptions (Lubben et al, 1996) and gives
relevance to school science learning (Campbell et al, 1994). Furthermore, incorporating every
day science applications into school science is fundamental to the students’ mastery of
science learning in school (Driver et al., 1994). However, many students view ‘school science
as having little or no relevance to their life-world subcultures’ (Aikenhead, 1996). When
there is no bridging between school science learning and daily experiences, students may
practice ‘cognitive apartheid’, referring to the isolation of knowledge systems relating to
science: one for school science and one for everyday lives (Cobern, 1994). Learning, like
doing science, is a social activity that takes place ‘through communication or interaction with
others where ideas are constructed or shared’ (Vygotsky, 1978). Communication through
discussion, argumentation, reading and writing can promote students’ constructs of
understanding science (Chang et al, 2011) with studies revealing the importance of students’
interpersonal communication with adults and peers on improved learning (Stamovlasis et al,
2005).
As discussed above, there are various features of science literacy and there have been
several successful studies measuring students’ science learning self-efficacy (SLSE) in
conformity with these features (e.g. Baldwin et al, 1999; Lin & Tsai, 2013; Uzuntiryaki &
Capa Aydin, 2009. In the research by Lin and Tsai (2013), several current SLSE instruments
were collected and modified to develop their own validated ‘Science Learning Self-Efficacy’
(SLSE) instrument. This consisted of five distinct domains (‘Conceptual Understanding’,
‘Higher-Order Cognitive Skills’, ‘Practical Work’, ‘Everyday Application’ and ‘Science
Communication’) that conform to the existing notion of science literacy. Furthermore, this
study also investigated the relationship between high school students’ SLSE and their
approaches to learning science. Through correlation analyses, they found that students’ deep
strategies and deep motive were strong predictors of their SLSE. This SLSE instrument has
also been useful in a later cross-cultural study (Lin et al., 2013), and in another, revealing a
significant association between students’ conceptions of learning science and their self-
efficacy (Lin & Tsai, 2013). This study found that students in strong agreement with learning
science as understanding and seeing in a new way are likely to possess a higher science self-
efficacy than students who consider learning science in terms of preparing for tests and
examinations. These studies indicate that this multi-dimensional SLSE instrument is relevant
and valid for advancing current understandings in the line of SLSE research.
Jane Chapman PGCE Secondary
7
Science educators have explored the relationships between student science learning self-
efficacy and both their conceptions of learning science and approaches to learning science.
However, there does not seem to be any studies exploring student science learning self-
efficacy and their matched or mismatched actual and perceived understanding. The main
purposes of this study were first, to explore the match or mismatch between the perceived and
actual understanding of high school students studying Science. Secondly, this study aimed to
explore the relationship between students’ SLSE and students’ perceptions of their own
understanding. Derived from the research purposes, this study addressed the following
questions:
1. Does students’ perception of their understanding match their actual understanding?
2. Is students’ perception of their understanding influenced by their self-efficacy?
2. Methodology and Methods
2.1. Methodology
A case study approach was taken, one that Denscombe (2010) has characterised as a
‘focus on just one instance of the thing that is to be investigated’. Focussing on individual
instances rather than many may reveal insights about the general by looking at the particular.
An advantage that a case study has over a survey approach is that there is greater opportunity
to explore things in more detail and discover new insights that may not have been possible
with a more superficial approach. Therefore, the complexities of a given situation can also be
exposed. This is useful as relationships and processes within social settings are often
interrelated and interconnected. Furthermore, the case being explored is a ‘naturally
occurring’ phenomenon (Yin, 2009); something that usually already exists, and is not a
situation that has been artificially created for the purpose of research. This gives the case
study another benefit of a natural setting.
The range of potential ‘cases’ is very widespread but for good case-study research, it
is essential for the unit to have distinct boundaries, where the researcher is able to keep it
separate from other similar things of its kind and distinct from its social context. The case
study approach also requires the researcher to select one case from a wider range of examples
of the type of thing that is being explored, based on their distinctive features.
Jane Chapman PGCE Secondary
8
In addition to the advantages mentioned earlier, the variety of methods used in case
studies results in multiple sources of data. This facilitates validation of the data through
triangulation. The major criticism of case studies is in relation to the credibility of
generalisations made from its findings. Therefore, researchers must highlight the extent to
which the case is similar and dissimilar to others of its type. This approach also often gets
accused of lacking a degree of rigour, and one that relies on qualitative data and interpretative
procedures instead of quantitative data and statistical methods. Another limitation of case
studies is the problem of ‘the observer effect’. This can arise as case studies usually comprise
lengthened involvement over time, potentially causing those being researched, to behave in a
different manner from normal, as they may be aware that they are being observed. This may
be the case in the present study, given my dual role as teacher and researcher.
2.1.1. Details of the case
The present study involved 27 students in a year 7 class (around 11-12 years old)
from a secondary school in the county of Cambridgeshire. Among the participants, 16 were
male and 11 were female, and were from a class with high academic achievement. The
surveyed students came from a variety of villages in the catchment area with different socio-
economic backgrounds. Permission to gather data was provided by the school administration
and students were informed that the data collection process was anonymous and voluntary.
The topic being studied during this research was of ‘Particle Solutions’. Within this,
they explored the properties of solids, liquids and gases and gained a greater understanding of
the ‘Particle Model’. Next, they applied the Particle Model theory to explain every day
phenomena such as gas pressure, expansion, diffusion and contraction and to consider the
motion, forces and closeness of particles to help explain observations obtained during
practical work, such as changing of state. Later lessons aimed at students’ understanding of
what mixtures are and how you can separate them and also their understanding of the process
of dissolving. Furthermore, students had to then use their knowledge about separating
mixtures to obtain a sample of salt from rock salt and understand that salt comes from a
variety of sources and has many uses.
Since students build their own concepts, their constructions of chemical concepts
sometimes differ from the one the teacher holds and has tried to present (Nakhleh, 1992).
These misconceptions differ from commonly accepted scientific understanding and may
interfere with subsequent learning. The scientifically accepted model that matter is made up
Jane Chapman PGCE Secondary
9
of discrete particles with empty space between them and are in constant motion is a concept
students of all ages have trouble understanding (Novick & Nussbaum, 1981). This study
revealed that over half of the students perceived matter as a continuous medium that was
static. In addition, the authors highlighted that aspects of the particulate model were
differentially accepted by students. For example, the idea that liquefaction of gases involved
the merging of particles was accepted by 70%, whereas only 40% accepted the idea that
particles in a gaseous phase have empty spaces between them. In another study of 300
students aged 15-16, nearly half of the students believed that the properties of the substance
were also properties of the individual atom (Ben-Zvi et al, 1986). This concept of the
particulate nature of matter is an important foundation for understanding many chemical
concepts (Krajcik, 1990). Misconceptions about the concepts of atoms and molecules have
also been revealed by researchers. In a study with 17-18 year old Canadian students, half of
the students in the sample believed: molecules are much larger than in reality; molecules of
the same substance may vary in size; molecules of the same substance can change shapes in
different phases; molecules have different weights in different phases and also that atoms are
alive (Griffiths & Preston, 1989).
2.2. Data Collection Methods
To allow for the examination of actual and perceived understanding of students within
its real-life context, this study compared test results with confidence indicator colours after
every question of the test, together with subject interviews. To study whether students’
perceptions of their understanding was influenced by their self-efficacy, a Science Learning
Self-Efficacy (SLSE) questionnaire was administered. Further data were collected through
subject interviews, student focus groups and teacher interviews. A summary of the research
questions and data sources is given below in Table 1.
Jane Chapman PGCE Secondary
10
Table 1. Research questions, data sources, and details of when the data was collected
Proposed
Methodology
Case study
Title A critical analysis of the match or mismatch between the perceived and actual
understanding ofyear 7 students,studying particle solutions,when engaging in self-
assessment learning activities.
Research question: When engaged in self-assessment activities, what influences year 7 students’perceptions
of their own understanding?
Sub questions Data Source Data Source Data Source Data Source
1. Does students’
perception of their
understanding
match their actual
understanding?
Red / amber /
green traffic
lights
immediately after
every question
Test results for
every question
Subject interviews
(Check on those
with green + bottom
marks and red + top
marks.)
-
2. Is students’
perception of their
understanding
influenced by their
self-efficacy?
SLSE
questionnaire
Subject interviews Focus groups Teacher interview
When will I
collect this data?
1. End of the
first two
lessons in the
sequence.
2. End of the
third lesson
in the
sequence
1. End of the first
two lessons in
the sequence.
2. During
lunchtime, after
the last lesson
in the topic.
1. During
lunchtime, after
the last lesson
in the topic.
2. During
lunchtime, after
the sixth lesson
in the sequence.
-
2. End of the
schoolterm
2.2.1. Assessing Students’ Actual and Perceived Understanding
Two distinct tests were administered over two lessons and each of these tests
consisted of six questions which aimed to assess their understanding of the learning
objectives from the lesson they had previously done (Appendices 3 and 4). Students were
asked to indicate how confident they were that their given answer was correct by using red,
amber and green traffic light colours after each question in the space provided.
Tests are a useful way of collecting evidence about the knowledge and understanding
of students. However, as with all assessment data, the validity of outcomes will strongly
depend upon whether the test items are actually testing the knowledge and understanding
they claim to. Creating tests that are both valid and reliable is known to be difficult (Taber,
2013). For example, contextualised questions which are meant to be less abstract and
unfamiliar to the student, may complicate matters: students have to ‘process’ more
information, the context may illicit ‘everyday’ ways of thinking that do not match academic
Jane Chapman PGCE Secondary
11
learning and the context may be more familiar to some students that others (causing a
potential gender- and cultural-bias) (Taber, 2003).
To confirm the results of the tests, subject interviews were carried out to further
analyse students who signalled green but had an incorrect answer and those who signalled red
but had a correct answer (a mismatch of actual and perceived understanding). In this case,
‘structured’ interviews were carried out. Structured interviews consist of a pre-determined list
of questions, asked of each respondent to which they are given a limited-option of responses.
The tight control over the format of questions and answers leads itself to the advantage of
‘standardisation’. Furthermore, the selection of pre-coded answers ensures relatively easy
data analysis, lending itself to the collection of quantitative data and which is useful for
‘checking’ data (Taber, 2013).
Self-assessment, as mentioned before, increases student engagement in assessment
tasks and is also a key factor in maintaining student attention. Self-assessment data also has
the strength of providing information that is not easily determined, such as how much effort a
student has made in preparing for a certain task (Ross, 2006). In addition, numerous
researchers have reported a high level of reliability in self-assessment in terms of consistency
across tasks (Fitzgerald et al., 2000) and over short time periods (Chang et al., 2005). These
studies all involved students that had been taught how to evaluate their work effectively.
Limitations of self-assessment appear to be the concern over its validity. The literature
suggests that there are discrepancies between self-assessments and scores on other measures
(Ross, 2006). Furthermore, student self-assessment is generally shown to be higher than
teacher ratings (McEnery & Blanchard, 1999).
2.2.2. Assessing Students’ Self-Efficacy in Learning Science
Unlike tests that aim to measure learning, questionnaires contain questions that all
respondents should be able to respond to (Taber, 2013). There are strengths and limitations of
the different types of items and scales used in questionnaires. Closed questions are simple to
analyse but only investigate which of the offered options respondents chose. There is
opportunity for respondents to give answers that closer matches their own views in open
questions, but these need to be later categorised to be reported in an economic way. The
selection of comments can also raise questions of how representative they are to the actual
data. It has been proposed that the design of questionnaires can have increased validity and
reliability if they do not contain central points as this forces the respondent to decide on how
Jane Chapman PGCE Secondary
12
they feel. However, if they genuinely have neutral or mixed feelings about the statement, this
may result in false claims being made. Consistency of responses should be probed by
including several similar, or directly opposite items. Another limitation of questionnaires that
consist of many scale-type items is that they are known to be occasionally completed with
little thought. To reduce the risk of this happening, questionnaires should aim to not contain
too many items. Furthermore, they could also contain some statements that are reversed,
encouraging respondents to think more carefully about each item (Taber, 2013).
In general, questionnaires have the advantages of being economical, relatively easy to
arrange and supplying standardised answers. As respondents are posed with identical, pre-
coded questions, there is no scope for variation via face-to-face contact with the added
benefit of the data not being affected by ‘interpersonal factors’. However, there are
disadvantages to pre-coded questions which should be considered. Together with being
frustrating and restricting for the respondents, they could also ‘bias the findings towards the
researcher’s, rather than the respondent’s, way of seeing things’ (Denscombe, 2010).
A 28-item Science Learning Self-Efficacy (SLSE) instrument was adopted to assess the
participants’ self-efficacy in learning science (Lin and Tsai, 2013) (Appendix 2). The items
of the SLSE instrument were presented with bipolar strongly agree/strongly disagree
statements in a four-point Likert scale (4=strongly agree, 3=agree, 2=disagree, 1=strongly
disagree), assessing the dimensions discussed in Table 2.
Table 2. The five dimensions assessed with the SLSE instrument
Distinct Dimension Description of what is assessedin participants’ confidence
Conceptual
Understanding
Ability to use fundamental cognitive skills such as science concepts,laws or
theories.
Higher-Order Cognitive
Skills
Ability to utilize sophisticated cognitive skills including problem-solving, critical
thinking or scientific inquiry.
Practical Work Ability to conduct science experiments in laboratory activities.
Everyday Application Ability to apply science concepts and skills in their daily life.
Science Communication Ability to communicate or discuss with classroom peers or others
Jane Chapman PGCE Secondary
13
2.2.3. Interviews
For an interview to take place there must be consent from the interviewee who agrees
and understands that the material obtained will be used for research purposes. The
interviewee must also agree that their words can be treated ‘on the record’ and ‘for the
record’ unless they specify otherwise and that the agenda for the discussion will be controlled
by the researcher. Interviews are useful for in-depth exploration of complex and subtle
phenomena, such as people’s experiences, opinions, feelings and emotions.
There are several types of research interviews, in addition to the ‘structured’ interview
mentioned previously (in section 2.2.1). With semi-structured interviews, the researcher still
has a set of issues to be considered and questions to be answered but is flexible in the order in
which the topics are addressed. Additionally, there is greater flexibility for the interviewee as
answers are open-ended. This allows the interviewee to develop and elaborate on points
which are of interest to them. With unstructured interviews, the role of researcher is to be as
unobtrusive as possible. Semi-structured and unstructured interviews are on a continuum
scale so it is likely for both to feature in a single interview. Due to the nature of the
interviewee having freedom to ‘speak their mind’, semi- and unstructured interviews are
useful for discovering ideas about complex issues (Denscombe, 2013).
The conducted teacher and student interviews were carried out one-to-one. This type
of interview has the advantages of being easy to arrange and also, the opinions and views
expressed throughout the interview originate from one source. This makes it easy for the
interviewer to match specific ideas with a certain person. However, one-to-one interviews do
have a disadvantage of providing limited opinions and views (Denscombe, 2013).
Additionally, face-to-face interviews involve social cues, such as voice, intonation and body
language from the interviewer which may influence the answers given from the interviewee
(Opdenakker, 2006).
To address the research question ‘Does students’ perception of their understanding
match their actual understanding?’ four students were chosen to interview with regards to
their test papers. Students were selected based on whether they had a mismatch of actual and
perceived understanding. The interview began with structured questions that probed their
understanding of the test questions to double check their responses. The same students were
then interviewed to answer the second research question, ‘Is students’ perception of their
understanding influenced by their self-efficacy?’ To investigate this, semi-structured
questions were asked of the students to explore this question. All interviews were audio
Jane Chapman PGCE Secondary
14
recorded and there was confirmation and reassurance about the confidentially of the
discussion.
2.2.4. Focus groups
Focus groups contain a small number of people brought together by the researcher to
investigate feelings, perceptions, attitudes and ideas about a certain topic. They are helpful
for exploring the extent to which shared views exist amongst a group of individuals relating
to a specific topic. As the name suggests, focus groups have a ‘focus’ to the session, with a
discussion being based around a topic in which all participants have similar knowledge of.
There is also a particular emphasis on the group’s interaction as means of acquiring
information, in which the moderator’s role is to facilitate this interaction and not to lead the
discussion.
For this research, a group of four students from the class being studied were chosen to
take part in the focus group. As the overall aim of the research was to explore in depth a
particular solution with a view to exploring the specifics (Denscombe, 2010), students were
deliberately chosen to ensure members of the group were likely to hold opposing views on
the topic for discussion. As the students are under the protection of responsible others,
permission was sought from the school organisation and authorisation to conduct the
interview was gained before the interview took place. The prospective interviewees were
contacted in advance and the interview was arranged for the following week, lasting 20
minutes. Semi-structured questions were asked that probed the research question ‘Is students’
perception of their understanding influenced by their self-efficacy?’ and the conversations in
focus groups were audio recorded. These questions covered the themes of self-assessment,
self-efficacy in school and self-efficacy in science. A full list of prompt questions is included
in Appendix 1.
2.3. Analysis of data
2.3.1. Constant comparative method
For the subject interviews, focus groups and the teacher interview, audio recordings
were analysed using the constant comparative method. For this, audio recordings were
listened through, focussing on the questions that guided the research. Parts of the interviews
that I believed to be important were written down and themes that underpinned what people
Jane Chapman PGCE Secondary
15
were saying were identified. These temporary constructs were used to compare against the
recordings again, and further notes and observations were written down. Temporary
constructs deemed to be unsuitable were deleted, and after another listen, a list of second
order constructs were made that seemed to explain the data (Wilson, 2013). This process
helped to summarise the important themes in the data.
2.3.2. Descriptive statistics
Answers to the test questions were compared to their confidence (red/amber/green)
responses after each question. Test answers were marked either correct or incorrect and were
only compared to red and green responses (i.e. not at all confident and very confident that
their given answer was correct). Amber responses were discounted. A correct answer with a
green response and an incorrect answer with a red response were regarded as a match of
actual and perceived understanding. A correct answer with a red response and an incorrect
answer with a green response were regarded as a mismatch of actual and perceived
understanding. Matched and mismatched understandings were given as percentages of the
students’ responses. The mean and standard deviation for girls’ and boys’ test scores together
with matched versus mismatched responses were calculated.
For the SLSE instrument, scale scores were computed by calculating the mean of the
items in each domain for each individual and the mean scores for boys and girls.
2.3.3. Association statistics
To initially explore the relationship between the students’ match or mismatch of
actual and perceived understanding and SLSE, Pearson correlation analysis of the students’
responses on the test papers and SLSE instrument was undertaken. There are a number of
assumptions that are made with respect to Pearson’s correlation. Included in these are that the
two variables should be measured at the interval or ratio level and that there needs to be a
linear relationship between these two variables. Also, as Pearson’s r is very sensitive to
outliers, these should be minimised or omitted, and finally, the variables should be
approximately normally distributed (Spiegelman, 2010).
Jane Chapman PGCE Secondary
16
2.4. Validity and reliability
Validity means that both the methods and data are ‘right’ in reflecting the reality and
truth. Methods used to obtain data should measure suitable indicators of the concept, giving
accurate results. A good level of reliability means that a research instrument will consistently
provide the same data, time and time again. If there were to be any variation, this would be
due to variation in the thing being measured and not due to the volatile nature of the research
instrument itself (Denscombe, 2010).
To ensure validity in what was being said in interviews, interview data was
corroborated with other data sources on the topic. Furthermore, there was often confirmation
on what was meant by the interviewee to avoid misinterpretation. The use of quantitative data
produces numerical data that is independent from the researcher and so should not be
influenced by the researcher. In this study, test papers, confidence indicator colours and
SLSE questionnaires gave standardised data and furthermore, the SLSE instrument was
validated by the method of exploratory factor analysis in the study of Lin and Tsai (2013). As
a check on the qualitative data in this study, there has been an explicit account of the
methods, analysis and decision making showing the readers as much detail as possible the
lines of analysis that led to certain conclusions (Denscombe, 2010). Qualitative data was also
checked for external reliability with other comparable studies.
2.5. Ethics
This educational research followed guidelines set out by the British Educational
Research Association (BERA, 2011), summarised in the table below.
Table 3. Adherence to Ethical Guidelines for Educational Research
Responsibilities
to…
What was done in the research to comply with the guidelines
Participants Voluntary Informed Consent
 All persons involved were treated within an ethic of respect.
 Participants understood and agreed to their participation, prior to the research getting
underway.
Openness and Disclosure
 Participants’ voluntary informed consent was secured,before research was carried
out, and there was no deception or subterfuge from the researchers.
Right to Withdraw
 Participants were informed that they had the right to withdraw from the research for
any or no reason,and at any time.
Children, Vulnerable Young People and Vulnerable Adults
 In all actions, the best interests of the child were the primary consideration.
 Children capable of forming their own views were granted the right to express their
Jane Chapman PGCE Secondary
17
views freely in all matters affecting them.
 Researchers ensured that they themselves and any collaborators complied with legal
requirements in relation to working with schoolchildren.
 All necessary steps were taken to reduce the sense of intrusion to the children and to
put them at their ease.
 Impact of research on the normal working and workloads of participants was
minimised.
Incentives
 Use of incentives to encourage participation was commensurate with good sense and
avoided choices which had undesirable effects.
 There was acknowledgement that the use of incentives had the potential to create a
bias in sampling or in participant responses.
Detriment Arising from Participation in Research
 Researchers made known to the participants that any unexpected detriment to
participants, which arose during the research, must be brought immediately to their
attention or to the attention of their guardians.
 Steps were taken to minimize the effects of designs that advantage or are perceived to
advantage one group of participants over others.
Privacy
 There was confidential and anonymous treatment of participants’ data
 Researchers complied with the legal requirements in relation to the storage and use of
personal data as set down by the Data Protection Act (1998) and any subsequent
similar acts.
Disclosure
 Researchers who judge that the effect of the agreements they have made with
participants, on confidentiality and anonymity, will allow the continuation of illegal
behaviour, which has come to light in the course of the research, must making
disclosure to the appropriate authorities. If the behaviouris likely to be harmful to the
participants or to others,the researchers must also consider disclosure.
 At all times the decision to override agreements on confidentiality and anonymity
must be taken after careful and thorough deliberation.
Sponsors of
Research
Methods
 Only methods fit for the purpose of the research undertaken were employed.
 Researchers have communicated the extent to which their data collection and analysis
techniques,and the inferences to be drawn from their findings, are reliable and valid.
Publication
 The researchers have make themselves familiar with the BERA research writing
guidelines
The Community
of Educational
Researchers
Misconduct
 This research was conducted to the highest standards.
 Subject to any limitations imposed by agreements to protect confidentiality and
anonymity, data and methods amenable to reasonable external scrutiny.
 There is contribution of critical analysis and constructive criticism.
Authorship
 Academic status or otherindicator of seniority has not determined first authorship
Educational
Professionals,
Policy Makers
and the General
Public
 Researchers will seek to make public the results of their research for the benefit of
educational professionals,policy makers and a wider public understanding of
educational policy and practice.
 Communication of findings, and the practical significance of their research, will be
given in a clear, straightforward fashion.
Jane Chapman PGCE Secondary
18
3. Analysis of Findings
3.1. Does students’ perception of their understanding match their actual
understanding?
The data collected from test results with confidence indicator colours and subject
interviews presented several interesting findings. Firstly, it was highlighted that overall, girl’s
perceptions of their understanding was more accurate than boys’. Additionally, girls were
more likely to believe that they had an incorrect answer when they were actually correct. This
is in contrast to another finding that demonstrated boys were more likely to believe that they
had a correct answer when they were actually incorrect. Because there is evidence of the
effect of gender in the literature (Fast et al., 2010; Griggs et al., 2013), these findings were
investigated further.
3.1.1. Overall, girls’ perception of their understanding was more accurate than boys’
The class’ mean percentage of matched understanding responses was 71% (±0.18 SD)
with girls having a higher mean percentage of 73% (±0.16 SD) than boys of 70% (±0.20 SD)
(Figs 1 and 2). This demonstrates that pupils’ perceived understanding greatly matched their
actual understanding. This is in contrast to previous studies that found evidence to suggest
school students to be relatively inaccurate assessors (Ross, 2006). A possible explanation for
this may be due to the current study using a high-achieving cohort of students. Indeed, a
similar study reported that higher-scoring students were more accurate at predicting their
examination scores than lower-scoring students (Hacker et al., 2000). A potential theory for
this could be that the higher ability students studied in this research have the awareness of the
knowledge that they do and do not possess.
In general, girls in this study were more accurate at judging whether they did or did
not know the questions asked of them. This is in concordance with other researchers
investigating differences in accuracy of self-perception between male and female students
(Pajares, 1996). Accurate self-perceptions may allow students to more accurately assess their
strategies of problem solving. However, ‘realistic’ self-appraisals may be in danger of
lowering optimism and therefore, lowering levels of effort, perseverance and persistence
(Bandura, 1997). Consequently, just as much attention should be paid to student’s perceived
competence to their actual capability as it is their perceptions that may more accurately
predict motivation and academic choices in the future (Hackett & Betz, 1989). Accuracy in
Jane Chapman PGCE Secondary
19
self-assessment by distinguishing strengths and weaknesses is critical for students to make
more effective decisions about where to apply their learning efforts. This will allow students
to take responsibility for their education and improve autonomy in gaining and improving on
their skills and knowledge (Dunning et al, 2004).
Figure 1. Comparing the percentages of matched versus mismatched actual and perceived
understanding between girls and boys
Comparing matched responses (i.e. red + incorrect and green + correct) and mismatched responses (i.e. red +
correct and green + incorrect) between boys and girls. Boys = dark blue; Girls = light blue.
3.1.2. Girls were more likely to believe that they had an incorrect answer when they
were actually correct
The class’ mean percentage of mismatched understanding responses of red but correct
was very low at 0.04% (±0.09 SD) with girls having a higher mean percentage of 6% (±0.09
SD) than boys of 0.03% (±0.09 SD) (Figs 1 and 2). These mismatched responses were
validated through subject interviews which confirmed both selected girls were sure of their
responses. This highlights that the girls in this study had less confidence of being correct
when compared to the boys.
70%
0.03%
27%
73%
6%
20%
0%
10%
20%
30%
40%
50%
60%
70%
80%
Matched Red + Correct Green + Incorrect
Responses(%)
Matched and mismatched responses
Comparing Boys' and Girls' Matched and Mismatched Responses
Boys'
Girls'
Jane Chapman PGCE Secondary
20
Figure 2. Percentage of student responses with matched and mismatched actual and
perceived understanding.
Matched (blue) = correct + green response and incorrect + red response. Mismatched = correct + red response
(red) and incorrect + green response (green). Boys = left side; Girls = right side
These results are consistent with other research that found girls, and in particular
gifted girls, to have a general tendency toward underconfidence (Lundeberg et al., 1994).
Furthermore, research shows that female students often have problems with self-confidence
and report greater stress over their competence than male students (Moffat et al., 2004). This
could have negative implications as students who lack confidence in skills they possess are
prone to avoiding tasks in which those skills are required, and more likely to give up in the
face of difficulty. Lent & Hackett (1987) studied the perceived and actual competence of
mathematical skills in college students. They demonstrated that generally, it is the
underestimation of competency and not the lack of capability that is responsible for
avoidance of math-related courses and careers, and this is more likely to be the case with
woman than men. When this is the case, identifying and modifying these perceptions would
prove beneficial.
0%
20%
40%
60%
80%
100%
120%
Aiden
Will
Jak
Bertie
Elvan
Toby
Adam
Jack
Kayde
Rowan
Ibrahim
Joe
BenJ
James
Dejan
BenB
Nagi
Gaea
Rosie
Elly
Sarah
Alicja
Chelsie
Riley
Anastasia
Emily
Isobel
Responses(%)
Students
Proportion of matched and mismatched student responses
Jane Chapman PGCE Secondary
21
3.1.3. Boys were more likely to believe that they had a correct answer when they were
actually incorrect
The class’ mean percentage of mismatched understanding responses of green but
incorrect was 24% (±0.17 SD) with girls having a lower mean percentage of 20% (±0.15 SD)
than boys of 26% (±0.18 SD) (Figs 1 and 2).
These mismatched responses were validated through subject interviews which
confirmed both selected boys were sure of their responses. Previous findings have shown a
disparity of results. Several studies have reported males to be more likely to overestimate
through self-assessment (Lind et al, 2002; Rees & Shepherd, 2005). Lind et al (2002)
assessed the ability of students to self-assess using a competency-based evaluation and
further found females to underestimate their performance, despite outperforming the male
students. However, another study involving an intervention to improve student understanding
of assessment criteria, found no identifiable difference between male and female self-
assessment (Rust et al., 2010). A possible explanation which could account for a lack of
gender difference may be that exposure of good quality exemplar assignments to students
could have caused underestimation of their own work. This perhaps had more of a
pronounced effect on previously over-confident males.
3.2. Is students’ perception of their understanding influenced by their self-efficacy?
3.2.1. Students scored highest on the ‘Practical Work’ dimension of the SLSE
instrument
The participants’ scores on the Science Learning Self-Efficacy instrument were
calculated. As a result, the classes mean scores and standard deviations of the SLSE
dimensions are shown in Table 4.
Table 4. Classes mean scores and standard deviations of the SLSE instrument
Distinct Dimension Class’ mean score Standard deviation
Conceptual Understanding 2.99 0.45
Higher-Order Cognitive Skills 3.00 0.39
Practical Work 3.71 0.41
Everyday Application 2.97 0.51
Science Communication 3.07 0.55
Jane Chapman PGCE Secondary
22
As shown by Table 4 and Figure 3, students scored highest on the ‘Practical Work’
dimension, identical to a result from a previous study by Lin et al (2013) who found (M =
3.44). Furthermore, another study found that one of the most positive predictors of student
science-related self-efficacy was practical work (Lavonen, & Laaksonen, 2009). The same
study also found that self-efficacy related to science was the most powerful predictor of
student performance, a result similar to Valijarvi et al. (2007).
This study asked students in interviews and focus groups a number of questions about
their self-efficacy in science classrooms, and the influence practical work has on this and on
science learning. During interviews, students unanimously agreed that their level of
confidence depended on the environment of the classroom. The following comments were
typical, suggesting that students felt less confident when they felt observed by their
classmates and were put off by the ‘big crowd’:
It depends on who is around…the atmosphere of the class. When it’s really silent you wouldn’t
feel confident. (Student F)
The most anxious thing is answering the questions if you were to put your hand up, because
you might be wrong, and feel anxious that people will laugh at you. (Student E)
As one might expect, students felt more confident and likely to contribute in group discussion
when they were doing smaller group activities, such as with practical work, as these
comments suggest:
It’s not so tense when we’re in groups and doing activities. All the pressure’s off…it’s a
relaxed environment. Everyone is doing their own thing so they’re not concentrating on you.
When you are in groups, everyone feels more confident in commenting so you want to
contribute more. (Student H)
In class if you put your hand up, you know that everyone is hearing and everyone is watching
but when you’re in groups you don’t get that feeling…everything else goes away. (Student E)
It appears that practical work promotes greater confidence in students because they are less
likely to feel judged by the whole class and feel less anxious in the science classroom. In
addition to self-efficacy, practical work was illustrated to increase student enjoyment and
stimulate learning, as these students comment:
Jane Chapman PGCE Secondary
23
When you think ‘science’, you think ‘experiments’. When you’re doing the experiments,
you’re learning about it. It explains what you are learning in front of you. (Student G)
People get excited [about practical work]…they enjoy it so much and in the end they realise
they learnt something. When you write the conclusion, it s urprises you how much you
know…because you just did one practical. (Student E)
If you’re in groups, you are doing it yourself instead of watching the teacher do it so you can
make sure you know something by doing your own experiment. You can get more involved
and do your own investigations rather than just sitting, listening and writing down. (Student A)
This is in line with previous findings that highlighted practical work stimulates student
interest and curiosity in science, promoting aspects of scientific thinking and allowing
students to develop practical abilities (Hofstein, 1988).
3.2.2. Boys had a higher mean score in every dimension of the SLSE instrument
For each dimension of the SLSE instrument, boys had higher mean scores than the
girls (Table 5 and Figure 3).
Table 5. Boys’ and girls’ mean scores and standard deviations of the SLSE instrument
Distinct Dimension Boys’ mean score Standard
deviation
Girls mean score Standard
deviation
Conceptual
Understanding
3.09 0.46 2.84 0.42
Higher-Order Cognitive
Skills
3.04 0.41 2.94 0.38
Practical Work 3.80 0.36 3.60 0.48
Everyday Application 3.11 0.50 2.77 0.49
Science Communication 3.23 0.41 2.83 0.66
This is fitting with studies mentioned previously, stating gender (i.e. being a girl; Fast
et al., 2010) contributed to students having a lower science self-efficacy. This was confirmed
by students doing the focus interviews. For example, one boy commented that the level of his
confidence ‘depends on the topic, really’. However, most girls stated that it was the
relationships in the classroom that was a large predictor of their confidence, as these excerpts
reveal:
Jane Chapman PGCE Secondary
24
If you’ve got people around you that you trust, you feel confident. It’s better in forms
because you get to know the people a lot more but if you’re in Science, you don’t really
know themas well so you don’t know what they’re going to say. Sometimes you think that if
you say something it might go around the whole school. (Student E)
Girls are more worried about what people think and what they’re going to say. (Student F)
For girls, some people laugh and you don’t feel comfortable or know why. In Science, girls
worry about themselves because everyone frets about tying their hair up [for practical
work]…you don’t really want other people looking at you with your hair tied up because it
makes you feel awkward. (Student E)
As Science classes in this school are set by prior attainment and not by form, this may
explain the lower levels of self-efficacy for girls in the SLSE instrument. Furthermore, the
worries of girls about their appearance during practical work could account for their lower
mean score in the ‘Practical Work’ dimension of the instrument, in relation to boys. However,
in another interview with a student, she explained that she did not feel worried about what
others thought of her answers and felt quite confident in Science because she felt she
understood quite a lot of it. The following statement highlights this:
I wouldn’t be the only one who didn’t understand… [when asked if she ever worried about
answering questions in class]. I feel pretty confident because most of time, I understand the
things you are teaching us. I feel much more confident in school now because we’re in our last
term…we’ve been here for longer. (Student A)
This student was then asked a number of questions about whether she personally thought
there was a disparity between boys’ and girls’ confidence in science, and whether certain
factors might predict a person’s confidence. She stated that it depended on the type of person
they were, together with their knowledge and enjoyment of science. Furthermore, she thought
that whether the person had siblings or not could be a factor affecting a person’s confidence
as these comments suggests:
Not necessarily. I think it depends on the person and how much they enjoy science and how
much they know. I think if you enjoy it more, you’re more confident because if you enjoy it,
you are more relaxed. (Student A)
Jane Chapman PGCE Secondary
25
You could be more confident if you are younger because you know your brother or sister has
done it before and you can ask them if you are unsure…I’ve got an older brother in year 8 and
I know that for Maths or Science, I can ask him if he knows the answers to help me. (Student
A)
Figure 3. Boys’ and Girls’ Mean Scores in the Science Learning Self-Efficacy Instrument
CU = Conceptual Understanding, HOCS = Higher-Order Cognitive Skills, PW = Practical Work,
EA = Everyday Application, SC = Science Communication. Boys = dark blue; Girls = light blue
3.2.3. A strong correlation was found between girls’ SLSE score and both matched and
mismatched (false negative) responses
To understand the relationship between the students’ self-efficacy in learning science
and their perceptions of understanding, Pearson correlation analysis based on their responses
to the SLSE and test was performed. As shown in Table 6, the three measures of perceived
understanding factors (i.e. matched, mismatched (red + correct) and mismatched (green +
incorrect) were related to all mean self-efficacy scores of the SLSE instrument (i.e. whole
class, boys and girls), suggesting weak (i.e. the boys factor) to medium (i.e. the whole class
factor) to large (i.e. the girls factor) effect size coefficients (Cohen, 1992).
2.5
2.7
2.9
3.1
3.3
3.5
3.7
3.9
CU HOCS PW EA SC
MeanScore
Dimension of Science Learning Self-Efficacy
Boys' and Girls' MeanScores of the SLSE Dimensions
Boys'
Girls'
Jane Chapman PGCE Secondary
26
Table 6. Correlation of the students’ science learning self-efficacy and their perceptions of
understanding
Mean Self-Efficacy Score Matched Mismatched
(Red + Correct)
Mismatched
(Green + Incorrect)
Whole Class 0.13 (weak) -0.42 (medium) 0.08 (weak)
Boys -0.04 (weak) -0.13 (weak) 0.12 (weak)
Girls 0.57 (large) -0.75 (large) -0.13 (weak)
From these results, the strongest positive correlation was between girls’ mean self-
efficacy score and matched responses (Fig 4). From this graph, it is clear that as girls’ self-
efficacy in science score increases, the percentage of responses that had a match or perceived
and actual understanding also increased. This is similar to the results of previous studies,
whereby students with higher-scoring and higher self-efficacy were more accurate at
predicting their examination scores than lower-scoring students (Hacker et al., 2010).
The strongest negative correlation was between girls’ mean self-efficacy score and
mismatched (red + correct) responses (Fig 5). These findings imply that in general, as girls’
self-efficacy in science increases, they are more likely to know when they do and do not
understand something. Furthermore, as girls’ self-efficacy increases, they are also less likely
to think they are incorrect when they are actually correct.
Figure 4. Correlation of the girls’ science learning self-efficacy and their matched
perceptions of understanding
R² = 0.3224
0%
20%
40%
60%
80%
100%
0 0.5 1 1.5 2 2.5 3 3.5 4
Responses(%)
SLSE Self-Efficacy Score
Girls' MeanSLSE Score vs Matched Responses
Jane Chapman PGCE Secondary
27
Figure 5. Correlation of the girls’ science learning self-efficacy and their mismatched (false
negative) perceptions of understanding
4. Conclusions and Implications
The purpose of this study was to investigate whether students’ perception of their
understanding matched their actual understanding and whether students’ perception of their
understanding was influenced by their self-efficacy. In response to the first research question,
the results indicate that overall, girls’ perception of their understanding was more accurate
than boys’. Furthermore, girls were more likely to believe that they had an incorrect answer
when they were actually correct and boys were more likely to believe that they had a correct
answer when they were actually incorrect. In response to the second question, students scored
highest on the ‘Practical Work’ dimension of the SLSE instrument; boys had a higher mean
score in every dimension of the SLSE instrument and a strong correlation was found between
girls’ SLSE score and both matched and mismatched (false negative) responses. It is
important to note that cognitive appraisal of a situation might affect expectations of personal
efficacy. Factors such as social, situational and temporal circumstances all influence the
micro-analysis of perceived coping capabilities that represent self-efficacy. It is not simply
down to personality traits (Bandura, 1977).
R² = 0.5661
-5%
0%
5%
10%
15%
20%
25%
30%
0 0.5 1 1.5 2 2.5 3 3.5 4
Responses(%)
SLSE Self-Efficacy Score
Girls' MeanSLSE Score vs Mismatched (False
Negative) Responses
Jane Chapman PGCE Secondary
28
4.1. Conclusions and their wider significance
Developing reliable and valid assessment tools of student performance that accurately
indicate student learning is difficult. One feedback tool developed by Gardner-Medwin et al.
at University College, London has been to use confidence-based marking (CBM). This
methodology measures a learner’s knowledge quality by determining both the correctness of
the learner’s knowledge and confidence in that knowledge (Gardner-Medwin, 2006). With
this CBM method, students select a confidence rating of low (1), medium (2) or high (3) to a
question; that is, their confidence about their knowledge. If the student’s answer is correct,
they are awarded those marks (i.e. 1, 2 or 3). If the answer is wrong, the marks awarded at
these confidence levels are 0, -2 or -6. The scheme uses negative, graded marking for the
upper two confidence levels with the relative cost of a wrong answer increasing at higher
confidence levels. This graduation ensures that the scoring scheme is properly motivating
(Gardner-Medwin & Gahan, 2003). Assessment by CBM is a simple, valid and reliable
method for challenging students to think discriminately (Barr & Burke, 2013).
4.3. Implications for research
Given that this study appears to be the first to examine the relationship between self-
efficacy and perception of understanding, future research is needed on students from different
grade-levels, schools and geographical areas to generalise beyond this sample. Furthermore,
due to the small sample size, future studies should investigate the present relationship with a
larger cohort to obtain greater reliability and precision (Biau, 2008). Previous research
looking into the link between students’ SLSE and their approaches to learning science found
that students’ deep strategies and deep motive were strong predictors of their SLSE. Future
studies could explore the associations of underlying learning variables, such as conceptions,
approaches, self-efficacy, motivation and outcomes to build a more elaborated model of these
relationships.
4.2. Implications for practice
For students to become better assessors of their understanding, educators should aim
to improve self-efficacy amongst students. In a study by Zimmerman et al. (1996), students
were asked to predict their efficacy before undertaking an assignment or test and then later
graph those judgements alongside their actual scores. Once students could visually see the
dissociation of their predicted and actual score, their accuracy for subsequent self-efficacy
Jane Chapman PGCE Secondary
29
judgements improved. Furthermore, this self-evaluating task was also shown to help students
improve their studying methods and academic achievement (Campillo et al., 1999).
Therefore, teachers could not only help students to develop a stronger and more accurate way
of personal self-assessment but also to increase their self-efficacy and promote more
autonomous and independent learners.
References
Aikenhead, G. (1996) Science Education: Border Crossing into the Subculture of Science,
Studies in Science Education, 27, 1-52.
Andrade, H. (2010) Students as the definitive source of formative assessment: academic self-
assessment and the self-regulation of learning. In Andrade, H. L. & Cizek, G. J. (Eds.)
Handbook of Formative Assessment, 90-105. New York: Routledge.
Ball, S. (2003) The teacher’s soul and the terrors of performativity, Journal of Educational
Policy, 18 (2), 215-228.
Bandura, A. (1977) ‘Self-efficacy: toward a unifying theory of behavioural change’,
Psychological Review 84, 191-215.
Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory.
Englewood Cliffs, NJ: Prentice Hall.
Bandura, A. (1997). Self-efficacy: The exercise of control. New York, NY: Freeman.
Barr, D. A. and Burke, J. R. (2013) ‘Using confidence-based marking in a laboratory setting:
A tool for student self-assessment and learning’, The Journal of Chiropractic Education 27: 1
Ben-Zvi, R., et al.(1986). Is an atom of copper malleable? Journal of Chemical Education,
63, 64-66.
BERA (2011) Ethical guidelines for educational research, Southwell, Notts.: British
Educational Research Association.
Biau, D. (2008) Statistics in Brief: The Importance of Sample Size in the Planning and
Interpretation of Medical Research, Clinical Orthopaedics and Related Research, 466(9),
2282-2288.
Black, P. and Wiliam, D (1998), “Assessment and Classroom learning”, Assessment in
Education: Principles, Policy and Practice, CARFAX, Oxfordshire, Vol. 5, No. 1, 7-74.
ISSN: 0969-594X.
Jane Chapman PGCE Secondary
30
Blatchford, P. (1997). Students’ self assessment of academic attainment: Accuracy and
stability from 7 to 16 years and influence of domain and social comparison group.
Educational Psychology, 17(3), 345-360.
Brown, G. & Harris, L. (2013) ‘Student self-assessment’, In J. H. McMillan (Ed.), SAGE
Handbook of Research on Classroom Assessment, 367-393, Los Angeles: SAGE.
Brown, G. et al. (2009). Use of interactive-informal assessment practices: New Zealand
secondary students’conceptions of assessment. Learning & Instruction, 19(2), 97-111.
Brown, G. & Harris, L. (2013) ‘Opportunities and obstacles to consider when using peer- and
self-assessment to improve student learning: Case studies into teachers' implementation’,
Teaching and Teacher Education, 36, 101-111
Britner, S. L., & Pajares, F. (2006). Sources of science self-efficacy beliefs of middle school
students. Journal of Research in Science Teaching, 43, 485– 499.
Borman, G. & Overman, L. (2004) Academic resilience in mathematics among poor and
minority students. The Elementary School Journal, 104, 177– 195
Campbell, B. (1994) Science: the Salters’ Approach – a case study of the process of large-
scale curriculum development. Science Education, 78 (5), 415-447.
Campillo, M. et al. (1999) Enhancing academic study skill, self-efficacy, and achievement
through self-regulatory training, Paper presented at the annual meeting of the Americal
Psychological Asssociation, Boston, MA.
CERI (2008) Assessment for Learning: Formative Assessment, Available from:
http://www.oecd.org/site/educeri21st/40600533.pdf (Accessed 15 Apr 2014)
Chang, H.-P. Et al. (2011) The Development of a Competence Scale For Learning Science:
Inquiry and Communication, International Journal of Science and Mathematics Education, 9,
1213-1233.
Cobern, W. (1994) ‘Worldview theory and conceptual change in science education’, Paper
presented to the annual meeting of the National Association for Research in Science
Teaching, Anaheim, CA.
Cohen, J. (1992). A power primer. Psychological Bulletin, 112, 155–159
Denscombe, M. (2010) The Good Research Guide: For Small-Scale Social Research Projects,
Fourth Edition. Open University Press: McGraw Hill Education
Driver, R. et al. (1994)’ Constructing scientific knowledge in the classroom. Educational
Researcher, 23 (7), 5-12.
Dunning, D., et al. (2004). Flawed self-assessment: implications for health, education, and
the workplace. Psychological Science in the Public Interest, 5(3), 69-106.
Duschl, R. (2008) ‘Science education in three-part harmony: Balancing conceptual,
Jane Chapman PGCE Secondary
31
epistemic, and social learning goals. Review of Research in Education, 32, 268–291.
Dlamini, B. Et al. (1996) Liked and disliked learning activities: responses of Swazi students
to science materials with a technological approach, Research in Science and Technological
Education, 14 (2), 221-235.
Ehrlinger, J. et al. (2008) Why the unskilled are unaware: Further explorations of (absent)
self-insight among the incompetent. Organisational Behaviour and Human Decision
Processes, 105, 98-121.
Fast, L. et al. (2010). Does math self-efficacy mediate the effect of the perceived classroom
environment on standardized math test performance. Journal of Educational
Psychology, 102, 729– 740.
Gardner-Medwin, A. R. (2006) ‘Confidence-based marking - towards deeper learning and
better exams’. Bryan C, Clegg K, eds. Innovative Assessment in Higher Education. London:
Routledge, Taylor and Group; Francis: 141–149.
Gardner-Medwin A.R., Gahan M. (2003) ‘Formative and Summative Confidence-Based
Assessment’, Proceedings of the 7th International CAA Conference, Loughborough
University, UK, 147-155 ( www.caaconference.com).
Gravill, J. et al. (2002). Metacognition and IT: the influence of self-efficacy and self-
awareness. Paper presented in the meeting of eighth Americas Conference on Information
Systems, Dallas, TX.
Griffiths, A. & Preston, K. (1989) Paper presented at the National Association for Research
in Science Teaching.
Hacker, D. et al. (2000) Test prediction and performance in a classroom context, Journal of
Educational Psychology, 92, 160-170.
Hackett, G. & Betz, N. (1996) An exploration of the mathematics self-efficacy/mathematics
performance correspondence, Journal for Research in Mathematics Education, 20, 261– 273.
Hoffstein, A. (1988) ‘Practical work and scientific investigation II’, In Development and
dilemmas in science education, Chapter 10.
Krajcik, J. (1990) In The Psychology of learning Science; Glynn, S.; Yeaney, R.: Brinon,
Eds; E~lbeum:H illdale. NJ.
Lavonen, J. & Laaksonen, S. (2009) Context of Teaching and Learning School Science in
Finland: Reflections on PISA 2006 Results, Journal of Research in Science Teaching, 46(8),
922-944.
Lent, R. & Hackett, G. (1987) Career self-efficacy: Empirical status and future directions,
Journal of Vocational Behavior, 30, 347–382.
Jane Chapman PGCE Secondary
32
Lin, T.-J. And Tsai, C.-C. (2013) ‘An investigation of Taiwanese high school students’
science learning self-efficacy in relation to their conceptions of learning science’, Research in
Science & Technological Education, 31(3), 308-323.
Lin, T.-J. and Tsai, C.-C. (2013) ‘A multi-dimensional instrument for evaluating Taiwanese
high school students’ science learning self-efficacy in relation to their approaches to learning
science’, International Journal of Science and Mathematics Education 11, 1275-1301
Lin, T.-J., et al. (2013) ‘A Cross-Cultural Comparison of Singaporean and Taiwanese Eighth
graders’ Science Learning Self-Efficacy from a Multidimensional Perspective.’ International
Journal of Science Education 35: 1083–1109.
Lind, D. Et al. (2002) Competency-based student self-assessment on a surgery rotation,
Journal of Surgical Research; 105: 31–4.
Liem, A. et al. (2008). The role of self-efficacy, task value, and achievement goals in
predicting learning strategies, task disengagement, peer relationship, and achievement
outcome. Contemporary Educational Psychology, 33, 486–512.
Liu, M., et al. (2006) ‘Middle school students’ self-efficacy, attitude, and achievement in a
computer enhanced problem-based learning environment. Journal of Interactive Learning
Research, 17(3), 225-242.
Lubben, F. et al. (1996) Contextualising science teaching in Swaziland: some student
reaction. International Journal of Science Education, 18(3), 311-320.
Lundeberg, M. et al. (1994) Highly confident but wrong: Gender differences and similarities
in confidence judgments. Journal of Educational Psychology, 86, 114–121.
OECD (2005) ‘Formative Assessment: Improving Learning in Secondary Classrooms’,
Avaliable from: http://www.oecd.org/edu/ceri/35661078.pdf (Accessed 15 Apr 2014)
OECD. (2007a). PISA 2006: Science Competencies for Tomorrow’s World, Volume 1:
Analysis. Paris: OECD.
OECD. (2007b). PISA 2006: Volume 2: Data. Paris: OECD.
McEnery, J. & Blanchard, P. (1999) Validity of multiple ratings of business student
performance in a management simulation, Human Resource Development Quarterly, 10(2),
155-172.
Miller, T. & Geraci, L. (2011) ‘Unskilled but Aware: Reinterpreting Overconfidence in Low-
Performing Students’, Journal of Experimental Psychology: Learning, Memory, and
Cognition, 37(2), 502–506
Moffat, K. et al. (2004) First year medical student stress and coping in a problem-based
learning medical curriculum, Medical Education, 38, 482–491.
Novick, S. & Nussbaum, J. (1981) Pupils’ Understanding of the Particulate Nature of Matter:
A Cross-Age Study, Science Education, 65(2), 87-196.
Jane Chapman PGCE Secondary
33
Nakhleh, M. (1992) Why some students don’t learn chemistry: Chemical Misconceptions,
69(3), 191-195
Organisation for Economic Co-operation and Development (1999) ‘Measuring student
knowledge and skills: A new framework for assessment. Paris: Author.
Pajares, F. (1996) Self-Efficacy Beliefs in Academic Settings, Review of Educational
Research, 66(4), 543-578.
Pajares, F. (1996) Self-Efficacy Beliefs and Mathematical Problem-Solving of Gifted
Students, Contemporary Educational Psychology, 21, 325-344.
Rees, C. & Shepherd, M. (2005) Students’ and assessors’ attitudes towards students’ self-
assessment of their personal and professional behaviours, Medical Education, 39, 30-39.
Ross, J. A., et al. (2002) Self-Evaluation in grade 11 mathematics: Effects on achievement
and student beliefs about ability. In D. McDougall (Ed.), OISE Papers on Mathematical
Education. Toronto: University of Toronto
Ross, J. A. (2006). The reliability, validity, and utility of self-assessment. Practical
Assessment, Research, and Evaluation, 11(10), 1-13.
Ross, J. et al (1998) Skills training versus action research in-service: impact on student
attitudes to self-evaluation, Teaching and Teacher Education, 14 (5), 463–477
Ross. J. Et al. (1999) Effect of self-evaluation on narrative writing. Assessing Writing, 6(1),
107-132.
Rust, C. et al. (2010) Improving Students' Learning by Developing their Understanding of
Assessment Criteria and Processes, Assessment and Evaluation in Higher Education, 28(2),
147-164.
Spiegelman, D. (2010) Commentary: Some remarks on the seminal 1904 paper of Charles
Spearman ‘The Proof and Measurement of Association between Two Things’, International
Journal of Epidemiology, 39(5), 1156-1159.
Sungur, S. (2007) ‘Modelling the relationships among students’ motivational beliefs,
metacognitive strategy use, and effort regulation’, Scandinavian Journal of Educational
Research, 51, 315–326.
Smith, M. L. (2006) ‘Multiple methods in education research’. In Green, J. L. et al. (Eds.),
Handbook of complementary methods in education research, LEA, Mahwah, NJ, 457–475
Stamovlasis, D. Et al. (2005) A study of group interaction processes in learning lower
secondary physics. Journal of Research in Science Teaching, 43(6), 556-576.
Griggs, M. et al. (2013) The Responsive Classroom Approach and Fifth Grade Students'
Math and Science Anxiety and Self-Efficacy, School Psychology Quarterly, 28(4), 360-373.
Jane Chapman PGCE Secondary
34
Taber, K. (2013) Classroom-based Research and Evidence-based Practice: A Guide for
Teachers, London: SAGE Publications.
Topping, K. (2003) Self and peer assessment in school and university: reliability, validity and
utility. In Segers, M., Dochy, F. & Cascallar, E. (Eds.) Optimising News Modes of
Assessment: In search of qualities and standards, 55-87, Dordrecht, NL: Springer
Netherlands
Tsai, C.-C. (2003). Taiwanese science students’ and teachers’ perceptions of the laboratory
learning environments: Exploring epistemological gaps. International Journal of Science
Education, 25, 847–860.
Tsai., et al. (2011) ‘Scientific epistemic beliefs, conceptions of learning science and self-
efficacy of learning science among high school students’, Learning and Instruction, 21, 757-
769.
Usher, E. L. & Pajares, F. (2006) ‘Sources of academic and self-regulatory efficacy beliefs of
entering middle school students’, Contemporary Educational Psychology, 31, 125–141.
Valijarvi, L. et al. (2007) The Finnish success in PISA—and some reasons behind it 2.
Jyvaskyla: Institute for Educational Research
Vygotsky, L. (1978) Mind in society: The development of higher psychological processes.
Cambridge, MA: Harvard University Press.
Weiner, B. (1972) Theories of motivation. Chicago: Markham.
Wilson, E. (2013) School-based research: a guide for education students, London: SAGE
Publications
Zins, J. & Elias, M. (2006). Social and emotional learning. In G. G.Bear & K. M.Minke (
Eds.), Children’s needs III: Development, prevention, and intervention, 1– 13. Bethesda,
MD: National Association of School Psychologists.
Zimmerman, B. (1996) ‘Measuring and mismeasuring academic self-efficacy: Dimensions,
problems, and misconceptions.’ Symposium presented at the Annual Meeting of the
American Educational Research Association, New York.
Jane Chapman PGCE Secondary
35
Appendix 1. Prompt questions for interviews
Self-Assessment
 Did marking how confident you were in each question change your understanding?
 With self-assessment in the classroom, how do you feel about how other people might
think or act towards you?
 How accurate do you feel like you were able to assess yourselves?
 Do you prefer marking your own work or have the teachers mark it? Why?
 When you were marking your test questions with red, amber or green, do you think
you played it safe instead of taking a risk (i.e. put red or amber to be on the safe side)
or do you think you took a risk and were over-confident (i.e. put a green)?
 How confident did you have to be, as a percentage, to award yourself green for a
question, as opposed to amber or red?
Self-Efficacy in School
 Does your confidence in subjects affect how high you set your goals or how much
effort you put in?
 Does your confidence depend on how anxious you are in class?
 How confident do you generally feel in school? Is this affected by the people in your
class? Affected by the time of day? Affected by different subjects?
Self-Efficacy in Science
 How confident do you generally feel in science lessons?
 Do you think self-confidence in science differs between genders? If so, why?
 When we do practical work in class, do you think your understanding of science
changes?
 Does practical work alter your interest in science?
 Do you ever relate school science into your everyday lives?
 When you think about if you know something or not, do you feel this is affected by
how confident you are in science?
Jane Chapman PGCE Secondary
36
Appendix 2. Example Science Learning Self-Efficacy Questionnaire with Answers – James
Jane Chapman PGCE Secondary
37
Example Science Learning Self-Efficacy Questionnaire with Answers – Alicja
Jane Chapman PGCE Secondary
38
Appendix 3. Example Test Paper 1 with Answers and Confidence Indicators – James
Jane Chapman PGCE Secondary
39
Example Test Paper 1 with Answers and Confidence Indicators – Alicja
Jane Chapman PGCE Secondary
40
Appendix 4. Example Test Paper 2 with Answers and Confidence Indicators – James
Jane Chapman PGCE Secondary
41
Example Test Paper 2 with Answers and Confidence Indicators – Alicja

More Related Content

Similar to 1c assignment

Fostering Autonomy, Purpose, and Competence in Math
Fostering Autonomy, Purpose, and Competence in MathFostering Autonomy, Purpose, and Competence in Math
Fostering Autonomy, Purpose, and Competence in Math
Jordan Yoshihara
 
Effects of Multiple Intellgences on Academic Education
Effects of Multiple Intellgences on Academic EducationEffects of Multiple Intellgences on Academic Education
Effects of Multiple Intellgences on Academic Education
Quinn Collor
 
Mini Grant Proposal
Mini Grant ProposalMini Grant Proposal
Mini Grant Proposal
Katherine Feliciano
 
Grading Ethical Issues
Grading Ethical IssuesGrading Ethical Issues
Language testing and evaluation
Language testing and evaluationLanguage testing and evaluation
Language testing and evaluation
esra66
 
Borgemenke, arthur j examining recurring critical events schooling v5 n1 2014
Borgemenke, arthur j examining recurring critical events schooling v5 n1 2014Borgemenke, arthur j examining recurring critical events schooling v5 n1 2014
Borgemenke, arthur j examining recurring critical events schooling v5 n1 2014
William Kritsonis
 
Arrrsa mid sem sample test anxiety
Arrrsa   mid sem sample test anxietyArrrsa   mid sem sample test anxiety
Arrrsa mid sem sample test anxiety
Hafizul Mukhlis
 
report in chem
report in chemreport in chem
report in chem
sealdrago02
 
Stress 18
Stress 18Stress 18
Stress 18
Priya Anand
 
Action Research
Action ResearchAction Research
Action Research
David Gebler
 
Action Research Final
Action Research FinalAction Research Final
Action Research Final
Donald Lance
 
Academic performance mapping traits of engineering students
Academic performance mapping traits of engineering studentsAcademic performance mapping traits of engineering students
Academic performance mapping traits of engineering students
Alexander Decker
 
1Methodology AssignmentParticipantProcedures
1Methodology AssignmentParticipantProcedures1Methodology AssignmentParticipantProcedures
1Methodology AssignmentParticipantProcedures
AnastaciaShadelb
 
G11-2Describe how a change in the exchange rate affected your fi
G11-2Describe how a change in the exchange rate affected your fiG11-2Describe how a change in the exchange rate affected your fi
G11-2Describe how a change in the exchange rate affected your fi
JeanmarieColbert3
 
Mathematics instruction for secondary students with learning disabilities
Mathematics instruction for secondary students with learning disabilitiesMathematics instruction for secondary students with learning disabilities
Mathematics instruction for secondary students with learning disabilities
pschlein
 
Ez35875879
Ez35875879Ez35875879
Ez35875879
IJERA Editor
 
A study on academic anxiety among adolescents of minicoy island
A study on academic anxiety among adolescents of minicoy islandA study on academic anxiety among adolescents of minicoy island
A study on academic anxiety among adolescents of minicoy island
International Journal of Science and Research (IJSR)
 
Self-efficacy in Instructional Technology Contexts
Self-efficacy in Instructional Technology ContextsSelf-efficacy in Instructional Technology Contexts
Self-efficacy in Instructional Technology Contexts
Georgia Southern University
 
Vela1
Vela1Vela1
Diagnosis: The Missing Ingredient from RTI
Diagnosis: The Missing Ingredient from RTIDiagnosis: The Missing Ingredient from RTI
Diagnosis: The Missing Ingredient from RTI
rathx039
 

Similar to 1c assignment (20)

Fostering Autonomy, Purpose, and Competence in Math
Fostering Autonomy, Purpose, and Competence in MathFostering Autonomy, Purpose, and Competence in Math
Fostering Autonomy, Purpose, and Competence in Math
 
Effects of Multiple Intellgences on Academic Education
Effects of Multiple Intellgences on Academic EducationEffects of Multiple Intellgences on Academic Education
Effects of Multiple Intellgences on Academic Education
 
Mini Grant Proposal
Mini Grant ProposalMini Grant Proposal
Mini Grant Proposal
 
Grading Ethical Issues
Grading Ethical IssuesGrading Ethical Issues
Grading Ethical Issues
 
Language testing and evaluation
Language testing and evaluationLanguage testing and evaluation
Language testing and evaluation
 
Borgemenke, arthur j examining recurring critical events schooling v5 n1 2014
Borgemenke, arthur j examining recurring critical events schooling v5 n1 2014Borgemenke, arthur j examining recurring critical events schooling v5 n1 2014
Borgemenke, arthur j examining recurring critical events schooling v5 n1 2014
 
Arrrsa mid sem sample test anxiety
Arrrsa   mid sem sample test anxietyArrrsa   mid sem sample test anxiety
Arrrsa mid sem sample test anxiety
 
report in chem
report in chemreport in chem
report in chem
 
Stress 18
Stress 18Stress 18
Stress 18
 
Action Research
Action ResearchAction Research
Action Research
 
Action Research Final
Action Research FinalAction Research Final
Action Research Final
 
Academic performance mapping traits of engineering students
Academic performance mapping traits of engineering studentsAcademic performance mapping traits of engineering students
Academic performance mapping traits of engineering students
 
1Methodology AssignmentParticipantProcedures
1Methodology AssignmentParticipantProcedures1Methodology AssignmentParticipantProcedures
1Methodology AssignmentParticipantProcedures
 
G11-2Describe how a change in the exchange rate affected your fi
G11-2Describe how a change in the exchange rate affected your fiG11-2Describe how a change in the exchange rate affected your fi
G11-2Describe how a change in the exchange rate affected your fi
 
Mathematics instruction for secondary students with learning disabilities
Mathematics instruction for secondary students with learning disabilitiesMathematics instruction for secondary students with learning disabilities
Mathematics instruction for secondary students with learning disabilities
 
Ez35875879
Ez35875879Ez35875879
Ez35875879
 
A study on academic anxiety among adolescents of minicoy island
A study on academic anxiety among adolescents of minicoy islandA study on academic anxiety among adolescents of minicoy island
A study on academic anxiety among adolescents of minicoy island
 
Self-efficacy in Instructional Technology Contexts
Self-efficacy in Instructional Technology ContextsSelf-efficacy in Instructional Technology Contexts
Self-efficacy in Instructional Technology Contexts
 
Vela1
Vela1Vela1
Vela1
 
Diagnosis: The Missing Ingredient from RTI
Diagnosis: The Missing Ingredient from RTIDiagnosis: The Missing Ingredient from RTI
Diagnosis: The Missing Ingredient from RTI
 

1c assignment

  • 1. Jane Chapman PGCE Secondary 1 A CRITIAL ANALYSIS OF THE MATCH OR MISMATCH BETWEEN THE PERCEIVED AND ACTUAL UNDERSTANDING OF YEAR 7 STUDENTS, STUDYING PARTICLE SOLUTIONS, WHEN ENGAGING IN SELF-ASSESSMENT LEARNING ACTIVITIES Introduction This study aims to investigate the match or mismatch between the perceived and actual understanding of students, when engaging in self-assessment. Educators frequently use self-assessment to assess student understanding in order to identify and respond to their needs. However, it is important to question how well students know what they know. It has been illustrated that people often make self-assessment errors, and when they do, they are often overconfident. An issue which needs to be addressed with this research is how best we can measure student understanding. In this study, students will be investigated individually, based on tests, confidence scorings and questionnaires. Interviews will further be used to examine the effect of self-efficacy on student understanding. The context of this enquiry is a coeducational school in Cambridgeshire, England. The chosen class is a high-achieving cohort of 29 students in a Science class. This paper will first give a background on the previous research done on this topic and then explain the methodological approach taken to address the current research questions. Later, findings will be analysed and finally, conclusions will be made, with implications of this study. 1. Literature review 1.1. Self-Assessment for Learning Effective assessment by schools and teachers not only needs to measure student progress, but also identify their learning needs and respond to them. Forms of ‘summative’ assessment, such as tests and examinations are a classic way to measure student progress, in addition to making schools and the education system accountable (Ball, 2003). However, to be truly effective, there must also be ‘formative’ assessment. In classrooms, this includes the teacher making frequent and interactive assessments of the students’ understanding in order
  • 2. Jane Chapman PGCE Secondary 2 to identify and respond to their needs. This informs future teaching as it allows the teacher to adapt to the changing needs of the students. Teachers should also involve students actively, encouraging them to develop skills that better aid their learning. Formative assessment is known to be highly effective in raising the standards of student achievement, gains which are ‘among the largest ever reported for educational interventions’ (Black & Wiliam, 1998). In addition, formative assessment methods may also promote ‘greater equity of student outcomes’, as teachers ‘adjust methods to recognise individual, cultural, and linguistic differences between children’ (CERI, 2008). Furthermore, this type of assessment also builds students’ ‘learning how to learn’ skills by actively involving students in the process of teaching and learning, helping them understand their own learning and building students’ skills for peer- and self-assessment. In primary and secondary schooling, self-assessment has been shown to improve student communication skills, engage and empower students, enhance their self-regulation and metacognition, and create better understandings of the criteria used to evaluate students’ work (Andrade, 2010; Topping, 2003). Reliability of self-assessments is typically high, demonstrated by a study of 11-12 year old students rating their performance in mathematical problem solving (Ross et al., 2002) and self-assessments in English (Ross et al., 1999). However, there has been shown to be less consistency over longer time periods, particularly involving younger children (Blatchford, 1997). Evidence about the concurrent validity of self-assessments is mixed. In general, student self-assessments are higher than teacher ratings. Furthermore, a study comparing student self-assessment to standardised tests found that age moderated the relationship. Self-assessment was correlated with achievement at age 16 but not at age 7 (Blatchford, 1997). It should be taken into consideration that any form of self-assessment that takes place in a public space may trigger threats to psychological safety and interpersonal relationships (Brown & Harris, 2013). Furthermore, many students have doubts about their ability to assess themselves (Brown et al., 2009) and there is evidence to suggest that school students are relatively inaccurate assessors (Ross, 2006). One study of 23 Canadian primary and secondary classrooms found that although students appreciated self-assessment, there were concerns over possible cheating and inaccuracy (Ross et al., 1998). Additionally, a New Zealand study of self-assessment reported that students preferred more traditional teacher- controlled assessments, a belief reinforced by school grading and reporting methods. The same study also proposed that students in high-stakes environments for educational assessments such as the UK Key Stage testing may be more likely to resist self-assessment
  • 3. Jane Chapman PGCE Secondary 3 because their assessment experiences have not allowed them to appraise their own evaluations of their work. This is highlighted by the student response: “my teacher’s judgement matters more than mine” (Brown & Harris, 2013). To improve the accuracy of self-assessment and improve student confidence in their evaluations, school students need support, direction and teacher involvement for self-assessment to work effectively (Dunning et al., 2004). One commonly used self-assessment practice is the Traffic Light technique, developed out of the King’s-Medway-Oxfordshire Formative Assessment Project in England. This popular method involves students holding up a green, amber or red sign to highlight whether they understand, think they understand but are not quite sure, or do not understand a certain concept. Teachers would then spend more time with students who held up amber or red (OECD, 2005). This ‘assessment for learning’ technique can also be used by students to label their work, indicating how confident they are of their success. However, it is important to question how well students know what they know. It has been illustrated that people often make self-assessment errors, and when they do, they are frequently overconfident. For example, in Hacker et al. (2000), many students predicted they would receive examination scores greater than 30% higher than their actual scores. This overconfidence effect was greatest for people with lower abilities. Moreover, the same study reports that higher-scoring students were more accurate at predicting their examination scores than lower-scoring students. The reason for this metacognitive inaccuracy is debated. The leading interpretation is that lower ability students lack awareness of the knowledge that they do and do not possess (Ehrlinger, 2008). However, a study testing this theory found that low-performing students were less subjectively confident in their predictions than high-performing students, implying low-performers are aware of their ineptitude. This literature demonstrates dissociation between metacognitive ability and awareness of this ability (Miller & Geraci, 2011). 1.2. Self-efficacy Self-efficacy is one of the essential components of Bandura’s (1977) social cognitive theory. He identified that behaviour could be affected by self-efficacy theory – the belief that a person can successfully do whatever is required to achieve a desired outcome. Key factors which affect a person’s efficacy expectations are; vicarious experiences (seeing other people doing something successfully), verbal persuasion (being told that you can do something) and emotional arousal (high levels of anxiety can reduce a person’s self-efficacy). Furthermore,
  • 4. Jane Chapman PGCE Secondary 4 contextual factors such as social, situational and temporal circumstances might also affect expectations of personal efficacy (Weiner, 1972). In the past several decades, studies have shown that students’ motivation, cognition and actual performance are strongly influenced by self-efficacy (Sungur, 2007; Usher & Pajares, 2006). In general, students with higher levels of self-efficacy have been found to set higher goals, adopt flexible and varied learning strategies, exert greater effort to complete academic tasks and obtain better academic performance levels (Liem et al., 2008). In contrast, students with low self-efficacy tend to avoid tasks they deem to be beyond their capabilities (Lin & Tsai, 2013). In the past, when determining the relationship between self-belief and outcome, there has often been an incorrect judgement of self-efficacy (Zimmermann, 1996). This has been due to self-efficacy beliefs not being assessed at the correct level of specificity that corresponds to the specific task being studied. General self-efficacy assessments are thought to transform beliefs of self-efficacy into an indiscriminate personality trait instead of the context-specific judgement Bandura suggests they are. Bandura (1986) proposed that judgements of self-efficacy should be consistent with the domain of task being investigated. An example of this would be a mathematics self- efficacy instrument used to investigate the confidence students had of succeeding in mathematics courses and comparing this to their performance in maths-related tasks (Pajares, 1996). Furthermore, students with higher self-efficacy often report higher levels of self- knowledge judgement than students with lower self-efficacy (Gravill et al., 2002) and students who believe in their learning efficacy develop and sustain their effort needed for learning. Therefore, self-efficacy contributes to knowledge acquisition and skill development (Tsai et al., 2011). Several studies have examined the more specific self-perceived competence in science education. Evidence highlights that students who feel more efficacious in science demonstrate higher achievement in this subject (Borman & Overman, 2004). Bandura’s theory would suggest that this may be due to student persistence, even when tasks are difficult (Bandura, 1997). The literature also implies that student academic anxiety (Britner & Pajares, 2006) and gender (i.e. being a girl; Fast et al., 2010) also contribute to students having a lower science self-efficacy. Interestingly however, another study found no gender differences in their science self-efficacy (Griggs, et al., 2013). One explanation for these different findings may be because the latter study controlled for science anxiety, which was greater among girls. Therefore, once anxiety was controlled, both genders believed themselves to be similarly efficacious. Student experiences at school also play a role,
  • 5. Jane Chapman PGCE Secondary 5 demonstrated by enhanced student self-efficacy through a focus on creating caring, emotionally supportive learning environments (Zins & Elias, 2006). As mentioned previously, the judgement of self-efficacy on predicting performance is shown to be discipline- and situation - specific. However, situational conditions do not establish perceived self-efficacy, but rather act as performance requirements for the judgement of efficacy (Bandura, 1997). Research has previously revealed the role of self-efficacy in science learning and has suggested that self-efficacy mediates people’s interpretation of their knowledge (Liu et al., 2006). 1.3. Students’ Science Learning Self-Efficacy As mentioned previously, when considering studies that aim to explore students’ self- efficacy, the use of an instrument with general self-efficacy items would be insufficient (Pajares, 1996). It would be more appropriate to instead develop measures that can be adapted to several contexts. In the case of students’ self-efficacy in science, it should not be thought of as one global measurement, but should be separated into several distinctive aspects for more detailed study (Lin & Tsai, 2013). Science education literature has established that there are several major aspects of science learning. Duschl (2008) has highlighted that conceptual understanding of scientific knowledge, together with higher-order thinking skills such as reasoning and critical thinking are of great importance. The development of conceptual understanding and critical thinking, together with problem solving ability has been suggested to be promoted by practical work. Furthermore, practical work has also been proposed to stimulate and maintain students’ interest, attitude, satisfaction, open-mindedness and curiosity in science, promote aspects of scientific thinking and the scientific method and also allow students to develop practical abilities (Hofstein, 1988). These all contribute in helping students learn science, learn about science and allow them to do science (Tsai, 2003). Based on the PISA 2006 survey (OECD, 2007a, b), Finnish students obtained the highest score in the Scientific Literacy Assessment between students in all OECD countries. A large-scale study looking at how they succeeded, found that a robust predictor of the high results in Finland was frequent use of practical work in the classroom (Lavonen & Laaksonen, 2009). It is also important for students to be literate in science, meaning ‘to use scientific knowledge, to identify questions and to draw evidence-based conclusions in order to understand and help make decisions about the natural world’ (OECD, 1999). There are many reasons why everyday science applications should be integrated into
  • 6. Jane Chapman PGCE Secondary 6 school science. Firstly, empirical studies have shown that using everyday contexts enhances student enjoyment (Dlamini, et al., 1996), allows for conceptual development, provides teachers with an opportunity to address misconceptions (Lubben et al, 1996) and gives relevance to school science learning (Campbell et al, 1994). Furthermore, incorporating every day science applications into school science is fundamental to the students’ mastery of science learning in school (Driver et al., 1994). However, many students view ‘school science as having little or no relevance to their life-world subcultures’ (Aikenhead, 1996). When there is no bridging between school science learning and daily experiences, students may practice ‘cognitive apartheid’, referring to the isolation of knowledge systems relating to science: one for school science and one for everyday lives (Cobern, 1994). Learning, like doing science, is a social activity that takes place ‘through communication or interaction with others where ideas are constructed or shared’ (Vygotsky, 1978). Communication through discussion, argumentation, reading and writing can promote students’ constructs of understanding science (Chang et al, 2011) with studies revealing the importance of students’ interpersonal communication with adults and peers on improved learning (Stamovlasis et al, 2005). As discussed above, there are various features of science literacy and there have been several successful studies measuring students’ science learning self-efficacy (SLSE) in conformity with these features (e.g. Baldwin et al, 1999; Lin & Tsai, 2013; Uzuntiryaki & Capa Aydin, 2009. In the research by Lin and Tsai (2013), several current SLSE instruments were collected and modified to develop their own validated ‘Science Learning Self-Efficacy’ (SLSE) instrument. This consisted of five distinct domains (‘Conceptual Understanding’, ‘Higher-Order Cognitive Skills’, ‘Practical Work’, ‘Everyday Application’ and ‘Science Communication’) that conform to the existing notion of science literacy. Furthermore, this study also investigated the relationship between high school students’ SLSE and their approaches to learning science. Through correlation analyses, they found that students’ deep strategies and deep motive were strong predictors of their SLSE. This SLSE instrument has also been useful in a later cross-cultural study (Lin et al., 2013), and in another, revealing a significant association between students’ conceptions of learning science and their self- efficacy (Lin & Tsai, 2013). This study found that students in strong agreement with learning science as understanding and seeing in a new way are likely to possess a higher science self- efficacy than students who consider learning science in terms of preparing for tests and examinations. These studies indicate that this multi-dimensional SLSE instrument is relevant and valid for advancing current understandings in the line of SLSE research.
  • 7. Jane Chapman PGCE Secondary 7 Science educators have explored the relationships between student science learning self- efficacy and both their conceptions of learning science and approaches to learning science. However, there does not seem to be any studies exploring student science learning self- efficacy and their matched or mismatched actual and perceived understanding. The main purposes of this study were first, to explore the match or mismatch between the perceived and actual understanding of high school students studying Science. Secondly, this study aimed to explore the relationship between students’ SLSE and students’ perceptions of their own understanding. Derived from the research purposes, this study addressed the following questions: 1. Does students’ perception of their understanding match their actual understanding? 2. Is students’ perception of their understanding influenced by their self-efficacy? 2. Methodology and Methods 2.1. Methodology A case study approach was taken, one that Denscombe (2010) has characterised as a ‘focus on just one instance of the thing that is to be investigated’. Focussing on individual instances rather than many may reveal insights about the general by looking at the particular. An advantage that a case study has over a survey approach is that there is greater opportunity to explore things in more detail and discover new insights that may not have been possible with a more superficial approach. Therefore, the complexities of a given situation can also be exposed. This is useful as relationships and processes within social settings are often interrelated and interconnected. Furthermore, the case being explored is a ‘naturally occurring’ phenomenon (Yin, 2009); something that usually already exists, and is not a situation that has been artificially created for the purpose of research. This gives the case study another benefit of a natural setting. The range of potential ‘cases’ is very widespread but for good case-study research, it is essential for the unit to have distinct boundaries, where the researcher is able to keep it separate from other similar things of its kind and distinct from its social context. The case study approach also requires the researcher to select one case from a wider range of examples of the type of thing that is being explored, based on their distinctive features.
  • 8. Jane Chapman PGCE Secondary 8 In addition to the advantages mentioned earlier, the variety of methods used in case studies results in multiple sources of data. This facilitates validation of the data through triangulation. The major criticism of case studies is in relation to the credibility of generalisations made from its findings. Therefore, researchers must highlight the extent to which the case is similar and dissimilar to others of its type. This approach also often gets accused of lacking a degree of rigour, and one that relies on qualitative data and interpretative procedures instead of quantitative data and statistical methods. Another limitation of case studies is the problem of ‘the observer effect’. This can arise as case studies usually comprise lengthened involvement over time, potentially causing those being researched, to behave in a different manner from normal, as they may be aware that they are being observed. This may be the case in the present study, given my dual role as teacher and researcher. 2.1.1. Details of the case The present study involved 27 students in a year 7 class (around 11-12 years old) from a secondary school in the county of Cambridgeshire. Among the participants, 16 were male and 11 were female, and were from a class with high academic achievement. The surveyed students came from a variety of villages in the catchment area with different socio- economic backgrounds. Permission to gather data was provided by the school administration and students were informed that the data collection process was anonymous and voluntary. The topic being studied during this research was of ‘Particle Solutions’. Within this, they explored the properties of solids, liquids and gases and gained a greater understanding of the ‘Particle Model’. Next, they applied the Particle Model theory to explain every day phenomena such as gas pressure, expansion, diffusion and contraction and to consider the motion, forces and closeness of particles to help explain observations obtained during practical work, such as changing of state. Later lessons aimed at students’ understanding of what mixtures are and how you can separate them and also their understanding of the process of dissolving. Furthermore, students had to then use their knowledge about separating mixtures to obtain a sample of salt from rock salt and understand that salt comes from a variety of sources and has many uses. Since students build their own concepts, their constructions of chemical concepts sometimes differ from the one the teacher holds and has tried to present (Nakhleh, 1992). These misconceptions differ from commonly accepted scientific understanding and may interfere with subsequent learning. The scientifically accepted model that matter is made up
  • 9. Jane Chapman PGCE Secondary 9 of discrete particles with empty space between them and are in constant motion is a concept students of all ages have trouble understanding (Novick & Nussbaum, 1981). This study revealed that over half of the students perceived matter as a continuous medium that was static. In addition, the authors highlighted that aspects of the particulate model were differentially accepted by students. For example, the idea that liquefaction of gases involved the merging of particles was accepted by 70%, whereas only 40% accepted the idea that particles in a gaseous phase have empty spaces between them. In another study of 300 students aged 15-16, nearly half of the students believed that the properties of the substance were also properties of the individual atom (Ben-Zvi et al, 1986). This concept of the particulate nature of matter is an important foundation for understanding many chemical concepts (Krajcik, 1990). Misconceptions about the concepts of atoms and molecules have also been revealed by researchers. In a study with 17-18 year old Canadian students, half of the students in the sample believed: molecules are much larger than in reality; molecules of the same substance may vary in size; molecules of the same substance can change shapes in different phases; molecules have different weights in different phases and also that atoms are alive (Griffiths & Preston, 1989). 2.2. Data Collection Methods To allow for the examination of actual and perceived understanding of students within its real-life context, this study compared test results with confidence indicator colours after every question of the test, together with subject interviews. To study whether students’ perceptions of their understanding was influenced by their self-efficacy, a Science Learning Self-Efficacy (SLSE) questionnaire was administered. Further data were collected through subject interviews, student focus groups and teacher interviews. A summary of the research questions and data sources is given below in Table 1.
  • 10. Jane Chapman PGCE Secondary 10 Table 1. Research questions, data sources, and details of when the data was collected Proposed Methodology Case study Title A critical analysis of the match or mismatch between the perceived and actual understanding ofyear 7 students,studying particle solutions,when engaging in self- assessment learning activities. Research question: When engaged in self-assessment activities, what influences year 7 students’perceptions of their own understanding? Sub questions Data Source Data Source Data Source Data Source 1. Does students’ perception of their understanding match their actual understanding? Red / amber / green traffic lights immediately after every question Test results for every question Subject interviews (Check on those with green + bottom marks and red + top marks.) - 2. Is students’ perception of their understanding influenced by their self-efficacy? SLSE questionnaire Subject interviews Focus groups Teacher interview When will I collect this data? 1. End of the first two lessons in the sequence. 2. End of the third lesson in the sequence 1. End of the first two lessons in the sequence. 2. During lunchtime, after the last lesson in the topic. 1. During lunchtime, after the last lesson in the topic. 2. During lunchtime, after the sixth lesson in the sequence. - 2. End of the schoolterm 2.2.1. Assessing Students’ Actual and Perceived Understanding Two distinct tests were administered over two lessons and each of these tests consisted of six questions which aimed to assess their understanding of the learning objectives from the lesson they had previously done (Appendices 3 and 4). Students were asked to indicate how confident they were that their given answer was correct by using red, amber and green traffic light colours after each question in the space provided. Tests are a useful way of collecting evidence about the knowledge and understanding of students. However, as with all assessment data, the validity of outcomes will strongly depend upon whether the test items are actually testing the knowledge and understanding they claim to. Creating tests that are both valid and reliable is known to be difficult (Taber, 2013). For example, contextualised questions which are meant to be less abstract and unfamiliar to the student, may complicate matters: students have to ‘process’ more information, the context may illicit ‘everyday’ ways of thinking that do not match academic
  • 11. Jane Chapman PGCE Secondary 11 learning and the context may be more familiar to some students that others (causing a potential gender- and cultural-bias) (Taber, 2003). To confirm the results of the tests, subject interviews were carried out to further analyse students who signalled green but had an incorrect answer and those who signalled red but had a correct answer (a mismatch of actual and perceived understanding). In this case, ‘structured’ interviews were carried out. Structured interviews consist of a pre-determined list of questions, asked of each respondent to which they are given a limited-option of responses. The tight control over the format of questions and answers leads itself to the advantage of ‘standardisation’. Furthermore, the selection of pre-coded answers ensures relatively easy data analysis, lending itself to the collection of quantitative data and which is useful for ‘checking’ data (Taber, 2013). Self-assessment, as mentioned before, increases student engagement in assessment tasks and is also a key factor in maintaining student attention. Self-assessment data also has the strength of providing information that is not easily determined, such as how much effort a student has made in preparing for a certain task (Ross, 2006). In addition, numerous researchers have reported a high level of reliability in self-assessment in terms of consistency across tasks (Fitzgerald et al., 2000) and over short time periods (Chang et al., 2005). These studies all involved students that had been taught how to evaluate their work effectively. Limitations of self-assessment appear to be the concern over its validity. The literature suggests that there are discrepancies between self-assessments and scores on other measures (Ross, 2006). Furthermore, student self-assessment is generally shown to be higher than teacher ratings (McEnery & Blanchard, 1999). 2.2.2. Assessing Students’ Self-Efficacy in Learning Science Unlike tests that aim to measure learning, questionnaires contain questions that all respondents should be able to respond to (Taber, 2013). There are strengths and limitations of the different types of items and scales used in questionnaires. Closed questions are simple to analyse but only investigate which of the offered options respondents chose. There is opportunity for respondents to give answers that closer matches their own views in open questions, but these need to be later categorised to be reported in an economic way. The selection of comments can also raise questions of how representative they are to the actual data. It has been proposed that the design of questionnaires can have increased validity and reliability if they do not contain central points as this forces the respondent to decide on how
  • 12. Jane Chapman PGCE Secondary 12 they feel. However, if they genuinely have neutral or mixed feelings about the statement, this may result in false claims being made. Consistency of responses should be probed by including several similar, or directly opposite items. Another limitation of questionnaires that consist of many scale-type items is that they are known to be occasionally completed with little thought. To reduce the risk of this happening, questionnaires should aim to not contain too many items. Furthermore, they could also contain some statements that are reversed, encouraging respondents to think more carefully about each item (Taber, 2013). In general, questionnaires have the advantages of being economical, relatively easy to arrange and supplying standardised answers. As respondents are posed with identical, pre- coded questions, there is no scope for variation via face-to-face contact with the added benefit of the data not being affected by ‘interpersonal factors’. However, there are disadvantages to pre-coded questions which should be considered. Together with being frustrating and restricting for the respondents, they could also ‘bias the findings towards the researcher’s, rather than the respondent’s, way of seeing things’ (Denscombe, 2010). A 28-item Science Learning Self-Efficacy (SLSE) instrument was adopted to assess the participants’ self-efficacy in learning science (Lin and Tsai, 2013) (Appendix 2). The items of the SLSE instrument were presented with bipolar strongly agree/strongly disagree statements in a four-point Likert scale (4=strongly agree, 3=agree, 2=disagree, 1=strongly disagree), assessing the dimensions discussed in Table 2. Table 2. The five dimensions assessed with the SLSE instrument Distinct Dimension Description of what is assessedin participants’ confidence Conceptual Understanding Ability to use fundamental cognitive skills such as science concepts,laws or theories. Higher-Order Cognitive Skills Ability to utilize sophisticated cognitive skills including problem-solving, critical thinking or scientific inquiry. Practical Work Ability to conduct science experiments in laboratory activities. Everyday Application Ability to apply science concepts and skills in their daily life. Science Communication Ability to communicate or discuss with classroom peers or others
  • 13. Jane Chapman PGCE Secondary 13 2.2.3. Interviews For an interview to take place there must be consent from the interviewee who agrees and understands that the material obtained will be used for research purposes. The interviewee must also agree that their words can be treated ‘on the record’ and ‘for the record’ unless they specify otherwise and that the agenda for the discussion will be controlled by the researcher. Interviews are useful for in-depth exploration of complex and subtle phenomena, such as people’s experiences, opinions, feelings and emotions. There are several types of research interviews, in addition to the ‘structured’ interview mentioned previously (in section 2.2.1). With semi-structured interviews, the researcher still has a set of issues to be considered and questions to be answered but is flexible in the order in which the topics are addressed. Additionally, there is greater flexibility for the interviewee as answers are open-ended. This allows the interviewee to develop and elaborate on points which are of interest to them. With unstructured interviews, the role of researcher is to be as unobtrusive as possible. Semi-structured and unstructured interviews are on a continuum scale so it is likely for both to feature in a single interview. Due to the nature of the interviewee having freedom to ‘speak their mind’, semi- and unstructured interviews are useful for discovering ideas about complex issues (Denscombe, 2013). The conducted teacher and student interviews were carried out one-to-one. This type of interview has the advantages of being easy to arrange and also, the opinions and views expressed throughout the interview originate from one source. This makes it easy for the interviewer to match specific ideas with a certain person. However, one-to-one interviews do have a disadvantage of providing limited opinions and views (Denscombe, 2013). Additionally, face-to-face interviews involve social cues, such as voice, intonation and body language from the interviewer which may influence the answers given from the interviewee (Opdenakker, 2006). To address the research question ‘Does students’ perception of their understanding match their actual understanding?’ four students were chosen to interview with regards to their test papers. Students were selected based on whether they had a mismatch of actual and perceived understanding. The interview began with structured questions that probed their understanding of the test questions to double check their responses. The same students were then interviewed to answer the second research question, ‘Is students’ perception of their understanding influenced by their self-efficacy?’ To investigate this, semi-structured questions were asked of the students to explore this question. All interviews were audio
  • 14. Jane Chapman PGCE Secondary 14 recorded and there was confirmation and reassurance about the confidentially of the discussion. 2.2.4. Focus groups Focus groups contain a small number of people brought together by the researcher to investigate feelings, perceptions, attitudes and ideas about a certain topic. They are helpful for exploring the extent to which shared views exist amongst a group of individuals relating to a specific topic. As the name suggests, focus groups have a ‘focus’ to the session, with a discussion being based around a topic in which all participants have similar knowledge of. There is also a particular emphasis on the group’s interaction as means of acquiring information, in which the moderator’s role is to facilitate this interaction and not to lead the discussion. For this research, a group of four students from the class being studied were chosen to take part in the focus group. As the overall aim of the research was to explore in depth a particular solution with a view to exploring the specifics (Denscombe, 2010), students were deliberately chosen to ensure members of the group were likely to hold opposing views on the topic for discussion. As the students are under the protection of responsible others, permission was sought from the school organisation and authorisation to conduct the interview was gained before the interview took place. The prospective interviewees were contacted in advance and the interview was arranged for the following week, lasting 20 minutes. Semi-structured questions were asked that probed the research question ‘Is students’ perception of their understanding influenced by their self-efficacy?’ and the conversations in focus groups were audio recorded. These questions covered the themes of self-assessment, self-efficacy in school and self-efficacy in science. A full list of prompt questions is included in Appendix 1. 2.3. Analysis of data 2.3.1. Constant comparative method For the subject interviews, focus groups and the teacher interview, audio recordings were analysed using the constant comparative method. For this, audio recordings were listened through, focussing on the questions that guided the research. Parts of the interviews that I believed to be important were written down and themes that underpinned what people
  • 15. Jane Chapman PGCE Secondary 15 were saying were identified. These temporary constructs were used to compare against the recordings again, and further notes and observations were written down. Temporary constructs deemed to be unsuitable were deleted, and after another listen, a list of second order constructs were made that seemed to explain the data (Wilson, 2013). This process helped to summarise the important themes in the data. 2.3.2. Descriptive statistics Answers to the test questions were compared to their confidence (red/amber/green) responses after each question. Test answers were marked either correct or incorrect and were only compared to red and green responses (i.e. not at all confident and very confident that their given answer was correct). Amber responses were discounted. A correct answer with a green response and an incorrect answer with a red response were regarded as a match of actual and perceived understanding. A correct answer with a red response and an incorrect answer with a green response were regarded as a mismatch of actual and perceived understanding. Matched and mismatched understandings were given as percentages of the students’ responses. The mean and standard deviation for girls’ and boys’ test scores together with matched versus mismatched responses were calculated. For the SLSE instrument, scale scores were computed by calculating the mean of the items in each domain for each individual and the mean scores for boys and girls. 2.3.3. Association statistics To initially explore the relationship between the students’ match or mismatch of actual and perceived understanding and SLSE, Pearson correlation analysis of the students’ responses on the test papers and SLSE instrument was undertaken. There are a number of assumptions that are made with respect to Pearson’s correlation. Included in these are that the two variables should be measured at the interval or ratio level and that there needs to be a linear relationship between these two variables. Also, as Pearson’s r is very sensitive to outliers, these should be minimised or omitted, and finally, the variables should be approximately normally distributed (Spiegelman, 2010).
  • 16. Jane Chapman PGCE Secondary 16 2.4. Validity and reliability Validity means that both the methods and data are ‘right’ in reflecting the reality and truth. Methods used to obtain data should measure suitable indicators of the concept, giving accurate results. A good level of reliability means that a research instrument will consistently provide the same data, time and time again. If there were to be any variation, this would be due to variation in the thing being measured and not due to the volatile nature of the research instrument itself (Denscombe, 2010). To ensure validity in what was being said in interviews, interview data was corroborated with other data sources on the topic. Furthermore, there was often confirmation on what was meant by the interviewee to avoid misinterpretation. The use of quantitative data produces numerical data that is independent from the researcher and so should not be influenced by the researcher. In this study, test papers, confidence indicator colours and SLSE questionnaires gave standardised data and furthermore, the SLSE instrument was validated by the method of exploratory factor analysis in the study of Lin and Tsai (2013). As a check on the qualitative data in this study, there has been an explicit account of the methods, analysis and decision making showing the readers as much detail as possible the lines of analysis that led to certain conclusions (Denscombe, 2010). Qualitative data was also checked for external reliability with other comparable studies. 2.5. Ethics This educational research followed guidelines set out by the British Educational Research Association (BERA, 2011), summarised in the table below. Table 3. Adherence to Ethical Guidelines for Educational Research Responsibilities to… What was done in the research to comply with the guidelines Participants Voluntary Informed Consent  All persons involved were treated within an ethic of respect.  Participants understood and agreed to their participation, prior to the research getting underway. Openness and Disclosure  Participants’ voluntary informed consent was secured,before research was carried out, and there was no deception or subterfuge from the researchers. Right to Withdraw  Participants were informed that they had the right to withdraw from the research for any or no reason,and at any time. Children, Vulnerable Young People and Vulnerable Adults  In all actions, the best interests of the child were the primary consideration.  Children capable of forming their own views were granted the right to express their
  • 17. Jane Chapman PGCE Secondary 17 views freely in all matters affecting them.  Researchers ensured that they themselves and any collaborators complied with legal requirements in relation to working with schoolchildren.  All necessary steps were taken to reduce the sense of intrusion to the children and to put them at their ease.  Impact of research on the normal working and workloads of participants was minimised. Incentives  Use of incentives to encourage participation was commensurate with good sense and avoided choices which had undesirable effects.  There was acknowledgement that the use of incentives had the potential to create a bias in sampling or in participant responses. Detriment Arising from Participation in Research  Researchers made known to the participants that any unexpected detriment to participants, which arose during the research, must be brought immediately to their attention or to the attention of their guardians.  Steps were taken to minimize the effects of designs that advantage or are perceived to advantage one group of participants over others. Privacy  There was confidential and anonymous treatment of participants’ data  Researchers complied with the legal requirements in relation to the storage and use of personal data as set down by the Data Protection Act (1998) and any subsequent similar acts. Disclosure  Researchers who judge that the effect of the agreements they have made with participants, on confidentiality and anonymity, will allow the continuation of illegal behaviour, which has come to light in the course of the research, must making disclosure to the appropriate authorities. If the behaviouris likely to be harmful to the participants or to others,the researchers must also consider disclosure.  At all times the decision to override agreements on confidentiality and anonymity must be taken after careful and thorough deliberation. Sponsors of Research Methods  Only methods fit for the purpose of the research undertaken were employed.  Researchers have communicated the extent to which their data collection and analysis techniques,and the inferences to be drawn from their findings, are reliable and valid. Publication  The researchers have make themselves familiar with the BERA research writing guidelines The Community of Educational Researchers Misconduct  This research was conducted to the highest standards.  Subject to any limitations imposed by agreements to protect confidentiality and anonymity, data and methods amenable to reasonable external scrutiny.  There is contribution of critical analysis and constructive criticism. Authorship  Academic status or otherindicator of seniority has not determined first authorship Educational Professionals, Policy Makers and the General Public  Researchers will seek to make public the results of their research for the benefit of educational professionals,policy makers and a wider public understanding of educational policy and practice.  Communication of findings, and the practical significance of their research, will be given in a clear, straightforward fashion.
  • 18. Jane Chapman PGCE Secondary 18 3. Analysis of Findings 3.1. Does students’ perception of their understanding match their actual understanding? The data collected from test results with confidence indicator colours and subject interviews presented several interesting findings. Firstly, it was highlighted that overall, girl’s perceptions of their understanding was more accurate than boys’. Additionally, girls were more likely to believe that they had an incorrect answer when they were actually correct. This is in contrast to another finding that demonstrated boys were more likely to believe that they had a correct answer when they were actually incorrect. Because there is evidence of the effect of gender in the literature (Fast et al., 2010; Griggs et al., 2013), these findings were investigated further. 3.1.1. Overall, girls’ perception of their understanding was more accurate than boys’ The class’ mean percentage of matched understanding responses was 71% (±0.18 SD) with girls having a higher mean percentage of 73% (±0.16 SD) than boys of 70% (±0.20 SD) (Figs 1 and 2). This demonstrates that pupils’ perceived understanding greatly matched their actual understanding. This is in contrast to previous studies that found evidence to suggest school students to be relatively inaccurate assessors (Ross, 2006). A possible explanation for this may be due to the current study using a high-achieving cohort of students. Indeed, a similar study reported that higher-scoring students were more accurate at predicting their examination scores than lower-scoring students (Hacker et al., 2000). A potential theory for this could be that the higher ability students studied in this research have the awareness of the knowledge that they do and do not possess. In general, girls in this study were more accurate at judging whether they did or did not know the questions asked of them. This is in concordance with other researchers investigating differences in accuracy of self-perception between male and female students (Pajares, 1996). Accurate self-perceptions may allow students to more accurately assess their strategies of problem solving. However, ‘realistic’ self-appraisals may be in danger of lowering optimism and therefore, lowering levels of effort, perseverance and persistence (Bandura, 1997). Consequently, just as much attention should be paid to student’s perceived competence to their actual capability as it is their perceptions that may more accurately predict motivation and academic choices in the future (Hackett & Betz, 1989). Accuracy in
  • 19. Jane Chapman PGCE Secondary 19 self-assessment by distinguishing strengths and weaknesses is critical for students to make more effective decisions about where to apply their learning efforts. This will allow students to take responsibility for their education and improve autonomy in gaining and improving on their skills and knowledge (Dunning et al, 2004). Figure 1. Comparing the percentages of matched versus mismatched actual and perceived understanding between girls and boys Comparing matched responses (i.e. red + incorrect and green + correct) and mismatched responses (i.e. red + correct and green + incorrect) between boys and girls. Boys = dark blue; Girls = light blue. 3.1.2. Girls were more likely to believe that they had an incorrect answer when they were actually correct The class’ mean percentage of mismatched understanding responses of red but correct was very low at 0.04% (±0.09 SD) with girls having a higher mean percentage of 6% (±0.09 SD) than boys of 0.03% (±0.09 SD) (Figs 1 and 2). These mismatched responses were validated through subject interviews which confirmed both selected girls were sure of their responses. This highlights that the girls in this study had less confidence of being correct when compared to the boys. 70% 0.03% 27% 73% 6% 20% 0% 10% 20% 30% 40% 50% 60% 70% 80% Matched Red + Correct Green + Incorrect Responses(%) Matched and mismatched responses Comparing Boys' and Girls' Matched and Mismatched Responses Boys' Girls'
  • 20. Jane Chapman PGCE Secondary 20 Figure 2. Percentage of student responses with matched and mismatched actual and perceived understanding. Matched (blue) = correct + green response and incorrect + red response. Mismatched = correct + red response (red) and incorrect + green response (green). Boys = left side; Girls = right side These results are consistent with other research that found girls, and in particular gifted girls, to have a general tendency toward underconfidence (Lundeberg et al., 1994). Furthermore, research shows that female students often have problems with self-confidence and report greater stress over their competence than male students (Moffat et al., 2004). This could have negative implications as students who lack confidence in skills they possess are prone to avoiding tasks in which those skills are required, and more likely to give up in the face of difficulty. Lent & Hackett (1987) studied the perceived and actual competence of mathematical skills in college students. They demonstrated that generally, it is the underestimation of competency and not the lack of capability that is responsible for avoidance of math-related courses and careers, and this is more likely to be the case with woman than men. When this is the case, identifying and modifying these perceptions would prove beneficial. 0% 20% 40% 60% 80% 100% 120% Aiden Will Jak Bertie Elvan Toby Adam Jack Kayde Rowan Ibrahim Joe BenJ James Dejan BenB Nagi Gaea Rosie Elly Sarah Alicja Chelsie Riley Anastasia Emily Isobel Responses(%) Students Proportion of matched and mismatched student responses
  • 21. Jane Chapman PGCE Secondary 21 3.1.3. Boys were more likely to believe that they had a correct answer when they were actually incorrect The class’ mean percentage of mismatched understanding responses of green but incorrect was 24% (±0.17 SD) with girls having a lower mean percentage of 20% (±0.15 SD) than boys of 26% (±0.18 SD) (Figs 1 and 2). These mismatched responses were validated through subject interviews which confirmed both selected boys were sure of their responses. Previous findings have shown a disparity of results. Several studies have reported males to be more likely to overestimate through self-assessment (Lind et al, 2002; Rees & Shepherd, 2005). Lind et al (2002) assessed the ability of students to self-assess using a competency-based evaluation and further found females to underestimate their performance, despite outperforming the male students. However, another study involving an intervention to improve student understanding of assessment criteria, found no identifiable difference between male and female self- assessment (Rust et al., 2010). A possible explanation which could account for a lack of gender difference may be that exposure of good quality exemplar assignments to students could have caused underestimation of their own work. This perhaps had more of a pronounced effect on previously over-confident males. 3.2. Is students’ perception of their understanding influenced by their self-efficacy? 3.2.1. Students scored highest on the ‘Practical Work’ dimension of the SLSE instrument The participants’ scores on the Science Learning Self-Efficacy instrument were calculated. As a result, the classes mean scores and standard deviations of the SLSE dimensions are shown in Table 4. Table 4. Classes mean scores and standard deviations of the SLSE instrument Distinct Dimension Class’ mean score Standard deviation Conceptual Understanding 2.99 0.45 Higher-Order Cognitive Skills 3.00 0.39 Practical Work 3.71 0.41 Everyday Application 2.97 0.51 Science Communication 3.07 0.55
  • 22. Jane Chapman PGCE Secondary 22 As shown by Table 4 and Figure 3, students scored highest on the ‘Practical Work’ dimension, identical to a result from a previous study by Lin et al (2013) who found (M = 3.44). Furthermore, another study found that one of the most positive predictors of student science-related self-efficacy was practical work (Lavonen, & Laaksonen, 2009). The same study also found that self-efficacy related to science was the most powerful predictor of student performance, a result similar to Valijarvi et al. (2007). This study asked students in interviews and focus groups a number of questions about their self-efficacy in science classrooms, and the influence practical work has on this and on science learning. During interviews, students unanimously agreed that their level of confidence depended on the environment of the classroom. The following comments were typical, suggesting that students felt less confident when they felt observed by their classmates and were put off by the ‘big crowd’: It depends on who is around…the atmosphere of the class. When it’s really silent you wouldn’t feel confident. (Student F) The most anxious thing is answering the questions if you were to put your hand up, because you might be wrong, and feel anxious that people will laugh at you. (Student E) As one might expect, students felt more confident and likely to contribute in group discussion when they were doing smaller group activities, such as with practical work, as these comments suggest: It’s not so tense when we’re in groups and doing activities. All the pressure’s off…it’s a relaxed environment. Everyone is doing their own thing so they’re not concentrating on you. When you are in groups, everyone feels more confident in commenting so you want to contribute more. (Student H) In class if you put your hand up, you know that everyone is hearing and everyone is watching but when you’re in groups you don’t get that feeling…everything else goes away. (Student E) It appears that practical work promotes greater confidence in students because they are less likely to feel judged by the whole class and feel less anxious in the science classroom. In addition to self-efficacy, practical work was illustrated to increase student enjoyment and stimulate learning, as these students comment:
  • 23. Jane Chapman PGCE Secondary 23 When you think ‘science’, you think ‘experiments’. When you’re doing the experiments, you’re learning about it. It explains what you are learning in front of you. (Student G) People get excited [about practical work]…they enjoy it so much and in the end they realise they learnt something. When you write the conclusion, it s urprises you how much you know…because you just did one practical. (Student E) If you’re in groups, you are doing it yourself instead of watching the teacher do it so you can make sure you know something by doing your own experiment. You can get more involved and do your own investigations rather than just sitting, listening and writing down. (Student A) This is in line with previous findings that highlighted practical work stimulates student interest and curiosity in science, promoting aspects of scientific thinking and allowing students to develop practical abilities (Hofstein, 1988). 3.2.2. Boys had a higher mean score in every dimension of the SLSE instrument For each dimension of the SLSE instrument, boys had higher mean scores than the girls (Table 5 and Figure 3). Table 5. Boys’ and girls’ mean scores and standard deviations of the SLSE instrument Distinct Dimension Boys’ mean score Standard deviation Girls mean score Standard deviation Conceptual Understanding 3.09 0.46 2.84 0.42 Higher-Order Cognitive Skills 3.04 0.41 2.94 0.38 Practical Work 3.80 0.36 3.60 0.48 Everyday Application 3.11 0.50 2.77 0.49 Science Communication 3.23 0.41 2.83 0.66 This is fitting with studies mentioned previously, stating gender (i.e. being a girl; Fast et al., 2010) contributed to students having a lower science self-efficacy. This was confirmed by students doing the focus interviews. For example, one boy commented that the level of his confidence ‘depends on the topic, really’. However, most girls stated that it was the relationships in the classroom that was a large predictor of their confidence, as these excerpts reveal:
  • 24. Jane Chapman PGCE Secondary 24 If you’ve got people around you that you trust, you feel confident. It’s better in forms because you get to know the people a lot more but if you’re in Science, you don’t really know themas well so you don’t know what they’re going to say. Sometimes you think that if you say something it might go around the whole school. (Student E) Girls are more worried about what people think and what they’re going to say. (Student F) For girls, some people laugh and you don’t feel comfortable or know why. In Science, girls worry about themselves because everyone frets about tying their hair up [for practical work]…you don’t really want other people looking at you with your hair tied up because it makes you feel awkward. (Student E) As Science classes in this school are set by prior attainment and not by form, this may explain the lower levels of self-efficacy for girls in the SLSE instrument. Furthermore, the worries of girls about their appearance during practical work could account for their lower mean score in the ‘Practical Work’ dimension of the instrument, in relation to boys. However, in another interview with a student, she explained that she did not feel worried about what others thought of her answers and felt quite confident in Science because she felt she understood quite a lot of it. The following statement highlights this: I wouldn’t be the only one who didn’t understand… [when asked if she ever worried about answering questions in class]. I feel pretty confident because most of time, I understand the things you are teaching us. I feel much more confident in school now because we’re in our last term…we’ve been here for longer. (Student A) This student was then asked a number of questions about whether she personally thought there was a disparity between boys’ and girls’ confidence in science, and whether certain factors might predict a person’s confidence. She stated that it depended on the type of person they were, together with their knowledge and enjoyment of science. Furthermore, she thought that whether the person had siblings or not could be a factor affecting a person’s confidence as these comments suggests: Not necessarily. I think it depends on the person and how much they enjoy science and how much they know. I think if you enjoy it more, you’re more confident because if you enjoy it, you are more relaxed. (Student A)
  • 25. Jane Chapman PGCE Secondary 25 You could be more confident if you are younger because you know your brother or sister has done it before and you can ask them if you are unsure…I’ve got an older brother in year 8 and I know that for Maths or Science, I can ask him if he knows the answers to help me. (Student A) Figure 3. Boys’ and Girls’ Mean Scores in the Science Learning Self-Efficacy Instrument CU = Conceptual Understanding, HOCS = Higher-Order Cognitive Skills, PW = Practical Work, EA = Everyday Application, SC = Science Communication. Boys = dark blue; Girls = light blue 3.2.3. A strong correlation was found between girls’ SLSE score and both matched and mismatched (false negative) responses To understand the relationship between the students’ self-efficacy in learning science and their perceptions of understanding, Pearson correlation analysis based on their responses to the SLSE and test was performed. As shown in Table 6, the three measures of perceived understanding factors (i.e. matched, mismatched (red + correct) and mismatched (green + incorrect) were related to all mean self-efficacy scores of the SLSE instrument (i.e. whole class, boys and girls), suggesting weak (i.e. the boys factor) to medium (i.e. the whole class factor) to large (i.e. the girls factor) effect size coefficients (Cohen, 1992). 2.5 2.7 2.9 3.1 3.3 3.5 3.7 3.9 CU HOCS PW EA SC MeanScore Dimension of Science Learning Self-Efficacy Boys' and Girls' MeanScores of the SLSE Dimensions Boys' Girls'
  • 26. Jane Chapman PGCE Secondary 26 Table 6. Correlation of the students’ science learning self-efficacy and their perceptions of understanding Mean Self-Efficacy Score Matched Mismatched (Red + Correct) Mismatched (Green + Incorrect) Whole Class 0.13 (weak) -0.42 (medium) 0.08 (weak) Boys -0.04 (weak) -0.13 (weak) 0.12 (weak) Girls 0.57 (large) -0.75 (large) -0.13 (weak) From these results, the strongest positive correlation was between girls’ mean self- efficacy score and matched responses (Fig 4). From this graph, it is clear that as girls’ self- efficacy in science score increases, the percentage of responses that had a match or perceived and actual understanding also increased. This is similar to the results of previous studies, whereby students with higher-scoring and higher self-efficacy were more accurate at predicting their examination scores than lower-scoring students (Hacker et al., 2010). The strongest negative correlation was between girls’ mean self-efficacy score and mismatched (red + correct) responses (Fig 5). These findings imply that in general, as girls’ self-efficacy in science increases, they are more likely to know when they do and do not understand something. Furthermore, as girls’ self-efficacy increases, they are also less likely to think they are incorrect when they are actually correct. Figure 4. Correlation of the girls’ science learning self-efficacy and their matched perceptions of understanding R² = 0.3224 0% 20% 40% 60% 80% 100% 0 0.5 1 1.5 2 2.5 3 3.5 4 Responses(%) SLSE Self-Efficacy Score Girls' MeanSLSE Score vs Matched Responses
  • 27. Jane Chapman PGCE Secondary 27 Figure 5. Correlation of the girls’ science learning self-efficacy and their mismatched (false negative) perceptions of understanding 4. Conclusions and Implications The purpose of this study was to investigate whether students’ perception of their understanding matched their actual understanding and whether students’ perception of their understanding was influenced by their self-efficacy. In response to the first research question, the results indicate that overall, girls’ perception of their understanding was more accurate than boys’. Furthermore, girls were more likely to believe that they had an incorrect answer when they were actually correct and boys were more likely to believe that they had a correct answer when they were actually incorrect. In response to the second question, students scored highest on the ‘Practical Work’ dimension of the SLSE instrument; boys had a higher mean score in every dimension of the SLSE instrument and a strong correlation was found between girls’ SLSE score and both matched and mismatched (false negative) responses. It is important to note that cognitive appraisal of a situation might affect expectations of personal efficacy. Factors such as social, situational and temporal circumstances all influence the micro-analysis of perceived coping capabilities that represent self-efficacy. It is not simply down to personality traits (Bandura, 1977). R² = 0.5661 -5% 0% 5% 10% 15% 20% 25% 30% 0 0.5 1 1.5 2 2.5 3 3.5 4 Responses(%) SLSE Self-Efficacy Score Girls' MeanSLSE Score vs Mismatched (False Negative) Responses
  • 28. Jane Chapman PGCE Secondary 28 4.1. Conclusions and their wider significance Developing reliable and valid assessment tools of student performance that accurately indicate student learning is difficult. One feedback tool developed by Gardner-Medwin et al. at University College, London has been to use confidence-based marking (CBM). This methodology measures a learner’s knowledge quality by determining both the correctness of the learner’s knowledge and confidence in that knowledge (Gardner-Medwin, 2006). With this CBM method, students select a confidence rating of low (1), medium (2) or high (3) to a question; that is, their confidence about their knowledge. If the student’s answer is correct, they are awarded those marks (i.e. 1, 2 or 3). If the answer is wrong, the marks awarded at these confidence levels are 0, -2 or -6. The scheme uses negative, graded marking for the upper two confidence levels with the relative cost of a wrong answer increasing at higher confidence levels. This graduation ensures that the scoring scheme is properly motivating (Gardner-Medwin & Gahan, 2003). Assessment by CBM is a simple, valid and reliable method for challenging students to think discriminately (Barr & Burke, 2013). 4.3. Implications for research Given that this study appears to be the first to examine the relationship between self- efficacy and perception of understanding, future research is needed on students from different grade-levels, schools and geographical areas to generalise beyond this sample. Furthermore, due to the small sample size, future studies should investigate the present relationship with a larger cohort to obtain greater reliability and precision (Biau, 2008). Previous research looking into the link between students’ SLSE and their approaches to learning science found that students’ deep strategies and deep motive were strong predictors of their SLSE. Future studies could explore the associations of underlying learning variables, such as conceptions, approaches, self-efficacy, motivation and outcomes to build a more elaborated model of these relationships. 4.2. Implications for practice For students to become better assessors of their understanding, educators should aim to improve self-efficacy amongst students. In a study by Zimmerman et al. (1996), students were asked to predict their efficacy before undertaking an assignment or test and then later graph those judgements alongside their actual scores. Once students could visually see the dissociation of their predicted and actual score, their accuracy for subsequent self-efficacy
  • 29. Jane Chapman PGCE Secondary 29 judgements improved. Furthermore, this self-evaluating task was also shown to help students improve their studying methods and academic achievement (Campillo et al., 1999). Therefore, teachers could not only help students to develop a stronger and more accurate way of personal self-assessment but also to increase their self-efficacy and promote more autonomous and independent learners. References Aikenhead, G. (1996) Science Education: Border Crossing into the Subculture of Science, Studies in Science Education, 27, 1-52. Andrade, H. (2010) Students as the definitive source of formative assessment: academic self- assessment and the self-regulation of learning. In Andrade, H. L. & Cizek, G. J. (Eds.) Handbook of Formative Assessment, 90-105. New York: Routledge. Ball, S. (2003) The teacher’s soul and the terrors of performativity, Journal of Educational Policy, 18 (2), 215-228. Bandura, A. (1977) ‘Self-efficacy: toward a unifying theory of behavioural change’, Psychological Review 84, 191-215. Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ: Prentice Hall. Bandura, A. (1997). Self-efficacy: The exercise of control. New York, NY: Freeman. Barr, D. A. and Burke, J. R. (2013) ‘Using confidence-based marking in a laboratory setting: A tool for student self-assessment and learning’, The Journal of Chiropractic Education 27: 1 Ben-Zvi, R., et al.(1986). Is an atom of copper malleable? Journal of Chemical Education, 63, 64-66. BERA (2011) Ethical guidelines for educational research, Southwell, Notts.: British Educational Research Association. Biau, D. (2008) Statistics in Brief: The Importance of Sample Size in the Planning and Interpretation of Medical Research, Clinical Orthopaedics and Related Research, 466(9), 2282-2288. Black, P. and Wiliam, D (1998), “Assessment and Classroom learning”, Assessment in Education: Principles, Policy and Practice, CARFAX, Oxfordshire, Vol. 5, No. 1, 7-74. ISSN: 0969-594X.
  • 30. Jane Chapman PGCE Secondary 30 Blatchford, P. (1997). Students’ self assessment of academic attainment: Accuracy and stability from 7 to 16 years and influence of domain and social comparison group. Educational Psychology, 17(3), 345-360. Brown, G. & Harris, L. (2013) ‘Student self-assessment’, In J. H. McMillan (Ed.), SAGE Handbook of Research on Classroom Assessment, 367-393, Los Angeles: SAGE. Brown, G. et al. (2009). Use of interactive-informal assessment practices: New Zealand secondary students’conceptions of assessment. Learning & Instruction, 19(2), 97-111. Brown, G. & Harris, L. (2013) ‘Opportunities and obstacles to consider when using peer- and self-assessment to improve student learning: Case studies into teachers' implementation’, Teaching and Teacher Education, 36, 101-111 Britner, S. L., & Pajares, F. (2006). Sources of science self-efficacy beliefs of middle school students. Journal of Research in Science Teaching, 43, 485– 499. Borman, G. & Overman, L. (2004) Academic resilience in mathematics among poor and minority students. The Elementary School Journal, 104, 177– 195 Campbell, B. (1994) Science: the Salters’ Approach – a case study of the process of large- scale curriculum development. Science Education, 78 (5), 415-447. Campillo, M. et al. (1999) Enhancing academic study skill, self-efficacy, and achievement through self-regulatory training, Paper presented at the annual meeting of the Americal Psychological Asssociation, Boston, MA. CERI (2008) Assessment for Learning: Formative Assessment, Available from: http://www.oecd.org/site/educeri21st/40600533.pdf (Accessed 15 Apr 2014) Chang, H.-P. Et al. (2011) The Development of a Competence Scale For Learning Science: Inquiry and Communication, International Journal of Science and Mathematics Education, 9, 1213-1233. Cobern, W. (1994) ‘Worldview theory and conceptual change in science education’, Paper presented to the annual meeting of the National Association for Research in Science Teaching, Anaheim, CA. Cohen, J. (1992). A power primer. Psychological Bulletin, 112, 155–159 Denscombe, M. (2010) The Good Research Guide: For Small-Scale Social Research Projects, Fourth Edition. Open University Press: McGraw Hill Education Driver, R. et al. (1994)’ Constructing scientific knowledge in the classroom. Educational Researcher, 23 (7), 5-12. Dunning, D., et al. (2004). Flawed self-assessment: implications for health, education, and the workplace. Psychological Science in the Public Interest, 5(3), 69-106. Duschl, R. (2008) ‘Science education in three-part harmony: Balancing conceptual,
  • 31. Jane Chapman PGCE Secondary 31 epistemic, and social learning goals. Review of Research in Education, 32, 268–291. Dlamini, B. Et al. (1996) Liked and disliked learning activities: responses of Swazi students to science materials with a technological approach, Research in Science and Technological Education, 14 (2), 221-235. Ehrlinger, J. et al. (2008) Why the unskilled are unaware: Further explorations of (absent) self-insight among the incompetent. Organisational Behaviour and Human Decision Processes, 105, 98-121. Fast, L. et al. (2010). Does math self-efficacy mediate the effect of the perceived classroom environment on standardized math test performance. Journal of Educational Psychology, 102, 729– 740. Gardner-Medwin, A. R. (2006) ‘Confidence-based marking - towards deeper learning and better exams’. Bryan C, Clegg K, eds. Innovative Assessment in Higher Education. London: Routledge, Taylor and Group; Francis: 141–149. Gardner-Medwin A.R., Gahan M. (2003) ‘Formative and Summative Confidence-Based Assessment’, Proceedings of the 7th International CAA Conference, Loughborough University, UK, 147-155 ( www.caaconference.com). Gravill, J. et al. (2002). Metacognition and IT: the influence of self-efficacy and self- awareness. Paper presented in the meeting of eighth Americas Conference on Information Systems, Dallas, TX. Griffiths, A. & Preston, K. (1989) Paper presented at the National Association for Research in Science Teaching. Hacker, D. et al. (2000) Test prediction and performance in a classroom context, Journal of Educational Psychology, 92, 160-170. Hackett, G. & Betz, N. (1996) An exploration of the mathematics self-efficacy/mathematics performance correspondence, Journal for Research in Mathematics Education, 20, 261– 273. Hoffstein, A. (1988) ‘Practical work and scientific investigation II’, In Development and dilemmas in science education, Chapter 10. Krajcik, J. (1990) In The Psychology of learning Science; Glynn, S.; Yeaney, R.: Brinon, Eds; E~lbeum:H illdale. NJ. Lavonen, J. & Laaksonen, S. (2009) Context of Teaching and Learning School Science in Finland: Reflections on PISA 2006 Results, Journal of Research in Science Teaching, 46(8), 922-944. Lent, R. & Hackett, G. (1987) Career self-efficacy: Empirical status and future directions, Journal of Vocational Behavior, 30, 347–382.
  • 32. Jane Chapman PGCE Secondary 32 Lin, T.-J. And Tsai, C.-C. (2013) ‘An investigation of Taiwanese high school students’ science learning self-efficacy in relation to their conceptions of learning science’, Research in Science & Technological Education, 31(3), 308-323. Lin, T.-J. and Tsai, C.-C. (2013) ‘A multi-dimensional instrument for evaluating Taiwanese high school students’ science learning self-efficacy in relation to their approaches to learning science’, International Journal of Science and Mathematics Education 11, 1275-1301 Lin, T.-J., et al. (2013) ‘A Cross-Cultural Comparison of Singaporean and Taiwanese Eighth graders’ Science Learning Self-Efficacy from a Multidimensional Perspective.’ International Journal of Science Education 35: 1083–1109. Lind, D. Et al. (2002) Competency-based student self-assessment on a surgery rotation, Journal of Surgical Research; 105: 31–4. Liem, A. et al. (2008). The role of self-efficacy, task value, and achievement goals in predicting learning strategies, task disengagement, peer relationship, and achievement outcome. Contemporary Educational Psychology, 33, 486–512. Liu, M., et al. (2006) ‘Middle school students’ self-efficacy, attitude, and achievement in a computer enhanced problem-based learning environment. Journal of Interactive Learning Research, 17(3), 225-242. Lubben, F. et al. (1996) Contextualising science teaching in Swaziland: some student reaction. International Journal of Science Education, 18(3), 311-320. Lundeberg, M. et al. (1994) Highly confident but wrong: Gender differences and similarities in confidence judgments. Journal of Educational Psychology, 86, 114–121. OECD (2005) ‘Formative Assessment: Improving Learning in Secondary Classrooms’, Avaliable from: http://www.oecd.org/edu/ceri/35661078.pdf (Accessed 15 Apr 2014) OECD. (2007a). PISA 2006: Science Competencies for Tomorrow’s World, Volume 1: Analysis. Paris: OECD. OECD. (2007b). PISA 2006: Volume 2: Data. Paris: OECD. McEnery, J. & Blanchard, P. (1999) Validity of multiple ratings of business student performance in a management simulation, Human Resource Development Quarterly, 10(2), 155-172. Miller, T. & Geraci, L. (2011) ‘Unskilled but Aware: Reinterpreting Overconfidence in Low- Performing Students’, Journal of Experimental Psychology: Learning, Memory, and Cognition, 37(2), 502–506 Moffat, K. et al. (2004) First year medical student stress and coping in a problem-based learning medical curriculum, Medical Education, 38, 482–491. Novick, S. & Nussbaum, J. (1981) Pupils’ Understanding of the Particulate Nature of Matter: A Cross-Age Study, Science Education, 65(2), 87-196.
  • 33. Jane Chapman PGCE Secondary 33 Nakhleh, M. (1992) Why some students don’t learn chemistry: Chemical Misconceptions, 69(3), 191-195 Organisation for Economic Co-operation and Development (1999) ‘Measuring student knowledge and skills: A new framework for assessment. Paris: Author. Pajares, F. (1996) Self-Efficacy Beliefs in Academic Settings, Review of Educational Research, 66(4), 543-578. Pajares, F. (1996) Self-Efficacy Beliefs and Mathematical Problem-Solving of Gifted Students, Contemporary Educational Psychology, 21, 325-344. Rees, C. & Shepherd, M. (2005) Students’ and assessors’ attitudes towards students’ self- assessment of their personal and professional behaviours, Medical Education, 39, 30-39. Ross, J. A., et al. (2002) Self-Evaluation in grade 11 mathematics: Effects on achievement and student beliefs about ability. In D. McDougall (Ed.), OISE Papers on Mathematical Education. Toronto: University of Toronto Ross, J. A. (2006). The reliability, validity, and utility of self-assessment. Practical Assessment, Research, and Evaluation, 11(10), 1-13. Ross, J. et al (1998) Skills training versus action research in-service: impact on student attitudes to self-evaluation, Teaching and Teacher Education, 14 (5), 463–477 Ross. J. Et al. (1999) Effect of self-evaluation on narrative writing. Assessing Writing, 6(1), 107-132. Rust, C. et al. (2010) Improving Students' Learning by Developing their Understanding of Assessment Criteria and Processes, Assessment and Evaluation in Higher Education, 28(2), 147-164. Spiegelman, D. (2010) Commentary: Some remarks on the seminal 1904 paper of Charles Spearman ‘The Proof and Measurement of Association between Two Things’, International Journal of Epidemiology, 39(5), 1156-1159. Sungur, S. (2007) ‘Modelling the relationships among students’ motivational beliefs, metacognitive strategy use, and effort regulation’, Scandinavian Journal of Educational Research, 51, 315–326. Smith, M. L. (2006) ‘Multiple methods in education research’. In Green, J. L. et al. (Eds.), Handbook of complementary methods in education research, LEA, Mahwah, NJ, 457–475 Stamovlasis, D. Et al. (2005) A study of group interaction processes in learning lower secondary physics. Journal of Research in Science Teaching, 43(6), 556-576. Griggs, M. et al. (2013) The Responsive Classroom Approach and Fifth Grade Students' Math and Science Anxiety and Self-Efficacy, School Psychology Quarterly, 28(4), 360-373.
  • 34. Jane Chapman PGCE Secondary 34 Taber, K. (2013) Classroom-based Research and Evidence-based Practice: A Guide for Teachers, London: SAGE Publications. Topping, K. (2003) Self and peer assessment in school and university: reliability, validity and utility. In Segers, M., Dochy, F. & Cascallar, E. (Eds.) Optimising News Modes of Assessment: In search of qualities and standards, 55-87, Dordrecht, NL: Springer Netherlands Tsai, C.-C. (2003). Taiwanese science students’ and teachers’ perceptions of the laboratory learning environments: Exploring epistemological gaps. International Journal of Science Education, 25, 847–860. Tsai., et al. (2011) ‘Scientific epistemic beliefs, conceptions of learning science and self- efficacy of learning science among high school students’, Learning and Instruction, 21, 757- 769. Usher, E. L. & Pajares, F. (2006) ‘Sources of academic and self-regulatory efficacy beliefs of entering middle school students’, Contemporary Educational Psychology, 31, 125–141. Valijarvi, L. et al. (2007) The Finnish success in PISA—and some reasons behind it 2. Jyvaskyla: Institute for Educational Research Vygotsky, L. (1978) Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press. Weiner, B. (1972) Theories of motivation. Chicago: Markham. Wilson, E. (2013) School-based research: a guide for education students, London: SAGE Publications Zins, J. & Elias, M. (2006). Social and emotional learning. In G. G.Bear & K. M.Minke ( Eds.), Children’s needs III: Development, prevention, and intervention, 1– 13. Bethesda, MD: National Association of School Psychologists. Zimmerman, B. (1996) ‘Measuring and mismeasuring academic self-efficacy: Dimensions, problems, and misconceptions.’ Symposium presented at the Annual Meeting of the American Educational Research Association, New York.
  • 35. Jane Chapman PGCE Secondary 35 Appendix 1. Prompt questions for interviews Self-Assessment  Did marking how confident you were in each question change your understanding?  With self-assessment in the classroom, how do you feel about how other people might think or act towards you?  How accurate do you feel like you were able to assess yourselves?  Do you prefer marking your own work or have the teachers mark it? Why?  When you were marking your test questions with red, amber or green, do you think you played it safe instead of taking a risk (i.e. put red or amber to be on the safe side) or do you think you took a risk and were over-confident (i.e. put a green)?  How confident did you have to be, as a percentage, to award yourself green for a question, as opposed to amber or red? Self-Efficacy in School  Does your confidence in subjects affect how high you set your goals or how much effort you put in?  Does your confidence depend on how anxious you are in class?  How confident do you generally feel in school? Is this affected by the people in your class? Affected by the time of day? Affected by different subjects? Self-Efficacy in Science  How confident do you generally feel in science lessons?  Do you think self-confidence in science differs between genders? If so, why?  When we do practical work in class, do you think your understanding of science changes?  Does practical work alter your interest in science?  Do you ever relate school science into your everyday lives?  When you think about if you know something or not, do you feel this is affected by how confident you are in science?
  • 36. Jane Chapman PGCE Secondary 36 Appendix 2. Example Science Learning Self-Efficacy Questionnaire with Answers – James
  • 37. Jane Chapman PGCE Secondary 37 Example Science Learning Self-Efficacy Questionnaire with Answers – Alicja
  • 38. Jane Chapman PGCE Secondary 38 Appendix 3. Example Test Paper 1 with Answers and Confidence Indicators – James
  • 39. Jane Chapman PGCE Secondary 39 Example Test Paper 1 with Answers and Confidence Indicators – Alicja
  • 40. Jane Chapman PGCE Secondary 40 Appendix 4. Example Test Paper 2 with Answers and Confidence Indicators – James
  • 41. Jane Chapman PGCE Secondary 41 Example Test Paper 2 with Answers and Confidence Indicators – Alicja