SlideShare a Scribd company logo
1 of 8
Download to read offline
Measuring third year undergraduate nursing students' reflective
thinking skills and critical reflection self-efficacy following high
fidelity simulation: A pilot study
Naomi Tutticci a, *
, Peter A. Lewis a
, Fiona Coyer a, b
a
School of Nursing, Queensland University of Technology, Victoria Park Road, Kelvin Grove, QLD 4059, Australia
b
Department of Intensive Care Medicine, Royal Brisbane and Women's Hospital, Victoria Park Road, Kelvin Grove, QLD 4059, Australia
a r t i c l e i n f o
Article history:
Received 20 July 2015
Received in revised form
5 November 2015
Accepted 11 March 2016
Keywords:
Nursing
Self-efficacy
Reflection
High fidelity simulation
a b s t r a c t
Critical reflection underpins critical thinking, a highly desirable generic nursing graduate capability. To
improve the likelihood of critical thinking transferring to clinical practice, reflective thinking needs to be
measured within the learning space of simulation. This study was divided into two phases to address the
reliability and validity measures of previously untested surveys. Phase One data was collected from in-
dividuals (n ¼ 6) using a ‘think aloud’ approach and an expert panel to review content validity, and
verbatim comment analysis was undertaken. The Reflective Thinking Instrument and Critical Reflection
Self-Efficacy Visual Analogue Scale items were contextualised to simulation. The expert review
confirmed these instruments exhibited content validity. Phase Two data was collected through an online
survey (n ¼ 58). Cronbach's alpha measured internal consistency and was demonstrated by all subscales
and the Instrument as a whole (.849). There was a small to medium positive correlation between critical
reflection self-efficacy and general self-efficacy (r ¼ .324, n ¼ 56, p ¼ .048). Participant responses were
positive regarding the simulation experience. The research findings demonstrated that the Reflective
Thinking and Simulation Satisfaction survey is reliable. Further development of this survey to establish
validity is recommended to make it viable.
© 2016 Elsevier Ltd. All rights reserved.
1. Introduction
Critical reflection is a meta-cognitive skill, or the awareness of
one's own cognitive functioning (Sobral, 2000), and is a key
mechanism in the process of critical thinking (Forneris and Peden-
McAlpine, 2007, p. 411). The identification and measurement of
critical reflection in a pre-transitioning undergraduate is essential if
critical thinking is to be an identifiable outcome of undergraduate
nursing education. There is evidence to suggest that teaching
strategies such as reflection, which requires active engagement by
the student, can be effective in developing and promoting critical
thinking skills and dispositions (Gul et al., 2010). Reflective
thinking, or reflection upon practice, is a key element of the un-
dergraduate nursing curriculum and is valued by the profession;
however, there is a dearth of instruments to measure reflective
thinking in this context (Kember et al., 2000).
1.1. Literature review
Facione and Facione (1996, p.133) stressed that critical thinking
needs to be demonstrated by graduates and this demands constant
metacognitive reflection on “what one is doing and why”. Whilst
critical thinking is considered to be a fundamental competency
required of nurses and is a compulsory educational outcome of
most undergraduate nursing programs (Gul et al., 2010), it is
acknowledged as being difficult to assess (Jones, 2009). Many
educational courses aim to promote reflective thinking, or reflec-
tion upon practice; however, there is a scarcity of readily usable
instruments to determine if students engage in reflective thinking,
and, if so, to what extent (Kember et al., 2000).
Kolb's (1984, p.27) experiential learning theory requires the
learner to become actively involved in the experience and reflect on
the experience, during, as well as after (Clapper, 2010). In the
simulation context, experiential learning is ‘especially adaptable to
adult learners; it gives the opportunity to see real consequences of
* Corresponding author.
E-mail addresses: naomi.tutticci@qut.edu.au (N. Tutticci), p.lewis@qut.edu.au
(P.A. Lewis), f.coyer@qut.edu.au (F. Coyer).
Contents lists available at ScienceDirect
Nurse Education in Practice
journal homepage: www.elsevier.com/nepr
http://dx.doi.org/10.1016/j.nepr.2016.03.001
1471-5953/© 2016 Elsevier Ltd. All rights reserved.
Nurse Education in Practice 18 (2016) 52e59
one's actions, to feel the exhilaration of success and frustration of
failure’ (Gilley,1990, p. 261). Kolb's theory provides a framework for
simulation, where learners are able to apply their nursing knowl-
edge to the care of a simulated patient within a safe environment,
which leads to the improved acquisition of knowledge (Howard
et al., 2011). Students use of reflective thinking after having any
lived experience, whether clinical or simulated, should result in
improved critical thinking, a more satisfied nurse, and in the long
run, better patient care (Sanford, 2010).
Chronister and Brown (2012) and Feingold et al. (2004) stated
that simulation as a teaching strategy incorporates two of the seven
principles for good practice in undergraduate education: active
learning and prompt feedback (Chickering and Gamson,1987), both
of which can improve learning, and ultimately, learning satisfac-
tion. These measures are evaluated in simulation contexts, but not
from the perspective of identifying their contribution to final year
nursing students’ reflective thinking capabilities within high fi-
delity simulation (HFS).
Levett-Jones et al. (2011) found that student satisfaction helps to
build self-confidence, which in turn helps students to develop skills
and acquire knowledge. Jefferies and Rizzolo (2006) and Jarzemsky
and McGrath (2008) reported higher levels of self-confidence
following participation in clinical simulations. Self-efficacy is,
therefore, particularly important to evaluate, as learning through
simulation incorporates processes that support social cognitive
theory: vicarious capability, cognition, self-regulation, and self-
reflection (Sinclair and Ferguson, 2009). What is lacking in the
literature is the measurement of self-belief about one's capacity to
critically reflect, and how the relationship between self-belief and
critical reflection impacts on learner satisfaction and learning
outcomes.
There is consensus in the literature that HFS is an effective
teaching and learning methodology when adhering to best practice
(Cant and Cooper, 2010), an association related to clinical
reasoning, namely critical thinking, clinical skill performance, and
knowledge acquisition (Lapkin et al., 2010). The directions for
future research on HFS are also agreed upon, and respond to the
lack of unequivocal evidence of universal measures of how to
document learning outcomes (Howard et al., 2011; Cant and
Cooper, 2010), in particular evaluating metacognitive skills such
as reflection. The development of a valid and reliable survey to
evaluate reflective practice, skills, and simulation learning experi-
ences may ultimately improve the assessment of student perfor-
mance (Lapkin et al., 2010) and facilitate the transfer of both
generic skills and discipline specific knowledge and skills into the
workplace.
1.2. Research aims and questions
The aim of the pilot study was to determine the reliability and
content validity of instruments to be used in a subsequent rando-
mised control trial. The research questions were:
1) Do the Reflective Thinking Instrument and Critical Reflection
Self-Efficacy Visual Analogue Scale (two instruments not pre-
viously tested for reliability and validity) demonstrate content
validity?
2) What is the reliability of the Reflective Thinking and Simulation
Survey (combined instruments)?
2. Research design
Multiple rationales for pilot studies exist, including refining the
methodology for a larger study (Musil, 2006), evaluating the reli-
ability of a research instrument or scale (Baird, 2000), and their
validity for the population under study (Brink, 1988). The intent of
this pilot study was to determine the reliability and validity of in-
struments either not previously tested, or not used together.
This study was divided into two phases e Phase One used in-
dividual ‘think aloud’ sessions and an expert panel to address the
validity measures of previously untested instruments: the Reflec-
tive Thinking Instrument and Critical Reflection Self-Efficacy Visual
Analogue Scale (VAS). Phase Two used a post-test only design to
address the reliability measures of all of the instruments used.
2.1. Setting
The setting for Phases One and Two was a metropolitan
Australian university's school of nursing. The school is one of the
largest schools of nursing in Australia (approximately 2500 stu-
dents), offering a Bachelor of Nursing (BN) program and three
double degrees (nursing/behavioural science, nursing/paramedic
science, and nursing/public health).
High fidelity simulation laboratories are located onsite and use
computerised full body manikins (METIman Nursing™[CAE
Healthcard] and SimMan 3G™[Laerdal]) for the final clinical sub-
jects’ high fidelity simulation sessions. These sessions are con-
ducted in a fully equipped clinical space with supporting
technology and equipment to deliver nursing interventions ex-
pected within the scope of the scenario.
The study was undertaken between August and September
2014.
2.2. Participant
All BN students in their final year of study and enrolled in their
final clinical subject were eligible for study inclusion. Repeating
students were not excluded from the study.
2.3. Intervention
A standard high fidelity simulation for the final year clinical
subject uses METIman Nursing™ and SimMan 3G™ positioned
within replica clinical spaces. These spaces contain appropriate
adjunctive technologies to support the simulated patients’ clinical
needs, for example: intravenous infusion pumps, drains, single lead
and 12 lead electrocardiograph, oxygen saturation and blood
pressure monitoring providing real time data.
For standard HFS sessions the manikins are programmed to
provide realistic physiological responses to the facilitator and stu-
dents’ actions. Conversations between the manikin and the student
nurse occurs through voice-over technology; that is, the facilitator
speaks as the patient and members of the multidisciplinary team
(from a microphone in the control room). HFS participants carry out
their assigned roles in response to a pre-programmed scenario,
within this true-to-life setting.
All students enrolled in the final clinical subject are allocated to
groups consisting of nine members for a standard HFS. These HFS
groups include: one academic facilitator, four students who un-
dertake the simulation, and four students who observe their peers
undertaking the simulation. All nine are involved in the debrief,
which is facilitated by the academic.
In addition to the standard high fidelity simulations (HFS) that
are offered in the final clinical subject, another five HFS were
offered to students who participated in Phase One of the pilot
study. These were delivered in the same way as a standard HFS
session is run for the final year clinical subject.
The scenario used in the specific Phase One HFS involved a
terminally ill male patient with a history of pancreatic cancer, who
presented to the emergency department with break-through
N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e59 53
abdominal pain. The patient had a nasogastric tube inserted by
emergency staff and was transferred to the ward e the setting for
the simulation. For high fidelity the manikin was attached to single
lead electrocardiograph, an oxygen saturation probe and blood
pressure cuff. Each of these components were providing real time
data to the simulant participants. The manikin was supported with
intravenous fluids which were being infused via a hospital grade
infusion pump.
The scenario used in Phase Two was part of the second semester
curricula for third year students and focused on the management of
a male patient with chest trauma experiencing acute pain with a
chest drain in situ. In this scenario the manikin was attached to
intravenous fluids via a hospital grade infusion pump and standard
haemodynamic parameters were being monitored and displayed
visually on a monitor screen. Real time data was accessible to the
participants to assist them with clinical decision making. An
Ocean™ Wet Suction Water Seal Drain was incorrectly positioned
and not connected to wall suction, to provide additional high fi-
delity conditions in terms of mimicking errors that can occur in the
clinical environment.
2.4. Instruments
The Phase Two survey was comprised of five instruments, in
addition to demographic questions and an open-ended question.
Higher scores indicated more positive results for all instruments.
2.5. The reflective thinking instrument
This 16-item (5 point Likert scale) instrument assesses students'
engagement in, and extent of, reflective thinking in professional
preparation courses, in particular health science disciplines
(Kember et al., 2000). It is comprised of four levels of reflection:
habitual action, understanding, reflection, and critical reflection
(Kember et al., 2000; Mann et al., 2009). Scores range from 16 to 80,
with 80 indicating the highest possible score. Kember et al. (2000)
demonstrated the instruments’ reliability (a ranged from .621 to
.757) for students enrolled in courses that include a professional
practice component.
2.6. Visual analogue scale (VAS)
Self-efficacy for critical reflection was measured using a single
item VAS (Kameg et al., 2009). The VAS was modified through
rewording of the original scale to ensure it measured the critical
reflection self-efficacy of third year undergraduate nursing stu-
dents. The VAS measured how confident participants felt about
their ability to critically reflect after clinical practice and was a
straight line from zero to 100 (zero represented ‘cannot do at all’
while 100 represented ‘highly certain can do’).
2.7. The general self-efficacy scale (GSES)
This instrument measures a global confidence in coping ability
across a wide range of demanding or difficult situations, and re-
flects a broad and stable confidence in dealing effectively with
rather diverse stressful situations (Rimm and Jerusalem, 1999, p.
330). Responses are made on a four-point Likert scale. The GSES
consists of ten items and scores range from 10 to 40, with 40
indicating the highest possible score (Luszczynska et al., 2005;
Schwarzer and Jerusalem, 1995).
2.8. The Educational Practices in Simulation Scale (EPSS)
This 16-item instrument uses a five-point Likert scale and a not
applicable column (scored as a ‘6’), and is completed after the
learner undertakes the simulation. The EPSS scores range from 16
to 96, with 96 indicating the highest possible score. The scale
measures four educational practices (active learning, collaboration,
diverse ways of learning, and high expectations), which are present
in the simulation to demonstrate the importance of each practice to
the learner (Jefferies and Rizzolo, 2006; Swanson et al., 2011).
2.9. Satisfaction with Simulation Experience Scale (SSES)
This instrument consists of 18 items, requiring participants to
rate their level of agreement with each statement on a five-point
Likert scale. Items are grouped into three subscales: debrief and
reflection, clinical reasoning, and clinical learning (Levett-Jones
et al., 2011; Liaw et al., 2014). The SSES scores range from 18 to
90, with 90 indicating the highest possible score. The survey con-
tained an open ended question at the end to seek comment on the
simulation experience.
2.10. Procedure
2.10.1. Phase One
The students were recruited to the individual ‘think aloud’
sessions (Fig. 1) through announcements, a three minute vodcast
shown on the learning management system platform, and emails.
The announcements were staggered over a five week period lead-
ing up to Phase One. The individual ‘think aloud’ sessions took place
after the second HFS and participants were asked to complete the
Reflective Thinking Instrument and VAS. The participants were
instructed to talk out loud about the questions whilst completing
them on a laptop computer. Responses were audio-recorded.
Content validation of the Reflective Thinking and VAS in-
struments by an expert panel followed on from the individual
‘think aloud’ sessions and subsequent instrument modification. The
expert panel consisted of six academics (four nurse academics, one
Fig. 1. Research design e phase one and two.
N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e5954
education academic, and an academic from Language and Learning
Services) to ensure a sufficient level of control for chance agree-
ments. Academics with expertise in HFS, reflective practice, in-
strument and curriculum design were sought, along with an
academic with expertise in international and indigenous student's
learning needs. Panel members were provided with working defi-
nitions of the study's primary constructs, along with paper copies of
the modified instruments. The review was requested within two
weeks.
2.11. Phase Two
Students were recruited to Phase Two (Fig. 1) using the same
strategy as Phase One. Surveys were distributed electronically with
an online link using Key Survey v8.2. The survey remained open for
two weeks once the final clinical subject's simulations had been
completed.
2.12. Data analysis
2.12.1. Phase One
Verbatim comment analysis and grouping of similar responses
was undertaken using the individual ‘think aloud’ transcripts. The
expert panel rated each item of the modified Reflective Thinking
Instrument and VAS using a four-point Likert scale. Comments
were sought on each individual item to reduce question ambiguity
and to provide new perspectives.
The content validity index (CVI) was applied to the modified
Reflective Thinking Instrument and VAS, with both scale level CVI
(S-CVI) and average proportion of items rated as 3 or 4 across the
six experts (S-CVI/Ave). The scale level CVI is the proportion of
items on an instrument that achieved a rating of 3 or 4 by all
content experts, that is, universal agreement (S-CVI/UA) (Polit and
Beck, 2006). The S-CVI/Ave should be at .90, a higher value than the
S-CVI/UA, because the former is much more liberal in its definition
of congruence.
The modified Reflective Thinking Instrument and VAS were
combined with three other instruments into the ‘Reflective
Thinking and Simulation Survey’ and administered to a sample of
students enrolled in the third year final clinical subject. This survey
was used in a ‘post-test’ role, after the participants had completed a
simulation debrief.
2.13. Phase Two
Reliability was measured for each subscale and total scale using
Cronbach's alpha. The alpha value was set at .70 to indicate suffi-
cient internal consistency (Tavakol and Dennick, 2011). Subscale
and full scales alphas have been reported on. However, alphas for
subscales were used to determine internal consistency due to their
measure of unidimensionality.
Verbatim comments from the open ended question in the
Satisfaction with Simulation Experience Scale were allocated to
groups based on the similarity of their content.
2.14. Ethical considerations
Ethical approval was obtained from the university's ethics
committee. All participants provided informed written consent for
Phase One and consent was implied with completion of the survey
in Phase Two.
3. Results
3.1. Phase One
Six students participated in the individual ‘think aloud’ sessions:
five females and one male. Responses were gathered into three
groups: ‘thinking reflectively’, ‘simulation equates to clinical prac-
tice’, and ‘simulation specific’. A few responses referred to simu-
lation directly, in most other instances it was referred to as ‘clinical
practice’ (see Table 1).
As a result of these individual sessions the instruments' ques-
tions were altered where necessary to refer to clinical practice as
the context for the participants' reflective thinking. Participant re-
sponses indicated that simulation was considered a type of clinical
practice. Results from participants' ‘think aloud’ sessions deter-
mined that no questions were removed from the instrument.
Individual CVI values for item two and the VAS were below the
recognised acceptable level of .78 for six raters, consequently, this
information was used to direct scrutiny towards item two and the
VAS for content modification. Ten of the 17 items (.59) were rated as
universally congruent by all six experts. The average proportion
rated across the experts was .90 (ideal is S-CVI/Ave .90).
No items were rejected by the expert panel. Comments by
expert panel members led to eight changes to the instruments’
items. The changes are outlined as follows:
1. Language used in items was individualised i.e. from ‘us’ to ‘me’,
‘you’ to ‘I’.
2. Items focusing on ‘understanding’ were expanded to include
‘application’.
3. Items were contextualised to nursing tasks instead of ‘actions’.
4. ‘Clinical practice’ was included in items to contextualise the
‘where’ (i.e. simulation).
5. ’Lecturer’ was replaced with ‘teacher’ to expand contexts of
learning to include simulation.
6. ‘Handout’ material was changed to ‘lecture/tutorial’ material for
assessments to broaden the applicability of the item for the
learning experience simulation.
7. Wording for the VAS was altered to plain English and con-
textualised to clinical practice.
8. Instructions on how to complete the VAS were altered to facil-
itate easier use and provide stronger linking between the
measured concept and the instrument.
3.2. Phase Two
Fifty eight (n ¼ 58) third year undergraduate students from a
cohort of 231 (25%) participated in Phase Two. The majority of the
sample were female (84.5%, 49/56) and were domestic students
(81.0%, 47/57) (see Table 2).
Internal consistency of the Reflective Thinking Instrument was
measured for each subscale and the instrument as a whole
(Table 3). It was significant across the four subscales (608e.808)
and the whole instrument (.849), demonstrating good internal
consistency of the instrument and each subscale.
The relationship between critical reflection self-efficacy (as
measured by the Critical Reflection Self-Efficacy Visual Analogue
Scale) and general self-efficacy (as measured by the General Self-
Efficacy Scale) was investigated to determine reliability (using the
Pearson productemoment correlation coefficient). Preliminary
analysis was performed to ensure no violation of the assumptions
of normality, linearity, and homoscedasticity. There was a small to
medium positive correlation between the two variables (r ¼ .324,
n ¼ 56, p ¼ .048), with high levels of critical reflection self-efficacy
N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e59 55
associated with high levels of general self-efficacy.
3.3. Analysis of comments from the satisfaction with Simulation
Experience Scale
Of the 58 participants, 22 (38%) commented on their simulation
experience as part of the Satisfaction with Simulation Experience
scale. The responses were collated and allocated into 3 distinct
groups: simulation design, simulation experience, and simulation
implementation. This allowed for inferences to be made about the
characteristics and meaning of the participants’ responses. The
participant response groups are reported in Table 4 and supported
by verbatim quotes.
Overall, the participant's responses about the simulation expe-
rience were positive. They described the HFS as ‘very helpful’,
‘excellent’ and ‘educational’, recognition of its positive impact on
learning, engagement, and preparation for clinical placement. The
most recognisable comments regarding the ‘simulation experience’
(Table 4 e simulation experience) centred on HFS as a valuable
learning experience especially from a practical skill/clinical
perspective. This was supported by the ‘simulation implementa-
tion’ group (Table 4 e simulation implementation), which included
comments about the benefits of increasing the frequency of simu-
lation within the curriculum, particularly before clinical placement.
The belief was expressed that with more regular simulations,
greater insight could be gained into different clinical issues.
It was recognised that some participants did not value the
experience positively, rather they saw it as a university strategy to
increase hours of study in place of clinical practicum hours (hos-
pital-based placement). ‘Simulation design’ (Table 4 e simulation
Table 1
Phase One data comments and evidence: ‘reflective thinking and simulation’.
Theme Survey item Data extracts
Thinking Reflectively
Q14. In this course you have to continually think
about material you are being taught.
‘Agree with reservation e it helps us critically think’. Participant 3
Q7. I like to think over what I have being doing
and consider alternative ways of doing it.
‘Maybe too much e so I definitely agree, I'm an overthinker’. Participant 5
Q8. I often reflect on my actions to see whether I
could have improved on what I did.
‘I don't do it much, I have to do it e agree with reservation’. Participant 4
Q3. I sometimes question the way others do
something and try to think of a better way.
‘Agree with reservation e I don't always question people and often people are doing the right
thing, so there is no reason to. Sometimes I don't realise people are doing the wrong thing. I will
evaluate and think of better options if I do pick someone up on something that's not completely
right’. Participant 6
Simulation equates
to clinical practice
Q13.If I follow what the lecturer says, I do not have
to think too much in this course.
‘Lecturers teaching the right things to be a safe, competent nurse so if you don't have to think too
much e definitely disagree e you always have to think, like I said (when I got) my learning
contract, it taught me so much e you think about everything you do, no exceptions'. Participant
6
Q8. This course has challenged some of my firmly
held ideas
Disagree with reservation e try to change beliefs about how you care for patients'. Participant 1
Q6. To pass this course you need to understand
the content.
Definitely agree e I think with any practical e like clinical nursing you need to understand what
you are doing and the theory behind it’. Participant 2
Simulation specific
Visual Analogue Scale ‘In regards to simulation maybe no, I don't think I did that really well. If I had more resources
would I be more self-efficacious (sic)?”Participant 1
Q3. I sometimes question the way others do
something and try to think of a better way.
‘Agree with reservation e not in the process of simulation, but if something going on is awful …
’Participant 1
Table 2
Phase Two 3rd year undergraduate nurse characteristics.
3rd year undergraduate nurse characteristics Number (%)
Gender Male 7 (12.1%)
Female 49 (84.5%)
Student Status Domestic 47 (81.0%)
International 10 (17.2%)
Age 15e25 25 (43.1%)
26e35 21 (36.2%)
36e45 10 (17.2%)
46e55 2 (3.4%)
56e65 0 (0%)
66þ 0 (0%)
Previously Completed a Degree Yes 21 (36.2%)
No 37 (63.8%)
Entry Pathway 3yr programme 22 (37.9%)
Graduate Entry programme 24 (41.4%)
Enrolled Nurse Articulation programme 4 (6.9%)
Double Degree programme 7 (12.1%)
Grade Point Average 0e3 0 (0%)
3e4 2 (3.4%)
4e5 11 (19.0%)
5e6 28 (48.3%)
6e7 17 (29.3%)
N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e5956
design) continued this constructive dialogue, as it focused on the
lack of variety of scenarios provided and the negative impact of the
reduced length of debrief on learning.
4. Discussion
The inclusion and measurement of reflective thinking within
undergraduate nursing curriculums facilitates student acquisition
of essential meta-teaching and metacognitive skills (McNamara,
1990; Noffke and Brennan, 1988). In this study, two instruments:
Reflective Thinking and Critical Reflection Self-Efficacy VAS were
tested for content validity and reliability, and were found to consist
of relevant content and confirm internal consistency. This finding
enabled the combination of these two instruments with other valid
and reliable instruments (measuring self-efficacy, education prac-
tices associated with simulation, and simulation satisfaction) into
one survey, which was piloted on third year nursing un-
dergraduates during their final clinical subject. This result has aided
the measurement of reflection, self-efficacy, learning, and satis-
faction experiences to be progressed within the context of high
fidelity simulation, an outcome not previously attained.
This evaluation of cognitive and learning processes, experiences,
and outcomes is a previously under-researched phenomena in the
context of simulation and preparation for professional practice. A
continuing theme in nursing and medical literature related to the
use of simulation is the question regarding the transferability of
competence (cognitive, affective, and psychomotor skills) from a
simulated setting to a real clinical setting (Feingold et al., 2004;
Table 3
Reflective thinking and simulation survey scale reliability.
Scale Subscale Cronbach's alpha
Reflective Thinking
Habitual Action .701
Understanding .682a
Reflection .608b
Critical Reflection .808
Whole Scale .849
General Self Efficacy Scale .754=
Educational Practices in Simulation Scale e Presence of Educational Practices
Active Learning .794y
Collaboration .835
Diverse Ways of Learning .847
High Expectations .618z
Whole Scale .856
Educational Practices in Simulation Scale e Importance of Educational Practice
Active Learning .873
Collaboration .880
Diverse Ways of Learning .647yy
High Expectations .854
Whole Scale .863
Satisfaction with Simulation Experience
Debrief and Reflection .931
Clinical Reasoning .884
Clinical Learning .839
Whole Scale .948
=
GSES e Items two and eight had a negative value of À.053 in the inter-item correlation matrix and the corrected itemetotal correlation for item two as less than .3 (.133).
Item two was retained.
y
EPSS e active learning. There were no negative values. Item two had a low value of .070 for the corrected itemetotal correlation.
z
EPSS e high expectations. The mean inter-item correlation value is .447 suggesting a moderate correlation. There were negative values in the inter-item correlation matrix
and the corrected itemetotal correlation for items two, 15 and 16 was less than .3. Item two was retained.
yy
EPSS e diverse ways of learning. The mean inter-item correlation value is .478.
a
Understanding - There were no negative values in the inter-item correlation matrix and no values were less than .3 for the corrected itemetotal correlation.
b
Reflection - Item three had a value of less than .3 (.128) for corrected itemetotal correlation. Item three (subscale e reflection) was removed to improve Cronbach's alpha
from .608 to .722.
Table 4
Qualitative data groups and evidence: ‘satisfaction with high fidelity simulation’.
Qualitative data groupings Participant quotes
Simulation Design
‘As the sim was based around communication having four students in the room made it too hard to be able to do tasks as there was not really
enough for everyone to do, two or three would have been better.’ Participant 13
‘We need a variety of scenarios.’ Participant 46
‘The debrief could possibly be longer with regard to discussion of the content.’ Participant 18
Simulation Experience
‘Educational and eye opening to palliative care and effective communication.’ Participant 51
‘The simulation experience is an excellent learning tool for students to practice the skills and concepts in a ‘real’ environment without any
actual risk to the patient.’ Participant 47
‘I find them nerve-wracking, but a good experience.’ Participant 18
Simulation Implementation
‘Overall I believe simulation is very helpful, especially before off campus placement. Participant 53
‘I find Sim to be the most valuable learning experience and would love to do more.’ Participant 6
‘..please have at least 10 sim sessions which can give some insight into different clinical issues.’ Participant 50
‘There were some technical issues (volume too low), which would have made the experience more useful if corrected.’ Participant 16
N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e59 57
Murray et al., 2008; Swenty and Eggleston, 2011). Cioffi (2001) and
Lasater (2007) were emphatic that simulation does not replace
clinical experience, but it can extend the subject matter to equip
learners with some skills that can be directly transferred into the
‘real’ clinical setting.
Participants were satisfied with simulation as a valuable
learning experience and inferred it was a type of clinical practice.
This association between high fidelity simulation and clinical
practice could promote student reflective thinking and assist with
the transfer of knowledge (Daley and Campbell, 2013, p. 2), further
enhancing the development of generic graduate capabilities and
improved transition to practice. Examining this association by
measuring reflective thinking and evaluating the simulation
experience may well lead to a greater understanding of student's
reflective capabilities and their simulation experience, in addition
to addressing the question of how effective simulation is as a type
of clinical experience. Certainly these findings demonstrate the
applicability of the selected instruments grouped into one survey
for a larger trial in the future.
The pilot study and survey instruments have several limitations.
Item 14 in the Educational Practices in Simulation Scale used a
slightly different Likert scale, offering five options, instead of six.
The scale was missing the option of ‘not applicable e the statement
does not pertain to the simulation performed.’ This could poten-
tially create missing data. This question was retained in the analysis
for reliability; however, for validation of the survey the item would
need to return to its original form. A number of items from the
unmodified instruments were found to have Cronbach alphas
below .7. These items were retained in the established instruments
to enable comparison of results between future similar studies
using these instruments.
A small to medium correlation was found to exist between two
variables (VAS and the General Self-Efficacy Scale), suggesting that
the VAS is a reliable instrument. Testing the reliability of this in-
strument on a larger sample or by selecting and using another scale
that is a closer substitute would be recommended.
The ‘think aloud’ sessions were included in phase one to
ascertain the relevance of the reflective thinking instrument to the
end users, that is, third year undergraduate nursing students
participating in HFS. A small convenience sample of six participants
(Phase One ‘think aloud’) does have an inherent risk of bias. Due to
the qualitative nature of this data collection method a small sample
was considered sufficient to obtain an account of the participants'
experience interacting with this instrument.
5. Conclusion
This study served as a starting point to validate a new survey in
reflective thinking and simulation satisfaction in undergraduate
student nurses. The research findings demonstrated that the
Reflective Thinking and Simulation Satisfaction survey is reliable.
Students also verbalised their satisfaction with simulation as a
valuable learning experience and inferred that it was a type of
clinical practice. The study findings suggest that HFS is reasonably
successful at imitating clinical experience and providing a satis-
fying learning experience, with certain limitations associated with
simulation design and implementation. Further development of the
Reflective Thinking and Simulation Survey to establish validity is
recommended to make this a viable survey.
Conflict of interest statement
There are no conflicts of interest.
Acknowledgements
We would like to thank the study participants for their time. Dr
Naomi Malouf provided invaluable assistance in the implementa-
tion of this pilot study. Kylie Morris provided editorial assistance for
this manuscript.
References
Baird, C., 2000. Taking the mystery out of research: the pilot study. Orthop. Nurs. 19
(2), 42e43. Retrieved from. http://search.proquest.com/docview/195967475?
accountid¼13380.
Brink, P.J., 1988. The pilot study. West. J. Nurs. Res. 10, 237e238.
Cant, R., Cooper, S., 2010. Simulation-based learning in nurse education: systematic
review. J. Of Adv. Nurs. 66 (1), 3e15. http://dx.doi.org/10.1111/j.1365-
2648.2009.05240.
Chickering, A.W., Gamson, Z.F., 1987. Seven principles for good practice in under-
graduate education. AAHE Bull. 3 (7). Retrieved from. http://search.proquest.
com/docview/63263975?accountid¼13380.
Chronister, C., Brown, D., 2012. Comparison of simulation debriefing methods. Clin.
Simul. Nurs. 8 (7), e281ee288. http://dx.doi.org/10.1016/j.ecns.2010.12.005.
Cioffi, J., 2001. Clinical simulations: development and validation. Nurse Educ. Today
21 (6), 477e486.
Clapper, T.C., 2010. Beyond Knowles: what those conducting simulations need to
know about Adult Learning Theory. Clin. Simul. Nurs. 6 (1), e7ee14. http://
dx.doi.org/10.1016/j.ecns.2009.07.003. Retrieved from.
Daley, K.M., Campbell, S.H., 2013. Simulation-focused pedagogy for nursing edu-
cation. In: Daley, K.M., Campbell, S.H. (Eds.), Simulation Scenarios for Nurse
Educators Making it Real, second ed. Springer Publishing Company, New York,
pp. 1e8.
Facione, N.C., Facione, P.A., 1996. Externalizing the critical thinking in knowledge
development and clinical judgment. Nurs. Outlook 44, 129e136.
Feingold, C., Calaluce, M., Kallen, M., 2004. Computerized patient model and
simulated clinical experiences: evaluation with baccalaureate nursing students.
J. Nurs. Educ. 43 (4), 156e163. Retrieved from http://search.proquest.com/
docview/203944152?accountid¼13380.
Forneris, S., Peden-McAlpine, C., 2007. Evaluation of a reflective learning inter-
vention to improve critical thinking in novice nurses. J. Of Adv. Nurs. 57 (4),
410e421. http://dx.doi.org/10.1111/j.1365-2648.2007.04120.
Gilley, J.W., 1990. Demonstration and simulation. In: Galbraith, M.W. (Ed.), Adult
Learning Methods: a Guide for Effective Instruction. Krieger, Malabar, FL,
pp. 261e281.
Gul, R., Cassum, S., Ahmad, A., Khan, S., Saeed, T., Parpio, Y., 2010. Enhancement of
critical thinking in curriculum design and delivery: a randomised controlled
trial for educators. Procedia Soc. Behav. Sci. 2, 3219e3225.
Howard, V.M., Englert, N., Kameg, K., Perozzi, K., 2011. Integration of simulation
across the undergraduate curriculum: student and faculty perspectives. Clin.
Simul. Nurs. 7, e1ee10. http://dx.doi.org/10.1016/j.ecns.2009.10.004. Retrieved
from.
Jarzemsky, P.A., McGrath, J., 2008. Look before you leap: lessons learned when
introducing clinical simulation. Nurse Educ. 33 (2), 90e95.
Jefferies, P.R., Rizzolo, M.A., 2006. Designing and Implementing Models for the
Innovative Use of Simulation to Teach Nursing Care of Ill Adults and Children: a
National, Multi-site, Multi-method Study. National League for Nursing, New
York.
Jones, A., 2009. Generic attributes as espoused theory: the importance of context.
High. Educ. 58 (2), 175e191.
Kameg, K., Mitchell, A., Clochesy, J., Howard, V., Suresky, J., 2009. Communication
and human patient simulation in psychiatric nursing. Issues Ment. Health Nurs.
30 (8), 503e508. http://dx.doi.org/10.1080/01612840802601366.
Kember, D., Leung, D.P., Jones, A., Loke, A., McKay, J., Sinclair, K., Yeung, E., 2000.
Development of a questionnaire to measure the level of reflective thinking.
Assessment & evaluation in. High. Educ. 25 (4), 381e395. http://dx.doi.org/
10.1080/026029300449272.
Kolb, D., 1984. Experiential Learning: Experience as the Source of Learning and
Development. Prentice Hall, Upper Saddle River, NJ.
Lapkin, S., Levett-Jones, T., Bellchambers, H., Fernandez, R., 2010. Effectiveness of
patient simulation manikins in teaching clinical reasoning skills to under-
graduate nursing students: a systematic review. Clin. Simul. Nurs. 6 (6),
e207ee222. http://dx.doi.org/10.1016/j.ecns.2010.05.005.
Lasater, K., 2007. High-fidelity simulation and the development of clinical judg-
ment: students' experiences. J. Nurs. Educ. 46 (6), 269e276. Retrieved from.
http://search.proquest.com/docview/203965988?accountid¼13380.
Levett-Jones, T., McCoy, M., Lapkin, S., Noble, D., Hoffman, K., Dempsey, J., Arthur, C.,
Roche, J., 2011. The development and psychometric testing of the satisfaction
with simulation experience scale. Nurse Educ. Today 31, 705e710.
Liaw, S.Y., Zhou, W.T., Tang, Ching Lau, Chiang, Siau, Sally, Wai-chi Chan, 2014. An
interprofessional communication training using simulation to enhance safe
care for a deteriorating patient. Nurse Educ. Today 34 (2), 259e264. http://
dx.doi.org/10.1016/j.nedt.2013.02.019. Retrieved from.
Luszczynska, A., Scholz, U., Schwarzer, R., 2005. The general self-efficacy scale:
multicultural validation studies. J. Psychol. 139 (5), 259e264. http://dx.doi.org/
N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e5958
10.1016/j.nedt.2013.02.019.
Mann, K., Gordon, J., MacLeod, A., 2009. Reflection and reflective practice in health
professions education: a systematic review. Adv. Health Sci. Educ. 14, 595e621.
McNamara, D., 1990. Research on Teachers' Thinking: its contribution to educating
student teachers to think critically. J. Educ. Teach. 16 (2), 147e160.
Murray, C., Grant, M.J., Howarth, M.L., Leigh, J., 2008. The use of simulation as a
teaching and learning approach to support practice learning. Nurse Educ. Pract.
8 (1), 5e8.
Musil, C.M., 2006. Pilot Study. Springer Publishing Company, New York. Retrieved
from. http://search.proquest.com/docview/189467802?accountid¼13380.
Noffke, S., Brennan, M., 1988. The Dimensions of Reflection: a conceptual and
contextual analysis'. In: Annual Meeting of the AERA. New Orleans.
Polit, D.F., Beck, C.T., 2006. The Content Validity Index: are you sure you know
what's being reported? Critique and Recommendations. Res. Nurs. Health 29,
489e497.
Rimm, H., Jerusalem, M., 1999. Adaption and validation of an Estonian version of the
general self-efficacy scale (ESES). Anxiety, Stress Coping An Int. J. 12 (3),
329e345.
Sanford, P.G., 2010. Simulation in nursing education: a review of the research. Qual.
Rep. 15 (4), 1006e1011. Retrieved from. http://search.proquest.com/docview/
757175213?accountid¼13380.
Schwarzer, R., Jerusalem, M., 1995. Generalized self-efficacy scale. In: Weinman, J.,
Wright, S., Johnston, M. (Eds.), Measures in Health Psychology: a User's Port-
folio. Causal and Control Beliefs. NFER-NELSON, Windsor England, pp. 35e 37.
Sinclair, B., Ferguson, K., 2009. Integrating simulated teaching/learning strategies in
undergraduate nursing education. Int. J. Nurs. Educ. Scholarsh. 6 (1) http://
dx.doi.org/10.2202/1548-923X.1676. Retrieved from.
Sobral, D., 2000. An appraisal of medical students' reflection-in-learning. Med.
Educ. 34 (3), 182e187.
Swanson, E.A., Nicholson, A.C., Boese, T.A., Cram, E., Stineman, A.M., Tew, K., 2011.
Comparison of selected teaching strategies incorporating simulation and stu-
dent outcomes. Clin. Simul. Nurs. 7 (3), e81ee90. http://dx.doi.org/10.1016/
j.ecns.2009.12.011. Retrieved from.
Swenty, C.F., Eggleston, B.M., 2011. The evaluation of simulation in a baccalaureate
nursing program,. Clin. Simul. Nurs. 7 (5), e181ee187. http://dx.doi.org/10.1016/
j.ecns.2010.02.006. ISSN 1876-1399, Retrieved from.
Tavakol, M., Dennick, R., 2011. Making sense of cronbach's alpha. Int. J. Med. Educ. 2,
53e55. Retrieved from. http://search.proquest.com/docview/898889039?
accountid¼13380.
N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e59 59

More Related Content

What's hot

Poster - Billy Bryan1
Poster - Billy Bryan1Poster - Billy Bryan1
Poster - Billy Bryan1Billy Bryan
 
Implementation strategies
Implementation strategiesImplementation strategies
Implementation strategiesMartha Seife
 
Jurnal ijssh rufi'i
Jurnal ijssh rufi'iJurnal ijssh rufi'i
Jurnal ijssh rufi'iRufi'i Rufii
 
Introduction Lecture for Implementation Science
Introduction Lecture for Implementation ScienceIntroduction Lecture for Implementation Science
Introduction Lecture for Implementation ScienceMartha Seife
 
Effectiveness of Lecture Cum Demonstration Method on Knowledge and Skill Rega...
Effectiveness of Lecture Cum Demonstration Method on Knowledge and Skill Rega...Effectiveness of Lecture Cum Demonstration Method on Knowledge and Skill Rega...
Effectiveness of Lecture Cum Demonstration Method on Knowledge and Skill Rega...Vivek Jamnik
 
E learning vs standard lecture-which is the best approach to improve senior n...
E learning vs standard lecture-which is the best approach to improve senior n...E learning vs standard lecture-which is the best approach to improve senior n...
E learning vs standard lecture-which is the best approach to improve senior n...Alexander Decker
 
Christner.Long-TermGainAfterTeam-BasedLearningExp.TeachingAndLearningInMedici...
Christner.Long-TermGainAfterTeam-BasedLearningExp.TeachingAndLearningInMedici...Christner.Long-TermGainAfterTeam-BasedLearningExp.TeachingAndLearningInMedici...
Christner.Long-TermGainAfterTeam-BasedLearningExp.TeachingAndLearningInMedici...Jenny Christner
 
Observable effects of developing mathematical skills of students through team...
Observable effects of developing mathematical skills of students through team...Observable effects of developing mathematical skills of students through team...
Observable effects of developing mathematical skills of students through team...Alexander Decker
 
Student perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative studyStudent perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative studymcjssfs2
 
Expert Validation of a Teamwork Assessment Rubric PARRATT FAHY ET AL
Expert Validation of a Teamwork Assessment Rubric PARRATT FAHY ET ALExpert Validation of a Teamwork Assessment Rubric PARRATT FAHY ET AL
Expert Validation of a Teamwork Assessment Rubric PARRATT FAHY ET ALJenny Parratt
 
Physics plc presentation sistc wordpress
Physics plc presentation sistc wordpressPhysics plc presentation sistc wordpress
Physics plc presentation sistc wordpresspalaeri
 
Teaching Physiology: Under Graduates’ Perspective
Teaching Physiology: Under Graduates’ PerspectiveTeaching Physiology: Under Graduates’ Perspective
Teaching Physiology: Under Graduates’ Perspectiveiosrjce
 
EFFECTIVENESS OF BLOG IN LEARNING MATHEMATICS AT THE SECONDARY TEACHER EDUCAT...
EFFECTIVENESS OF BLOG IN LEARNING MATHEMATICS AT THE SECONDARY TEACHER EDUCAT...EFFECTIVENESS OF BLOG IN LEARNING MATHEMATICS AT THE SECONDARY TEACHER EDUCAT...
EFFECTIVENESS OF BLOG IN LEARNING MATHEMATICS AT THE SECONDARY TEACHER EDUCAT...Thiyagu K
 
High fidelity simulation for healthcare education iii
High fidelity simulation for healthcare education iiiHigh fidelity simulation for healthcare education iii
High fidelity simulation for healthcare education iiiHelen Wood
 
EFFECTIVE COVERAGE OF LEARNING DOMAINS IN STUDENTS ASSESSMENT BY LECTURERS OF...
EFFECTIVE COVERAGE OF LEARNING DOMAINS IN STUDENTS ASSESSMENT BY LECTURERS OF...EFFECTIVE COVERAGE OF LEARNING DOMAINS IN STUDENTS ASSESSMENT BY LECTURERS OF...
EFFECTIVE COVERAGE OF LEARNING DOMAINS IN STUDENTS ASSESSMENT BY LECTURERS OF...ijejournal
 
Assessing assessment literacy of science teachers in public secondary schools...
Assessing assessment literacy of science teachers in public secondary schools...Assessing assessment literacy of science teachers in public secondary schools...
Assessing assessment literacy of science teachers in public secondary schools...Alexander Decker
 
Opinions of Nursing Students towards Simulation Efficiency in Nursing Education
Opinions of Nursing Students towards Simulation Efficiency in Nursing EducationOpinions of Nursing Students towards Simulation Efficiency in Nursing Education
Opinions of Nursing Students towards Simulation Efficiency in Nursing Educationinventionjournals
 
Going Mobile: Librarians Supporting Distance Learning Students with Mobile Re...
Going Mobile: Librarians Supporting Distance Learning Students with Mobile Re...Going Mobile: Librarians Supporting Distance Learning Students with Mobile Re...
Going Mobile: Librarians Supporting Distance Learning Students with Mobile Re...Billie Anne Gebb
 

What's hot (20)

Simulation-based education: How could simulation be used for technical skills...
Simulation-based education: How could simulation be used for technical skills...Simulation-based education: How could simulation be used for technical skills...
Simulation-based education: How could simulation be used for technical skills...
 
Poster - Billy Bryan1
Poster - Billy Bryan1Poster - Billy Bryan1
Poster - Billy Bryan1
 
Implementation strategies
Implementation strategiesImplementation strategies
Implementation strategies
 
Jurnal ijssh rufi'i
Jurnal ijssh rufi'iJurnal ijssh rufi'i
Jurnal ijssh rufi'i
 
Introduction Lecture for Implementation Science
Introduction Lecture for Implementation ScienceIntroduction Lecture for Implementation Science
Introduction Lecture for Implementation Science
 
Effectiveness of Lecture Cum Demonstration Method on Knowledge and Skill Rega...
Effectiveness of Lecture Cum Demonstration Method on Knowledge and Skill Rega...Effectiveness of Lecture Cum Demonstration Method on Knowledge and Skill Rega...
Effectiveness of Lecture Cum Demonstration Method on Knowledge and Skill Rega...
 
E learning vs standard lecture-which is the best approach to improve senior n...
E learning vs standard lecture-which is the best approach to improve senior n...E learning vs standard lecture-which is the best approach to improve senior n...
E learning vs standard lecture-which is the best approach to improve senior n...
 
Christner.Long-TermGainAfterTeam-BasedLearningExp.TeachingAndLearningInMedici...
Christner.Long-TermGainAfterTeam-BasedLearningExp.TeachingAndLearningInMedici...Christner.Long-TermGainAfterTeam-BasedLearningExp.TeachingAndLearningInMedici...
Christner.Long-TermGainAfterTeam-BasedLearningExp.TeachingAndLearningInMedici...
 
Observable effects of developing mathematical skills of students through team...
Observable effects of developing mathematical skills of students through team...Observable effects of developing mathematical skills of students through team...
Observable effects of developing mathematical skills of students through team...
 
Student perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative studyStudent perspectives on formative feedback: an exploratory comparative study
Student perspectives on formative feedback: an exploratory comparative study
 
Expert Validation of a Teamwork Assessment Rubric PARRATT FAHY ET AL
Expert Validation of a Teamwork Assessment Rubric PARRATT FAHY ET ALExpert Validation of a Teamwork Assessment Rubric PARRATT FAHY ET AL
Expert Validation of a Teamwork Assessment Rubric PARRATT FAHY ET AL
 
The Influences Virtual Physics Laboratory (VPL) For Assessment the Millennial...
The Influences Virtual Physics Laboratory (VPL) For Assessment the Millennial...The Influences Virtual Physics Laboratory (VPL) For Assessment the Millennial...
The Influences Virtual Physics Laboratory (VPL) For Assessment the Millennial...
 
Physics plc presentation sistc wordpress
Physics plc presentation sistc wordpressPhysics plc presentation sistc wordpress
Physics plc presentation sistc wordpress
 
Teaching Physiology: Under Graduates’ Perspective
Teaching Physiology: Under Graduates’ PerspectiveTeaching Physiology: Under Graduates’ Perspective
Teaching Physiology: Under Graduates’ Perspective
 
EFFECTIVENESS OF BLOG IN LEARNING MATHEMATICS AT THE SECONDARY TEACHER EDUCAT...
EFFECTIVENESS OF BLOG IN LEARNING MATHEMATICS AT THE SECONDARY TEACHER EDUCAT...EFFECTIVENESS OF BLOG IN LEARNING MATHEMATICS AT THE SECONDARY TEACHER EDUCAT...
EFFECTIVENESS OF BLOG IN LEARNING MATHEMATICS AT THE SECONDARY TEACHER EDUCAT...
 
High fidelity simulation for healthcare education iii
High fidelity simulation for healthcare education iiiHigh fidelity simulation for healthcare education iii
High fidelity simulation for healthcare education iii
 
EFFECTIVE COVERAGE OF LEARNING DOMAINS IN STUDENTS ASSESSMENT BY LECTURERS OF...
EFFECTIVE COVERAGE OF LEARNING DOMAINS IN STUDENTS ASSESSMENT BY LECTURERS OF...EFFECTIVE COVERAGE OF LEARNING DOMAINS IN STUDENTS ASSESSMENT BY LECTURERS OF...
EFFECTIVE COVERAGE OF LEARNING DOMAINS IN STUDENTS ASSESSMENT BY LECTURERS OF...
 
Assessing assessment literacy of science teachers in public secondary schools...
Assessing assessment literacy of science teachers in public secondary schools...Assessing assessment literacy of science teachers in public secondary schools...
Assessing assessment literacy of science teachers in public secondary schools...
 
Opinions of Nursing Students towards Simulation Efficiency in Nursing Education
Opinions of Nursing Students towards Simulation Efficiency in Nursing EducationOpinions of Nursing Students towards Simulation Efficiency in Nursing Education
Opinions of Nursing Students towards Simulation Efficiency in Nursing Education
 
Going Mobile: Librarians Supporting Distance Learning Students with Mobile Re...
Going Mobile: Librarians Supporting Distance Learning Students with Mobile Re...Going Mobile: Librarians Supporting Distance Learning Students with Mobile Re...
Going Mobile: Librarians Supporting Distance Learning Students with Mobile Re...
 

Similar to Measuring nursing students' reflective thinking and self-efficacy following simulation

High-fidelity simulation_Descriptive analysis of student learning styles_CSN
High-fidelity simulation_Descriptive analysis of student learning styles_CSNHigh-fidelity simulation_Descriptive analysis of student learning styles_CSN
High-fidelity simulation_Descriptive analysis of student learning styles_CSNnaomi tutticci
 
Portfolio My class is NURSING RESEARCH 28358.pdf
Portfolio My class is NURSING RESEARCH 28358.pdfPortfolio My class is NURSING RESEARCH 28358.pdf
Portfolio My class is NURSING RESEARCH 28358.pdfsdfghj21
 
kabir a m external.pptx
kabir a m external.pptxkabir a m external.pptx
kabir a m external.pptxMusaargungu
 
for publication.pdf
for publication.pdffor publication.pdf
for publication.pdfsdfghj21
 
A Protocol For The Development Of A Critical Thinking Assessment Tool For Nur...
A Protocol For The Development Of A Critical Thinking Assessment Tool For Nur...A Protocol For The Development Of A Critical Thinking Assessment Tool For Nur...
A Protocol For The Development Of A Critical Thinking Assessment Tool For Nur...Darian Pruitt
 
Transfer of Learning using Simulation Based Education among Students of Teach...
Transfer of Learning using Simulation Based Education among Students of Teach...Transfer of Learning using Simulation Based Education among Students of Teach...
Transfer of Learning using Simulation Based Education among Students of Teach...ijtsrd
 
Poster_HAPS_2015_May20
Poster_HAPS_2015_May20Poster_HAPS_2015_May20
Poster_HAPS_2015_May20Mona Allan
 
Analysis of Relationship Between Associate Degree Nursing Student’s Self-Conf...
Analysis of Relationship Between Associate Degree Nursing Student’s Self-Conf...Analysis of Relationship Between Associate Degree Nursing Student’s Self-Conf...
Analysis of Relationship Between Associate Degree Nursing Student’s Self-Conf...Crimsonpublisherscojnh
 
Applying Self-Directed Learning Strategy To Enhance Nursing Students Critica...
Applying Self-Directed Learning Strategy To Enhance Nursing Students  Critica...Applying Self-Directed Learning Strategy To Enhance Nursing Students  Critica...
Applying Self-Directed Learning Strategy To Enhance Nursing Students Critica...Michele Thomas
 
The Effectiveness of Low Fidelity Simulation in the Training of Undergraduate...
The Effectiveness of Low Fidelity Simulation in the Training of Undergraduate...The Effectiveness of Low Fidelity Simulation in the Training of Undergraduate...
The Effectiveness of Low Fidelity Simulation in the Training of Undergraduate...ijtsrd
 
Adoption of generic skills in the process of teaching and learning in college...
Adoption of generic skills in the process of teaching and learning in college...Adoption of generic skills in the process of teaching and learning in college...
Adoption of generic skills in the process of teaching and learning in college...Alexander Decker
 
Learning outcomes after completing this part of the project, stud
Learning outcomes after completing this part of the project, studLearning outcomes after completing this part of the project, stud
Learning outcomes after completing this part of the project, studRIYAN43
 
2022_eat_framework_-aug_.pdf
2022_eat_framework_-aug_.pdf2022_eat_framework_-aug_.pdf
2022_eat_framework_-aug_.pdfRebecca665273
 
Application and evaluation of advanced simulation with HPS in developing stud...
Application and evaluation of advanced simulation with HPS in developing stud...Application and evaluation of advanced simulation with HPS in developing stud...
Application and evaluation of advanced simulation with HPS in developing stud...CITE
 
Research Engineering In Nursing: What Impedes Nurses In Their Research The Most?
Research Engineering In Nursing: What Impedes Nurses In Their Research The Most?Research Engineering In Nursing: What Impedes Nurses In Their Research The Most?
Research Engineering In Nursing: What Impedes Nurses In Their Research The Most?inventionjournals
 
Using GOOGLE Scholar, you must choose a scholarly journal article .docx
Using GOOGLE Scholar, you must choose a scholarly journal article .docxUsing GOOGLE Scholar, you must choose a scholarly journal article .docx
Using GOOGLE Scholar, you must choose a scholarly journal article .docxjessiehampson
 
Using GOOGLE Scholar, you must choose a scholarly journal article .docx
Using GOOGLE Scholar, you must choose a scholarly journal article .docxUsing GOOGLE Scholar, you must choose a scholarly journal article .docx
Using GOOGLE Scholar, you must choose a scholarly journal article .docxadkinspaige22
 
Critical Thinking in Nursing.pdf
Critical Thinking in Nursing.pdfCritical Thinking in Nursing.pdf
Critical Thinking in Nursing.pdfsdfghj21
 

Similar to Measuring nursing students' reflective thinking and self-efficacy following simulation (20)

High-fidelity simulation_Descriptive analysis of student learning styles_CSN
High-fidelity simulation_Descriptive analysis of student learning styles_CSNHigh-fidelity simulation_Descriptive analysis of student learning styles_CSN
High-fidelity simulation_Descriptive analysis of student learning styles_CSN
 
Uwfpp
UwfppUwfpp
Uwfpp
 
Portfolio My class is NURSING RESEARCH 28358.pdf
Portfolio My class is NURSING RESEARCH 28358.pdfPortfolio My class is NURSING RESEARCH 28358.pdf
Portfolio My class is NURSING RESEARCH 28358.pdf
 
kabir a m external.pptx
kabir a m external.pptxkabir a m external.pptx
kabir a m external.pptx
 
for publication.pdf
for publication.pdffor publication.pdf
for publication.pdf
 
A Protocol For The Development Of A Critical Thinking Assessment Tool For Nur...
A Protocol For The Development Of A Critical Thinking Assessment Tool For Nur...A Protocol For The Development Of A Critical Thinking Assessment Tool For Nur...
A Protocol For The Development Of A Critical Thinking Assessment Tool For Nur...
 
Transfer of Learning using Simulation Based Education among Students of Teach...
Transfer of Learning using Simulation Based Education among Students of Teach...Transfer of Learning using Simulation Based Education among Students of Teach...
Transfer of Learning using Simulation Based Education among Students of Teach...
 
Poster_HAPS_2015_May20
Poster_HAPS_2015_May20Poster_HAPS_2015_May20
Poster_HAPS_2015_May20
 
Analysis of Relationship Between Associate Degree Nursing Student’s Self-Conf...
Analysis of Relationship Between Associate Degree Nursing Student’s Self-Conf...Analysis of Relationship Between Associate Degree Nursing Student’s Self-Conf...
Analysis of Relationship Between Associate Degree Nursing Student’s Self-Conf...
 
Applying Self-Directed Learning Strategy To Enhance Nursing Students Critica...
Applying Self-Directed Learning Strategy To Enhance Nursing Students  Critica...Applying Self-Directed Learning Strategy To Enhance Nursing Students  Critica...
Applying Self-Directed Learning Strategy To Enhance Nursing Students Critica...
 
The Effectiveness of Low Fidelity Simulation in the Training of Undergraduate...
The Effectiveness of Low Fidelity Simulation in the Training of Undergraduate...The Effectiveness of Low Fidelity Simulation in the Training of Undergraduate...
The Effectiveness of Low Fidelity Simulation in the Training of Undergraduate...
 
Adoption of generic skills in the process of teaching and learning in college...
Adoption of generic skills in the process of teaching and learning in college...Adoption of generic skills in the process of teaching and learning in college...
Adoption of generic skills in the process of teaching and learning in college...
 
Learning outcomes after completing this part of the project, stud
Learning outcomes after completing this part of the project, studLearning outcomes after completing this part of the project, stud
Learning outcomes after completing this part of the project, stud
 
2022_eat_framework_-aug_.pdf
2022_eat_framework_-aug_.pdf2022_eat_framework_-aug_.pdf
2022_eat_framework_-aug_.pdf
 
Application and evaluation of advanced simulation with HPS in developing stud...
Application and evaluation of advanced simulation with HPS in developing stud...Application and evaluation of advanced simulation with HPS in developing stud...
Application and evaluation of advanced simulation with HPS in developing stud...
 
Research Engineering In Nursing: What Impedes Nurses In Their Research The Most?
Research Engineering In Nursing: What Impedes Nurses In Their Research The Most?Research Engineering In Nursing: What Impedes Nurses In Their Research The Most?
Research Engineering In Nursing: What Impedes Nurses In Their Research The Most?
 
Using GOOGLE Scholar, you must choose a scholarly journal article .docx
Using GOOGLE Scholar, you must choose a scholarly journal article .docxUsing GOOGLE Scholar, you must choose a scholarly journal article .docx
Using GOOGLE Scholar, you must choose a scholarly journal article .docx
 
Using GOOGLE Scholar, you must choose a scholarly journal article .docx
Using GOOGLE Scholar, you must choose a scholarly journal article .docxUsing GOOGLE Scholar, you must choose a scholarly journal article .docx
Using GOOGLE Scholar, you must choose a scholarly journal article .docx
 
Critical Thinking in Nursing.pdf
Critical Thinking in Nursing.pdfCritical Thinking in Nursing.pdf
Critical Thinking in Nursing.pdf
 
EDR8205-5
EDR8205-5EDR8205-5
EDR8205-5
 

Measuring nursing students' reflective thinking and self-efficacy following simulation

  • 1. Measuring third year undergraduate nursing students' reflective thinking skills and critical reflection self-efficacy following high fidelity simulation: A pilot study Naomi Tutticci a, * , Peter A. Lewis a , Fiona Coyer a, b a School of Nursing, Queensland University of Technology, Victoria Park Road, Kelvin Grove, QLD 4059, Australia b Department of Intensive Care Medicine, Royal Brisbane and Women's Hospital, Victoria Park Road, Kelvin Grove, QLD 4059, Australia a r t i c l e i n f o Article history: Received 20 July 2015 Received in revised form 5 November 2015 Accepted 11 March 2016 Keywords: Nursing Self-efficacy Reflection High fidelity simulation a b s t r a c t Critical reflection underpins critical thinking, a highly desirable generic nursing graduate capability. To improve the likelihood of critical thinking transferring to clinical practice, reflective thinking needs to be measured within the learning space of simulation. This study was divided into two phases to address the reliability and validity measures of previously untested surveys. Phase One data was collected from in- dividuals (n ¼ 6) using a ‘think aloud’ approach and an expert panel to review content validity, and verbatim comment analysis was undertaken. The Reflective Thinking Instrument and Critical Reflection Self-Efficacy Visual Analogue Scale items were contextualised to simulation. The expert review confirmed these instruments exhibited content validity. Phase Two data was collected through an online survey (n ¼ 58). Cronbach's alpha measured internal consistency and was demonstrated by all subscales and the Instrument as a whole (.849). There was a small to medium positive correlation between critical reflection self-efficacy and general self-efficacy (r ¼ .324, n ¼ 56, p ¼ .048). Participant responses were positive regarding the simulation experience. The research findings demonstrated that the Reflective Thinking and Simulation Satisfaction survey is reliable. Further development of this survey to establish validity is recommended to make it viable. © 2016 Elsevier Ltd. All rights reserved. 1. Introduction Critical reflection is a meta-cognitive skill, or the awareness of one's own cognitive functioning (Sobral, 2000), and is a key mechanism in the process of critical thinking (Forneris and Peden- McAlpine, 2007, p. 411). The identification and measurement of critical reflection in a pre-transitioning undergraduate is essential if critical thinking is to be an identifiable outcome of undergraduate nursing education. There is evidence to suggest that teaching strategies such as reflection, which requires active engagement by the student, can be effective in developing and promoting critical thinking skills and dispositions (Gul et al., 2010). Reflective thinking, or reflection upon practice, is a key element of the un- dergraduate nursing curriculum and is valued by the profession; however, there is a dearth of instruments to measure reflective thinking in this context (Kember et al., 2000). 1.1. Literature review Facione and Facione (1996, p.133) stressed that critical thinking needs to be demonstrated by graduates and this demands constant metacognitive reflection on “what one is doing and why”. Whilst critical thinking is considered to be a fundamental competency required of nurses and is a compulsory educational outcome of most undergraduate nursing programs (Gul et al., 2010), it is acknowledged as being difficult to assess (Jones, 2009). Many educational courses aim to promote reflective thinking, or reflec- tion upon practice; however, there is a scarcity of readily usable instruments to determine if students engage in reflective thinking, and, if so, to what extent (Kember et al., 2000). Kolb's (1984, p.27) experiential learning theory requires the learner to become actively involved in the experience and reflect on the experience, during, as well as after (Clapper, 2010). In the simulation context, experiential learning is ‘especially adaptable to adult learners; it gives the opportunity to see real consequences of * Corresponding author. E-mail addresses: naomi.tutticci@qut.edu.au (N. Tutticci), p.lewis@qut.edu.au (P.A. Lewis), f.coyer@qut.edu.au (F. Coyer). Contents lists available at ScienceDirect Nurse Education in Practice journal homepage: www.elsevier.com/nepr http://dx.doi.org/10.1016/j.nepr.2016.03.001 1471-5953/© 2016 Elsevier Ltd. All rights reserved. Nurse Education in Practice 18 (2016) 52e59
  • 2. one's actions, to feel the exhilaration of success and frustration of failure’ (Gilley,1990, p. 261). Kolb's theory provides a framework for simulation, where learners are able to apply their nursing knowl- edge to the care of a simulated patient within a safe environment, which leads to the improved acquisition of knowledge (Howard et al., 2011). Students use of reflective thinking after having any lived experience, whether clinical or simulated, should result in improved critical thinking, a more satisfied nurse, and in the long run, better patient care (Sanford, 2010). Chronister and Brown (2012) and Feingold et al. (2004) stated that simulation as a teaching strategy incorporates two of the seven principles for good practice in undergraduate education: active learning and prompt feedback (Chickering and Gamson,1987), both of which can improve learning, and ultimately, learning satisfac- tion. These measures are evaluated in simulation contexts, but not from the perspective of identifying their contribution to final year nursing students’ reflective thinking capabilities within high fi- delity simulation (HFS). Levett-Jones et al. (2011) found that student satisfaction helps to build self-confidence, which in turn helps students to develop skills and acquire knowledge. Jefferies and Rizzolo (2006) and Jarzemsky and McGrath (2008) reported higher levels of self-confidence following participation in clinical simulations. Self-efficacy is, therefore, particularly important to evaluate, as learning through simulation incorporates processes that support social cognitive theory: vicarious capability, cognition, self-regulation, and self- reflection (Sinclair and Ferguson, 2009). What is lacking in the literature is the measurement of self-belief about one's capacity to critically reflect, and how the relationship between self-belief and critical reflection impacts on learner satisfaction and learning outcomes. There is consensus in the literature that HFS is an effective teaching and learning methodology when adhering to best practice (Cant and Cooper, 2010), an association related to clinical reasoning, namely critical thinking, clinical skill performance, and knowledge acquisition (Lapkin et al., 2010). The directions for future research on HFS are also agreed upon, and respond to the lack of unequivocal evidence of universal measures of how to document learning outcomes (Howard et al., 2011; Cant and Cooper, 2010), in particular evaluating metacognitive skills such as reflection. The development of a valid and reliable survey to evaluate reflective practice, skills, and simulation learning experi- ences may ultimately improve the assessment of student perfor- mance (Lapkin et al., 2010) and facilitate the transfer of both generic skills and discipline specific knowledge and skills into the workplace. 1.2. Research aims and questions The aim of the pilot study was to determine the reliability and content validity of instruments to be used in a subsequent rando- mised control trial. The research questions were: 1) Do the Reflective Thinking Instrument and Critical Reflection Self-Efficacy Visual Analogue Scale (two instruments not pre- viously tested for reliability and validity) demonstrate content validity? 2) What is the reliability of the Reflective Thinking and Simulation Survey (combined instruments)? 2. Research design Multiple rationales for pilot studies exist, including refining the methodology for a larger study (Musil, 2006), evaluating the reli- ability of a research instrument or scale (Baird, 2000), and their validity for the population under study (Brink, 1988). The intent of this pilot study was to determine the reliability and validity of in- struments either not previously tested, or not used together. This study was divided into two phases e Phase One used in- dividual ‘think aloud’ sessions and an expert panel to address the validity measures of previously untested instruments: the Reflec- tive Thinking Instrument and Critical Reflection Self-Efficacy Visual Analogue Scale (VAS). Phase Two used a post-test only design to address the reliability measures of all of the instruments used. 2.1. Setting The setting for Phases One and Two was a metropolitan Australian university's school of nursing. The school is one of the largest schools of nursing in Australia (approximately 2500 stu- dents), offering a Bachelor of Nursing (BN) program and three double degrees (nursing/behavioural science, nursing/paramedic science, and nursing/public health). High fidelity simulation laboratories are located onsite and use computerised full body manikins (METIman Nursing™[CAE Healthcard] and SimMan 3G™[Laerdal]) for the final clinical sub- jects’ high fidelity simulation sessions. These sessions are con- ducted in a fully equipped clinical space with supporting technology and equipment to deliver nursing interventions ex- pected within the scope of the scenario. The study was undertaken between August and September 2014. 2.2. Participant All BN students in their final year of study and enrolled in their final clinical subject were eligible for study inclusion. Repeating students were not excluded from the study. 2.3. Intervention A standard high fidelity simulation for the final year clinical subject uses METIman Nursing™ and SimMan 3G™ positioned within replica clinical spaces. These spaces contain appropriate adjunctive technologies to support the simulated patients’ clinical needs, for example: intravenous infusion pumps, drains, single lead and 12 lead electrocardiograph, oxygen saturation and blood pressure monitoring providing real time data. For standard HFS sessions the manikins are programmed to provide realistic physiological responses to the facilitator and stu- dents’ actions. Conversations between the manikin and the student nurse occurs through voice-over technology; that is, the facilitator speaks as the patient and members of the multidisciplinary team (from a microphone in the control room). HFS participants carry out their assigned roles in response to a pre-programmed scenario, within this true-to-life setting. All students enrolled in the final clinical subject are allocated to groups consisting of nine members for a standard HFS. These HFS groups include: one academic facilitator, four students who un- dertake the simulation, and four students who observe their peers undertaking the simulation. All nine are involved in the debrief, which is facilitated by the academic. In addition to the standard high fidelity simulations (HFS) that are offered in the final clinical subject, another five HFS were offered to students who participated in Phase One of the pilot study. These were delivered in the same way as a standard HFS session is run for the final year clinical subject. The scenario used in the specific Phase One HFS involved a terminally ill male patient with a history of pancreatic cancer, who presented to the emergency department with break-through N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e59 53
  • 3. abdominal pain. The patient had a nasogastric tube inserted by emergency staff and was transferred to the ward e the setting for the simulation. For high fidelity the manikin was attached to single lead electrocardiograph, an oxygen saturation probe and blood pressure cuff. Each of these components were providing real time data to the simulant participants. The manikin was supported with intravenous fluids which were being infused via a hospital grade infusion pump. The scenario used in Phase Two was part of the second semester curricula for third year students and focused on the management of a male patient with chest trauma experiencing acute pain with a chest drain in situ. In this scenario the manikin was attached to intravenous fluids via a hospital grade infusion pump and standard haemodynamic parameters were being monitored and displayed visually on a monitor screen. Real time data was accessible to the participants to assist them with clinical decision making. An Ocean™ Wet Suction Water Seal Drain was incorrectly positioned and not connected to wall suction, to provide additional high fi- delity conditions in terms of mimicking errors that can occur in the clinical environment. 2.4. Instruments The Phase Two survey was comprised of five instruments, in addition to demographic questions and an open-ended question. Higher scores indicated more positive results for all instruments. 2.5. The reflective thinking instrument This 16-item (5 point Likert scale) instrument assesses students' engagement in, and extent of, reflective thinking in professional preparation courses, in particular health science disciplines (Kember et al., 2000). It is comprised of four levels of reflection: habitual action, understanding, reflection, and critical reflection (Kember et al., 2000; Mann et al., 2009). Scores range from 16 to 80, with 80 indicating the highest possible score. Kember et al. (2000) demonstrated the instruments’ reliability (a ranged from .621 to .757) for students enrolled in courses that include a professional practice component. 2.6. Visual analogue scale (VAS) Self-efficacy for critical reflection was measured using a single item VAS (Kameg et al., 2009). The VAS was modified through rewording of the original scale to ensure it measured the critical reflection self-efficacy of third year undergraduate nursing stu- dents. The VAS measured how confident participants felt about their ability to critically reflect after clinical practice and was a straight line from zero to 100 (zero represented ‘cannot do at all’ while 100 represented ‘highly certain can do’). 2.7. The general self-efficacy scale (GSES) This instrument measures a global confidence in coping ability across a wide range of demanding or difficult situations, and re- flects a broad and stable confidence in dealing effectively with rather diverse stressful situations (Rimm and Jerusalem, 1999, p. 330). Responses are made on a four-point Likert scale. The GSES consists of ten items and scores range from 10 to 40, with 40 indicating the highest possible score (Luszczynska et al., 2005; Schwarzer and Jerusalem, 1995). 2.8. The Educational Practices in Simulation Scale (EPSS) This 16-item instrument uses a five-point Likert scale and a not applicable column (scored as a ‘6’), and is completed after the learner undertakes the simulation. The EPSS scores range from 16 to 96, with 96 indicating the highest possible score. The scale measures four educational practices (active learning, collaboration, diverse ways of learning, and high expectations), which are present in the simulation to demonstrate the importance of each practice to the learner (Jefferies and Rizzolo, 2006; Swanson et al., 2011). 2.9. Satisfaction with Simulation Experience Scale (SSES) This instrument consists of 18 items, requiring participants to rate their level of agreement with each statement on a five-point Likert scale. Items are grouped into three subscales: debrief and reflection, clinical reasoning, and clinical learning (Levett-Jones et al., 2011; Liaw et al., 2014). The SSES scores range from 18 to 90, with 90 indicating the highest possible score. The survey con- tained an open ended question at the end to seek comment on the simulation experience. 2.10. Procedure 2.10.1. Phase One The students were recruited to the individual ‘think aloud’ sessions (Fig. 1) through announcements, a three minute vodcast shown on the learning management system platform, and emails. The announcements were staggered over a five week period lead- ing up to Phase One. The individual ‘think aloud’ sessions took place after the second HFS and participants were asked to complete the Reflective Thinking Instrument and VAS. The participants were instructed to talk out loud about the questions whilst completing them on a laptop computer. Responses were audio-recorded. Content validation of the Reflective Thinking and VAS in- struments by an expert panel followed on from the individual ‘think aloud’ sessions and subsequent instrument modification. The expert panel consisted of six academics (four nurse academics, one Fig. 1. Research design e phase one and two. N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e5954
  • 4. education academic, and an academic from Language and Learning Services) to ensure a sufficient level of control for chance agree- ments. Academics with expertise in HFS, reflective practice, in- strument and curriculum design were sought, along with an academic with expertise in international and indigenous student's learning needs. Panel members were provided with working defi- nitions of the study's primary constructs, along with paper copies of the modified instruments. The review was requested within two weeks. 2.11. Phase Two Students were recruited to Phase Two (Fig. 1) using the same strategy as Phase One. Surveys were distributed electronically with an online link using Key Survey v8.2. The survey remained open for two weeks once the final clinical subject's simulations had been completed. 2.12. Data analysis 2.12.1. Phase One Verbatim comment analysis and grouping of similar responses was undertaken using the individual ‘think aloud’ transcripts. The expert panel rated each item of the modified Reflective Thinking Instrument and VAS using a four-point Likert scale. Comments were sought on each individual item to reduce question ambiguity and to provide new perspectives. The content validity index (CVI) was applied to the modified Reflective Thinking Instrument and VAS, with both scale level CVI (S-CVI) and average proportion of items rated as 3 or 4 across the six experts (S-CVI/Ave). The scale level CVI is the proportion of items on an instrument that achieved a rating of 3 or 4 by all content experts, that is, universal agreement (S-CVI/UA) (Polit and Beck, 2006). The S-CVI/Ave should be at .90, a higher value than the S-CVI/UA, because the former is much more liberal in its definition of congruence. The modified Reflective Thinking Instrument and VAS were combined with three other instruments into the ‘Reflective Thinking and Simulation Survey’ and administered to a sample of students enrolled in the third year final clinical subject. This survey was used in a ‘post-test’ role, after the participants had completed a simulation debrief. 2.13. Phase Two Reliability was measured for each subscale and total scale using Cronbach's alpha. The alpha value was set at .70 to indicate suffi- cient internal consistency (Tavakol and Dennick, 2011). Subscale and full scales alphas have been reported on. However, alphas for subscales were used to determine internal consistency due to their measure of unidimensionality. Verbatim comments from the open ended question in the Satisfaction with Simulation Experience Scale were allocated to groups based on the similarity of their content. 2.14. Ethical considerations Ethical approval was obtained from the university's ethics committee. All participants provided informed written consent for Phase One and consent was implied with completion of the survey in Phase Two. 3. Results 3.1. Phase One Six students participated in the individual ‘think aloud’ sessions: five females and one male. Responses were gathered into three groups: ‘thinking reflectively’, ‘simulation equates to clinical prac- tice’, and ‘simulation specific’. A few responses referred to simu- lation directly, in most other instances it was referred to as ‘clinical practice’ (see Table 1). As a result of these individual sessions the instruments' ques- tions were altered where necessary to refer to clinical practice as the context for the participants' reflective thinking. Participant re- sponses indicated that simulation was considered a type of clinical practice. Results from participants' ‘think aloud’ sessions deter- mined that no questions were removed from the instrument. Individual CVI values for item two and the VAS were below the recognised acceptable level of .78 for six raters, consequently, this information was used to direct scrutiny towards item two and the VAS for content modification. Ten of the 17 items (.59) were rated as universally congruent by all six experts. The average proportion rated across the experts was .90 (ideal is S-CVI/Ave .90). No items were rejected by the expert panel. Comments by expert panel members led to eight changes to the instruments’ items. The changes are outlined as follows: 1. Language used in items was individualised i.e. from ‘us’ to ‘me’, ‘you’ to ‘I’. 2. Items focusing on ‘understanding’ were expanded to include ‘application’. 3. Items were contextualised to nursing tasks instead of ‘actions’. 4. ‘Clinical practice’ was included in items to contextualise the ‘where’ (i.e. simulation). 5. ’Lecturer’ was replaced with ‘teacher’ to expand contexts of learning to include simulation. 6. ‘Handout’ material was changed to ‘lecture/tutorial’ material for assessments to broaden the applicability of the item for the learning experience simulation. 7. Wording for the VAS was altered to plain English and con- textualised to clinical practice. 8. Instructions on how to complete the VAS were altered to facil- itate easier use and provide stronger linking between the measured concept and the instrument. 3.2. Phase Two Fifty eight (n ¼ 58) third year undergraduate students from a cohort of 231 (25%) participated in Phase Two. The majority of the sample were female (84.5%, 49/56) and were domestic students (81.0%, 47/57) (see Table 2). Internal consistency of the Reflective Thinking Instrument was measured for each subscale and the instrument as a whole (Table 3). It was significant across the four subscales (608e.808) and the whole instrument (.849), demonstrating good internal consistency of the instrument and each subscale. The relationship between critical reflection self-efficacy (as measured by the Critical Reflection Self-Efficacy Visual Analogue Scale) and general self-efficacy (as measured by the General Self- Efficacy Scale) was investigated to determine reliability (using the Pearson productemoment correlation coefficient). Preliminary analysis was performed to ensure no violation of the assumptions of normality, linearity, and homoscedasticity. There was a small to medium positive correlation between the two variables (r ¼ .324, n ¼ 56, p ¼ .048), with high levels of critical reflection self-efficacy N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e59 55
  • 5. associated with high levels of general self-efficacy. 3.3. Analysis of comments from the satisfaction with Simulation Experience Scale Of the 58 participants, 22 (38%) commented on their simulation experience as part of the Satisfaction with Simulation Experience scale. The responses were collated and allocated into 3 distinct groups: simulation design, simulation experience, and simulation implementation. This allowed for inferences to be made about the characteristics and meaning of the participants’ responses. The participant response groups are reported in Table 4 and supported by verbatim quotes. Overall, the participant's responses about the simulation expe- rience were positive. They described the HFS as ‘very helpful’, ‘excellent’ and ‘educational’, recognition of its positive impact on learning, engagement, and preparation for clinical placement. The most recognisable comments regarding the ‘simulation experience’ (Table 4 e simulation experience) centred on HFS as a valuable learning experience especially from a practical skill/clinical perspective. This was supported by the ‘simulation implementa- tion’ group (Table 4 e simulation implementation), which included comments about the benefits of increasing the frequency of simu- lation within the curriculum, particularly before clinical placement. The belief was expressed that with more regular simulations, greater insight could be gained into different clinical issues. It was recognised that some participants did not value the experience positively, rather they saw it as a university strategy to increase hours of study in place of clinical practicum hours (hos- pital-based placement). ‘Simulation design’ (Table 4 e simulation Table 1 Phase One data comments and evidence: ‘reflective thinking and simulation’. Theme Survey item Data extracts Thinking Reflectively Q14. In this course you have to continually think about material you are being taught. ‘Agree with reservation e it helps us critically think’. Participant 3 Q7. I like to think over what I have being doing and consider alternative ways of doing it. ‘Maybe too much e so I definitely agree, I'm an overthinker’. Participant 5 Q8. I often reflect on my actions to see whether I could have improved on what I did. ‘I don't do it much, I have to do it e agree with reservation’. Participant 4 Q3. I sometimes question the way others do something and try to think of a better way. ‘Agree with reservation e I don't always question people and often people are doing the right thing, so there is no reason to. Sometimes I don't realise people are doing the wrong thing. I will evaluate and think of better options if I do pick someone up on something that's not completely right’. Participant 6 Simulation equates to clinical practice Q13.If I follow what the lecturer says, I do not have to think too much in this course. ‘Lecturers teaching the right things to be a safe, competent nurse so if you don't have to think too much e definitely disagree e you always have to think, like I said (when I got) my learning contract, it taught me so much e you think about everything you do, no exceptions'. Participant 6 Q8. This course has challenged some of my firmly held ideas Disagree with reservation e try to change beliefs about how you care for patients'. Participant 1 Q6. To pass this course you need to understand the content. Definitely agree e I think with any practical e like clinical nursing you need to understand what you are doing and the theory behind it’. Participant 2 Simulation specific Visual Analogue Scale ‘In regards to simulation maybe no, I don't think I did that really well. If I had more resources would I be more self-efficacious (sic)?”Participant 1 Q3. I sometimes question the way others do something and try to think of a better way. ‘Agree with reservation e not in the process of simulation, but if something going on is awful … ’Participant 1 Table 2 Phase Two 3rd year undergraduate nurse characteristics. 3rd year undergraduate nurse characteristics Number (%) Gender Male 7 (12.1%) Female 49 (84.5%) Student Status Domestic 47 (81.0%) International 10 (17.2%) Age 15e25 25 (43.1%) 26e35 21 (36.2%) 36e45 10 (17.2%) 46e55 2 (3.4%) 56e65 0 (0%) 66þ 0 (0%) Previously Completed a Degree Yes 21 (36.2%) No 37 (63.8%) Entry Pathway 3yr programme 22 (37.9%) Graduate Entry programme 24 (41.4%) Enrolled Nurse Articulation programme 4 (6.9%) Double Degree programme 7 (12.1%) Grade Point Average 0e3 0 (0%) 3e4 2 (3.4%) 4e5 11 (19.0%) 5e6 28 (48.3%) 6e7 17 (29.3%) N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e5956
  • 6. design) continued this constructive dialogue, as it focused on the lack of variety of scenarios provided and the negative impact of the reduced length of debrief on learning. 4. Discussion The inclusion and measurement of reflective thinking within undergraduate nursing curriculums facilitates student acquisition of essential meta-teaching and metacognitive skills (McNamara, 1990; Noffke and Brennan, 1988). In this study, two instruments: Reflective Thinking and Critical Reflection Self-Efficacy VAS were tested for content validity and reliability, and were found to consist of relevant content and confirm internal consistency. This finding enabled the combination of these two instruments with other valid and reliable instruments (measuring self-efficacy, education prac- tices associated with simulation, and simulation satisfaction) into one survey, which was piloted on third year nursing un- dergraduates during their final clinical subject. This result has aided the measurement of reflection, self-efficacy, learning, and satis- faction experiences to be progressed within the context of high fidelity simulation, an outcome not previously attained. This evaluation of cognitive and learning processes, experiences, and outcomes is a previously under-researched phenomena in the context of simulation and preparation for professional practice. A continuing theme in nursing and medical literature related to the use of simulation is the question regarding the transferability of competence (cognitive, affective, and psychomotor skills) from a simulated setting to a real clinical setting (Feingold et al., 2004; Table 3 Reflective thinking and simulation survey scale reliability. Scale Subscale Cronbach's alpha Reflective Thinking Habitual Action .701 Understanding .682a Reflection .608b Critical Reflection .808 Whole Scale .849 General Self Efficacy Scale .754= Educational Practices in Simulation Scale e Presence of Educational Practices Active Learning .794y Collaboration .835 Diverse Ways of Learning .847 High Expectations .618z Whole Scale .856 Educational Practices in Simulation Scale e Importance of Educational Practice Active Learning .873 Collaboration .880 Diverse Ways of Learning .647yy High Expectations .854 Whole Scale .863 Satisfaction with Simulation Experience Debrief and Reflection .931 Clinical Reasoning .884 Clinical Learning .839 Whole Scale .948 = GSES e Items two and eight had a negative value of À.053 in the inter-item correlation matrix and the corrected itemetotal correlation for item two as less than .3 (.133). Item two was retained. y EPSS e active learning. There were no negative values. Item two had a low value of .070 for the corrected itemetotal correlation. z EPSS e high expectations. The mean inter-item correlation value is .447 suggesting a moderate correlation. There were negative values in the inter-item correlation matrix and the corrected itemetotal correlation for items two, 15 and 16 was less than .3. Item two was retained. yy EPSS e diverse ways of learning. The mean inter-item correlation value is .478. a Understanding - There were no negative values in the inter-item correlation matrix and no values were less than .3 for the corrected itemetotal correlation. b Reflection - Item three had a value of less than .3 (.128) for corrected itemetotal correlation. Item three (subscale e reflection) was removed to improve Cronbach's alpha from .608 to .722. Table 4 Qualitative data groups and evidence: ‘satisfaction with high fidelity simulation’. Qualitative data groupings Participant quotes Simulation Design ‘As the sim was based around communication having four students in the room made it too hard to be able to do tasks as there was not really enough for everyone to do, two or three would have been better.’ Participant 13 ‘We need a variety of scenarios.’ Participant 46 ‘The debrief could possibly be longer with regard to discussion of the content.’ Participant 18 Simulation Experience ‘Educational and eye opening to palliative care and effective communication.’ Participant 51 ‘The simulation experience is an excellent learning tool for students to practice the skills and concepts in a ‘real’ environment without any actual risk to the patient.’ Participant 47 ‘I find them nerve-wracking, but a good experience.’ Participant 18 Simulation Implementation ‘Overall I believe simulation is very helpful, especially before off campus placement. Participant 53 ‘I find Sim to be the most valuable learning experience and would love to do more.’ Participant 6 ‘..please have at least 10 sim sessions which can give some insight into different clinical issues.’ Participant 50 ‘There were some technical issues (volume too low), which would have made the experience more useful if corrected.’ Participant 16 N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e59 57
  • 7. Murray et al., 2008; Swenty and Eggleston, 2011). Cioffi (2001) and Lasater (2007) were emphatic that simulation does not replace clinical experience, but it can extend the subject matter to equip learners with some skills that can be directly transferred into the ‘real’ clinical setting. Participants were satisfied with simulation as a valuable learning experience and inferred it was a type of clinical practice. This association between high fidelity simulation and clinical practice could promote student reflective thinking and assist with the transfer of knowledge (Daley and Campbell, 2013, p. 2), further enhancing the development of generic graduate capabilities and improved transition to practice. Examining this association by measuring reflective thinking and evaluating the simulation experience may well lead to a greater understanding of student's reflective capabilities and their simulation experience, in addition to addressing the question of how effective simulation is as a type of clinical experience. Certainly these findings demonstrate the applicability of the selected instruments grouped into one survey for a larger trial in the future. The pilot study and survey instruments have several limitations. Item 14 in the Educational Practices in Simulation Scale used a slightly different Likert scale, offering five options, instead of six. The scale was missing the option of ‘not applicable e the statement does not pertain to the simulation performed.’ This could poten- tially create missing data. This question was retained in the analysis for reliability; however, for validation of the survey the item would need to return to its original form. A number of items from the unmodified instruments were found to have Cronbach alphas below .7. These items were retained in the established instruments to enable comparison of results between future similar studies using these instruments. A small to medium correlation was found to exist between two variables (VAS and the General Self-Efficacy Scale), suggesting that the VAS is a reliable instrument. Testing the reliability of this in- strument on a larger sample or by selecting and using another scale that is a closer substitute would be recommended. The ‘think aloud’ sessions were included in phase one to ascertain the relevance of the reflective thinking instrument to the end users, that is, third year undergraduate nursing students participating in HFS. A small convenience sample of six participants (Phase One ‘think aloud’) does have an inherent risk of bias. Due to the qualitative nature of this data collection method a small sample was considered sufficient to obtain an account of the participants' experience interacting with this instrument. 5. Conclusion This study served as a starting point to validate a new survey in reflective thinking and simulation satisfaction in undergraduate student nurses. The research findings demonstrated that the Reflective Thinking and Simulation Satisfaction survey is reliable. Students also verbalised their satisfaction with simulation as a valuable learning experience and inferred that it was a type of clinical practice. The study findings suggest that HFS is reasonably successful at imitating clinical experience and providing a satis- fying learning experience, with certain limitations associated with simulation design and implementation. Further development of the Reflective Thinking and Simulation Survey to establish validity is recommended to make this a viable survey. Conflict of interest statement There are no conflicts of interest. Acknowledgements We would like to thank the study participants for their time. Dr Naomi Malouf provided invaluable assistance in the implementa- tion of this pilot study. Kylie Morris provided editorial assistance for this manuscript. References Baird, C., 2000. Taking the mystery out of research: the pilot study. Orthop. Nurs. 19 (2), 42e43. Retrieved from. http://search.proquest.com/docview/195967475? accountid¼13380. Brink, P.J., 1988. The pilot study. West. J. Nurs. Res. 10, 237e238. Cant, R., Cooper, S., 2010. Simulation-based learning in nurse education: systematic review. J. Of Adv. Nurs. 66 (1), 3e15. http://dx.doi.org/10.1111/j.1365- 2648.2009.05240. Chickering, A.W., Gamson, Z.F., 1987. Seven principles for good practice in under- graduate education. AAHE Bull. 3 (7). Retrieved from. http://search.proquest. com/docview/63263975?accountid¼13380. Chronister, C., Brown, D., 2012. Comparison of simulation debriefing methods. Clin. Simul. Nurs. 8 (7), e281ee288. http://dx.doi.org/10.1016/j.ecns.2010.12.005. Cioffi, J., 2001. Clinical simulations: development and validation. Nurse Educ. Today 21 (6), 477e486. Clapper, T.C., 2010. Beyond Knowles: what those conducting simulations need to know about Adult Learning Theory. Clin. Simul. Nurs. 6 (1), e7ee14. http:// dx.doi.org/10.1016/j.ecns.2009.07.003. Retrieved from. Daley, K.M., Campbell, S.H., 2013. Simulation-focused pedagogy for nursing edu- cation. In: Daley, K.M., Campbell, S.H. (Eds.), Simulation Scenarios for Nurse Educators Making it Real, second ed. Springer Publishing Company, New York, pp. 1e8. Facione, N.C., Facione, P.A., 1996. Externalizing the critical thinking in knowledge development and clinical judgment. Nurs. Outlook 44, 129e136. Feingold, C., Calaluce, M., Kallen, M., 2004. Computerized patient model and simulated clinical experiences: evaluation with baccalaureate nursing students. J. Nurs. Educ. 43 (4), 156e163. Retrieved from http://search.proquest.com/ docview/203944152?accountid¼13380. Forneris, S., Peden-McAlpine, C., 2007. Evaluation of a reflective learning inter- vention to improve critical thinking in novice nurses. J. Of Adv. Nurs. 57 (4), 410e421. http://dx.doi.org/10.1111/j.1365-2648.2007.04120. Gilley, J.W., 1990. Demonstration and simulation. In: Galbraith, M.W. (Ed.), Adult Learning Methods: a Guide for Effective Instruction. Krieger, Malabar, FL, pp. 261e281. Gul, R., Cassum, S., Ahmad, A., Khan, S., Saeed, T., Parpio, Y., 2010. Enhancement of critical thinking in curriculum design and delivery: a randomised controlled trial for educators. Procedia Soc. Behav. Sci. 2, 3219e3225. Howard, V.M., Englert, N., Kameg, K., Perozzi, K., 2011. Integration of simulation across the undergraduate curriculum: student and faculty perspectives. Clin. Simul. Nurs. 7, e1ee10. http://dx.doi.org/10.1016/j.ecns.2009.10.004. Retrieved from. Jarzemsky, P.A., McGrath, J., 2008. Look before you leap: lessons learned when introducing clinical simulation. Nurse Educ. 33 (2), 90e95. Jefferies, P.R., Rizzolo, M.A., 2006. Designing and Implementing Models for the Innovative Use of Simulation to Teach Nursing Care of Ill Adults and Children: a National, Multi-site, Multi-method Study. National League for Nursing, New York. Jones, A., 2009. Generic attributes as espoused theory: the importance of context. High. Educ. 58 (2), 175e191. Kameg, K., Mitchell, A., Clochesy, J., Howard, V., Suresky, J., 2009. Communication and human patient simulation in psychiatric nursing. Issues Ment. Health Nurs. 30 (8), 503e508. http://dx.doi.org/10.1080/01612840802601366. Kember, D., Leung, D.P., Jones, A., Loke, A., McKay, J., Sinclair, K., Yeung, E., 2000. Development of a questionnaire to measure the level of reflective thinking. Assessment & evaluation in. High. Educ. 25 (4), 381e395. http://dx.doi.org/ 10.1080/026029300449272. Kolb, D., 1984. Experiential Learning: Experience as the Source of Learning and Development. Prentice Hall, Upper Saddle River, NJ. Lapkin, S., Levett-Jones, T., Bellchambers, H., Fernandez, R., 2010. Effectiveness of patient simulation manikins in teaching clinical reasoning skills to under- graduate nursing students: a systematic review. Clin. Simul. Nurs. 6 (6), e207ee222. http://dx.doi.org/10.1016/j.ecns.2010.05.005. Lasater, K., 2007. High-fidelity simulation and the development of clinical judg- ment: students' experiences. J. Nurs. Educ. 46 (6), 269e276. Retrieved from. http://search.proquest.com/docview/203965988?accountid¼13380. Levett-Jones, T., McCoy, M., Lapkin, S., Noble, D., Hoffman, K., Dempsey, J., Arthur, C., Roche, J., 2011. The development and psychometric testing of the satisfaction with simulation experience scale. Nurse Educ. Today 31, 705e710. Liaw, S.Y., Zhou, W.T., Tang, Ching Lau, Chiang, Siau, Sally, Wai-chi Chan, 2014. An interprofessional communication training using simulation to enhance safe care for a deteriorating patient. Nurse Educ. Today 34 (2), 259e264. http:// dx.doi.org/10.1016/j.nedt.2013.02.019. Retrieved from. Luszczynska, A., Scholz, U., Schwarzer, R., 2005. The general self-efficacy scale: multicultural validation studies. J. Psychol. 139 (5), 259e264. http://dx.doi.org/ N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e5958
  • 8. 10.1016/j.nedt.2013.02.019. Mann, K., Gordon, J., MacLeod, A., 2009. Reflection and reflective practice in health professions education: a systematic review. Adv. Health Sci. Educ. 14, 595e621. McNamara, D., 1990. Research on Teachers' Thinking: its contribution to educating student teachers to think critically. J. Educ. Teach. 16 (2), 147e160. Murray, C., Grant, M.J., Howarth, M.L., Leigh, J., 2008. The use of simulation as a teaching and learning approach to support practice learning. Nurse Educ. Pract. 8 (1), 5e8. Musil, C.M., 2006. Pilot Study. Springer Publishing Company, New York. Retrieved from. http://search.proquest.com/docview/189467802?accountid¼13380. Noffke, S., Brennan, M., 1988. The Dimensions of Reflection: a conceptual and contextual analysis'. In: Annual Meeting of the AERA. New Orleans. Polit, D.F., Beck, C.T., 2006. The Content Validity Index: are you sure you know what's being reported? Critique and Recommendations. Res. Nurs. Health 29, 489e497. Rimm, H., Jerusalem, M., 1999. Adaption and validation of an Estonian version of the general self-efficacy scale (ESES). Anxiety, Stress Coping An Int. J. 12 (3), 329e345. Sanford, P.G., 2010. Simulation in nursing education: a review of the research. Qual. Rep. 15 (4), 1006e1011. Retrieved from. http://search.proquest.com/docview/ 757175213?accountid¼13380. Schwarzer, R., Jerusalem, M., 1995. Generalized self-efficacy scale. In: Weinman, J., Wright, S., Johnston, M. (Eds.), Measures in Health Psychology: a User's Port- folio. Causal and Control Beliefs. NFER-NELSON, Windsor England, pp. 35e 37. Sinclair, B., Ferguson, K., 2009. Integrating simulated teaching/learning strategies in undergraduate nursing education. Int. J. Nurs. Educ. Scholarsh. 6 (1) http:// dx.doi.org/10.2202/1548-923X.1676. Retrieved from. Sobral, D., 2000. An appraisal of medical students' reflection-in-learning. Med. Educ. 34 (3), 182e187. Swanson, E.A., Nicholson, A.C., Boese, T.A., Cram, E., Stineman, A.M., Tew, K., 2011. Comparison of selected teaching strategies incorporating simulation and stu- dent outcomes. Clin. Simul. Nurs. 7 (3), e81ee90. http://dx.doi.org/10.1016/ j.ecns.2009.12.011. Retrieved from. Swenty, C.F., Eggleston, B.M., 2011. The evaluation of simulation in a baccalaureate nursing program,. Clin. Simul. Nurs. 7 (5), e181ee187. http://dx.doi.org/10.1016/ j.ecns.2010.02.006. ISSN 1876-1399, Retrieved from. Tavakol, M., Dennick, R., 2011. Making sense of cronbach's alpha. Int. J. Med. Educ. 2, 53e55. Retrieved from. http://search.proquest.com/docview/898889039? accountid¼13380. N. Tutticci et al. / Nurse Education in Practice 18 (2016) 52e59 59