SlideShare a Scribd company logo
1 of 20
Download to read offline
This article was downloaded by: [ Lisa Besnon]
On: 19 March 2014, At: 08: 07
Publisher: Taylor & Francis
Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered
office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK
European Journal of Engineering
Education
Publication details, including instructions for authors and
subscription information:
http:/ / www.tandfonline.com/ loi/ ceee20
A coding scheme for analysing
problem-solving processes of first-year
engineering students
Sarah J. Grigg
a
& Lisa C. Benson
a
a
Engineering and Science Education, Clemson University, Clemson,
SC, USA
Published online: 17 Mar 2014.
To cite this article: Sarah J. Grigg & Lisa C. Benson (2014): A coding scheme for analysing problem-
solving processes of first-year engineering students, European Journal of Engineering Education,
DOI: 10.1080/ 03043797.2014.895709
To link to this article: http:/ / dx.doi.org/ 10.1080/ 03043797.2014.895709
PLEASE SCROLL DOWN FOR ARTICLE
Taylor & Francis makes every effort to ensure the accuracy of all the information (the
“Content”) contained in the publications on our platform. However, Taylor & Francis,
our agents, and our licensors make no representations or warranties whatsoever as to
the accuracy, completeness, or suitability for any purpose of the Content. Any opinions
and views expressed in this publication are the opinions and views of the authors,
and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content
should not be relied upon and should be independently verified with primary sources
of information. Taylor and Francis shall not be liable for any losses, actions, claims,
proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or
howsoever caused arising directly or indirectly in connection with, in relation to or arising
out of the use of the Content.
This article may be used for research, teaching, and private study purposes. Any
substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,
systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Conditions of access and use can be found at http: / / www.tandfonline.com/ page/ terms-
and-conditions
European Journal of Engineering Education, 2014
http://dx.doi.org/10.1080/03043797.2014.895709
A coding scheme for analysing problem-solving processes of
first-year engineering students
Sarah J. Grigg and Lisa C. Benson∗
Engineering and Science Education, Clemson University, Clemson, SC, USA
(Received 28 January 2014; accepted 6 February 2014)
This study describes the development and structure of a coding scheme for analysing solutions to
well-structured problems in terms of cognitive processes and problem-solving deficiencies for first-year
engineeringstudents.Ataskanalysisapproachwasusedtoassessstudents’problemsolutionsusingthehier-
archical structure from a theoretical framework from mathematics research. The coding scheme comprises
54 codes within the categories of knowledge access, knowledge generation, self-management, conceptual
errors, mechanical errors, management errors, approach strategies and solution accuracy, and was demon-
strated to be both dependable and credible for analysing problems typical of topics in first-year engineering
courses. The problem-solving processes were evaluated in terms of time, process elements, errors commit-
ted and self-corrected errors. Therefore, problem-solving performance can be analysed in terms of both
accuracy and efficiency of processing, pinpointing areas meriting further study from a cognitive perspective,
and for documenting processes for research purposes.
Keywords: problem-solving skills; cognitive processes; coding scheme; first-year engineering
1. Introduction
While engineers once worked almost exclusively in their specialised field of study, the practice of
engineering is changing in the wake of rapidly changing global economy. Companies are faced
with new challenges that require integration of knowledge from various domains and are often
under tight time constraints to find solutions (NationalAcademy of Engineering 2004). Therefore,
proficiency in problem-solving is highly valuable as industry leaders looks to engineers to tackle
problems involving such constraints as technological change (Jablokow 2007), market globali-
sation and resource sustainability (Rugarcia et al. 2000). The National Academy of Engineering
describes the necessary attributes of the engineer of 2020 as having ingenuity, problem-solving
capabilities, scientific insight, creativity, determination, leadership abilities, conscience, vision
and curiosity (National Academy of Engineering 2004).
To prepare for problem-solving in the workplace, students must develop conceptual and pro-
cedural frameworks that they can use to solve real world problems that are often complex, have
conflicting goals and undefined system constraints (Jonassen et al. 2006). However, students
must first build an engineering knowledge base and develop process skills used in the applica-
tion of knowledge, such as problem-solving and self-assessment (Woods et al. 2000). Because
of the importance of problem-solving skills, educators should strive to help students obtain the
knowledge resources and develop skills required for problem-solving success.
∗Corresponding author. Email: lbenson@clemson.edu
© 2014 SEFI
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
2 S.J. Grigg and L.C. Benson
The purpose of this research is to establish a standardised method for analysing student problem-
solving processes that can be used in the evaluation of problem-solving performances. This paper
details the methodology used to develop a structured scheme for coding the solution processes of
first-year students solving engineering problems independently and presents the coding scheme
as a credible instrument for assessing first-year engineering students’problem-solving skills using
a mixed model methodology (Tashakkori and Teddlie 1998).
2. Literature review
Much research has been conducted on problem-solving from a variety of perspectives. This review
of relevant literature describes various models that have been proposed to explain the problem-
solvingprocess,factorsthathavebeenshowntoimpactproblem-solvingsuccessintheeducational
problem-solving context and analysis tools that have been used by other researchers in the study
of problem-solving.
2.1. Problem-solving models
Several theoretical frameworks describe problem-solving in contexts as diverse as explaining
insights in creativity (Wallas 1926), to heuristics in mathematics (Polya 1957) and gaming strate-
gies in chess (Simon and Simon 1978). Wallas’ model describes creative problem-solving in four
stages: (1) preparation, (2) incubation, (3) inspiration and (4) verification (Wallas 1926). The first
widely accepted problem-solving methodology is credited to George Polya, who describes the
act of problem-solving in four steps: (1) understanding the problem, (2) devising a plan, (3) car-
rying out the plan and (4) looking back or reviewing (Polya 1957). However, like other heuristic
models, the implication that problem-solving is a linear process that can be memorised is flawed;
problem-solvers may iteratively transition back to previous steps (Wilson et al. 1993).
A more recent model depicts problem-solving as a seven-stage cycle that emphasises the itera-
tive nature of the cycle (Pretz et al. 2003). The stages include: (1) recognise/identify the problem,
(2) define and represent the problem mentally, (3) develop a solution strategy, (4) organise knowl-
edge about the problem, (5) allocate resources for solving the problem, (6) monitor progress
towards the goals and (7) evaluate the solution for accuracy. While this structure gives a more
complete view of the stages of problem-solving, in practice, there is much variability in how
people approach the problem and how well each of the stages are completed, if at all (Wilson
et al. 1993).
The stages listed above are based on Sternberg’s Triarchic Theory of Human Intelligence that
breaks analytical intelligence, the form of intelligence utilised in problem-solving, into three com-
ponents: meta-components, performance components and knowledge acquisition components.
Meta-components (metacognition) are higher level executive functions that consist of planning,
monitoring and evaluating the problem-solving process. Performance components are the cogni-
tive processes that perform operations such as making calculations, comparing data or encoding
information that are used to generate new knowledge. Knowledge acquisition components are the
processes used to gain or store new knowledge (Sternberg 1985).
2.2. Factors influencing problem-solving success
Research in problem types and strategies has shown that characteristics of the problem such
as the complexity or structure of the problem (Jonassen and Hung 2008), the person such as
prior experiences (Kirton 2003) and reasoning skills (Jonassen and Hung 2008), the process such
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
European Journal of Engineering Education 3
as cognitive and metacognitive actions (Sternberg 1985; Greeno and Riley 1987) and strategies
(Nickerson 1994), and the environment such as the social context (Woods et al. 2000) all influence
problem-solving performance. In the search of behaviours that promote proficiency in problem-
solving, much research has focused on classifying variations in performance between expert and
novice problem-solvers (Hutchinson 1988) presumably because expert problem solutions exhibit
more successful application of problem-solving skills. Experts have been shown to be up to four
times faster at determining a solution than novices, even though experts also take time to pause
between retrieving equations or chunks of information (Chi et al. 1981) and spend more time than
novices in the problem representation phase of the problem-solving process (Pretz et al. 2003).
Experts also exhibit dramatically different approaches to solving the problem (Chi et al. 1981)
and organise their information differently than novices, displaying larger chunking of information
than novices (Larkin et al. 1980).
While expert strategies are linked with higher level performances, methods used by experts
to solve problems are not necessarily transferable to novices due to cognitive requirements of
these strategies. For example, cognitive overload has been noted as a factor in some of the major
hindrances to achieving problem-solving proficiency, including the inability to solve the prob-
lem without acquiring more information, lack of awareness of performance errors and resistance
to changing a selected method or representation (Wang and Chiew 2010). First-year engineer-
ing students, who have limited experience solving even well-defined problems, tend to access
information stored in memory in a piecemeal rather than systematic or organised fashion (Sweller
1988). Inefficient mapping of information requires more cognitive effort when attempting to iden-
tify relevant information needed to solve problems. This can force students into a state of cognitive
overload, which may limit their ability to complete the problem at all, but also reduces the amount
of cognitive capacity available for metacognitive tasks, such as assessing the reasonableness of
their answers (Wang and Chiew 2010; Sweller 1988). While strategies such as problem decom-
position or subgoaling can be used to alleviate some of the cognitive demand required to solve
a problem (Nickerson 1994); problem-solvers can become too reliant on strategies or use them
inappropriately, leading to a decrement in performance (Matlin 2001). Some of the best ways
of improving performance is through promoting heightened awareness of performance errors,
identifying the source of those errors and personalising instruction in addressing problem areas
(Stigler and Hiebert 2009).
2.3. Analysis tools in prior research
While several independent coding schemes have been developed to analyse problem-solving,
most have addressed written work in conjunction with a think-aloud (Artzt and Armour-Thomas
1992; Litzinger et al. 2010; Wong et al. 2002; Weston et al. 2001). These coding schemes are
tailored to analyse the students’ verbal expressions of their work, not necessarily the elements
explicitly contained in the artefact itself, i.e. the students’ actual problem solution by which they
communicate their problem-solving competencies in the classroom. In addition, studies have
shown that thinking out loud while working on a problem can alter a student’s thought process
(Ericsson and Simon 1998).
Coding schemes for assessing students’ think-alouds are not readily applicable to the assess-
ment of the written problem solutions. Yet, written data are rich in many ways, and analysing
tasks explicitly enacted under authentic problem-solving conditions can reveal strategies or errors
that occur organically and that may impact problem-solving success. As an alternative to cur-
rent methods of assessing problem-solving processes, we utilise a method that has been utilised
in human factors research in the analysis of both physical and cognitive processes: task analysis.
Task analysis methods originated with the work of Gilbreth and Taylor (Gilbreth 1914, Taylor
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
4 S.J. Grigg and L.C. Benson
1911), whose work-study approaches were traditionally used to evaluate and improve the effi-
ciency of workers (Stammers and Shepherd 1990). The definition of task analysis has been
broadened to include the qualitative assessment of humans interacting with a system or pro-
cess to understand how to better match the demands of the task to the capabilities of the human
(Wickens et al. 1998). The subcomponent tasks obtained from task analyses, referred to as ele-
ments, often serve as inputs for other forms of data analysis including error analysis and process
charting techniques (Stanton et al. 2005). Research in mathematics education describes the impor-
tance of error analysis as providing the opportunity to diagnose learning difficulties and develop
criteria for differentiating education, so that instructors can tailor education to individual students
to improve their performance and understanding (Radatz 1980). While there is no consensus on
what constitutes an element, typically they are defined as discrete, measureable and repeatable
units of activity, and it is at the user’s discretion to assign elements that are appropriately sized
for their intended analysis (Stammers and Shepherd 1990).
3. Research purpose
This research establishes a standard method for analysing engineering student problem-solving
processes that can be used in the evaluation of problem-solving performances in a way that is
not context dependent. By analysing the processes and strategies used by first-year engineering
students during problem-solving attempts and measuring their association with errors committed,
researchers can identify which behaviours to emphasise and which to discourage to promote
problem-solving proficiency.
Analysing multiple students’ problem-solving attempts enables the identification of variations
inprocessesandtheevaluationofproblem-solvingperformances.However,toenablecomparisons
of processes over time and across contexts, such as when measuring skills development analysis
must be conducted using consistent methods and common, dependable instruments that are free
from problem-specific features. The development and testing of such an analysis method is an
arduous process, one that requires a critical perspective and iterative refinement. The development
process and coding scheme are presented in this manuscript.
4. Research methods
A task analysis approach was utilised to develop a taxonomy of component and subcomponent
tasks used in the completion of problem solutions, along with a taxonomy of errors and strategies
used by first-year engineering students. Together, these components comprise a coding scheme
that can be used to analyse the problem-solving process and assess performance. Measures of
rater agreement were used to assess dependability, generalizability and repeatability of the coding
scheme by testing multiple solutions from the same problem, different problems and with a new
rater, respectively. Credibility of the coding scheme was assessed by comparing summarised
coding data to results found in previous research on problem-solving tasks.
4.1. Educational environment and participants
Problem solutions were collected from first-year engineering students in an introductory engineer-
ing course taught in a studio setting using active learning techniques. Studio classrooms differ
from traditional lecture halls in that students sit with a small number of peers (6–8) at tables,
which facilitates group interaction unlike traditional lecture halls, which utilise rows of seats that
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
European Journal of Engineering Education 5
are quite often stationary. Learning environments that facilitate student interactions (i.e. studio
environments) are effective in achieving student-centred and inquiry-based learning, which are
approaches for building students’problem-solving and laboratory skills (Prince and Felder 2006).
An example of this type of learning environment is the Student-CentredActive Learning Environ-
ments for Undergraduate Programs (SCALE-UP) approach, which has been applied in science,
engineering, technology and mathematics courses (Oliver-Hoyo and Beichner 2004). Research on
the outcomes of SCALE-UP in engineering and mathematics showed that students participating
in SCALE-UP classrooms exhibited higher levels of efficacy with the course material (Benson
et al. 2010). There was evidence of improved academic performance, conceptual understand-
ing and skills development in students participating in SCALE-UP classrooms compared with
traditional lecture-based instruction. Affective outcomes associated with SCALE-UP classrooms
include higher attendance, completion and retention rates as well as more favourable perceptions
of the course overall (Beichner et al. 2007; Benson et al. 2010). While students were regularly
encouraged to work with their peers on in-class activities, students completed the problems in
this study independently. Students in the course sections taught by a member of the research team
were invited to participate in this study.
Students’ solutions to problems were collected for three problems completed throughout the
semester, with data collected for four semesters. Perceived mental workload was assessed by
administering the NASA-TLX, a validated measure of task load (Farmer and Brownson 2003,
Hart 2006), which students completed following the completion of each problem for two of the
four semesters. Data from these surveys were used to establish the relative difficulty of the three
problems (Grigg and Benson, 2012a, 2012b, 2012c).
Data from one problem from the first semester was utilised for the initial development of the
coding scheme (n = 24). Subsequent tests for generalizability and refinements were made using
solutions from all three problems from the first semester (n = 68). Finally, the original data-
set from the first problem was used to check the credibility of the coding scheme (n = 24) by
comparing results obtained to those obtained in previous research efforts. Coding of subsequent
semester’s data proved that the coding scheme was adequate among a larger data-set (n = 187).
4.2. Engineering problems
Problems were chosen based on characteristics that would ensure moderate problem difficulty for
students in a first-year engineering classroom, who are building their engineering knowledge base
and process skills. The chosen problems (shown in Figure A1 in Appendix 1) struck a balance of
being well structured enough to limit the cognitive load on the students but remain challenging
and provide multiple perspectives to solving the problem in accordance with the guidelines for
problem-based learning (Jonassen and Hung 2008). All problems had (1) a constrained context,
including pre-defined elements (problem inputs), (2) allowed the use of multiple procedures or
algorithms and (3) had a single correct answer (Jonassen 2004). Three problems were selected
that reflected different types of well-structured problems, as defined by Jonassen (2010). One
originated from the course textbook (Stephan et al. 2010) and two were developed by the project
team. All three problems were story problems, in which the student is presented with a narrative
that embeds the values needed to obtain the final answer (Jonassen 2010). Problem 1 involved a
multi-stage solar energy conversion system and required calculation of the efficiency of one stage
given input and output values for the other stages (Stephan et al. 2010). Problem 2 required students
to solve for values of components in a given electrical circuit. This problem, developed by the
project team, also contained a Rule-Using/Rule Induction portion (a problem having one correct
solution but multiple rules governing the process (Jonassen 2010)), where students were asked
to determine an equivalent circuit based on a set of given constraints. Problem 3 involved total
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
6 S.J. Grigg and L.C. Benson
pressure calculations and required students to solve for values within the system, and conversion
between different unit systems. Average perceived mental workload scores on the NASA-TLX
survey was highest for the solar efficiency problem (47.04 out of 100), followed by the resistor
problem (43.54 out of 100) and the total pressure problem (41.93 out of 100) (Grigg and Benson
2012a, 2012b, 2012c). These three problems are included in Appendix 1.
4.3. Tablet PC technology and data collection software
To capture problem-solving processes for analysis, students wrote their solution attempts on a
Tablet PC using a custom-designed software called MuseInk (Bowman and Benson 2011, Grigg
and Benson 2011). The software allows students to work problems on a Tablet PC, and stores the
digital Ink so that it can be replayed, annotated and exported as data to a database for analysis.
Students work through problems much as they would with pen and paper, with the added benefit
of having electronic access to their work, while researchers are provided with a comprehensive
expression of the problem-solving attempt from beginning to end including work that was erased.
The MuseInk software differs from other tablet-based instructional applications, such as DyKnow
and Classroom Presenter, which allow insertion of pen-based feedback into student work in
real time but cannot archive data in a way that is useful for research purposes (DyKnow 2012;
University of Washington). Other instructional applications, such as CogSketch (Forbus et al.
2008) and BeSocratic (Bryfczynski et al. 2012), can provide students with automated feedback
and guidance in an interactive mode, however they do not have the capability of importing a
coding scheme or directly inserting codes into students’ work.
4.4. Coding scheme development
The coding scheme development process progressed through 10 steps. These steps are illustrated
in Figure 1. The results section details the findings of the testing and refinement stage.
Figure 1. Depiction of the steps to develop a coding scheme for qualitative data analysis.
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
European Journal of Engineering Education 7
Figure 2. Summary of coding scheme elements (tasks, errors, strategies and accuracy).
5. Results and discussion
A robust coding scheme for the analysis of problem-solving skills of first-year engineering stu-
dents was developed, and the methodology used to develop and test these complex processes are
described in the following section. Using this coding scheme, students’ solutions were analysed
based on actions taken as a result of cognitive and metacognitive processes, errors, approach
strategies and solution accuracy.
5.1. Coding scheme
The primary result of this investigation is the coding scheme itself, which is summarised in
Figure 2. The following sections describe the major categories of codes and their significance
to the analysis of problem-solving performance. A more thorough description of each code is
included in Appendix 2.
5.1.1. Process element codes
Process elements are the backbone of the coding scheme and describe what the student is doing
as depicted in the problem solution. For codes related to process elements, the basic structure set
forth in the coding scheme by Wong et al. was used as an a priori framework. The framework was
originally used to code student’s videotaped activity while studying mathematical materials during
a concurrent think-aloud to assess processing differences in students based on self-explanation
training versus a control group that did not receive self-explanation training (Wong et al. 2002).
Similar to Sternberg’s (1985) theoretical framework , this coding scheme separated elements into
categories of knowledge access (KA), knowledge generation (KG) and self-management (SM).
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
8 S.J. Grigg and L.C. Benson
‱ KA codes describe instances of the student retrieving knowledge not explicitly stated in the
problem statement.
‱ KG codes describe instances of transforming bits of information to form new connections or
relationships.
‱ SM codes describe tasks related to assessing the current state of problem-solving activity. SM
is collective of subcategories of planning, monitoring, revising and evaluating.
The Wong et al. coding scheme provided the structure needed to distinguish between instances of
retrieval of information from cognitive resources (KA), cognitive elements (KG) and metacog-
nitive elements (self-management). We segmented the self-management category according
to elements of planning, monitoring, evaluating and revising the solution in accordance with
Hartman’s (2001) definition of the executive management aspects of metacognition.
5.1.2. Error codes
Error codes indicate instances where an incorrect element first occurs; errors always occur in
conjunction with a process element. For codes relating to errors, we utilised a structure derived
from error detection literature in accounting, where it is common to classify errors as conceptual
and mechanical errors (Owhoso et al. 2002, Ramsay 1994). A category for management errors to
capture errors was added in metacognitive processes.
‱ Conceptual errors describe instances of misunderstanding of the problem and/or underlying
fundamental concepts.
‱ Mechanical errors describe instances of operation errors, such as calculation errors.
‱ Management errors describe instances of problems managing information, including identifi-
cation of given information, transcribing values or erasing correct work.
With this error-coding structure, errors related to students’understanding of engineering concepts
and their computational skills can be identified separately, allowing researchers to pinpoint hin-
drances to learning. For example, the case in which a student does not use the efficiency equation
properly because s/he did not understand the concept of efficiency is a much different case than
if the student erred in using the equation because s/he has difficulty manipulating equations due
to weak algebra skills or inattention to details.
5.1.3. Strategy codes
Strategy codes are one-time-use codes that describe the overall approach taken to solve the
problem. For strategy codes, we utilised a subset of strategies that appeared most applicable
to story problems from the compilation described in ‘Thinking and Problem Solving’ (Nickerson
1994). The subset was refined from the broader list of strategies identified by Nickerson over the
course of reviewing multiple problems completed by different students with different academic
backgrounds. Specifically, six strategies appeared most frequently within the sample of problem
solutions. These include:
(1) Problem Decomposition (segmentation) – which involves breaking down a complex problem
to ease analysis (Nickerson 1994).
(2) Clustering (chunking) – which involves grouping similar information into larger units (Chi
et al. 1981)
(3) Means-End Analysis – which involves beginning with the identification of a goal state and
the current state followed by the problem-solver making efforts to reduce the gap between
states (Nickerson 1994).
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
European Journal of Engineering Education 9
(4) Forward Chaining – which is similar to Mean-EndAnalysis but involves a direct path between
the current and goal states (Nickerson 1994).
Some problems could also be classified according to an apparent lack of strategy.
(5) Plug-and-chug – which involved inserting given values into an equation and producing an
answer without necessarily understanding the reasons for doing so (Wankat 1999).
(6) Guess-and-check – which is a slightly more sophisticated approach where the problem-
solver checks that the values inserted into an equation yields the correct units or checks
for reasonableness of the solution (Wankat 1999).
Plug-and-chug and guess-and-check strategies are considered beginner-level strategies. Problem
decomposition and means-end analysis strategies are considered intermediate-level strategies
while clustering and forward chaining are considered advanced strategies. These strategies can
be used as a proxy measure of ‘level of expertise’ as has been used in much problem-solving
research in the past.
5.1.4. Solution accuracy codes
Solution accuracy codes describe the accuracy of the final answer and are used once in each
solution. While standard answer states of ‘Correct’ and ‘Incorrect’ could be used to describe the
accuracy of the problem solution, two additional codes were included to describe solutions for
a more fine-grained analysis of accuracy. ‘Correct but Missing/Incorrect units’ is a subset of
‘Correct’ in which the numerical value of the solution is correct, but the units are missing or the
answer is given in a unit other than what was required, such as 120 seconds when the correct
answer should have been reported in minutes. ‘Incomplete’ is a subset of ‘Incorrect’ indicating
that no final answer was obtained.
5.2. Assessment of dependability, generalizability and repeatability
Measures of rater agreement were used to assess dependability, generalizability and repeatability
to the coding scheme by testing multiple solutions from the same problem, different problems
and coding with a new rater, respectively. Coded solutions were assessed on three criteria: (1)
code agreement (i.e. Did all coders associate this particular code with the element?), (2) code
frequency (i.e. Did all coders code the same number of elements?) and (3) code timing (i.e. Were
elements coded by coders consistently at the same point within a solution?).
Inter-rater reliability was calculated based on overall agreement rates for all coded elements, as
shown in calculation (1), and adjusted overall agreement rate, which accounts for discrepancies
due to missing data by measuring agreement only for elements coded by all coders, as shown in
calculation (2) (Gwet 2010). ‘Agreement’ was defined as an instance where a code was identified
by all coders. ‘Missing data’ was defined as a code that one coder applied but another did not,
which may be considered an important omission, depending on how the analysis is used to answer
the research question.
Overall Agreement Rate =
number of ratings in agreement
total number of ratings
, (1)
Adjusted Overall Agreement Rate =
number of ratings in agreement
total number of ratings −
number of ratings with missing data
. (2)
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
10 S.J. Grigg and L.C. Benson
Overall agreement answers the question, ‘To what degree are the codes assigned to a solution the
same across raters?’This approach examines the degree to which coders can both identify and code
elements precisely. Overall agreement rates are used to determine the ability of coders to identify
elements and complete the coding process consistently. Adjusted overall agreement (dropping
missing data from the analysis) answers the questions, ‘Given a set of identified elements, do
two coders code elements similarly?’ Instances with missing data may be dropped to examine
the consistency of the application of codes themselves but is not as good of a measure of coder
performance. For example, an instance where a coder fails to identify an element and neglects
to apply a code signals a need for better coder training. In contrast, an instance of disagreement
about what to code a jointly identified element indicates the need for additional clarity around
the meaning and interpretation of the codes, which is of larger concern in the coding scheme
development process.
5.2.1. Assessing dependability of the coding protocol
Dependability of the coding protocol was evaluated using data from three solutions of the same
problem (the efficiency problem). Inter-rater reliability was calculated by examining overall agree-
ment. Initial results showed an overall agreement rate of 55%, with individual agreement rates of
77%, 55% and 42% for the three solutions. Reviewing the instances of disagreements was instru-
mental in identifying inconsistencies in applying codes and revealing missing problem features
that were not captured in the initial coding scheme.
Two more iterations of revisions were conducted before reaching a satisfactory level of inter-
rater reliability. Initial inter-rater reliability for the second round was ‘Moderate’ with overall
agreement lower than the initial round of coding (for the three solutions, inter-rater agreement
was 73%, 40% and 25%, for an overall agreement rate of 41%). This round of coding revealed
that, with the addition of new codes and reconfiguring code categories, there was confusion with
the use of specific codes as well as the frequency of assigning codes. The coding protocol was
clarified and documented in the codebook. To improve the inter-rater reliability of coders, a round-
robin review cycle was implemented for the next round of coding, in which each coded problem
was reviewed by a second coder as an internal check of adherence to proper coding procedures.
The third round of coding showed great improvement of overall agreement rates with an ‘almost
perfect’ rating. Across the three solutions, agreement rates were 100%, 96% and 85%, for an
overall agreement rate of 92%.
As is shown in the second round of assessment, not all refinements result in automatic improve-
ments (Table 1). This iterative cycle of testing was crucial in refining the coding scheme to be of
a high quality that others can ensure is dependable for their assessment.
5.2.2. Establishing generalizability of coding scheme
Generalizability was evaluated by analysing solutions from three different problems with different
contexts and features. All three problems were different in terms of the context, but were similar
Table 1. Results of iterative refinement on overall coder agreement scores for efficiency problem.
Overall agreement (%) Solution 1 (%) Solution 2 (%) Solution 3 (%)
Round 1: Initial Coding Scheme; One
problem tested (three solutions)
55 77 55 42
Round 2:New codes added; One
problem tested (three solutions)
41 73 40 25
Round 3: Added review cycle; One
problem tested (three solutions)
92 100 96 85
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
European Journal of Engineering Education 11
Table 2. Inter-rater reliability for two original raters calculated as overall agreement and
adjusted overall agreement (missing data removed) with Cohen’s Kappa coefficients.
Retain missing codes Drop missing codes
Problem Overall agreement (%) Îș Adjusted overall agreement (%) Îș
1 88 0.86 98 0.98
2 74 0.72 90 0.89
3 74 0.72 97 0.96
Average 78.7 0.77 95.0 0.94
Table 3. Inter-rater reliability for two original coders + new coder calculated as overall
agreement and adjusted overall agreement.
Retain missing codes Drop missing codes
Problem Overall agreement (%) Îș Adjusted overall agreement (%) Îș
1 70 0.72 98 0.97
2 62 0.65 94 0.94
3 38 0.47 94 0.93
Average 57 0.61 95 0.95
in terms of processes required to solve them. For example, problem-finding tasks were limited
since all problems were defined for the students. Also, no physical construction was required, as
all problems could be solved using mathematical means. This iteration of coding was important
to ensure that the coding scheme was robust enough to be used for a variety of problems within
engineering contexts.
Both overall agreement and adjusted overall agreement (removing instances of missing data)
were calculated along with Cohen’s Kappa coefficients for both measures of agreement.Two of the
three original coders conducted this round of coding. The overall agreement rate was ‘substantial’
(0.769) and adjusted overall agreement rate was ‘almost perfect’(0.942) (Gwet 2010), indicating
that when an element was coded by both coders, it was very likely that they assigned the same
code. A summary of results is given in Table 2.
5.2.3. Assessing repeatability of coding scheme with new rater
To assess whether the coding protocol was repeatable, another member who had no previous
involvement in the coding scheme development was brought in to code solutions. Inter-rater
agreement showed that agreement was acceptable. Again, both overall agreement and adjusted
overall agreement (removing instances of missing data) were calculated along with Cohen’s
Kappa coefficients for both measures of agreement. As given in Table 3, inter-rater reliability
decreased with the addition of the new coder, but remained ‘substantial’ for problems 1 and 2
and ‘almost perfect’ when adjusted to remove data points with missing ratings. In problem 3,
there was a sizeable portion that was not coded because the student iterated through the same
element repeatedly; these iterations were not captured by the new coder, resulting in only ‘fair’
agreement. For the efficiency problem from which the coding scheme was initially developed,
the agreement rates were 70% (98% adjusted). The agreement rate was 62% (94% adjusted) for
the circuit problem and 38% (94%) for the pressure problem, leading to an overall agreement
rate of 57% (95% adjusted). Overall, Cohen’s Kappa coefficients were 0.614 (0.948 adjusted)
for a ‘substantial’ level of inter-rater reliability and ‘near perfect’ level on adjusted scores. These
inter-rater reliability measures were encouraging, showing that the coding scheme is robust and
detailed enough to achieve high reliability between raters. By the end of scheme development
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
12 S.J. Grigg and L.C. Benson
Table 4. Average number of codes by the approach strategy.
Code frequencies by strategy group
Average Time to Average Average Average Average number
Strategy Sample number of completion number of number of number of of answer
group size codes (minutes) KA codes KG codes SM codes codes
Plug-and-chug 2 10 5.1 1 3 2.5 1
Guess-and-check 3 33.3 20.7 1.7 13.7 8 1
Segmentation 15 31.2 17.7 3.4 8.3 9.1 1.6
Chunking 4 21.3 14.3 2.5 5.8 5.8 1.5
Table 5. Average number of errors by the expertise level (as indicated by strategy).
Error code frequencies by strategy group
Strategy Sample Average number of Average number of Average number of Probability of
group size conceptual errors mechanical errors management errors success
Plug-and-chug 2 1 0 1 0
Guess-and-check 3 4 2 4.7 0
Segmentation 15 5.3 1 4.8 0.5
Chunking 4 3.5 2 3.3 0.6
and training, coders were consistently assigning the same codes, though there remained some
confusion for the new coder on when to assign a code (i.e. to code each instance of a task, even
when the tasks were iteratively written and erased)
5.2.4. Establishing credibility of coding scheme
The data from the analysis of the first problem (the efficiency problem) were used to ensure that
the results obtained are in line with those obtained by others when examining problem-solving
with respect to expertise (Chi et al. 1981). Tables 4 and 5 summarise the data used to make this
comparison.
Solutions were divided into four groups based on their approach strategy as a proxy of expertise
level as evidenced in this particular solution. (The remaining two strategies were not found in
this sample of solutions.) Results indicate that those who used a plug-and-chug strategy were
not successful but had the fewest number of codes, number of errors and the shortest time to
completion. These results can be explained by limited awareness of performance problems. The
other two novice groups (guess-and-check and segmentation groups) mirrored results identified in
the previous literature (Chi et al. 1988) as characteristic of novice performance, including longer
time to completion, more errors and a lower probability of success than the more expert level
of performance (chunking group). Our results indicated faster completion time for more expert
performance, though the results showed a more moderate difference between them and the more
novice performance groups than what was observed in the research by Chi et al., namely four times
faster. (Average completion times for ‘novice’ groups, guess-and-check and segmentation, were
20.72 minutes and 17.71 minutes, respectively, compared with the ‘expert’ performance group,
chunking, which was 14.32 minutes). Our research supports the claim that novices commit more
errors. Guess-and-check and segmentation groups committed an average of 10.67 and 11.1 errors
respectively, compared with the chunking group, with an average of 8.83 errors, which indicates
that this coding scheme provides a reasonable assessment of problem-solving performance, as
indicated by relative expertise of the students.
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
European Journal of Engineering Education 13
5.3. Solution assessment
Students’solutions were analysed based on the actions they took as a result of their cognitive and
metacognitive processes, errors, approach strategies and solution accuracy. Examples of step-by-
stepcodingofastudent’ssolutiontotheefficiencyproblemisshowninAppendix3.Usingthiscod-
ing scheme to evaluate problem solutions enabled several different types of analyses. Comparisons
were made to identify key tasks that were associated with successful problem solutions (Grigg
and Benson 2012a, 2012b, 2012c), to identify tasks and errors associated with extreme levels
of perceived mental workload (Grigg and Benson 2012a, 2012b, 2012c) and evaluate the effects
of prior academic experiences on process variations (Grigg and Benson 2012a, 2012b, 2012c).
This research indicated that planning activities had positive benefits on student problem-solving
success, while the use of visual representations was associated with lower mental workload dur-
ing problem-solving attempts. Conceptual errors had the highest association with unsuccessful
solutions; yet having pre-engineering experience did not have a significant impact on success-
fully solving problems. Mechanical errors were also significantly associated with unsuccessful
solution. Having previously completed a calculus course was significantly related to successful
problem-solving, even though none of the problems required calculus to solve them.
6. Implications for research and instruction
While the coding scheme was developed and tested using a set of well-defined story problems typi-
calofafirst-yearengineeringcourse,itislikelytransferrabletothestudyofothertypesofproblems
because of the general nature of the categories of processes, errors and strategies. Problem-solving
activities within various contexts require similar cognitive and metacognitive processes, evoke
similar types of errors and utilise similar approach strategies. However, the generalizability of the
coding scheme to problems beyond well-defined story problems within the first-year engineering
context is an area for future and ongoing research. Much work has gone into adapting critical ele-
ments of the coding scheme into an instructional tool that quantitatively assesses problem-solving
performance for both well-defined and more open-ended problems. The outcome of this research
is an assessment that supports teaching successful problem-solving strategies (Grigg et al. 2013).
The assessment tool maps the presented coding scheme to a problem-solving cycle (Pretz et al.
2003) and allows instructors to provide individualised feedback to students on their problem-
solving proficiency by summarising processes and errors committed as well as what stages of
the problem-solving cycle presented the biggest challenges. To date, this assessment tool has
been used to assess problem solutions in three different undergraduate engineering courses as
part of ongoing testing and refinement. The major advantage of the assessment tool over tradi-
tional grading methods is that it can provide personalised feedback to students about their level of
problem-solving proficiency as well as pinpoint skill deficiencies that need attention. It is believed
that this personalised feedback will improve students’ awareness of their cognitive processes and
their problem-solving performance. Because the criteria are consistent for each problem, student
performance could be tracked over time. This would be enhanced by a streamlined process for
digitising assessment data, which is currently under development. Once a large set of solution
assessments have been collected for a variety of problem types across several courses, regres-
sion models will be used to determine combinations of tasks that contribute to more successful
problem-solving performance.
Acknowledgements
The authors thank Michelle Cook, Catherine McGough, Jennifer Parham-Mocello, David Bowman and Roy Pargas for
their assistance with data collection and development of the coding scheme.
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
14 S.J. Grigg and L.C. Benson
Funding
This work was supported through a grant from the National Science Foundation [EEC 0935163].
References
Artzt, Alice F., and Eleanor Armour-Thomas. 1992. “Development of a Cognitive-Metacognitive Framework for Protocol
Analysis of Mathematical Problem Solving in Small Groups.” Cognition and Instruction 9 (2):137–175.
Beichner, R., J. Saul, D. Abbott, J. Morse, D. Deardorff, R. Allain, S. Bonham, M. Dancy, and J. Risley. 2007. “Student-
CenteredActivities for Large Enrollment Undergraduate Programs (SCALE-UP) project.” In Research-Based Reform
of University Physics Vol. 1, edited by E. F. Redish and P. J. Cooney 1–42. College Park, MD: American Association
of Physics Teachers. http://www.per-central.org/document/ServeFile.cfm?ID=4517
Benson, L. C., M. K. Orr, S. B. Biggers, W. F. Moss, M. W. Ohland, and S. D. Schiff. 2010. “Student-Centered Active,
Cooperative Learning in Engineering.” International Journal of Engineering Education 26 (5): 1097–1110.
Bowman, D., and L. Benson. 2011. “MuseInk: Seeing and Hearing a Freshman Engineering Student Think.” Paper read
at 2010 ASEE annual conference, Louisville, KY.
Bryfczynski, S., R. P. Pargas, M. Cooper, and M. Klymkowsky. 2012. “Analyzing and visualizing student work with
BeSocratic.” Paper read at proceedings of the 50th annual southeast regional conference, at ACM, Tuscaloosa, AL.
Chi, Michelene T., Paul J. Feltovich, and Robert Glaser. 1981. “Categorization and Representation of Physics Problems
by Experts and Novices.” Cognitive Science 5 (2): 121–152.
Chi,MicheleneT.,RobertGlaser,andMarchallJ.Farr.1988.TheNatureofExpertise.Hillsdale,NJ:L.ErlbaumAssociates.
DyKnow. 12/17/2012. “The DyKnow Advantage.” Accessed December 17, 2012. http://www.dyknow.com.
Ericsson, K. Anders, and Herbert A. Simon. 1998. “How to Study Thinking in Everyday Life: Contrasting Think-Aloud
Protocols with Descriptions and Explanations of Thinking.” Mind, Culture, and Activity 5 (3): 178–186.
Farmer, Eric, and Adam Brownson. 2003. Review of Workload Measurement, Analysis and Interpretation Methods.
Technical Report CARE-Integra-TRS-130-02-WP2, EUROCONTROL.
Forbus, K., J. Usher, A. Lovett, K. Lockwood, and J. Wetzel. 2008. “CogSketch: Open-domain Sketch Understanding for
Cognitive Science Research and for Education.” Paper read at proceedings of the fifth eurographics workshop on
sketch-based interfaces and modeling, Annecy, France.
Gilbreth, Lillian. 1914. The Psychology of Management. New York, NY: Sturgis & Walton.
Greeno, James G., and Mary S. Riley. 1987. “Processes and Development of Understanding.” In Metacognition, Moti-
vation, and Understanding, edited by F. E. Weinert and R. H. Kluwe, 289–313. Hillsdale, NJ: Lawrence Erlbaum
Associates.
Grigg, S., and L. Benson. 2011. “Work in Progress: Robust Engineering Problems for the Study of Problem Solving
Strategies.” Proceedings of the Frontiers in Education conference, Rapid City, SD.
Grigg, Sarah J., and Lisa C. Benson. 2012a. “Effects of Student Strategies on Successful Problem Solving.” ASEE annual
conference, San Antonio, TX.
Grigg, Sarah J., and Lisa C. Benson. 2012b. “How doesAcademic Preparation Influence How Engineering Students Solve
Problems?” Paper read at frontiers in education, sSeattle, Washington.
Grigg, Sarah J., and Lisa C. Benson. 2012c. “Using the NASA-TLX toAssess FirstYear Engineering Problem Difficulty.”
Paper read at industrial and systems engineering research conference, Orlando, FL.
Grigg, S., J. Van Dyken, L. Benson, and B. Morkos. 2013. “Process Analysis as a Feedback Tool for Development of
Engineering Problem Solving Skills.” Paper read at 2013 ASEE annual conference, Atlanta, GA.
Gwet, Kilem L. 2010. Handbook of Inter-Rater Reliability: The Definitive Guide to Measuring the Extent of Agreement
upon Multiple Raters. 2nd ed. Gaithersburg, MD: Advanced Analytics, LLC.
Hart, Sandra G. 2006. “NASA-Task Load Index (NASA-TLX); 20 Years Later.” Paper read at human factors and
ergonomics society annual meeting proceedings, San Francisco, CA.
Hartman, Hope J. 2001. “Developing Students’ Metacognitive Knowledge and Skills.” In Metacognition in Learning
and Instruction: Theory, Research and Practice, edited by H. J. Hartman, 33–68. Dordrecht: Kluwer Academic
Publishers.
Hutchinson, SallyA. 1988. “Education and Grounded Theory.” In Qualitative Research in Education: Focus and Methods,
edited by R. R. Sherman and R. B. Webb. New York: Falmer Press.
Jablokow, Kathryn W. 2007. “Engineers as Problem-Solving Leaders: Embracing the Humanities.” IEEE Technology and
Society Magazine 26 (4): 29–35.
Jonassen, David H. 2004. Learning to Solve Problems. San Fransisco, CA: Pfeiffer.
Jonassen, David H. 2010. “Research Issues in Problem Solving.” Paper read at 11th international conference on education
research, Seoul, Korea.
Jonassen, David H., and Woei Hung. 2008. “All Problems are not Equal: Implications for Problem-Based Learning.”
The Interdisciplinary Journal of Problem-Based Learning 2 (2): 6–28.
Jonassen, David, Johannes Strobel, and Chwee Beng Lee. 2006. “Everyday Problem Solving in Engineering: Lessons for
Engineering Educators.” Journal of Engineering Education 95 (2): 139–151.
Kirton, M. J. 2003. Adaption-Innovation in the Context of Diversity and Change. Hove: Routledge.
Larkin, Jill, John McDermott, Dorothea P. Simon, and Herbert A. Simon. 1980. “Expert and Novice Performance in
Solving Physics Problems.” Science 208 (4450): 1335–1342.
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
European Journal of Engineering Education 15
Litzinger, Thomas A, Peggy Van Meter, Carla M. Firetto, Lucas J. Passmore, Christine B. Masters, Stephen R. Turns,
Gary L. Gray, Francesco Costanzo, and Sarah E. Zappe. 2010. “A Cognitive Study of Problem Solving in Statics.”
Journal of Engineering Education 99 (4): 337–353.
Matlin, Margaret W., ed. 2001. Cognition. 5th ed. Hoboken, NJ: John Wiley & Sons.
National Academy of Engineering. 2004. The Engineer of 2020: Visions of Engineering in the New Century. Washington,
DC: National Academies Press.
Nickerson, Raymond S. 1994. “The Teaching and Thinking of Problem Solving.” In Thinking and Problem Solving, edited
by R. J. Sternberg. San Diego, CA: Academic Press.
Oliver-Hoyo, M., and R. Beichner, 2004. “SCALE-UP: Bringing Inquiry-Guided Methods to Large Enrollment Courses.”
In Teaching and Learning through Inquiry: A Guidebook for Institutions and Instructors, edited by V. S. Lee, 51–70.
Sterling, VA: Stylus Publishing.
Owhoso, Vincent E., William F. Messier, and John G. Lynch Jr. 2002. “Error Detection by Industry-Specialised Teams
During Sequential Audit Review.” Journal of Accounting Research 40 (3): 883–900.
Polya, George. 1957. How to Solve It. Garden City, New York: Doubleday.
Pretz, Jean E., Adam J. Naples, and Robert J. Sternberg. 2003. “Recognizing, Defining, and Representing Problems.” In
The Psychology of Problem Solving, edited by J. E. Davidson and R. J. Sternberg, 3–30. Cambridge, UK: Cambridge
University Press.
Prince, M., and R.M. Felder. 2006. “Inductive Teaching and Learning Methods: Definitions, Comparisons, and Research
Bases.” Journal of Engineering Education 95 (2): 123–138.
Radatz, Hendrik. 1980. “Students’ Errors in the Mathematical Learning Environment.” For the Learning of Mathematics
1 (1): 16–20.
Ramsay, Robert J. 1994. “Senior/Manager Difference in Audit Workpaper Review Performance.” Journal of Accounting
Research 32 (1): 127–135.
Rugarcia,Armando, Richard M. Felder, Donald R.Woods, and James E. Stice. 2000. “The Future of Engineering Education
I. A Vision for a New Century.” Chemical Engineering Education 34 (1): 16–25.
Simon, Dorothea P., and Herbert A. Simon. 1978. “Individual Differences in Solving Physics Problems.” In Children’s
Thinking: What Develops? edited by R. S. Sigler, 325–348. NJ: Lawrence Erlbaum.
Stammers, Robert B., andAndrew Shepherd. 1990. “TaskAnalysis.” In Evaluation of Human Work, edited by J. R. Wilson
and E. N. Corlett, 144–168. Philadelphia, PA: Taylor & Francis.
Stanton, Neville,A., Paul M. Salmon, Guy H. Walker, Chris Baber, and Daniel P. Jenkins. 2005. Human Factors Methods:
A Practical Guide for Engineering and Design. Burlington, VT: Ashgate.
Stephan, E. A., W. J. Park, B. L.Sill, D. R. Bowman, and M. H. Ohland. 2010. Thinking Like an Engineer: An Active
Learning Approach. Upper Saddle River, NJ: Pearson Education.
Sternberg, Robert J. 1985. Beyond IQ: A Triarchic Theory of Human Intelligence. NewYork: Cambridge University Press.
Stigler, J. W., and J. Hiebert. 2009. The Teaching Gap: Best Ideas from the World’s Teachers for Improving Education in
the Classroom. New York, NY: Free Press.
Sweller, John. 1988. “Cognitive Load during Problem Solving: Effects on Learning.” Cognitive Science 12 (2): 257–285.
Tashakkori, Abbas, and Charles Teddlie. 1998. Mixed Methodology. Vol. 46. Applied Social Research Methods Series.
Thousand Oaks, CA: Sage Publications.
Taylor, F. W. 1911. The Principles of Scientific Management. New York, NY: Harper & Brothers.
University of Washington. “Classroom Presenter 3.” Accessed December 17, 2012. http://classroompresenter.cs.
washington.edu/.
Wallas, Graham. 1926. The Art of Thought. London: J. Cape.
Wang, Yingxu, and Vincent Chiew. 2010. “On the Cognitive Process of Human Problem Solving.” Cognitive Systems
Research 11 (1): 81–92.
Wankat, Phillip C. 1999. “Reflective Analysis of Student Learning in a Sophomore Engineering Course.” Journal of
Engineering Education 88 (2): 195–203.
Weston, Cynthia, Terry Gandell, Jacinthe Beauchamp, Lynn McAlpine, Carol Wiseman, and Cathy Beauchamp. 2001.
“Analyzing Interview Data: The Development and Evolution of a Coding System.” Qualitative Sociology 24 (3):
381–400.
Wickens, C. D., S. E. Gordon, andY. Liu. 1998. An Introduction to Human Factors Engineering. NewYork, NY: Longman.
Wilson, James W., Maria L. Fernandez, and Nelda Hadaway. 1993. “Mathematical Problem Solving.” Chapter 4 In
Research Ideas for the Classroom: High School Mathematics, edited by P. S. Wilson. New York, NY: MacMillan.
http://www.recsam.edu.my/Mathematical_Problem_Solving.pdf
Wong, Regina M. F., Michael J. Lawson, and John Keeves. 2002. “The Effects of Self-Explanation Training on Students’
Problem Solving in High-School Mathematics.” Learning and Instruction 12 (1): 233–262.
Woods, Donald R., Richard M. Felder,Armando Rugarcia, and James E. Stice. 2000. “The Future of Engineering Education
III. Developing Critical Skills.” Chemical Engineering Education 34 (2): 108–117.
About the authors
Sarah J. Grigg is a Lecturer in General Engineering at Clemson University. Her research focuses on process improvement
and error mitigation across various contexts, including engineering education, healthcare and transportation. She received
PhD, M.S. and B.S. degrees in Industrial Engineering and an MBA from Clemson University.
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
16 S.J. Grigg and L.C. Benson
Lisa C. Benson is an Associate Professor of Engineering and Science Education at Clemson University, with a joint
appointment in Bioengineering. Her research focuses on how student motivation affects learning experiences. Her projects
involve assessing and studying the interaction of motivation, problem-solving strategies and knowledge transfer. Other
projects in the Benson group include utilising Tablet PCs to enhance and assess learning, implementing student-centred
active learning, and incorporating engineering into secondary science and mathematics classrooms. Her education includes
a B.S. in Bioengineering from the University of Vermont and M.S. and PhD in Bioengineering from Clemson University.
Appendix 1. Engineering problems under analysis.
(a)
(b)
Figure A1. (a) The efficiency problem. Reprinted with permission from Pearson Education. (b) The resistor problem
and (c) pressure problem were written by members of the research team.
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
European Journal of Engineering Education 17
(c)
Figure A1. Continued.
Appendix 2. Description of coding scheme elements
Category Code Description
Knowledge access Identify equation Equation with variables, no values
Implicit equation identification No formal equation shown, values inserted initially
Identified assumption Explicit statement of assumption or self-imposed constraint
Identify prior knowledge Identifying outside knowledge to solve the problem
Identify conversion factor Explicit listing of conversion factor(s) used
Use conversion factor Ex 1ft = 12 in 4ft => in = 48 in
Knowledge generation Draw a picture/diagram Flow diagram, schematic, sketch, Venn diagram, etc.
Make a table Organising similar data in lists
Relate variables Assigning relationships in the system, show connections,
insert known values in diagram
Manipulate equation Solving an equation for another variable
Derive units Ex: 4 ft∗12in/1ft = 48 in
Plug values in equation Inserting given or derived values
Document math Documentation of mathematical calculations
Solve intermediate value Getting a sub answer
Self-management
Planning Restate problem Summarising in phrases or sentences
Identify known value Defining variables by given values from problem statement
Identify unknown value Explicitly identifying what is being solved for
Identify constraint Information from problem statement (Ex: only one of each
type of resistors)
Revising Labelling/renaming Clarifying documentation, relabelling variables
Erase work Transition or correction (not fixing penmanship)
Abandon process/start over Completely changing gears
Evaluating Check accuracy Plugging answer back in and checking
Identify final answer Boxed/underlined/circled answer
Monitoring Identify error Correcting or erasing a previous error
Conceptual errors Incorrectly relate variables EX: P1out = P2in, P2out = P3in
Misuse governing equation Error in equation EX: flipped variables or sign
Incorrect visual/graphic
representation
Misrepresenting underlying concepts
Incorrect assumptions Placing or misusing constraints on the system or assumptions
not given in problem statement
Mechanical errors Incorrectly manipulate equation Algebra problem
Incorrect calculation Plugging numbers in calculator incorrectly
Incorrect unit derivation Error in deriving units
Management errors Incorrect known value Inserting wrong number for variable
Incorrect unknown value Solving for wrong variable
Ignored problem constraints Not conforming to constraints given in problem statement
Irrelevant information Using values that are not given and not needed
Inconsistent transcription Correct information rewritten incorrectly (miscopy)
Inconsistent units Mismatch of units in a calculation (EX: mixing English and
SI units in an equation)
Incorrect unit assignment Labelling units on value incorrectly (arbitrarily with no
other documentation)
Using incorrectly generated
information
Using incorrect equation or value calculated in previous part
of problem
Missing units throughout No use of units (or few) in calculations throughout
(Continued)
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
18 S.J. Grigg and L.C. Benson
Appendix 2. Continued
Category Code Description
Erasing correct work Correcting ‘mistake’ that is not really wrong
Strategies Plug-and-chug Plugging numbers into equations without understanding
why
Guess-and-check Trying values and seeing what gives good answers
Work backwards Choosing steps based on known solution
Utilise a similar problem Referring to or working from example in book or notes
Segmentation Discovering or acknowledging multiple parts to problem
(AKA problem decomposition or subgoaling)
Chunking Collapsing multiple parts into one step
Means-ends analysis Working to minimise differences between end goal and
starting point
Forward chaining Planning out path to solve problem
Specialisation/extreme cases Considering abstract or extreme forms of problem
Solution accuracy Correct answer Correctly calculating final answer
Correct but missing/incorrect
units
Correct value with no or incorrect units
Incorrect answer Solving for wrong variable, skipped steps
Incomplete No final answer produced
Appendix 3. Example of coded student work: step-by-step coding of solutions
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014
European Journal of Engineering Education 19
These snapshots at different points in time throughout the solution indicate the student’s work and the associated codes
that were inserted at relevant points in the work.
Downloaded
by
[Lisa
Besnon]
at
08:07
19
March
2014

More Related Content

Similar to A Coding Scheme For Analysing Problem-Solving Processes Of First-Year Engineering Students

Technological persuasive pedagogy a new way to persuade students in the compu...
Technological persuasive pedagogy a new way to persuade students in the compu...Technological persuasive pedagogy a new way to persuade students in the compu...
Technological persuasive pedagogy a new way to persuade students in the compu...Alexander Decker
 
Sentiment analysis in SemEval: a review of sentiment identification approaches
Sentiment analysis in SemEval: a review of sentiment identification approachesSentiment analysis in SemEval: a review of sentiment identification approaches
Sentiment analysis in SemEval: a review of sentiment identification approachesIJECEIAES
 
Kelley
KelleyKelley
KelleyWak_sani
 
A Competence Theory Approach To Problem Solving Method Construction
A Competence Theory Approach To Problem Solving Method ConstructionA Competence Theory Approach To Problem Solving Method Construction
A Competence Theory Approach To Problem Solving Method ConstructionRobin Beregovska
 
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...journal ijrtem
 
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...IJRTEMJOURNAL
 
Instructional design postcards
Instructional design postcardsInstructional design postcards
Instructional design postcardsJennifer Byzewski
 
FROM BRAINSTORMING TO C-SKETCH TO PRINCIPLES OF HISTORICAL INNOVATORS: IDEATI...
FROM BRAINSTORMING TO C-SKETCH TO PRINCIPLES OF HISTORICAL INNOVATORS: IDEATI...FROM BRAINSTORMING TO C-SKETCH TO PRINCIPLES OF HISTORICAL INNOVATORS: IDEATI...
FROM BRAINSTORMING TO C-SKETCH TO PRINCIPLES OF HISTORICAL INNOVATORS: IDEATI...FaelXC
 
Assessing Authentic Problem-Solving In Heat Transfer
Assessing Authentic Problem-Solving In Heat TransferAssessing Authentic Problem-Solving In Heat Transfer
Assessing Authentic Problem-Solving In Heat TransferNathan Mathis
 
Dynamic Question Answer Generator An Enhanced Approach to Question Generation
Dynamic Question Answer Generator An Enhanced Approach to Question GenerationDynamic Question Answer Generator An Enhanced Approach to Question Generation
Dynamic Question Answer Generator An Enhanced Approach to Question Generationijtsrd
 
An Illustrated Design for Self-Directed 3 D Learning GSTA 2017
An Illustrated Design for Self-Directed 3 D Learning  GSTA 2017An Illustrated Design for Self-Directed 3 D Learning  GSTA 2017
An Illustrated Design for Self-Directed 3 D Learning GSTA 2017rekharajaseran
 
Case Study Based Software Engineering Project Development: State of Art
Case Study Based Software Engineering Project Development: State of ArtCase Study Based Software Engineering Project Development: State of Art
Case Study Based Software Engineering Project Development: State of ArtDr Sukhpal Singh Gill
 
A STUDY ON THE SW CODING EDUCATION AND CREATIVE PROBLEM SOLVING
A STUDY ON THE SW CODING EDUCATION AND CREATIVE PROBLEM SOLVINGA STUDY ON THE SW CODING EDUCATION AND CREATIVE PROBLEM SOLVING
A STUDY ON THE SW CODING EDUCATION AND CREATIVE PROBLEM SOLVINGSandra Long
 
Irjet v4 i73A Survey on Student’s Academic Experiences using Social Media Data53
Irjet v4 i73A Survey on Student’s Academic Experiences using Social Media Data53Irjet v4 i73A Survey on Student’s Academic Experiences using Social Media Data53
Irjet v4 i73A Survey on Student’s Academic Experiences using Social Media Data53IRJET Journal
 
Rethinking a presentation
Rethinking a presentationRethinking a presentation
Rethinking a presentationClarity Thinker
 
Hayes
HayesHayes
Hayesanesah
 
AN ANALYSIS OF USING JAPANESE PROBLEM SOLVING ORIENTED LESSON STRUCTURE TO SW...
AN ANALYSIS OF USING JAPANESE PROBLEM SOLVING ORIENTED LESSON STRUCTURE TO SW...AN ANALYSIS OF USING JAPANESE PROBLEM SOLVING ORIENTED LESSON STRUCTURE TO SW...
AN ANALYSIS OF USING JAPANESE PROBLEM SOLVING ORIENTED LESSON STRUCTURE TO SW...Joe Andelija
 
03. cse.sylbs
03. cse.sylbs03. cse.sylbs
03. cse.sylbsAaDi RA
 
Intro to o.r.
Intro to o.r.Intro to o.r.
Intro to o.r.kharar
 
A Synergy Between The Technological Process And A Methodology For Web Design ...
A Synergy Between The Technological Process And A Methodology For Web Design ...A Synergy Between The Technological Process And A Methodology For Web Design ...
A Synergy Between The Technological Process And A Methodology For Web Design ...Andrew Molina
 

Similar to A Coding Scheme For Analysing Problem-Solving Processes Of First-Year Engineering Students (20)

Technological persuasive pedagogy a new way to persuade students in the compu...
Technological persuasive pedagogy a new way to persuade students in the compu...Technological persuasive pedagogy a new way to persuade students in the compu...
Technological persuasive pedagogy a new way to persuade students in the compu...
 
Sentiment analysis in SemEval: a review of sentiment identification approaches
Sentiment analysis in SemEval: a review of sentiment identification approachesSentiment analysis in SemEval: a review of sentiment identification approaches
Sentiment analysis in SemEval: a review of sentiment identification approaches
 
Kelley
KelleyKelley
Kelley
 
A Competence Theory Approach To Problem Solving Method Construction
A Competence Theory Approach To Problem Solving Method ConstructionA Competence Theory Approach To Problem Solving Method Construction
A Competence Theory Approach To Problem Solving Method Construction
 
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
 
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
Modeling the Student Success or Failure in Engineering at VUT Using the Date ...
 
Instructional design postcards
Instructional design postcardsInstructional design postcards
Instructional design postcards
 
FROM BRAINSTORMING TO C-SKETCH TO PRINCIPLES OF HISTORICAL INNOVATORS: IDEATI...
FROM BRAINSTORMING TO C-SKETCH TO PRINCIPLES OF HISTORICAL INNOVATORS: IDEATI...FROM BRAINSTORMING TO C-SKETCH TO PRINCIPLES OF HISTORICAL INNOVATORS: IDEATI...
FROM BRAINSTORMING TO C-SKETCH TO PRINCIPLES OF HISTORICAL INNOVATORS: IDEATI...
 
Assessing Authentic Problem-Solving In Heat Transfer
Assessing Authentic Problem-Solving In Heat TransferAssessing Authentic Problem-Solving In Heat Transfer
Assessing Authentic Problem-Solving In Heat Transfer
 
Dynamic Question Answer Generator An Enhanced Approach to Question Generation
Dynamic Question Answer Generator An Enhanced Approach to Question GenerationDynamic Question Answer Generator An Enhanced Approach to Question Generation
Dynamic Question Answer Generator An Enhanced Approach to Question Generation
 
An Illustrated Design for Self-Directed 3 D Learning GSTA 2017
An Illustrated Design for Self-Directed 3 D Learning  GSTA 2017An Illustrated Design for Self-Directed 3 D Learning  GSTA 2017
An Illustrated Design for Self-Directed 3 D Learning GSTA 2017
 
Case Study Based Software Engineering Project Development: State of Art
Case Study Based Software Engineering Project Development: State of ArtCase Study Based Software Engineering Project Development: State of Art
Case Study Based Software Engineering Project Development: State of Art
 
A STUDY ON THE SW CODING EDUCATION AND CREATIVE PROBLEM SOLVING
A STUDY ON THE SW CODING EDUCATION AND CREATIVE PROBLEM SOLVINGA STUDY ON THE SW CODING EDUCATION AND CREATIVE PROBLEM SOLVING
A STUDY ON THE SW CODING EDUCATION AND CREATIVE PROBLEM SOLVING
 
Irjet v4 i73A Survey on Student’s Academic Experiences using Social Media Data53
Irjet v4 i73A Survey on Student’s Academic Experiences using Social Media Data53Irjet v4 i73A Survey on Student’s Academic Experiences using Social Media Data53
Irjet v4 i73A Survey on Student’s Academic Experiences using Social Media Data53
 
Rethinking a presentation
Rethinking a presentationRethinking a presentation
Rethinking a presentation
 
Hayes
HayesHayes
Hayes
 
AN ANALYSIS OF USING JAPANESE PROBLEM SOLVING ORIENTED LESSON STRUCTURE TO SW...
AN ANALYSIS OF USING JAPANESE PROBLEM SOLVING ORIENTED LESSON STRUCTURE TO SW...AN ANALYSIS OF USING JAPANESE PROBLEM SOLVING ORIENTED LESSON STRUCTURE TO SW...
AN ANALYSIS OF USING JAPANESE PROBLEM SOLVING ORIENTED LESSON STRUCTURE TO SW...
 
03. cse.sylbs
03. cse.sylbs03. cse.sylbs
03. cse.sylbs
 
Intro to o.r.
Intro to o.r.Intro to o.r.
Intro to o.r.
 
A Synergy Between The Technological Process And A Methodology For Web Design ...
A Synergy Between The Technological Process And A Methodology For Web Design ...A Synergy Between The Technological Process And A Methodology For Web Design ...
A Synergy Between The Technological Process And A Methodology For Web Design ...
 

More from Deja Lewis

Outlining Essays (Grades ) - Introducing Expository W
Outlining Essays (Grades ) - Introducing Expository WOutlining Essays (Grades ) - Introducing Expository W
Outlining Essays (Grades ) - Introducing Expository WDeja Lewis
 
Should College Athletes Be Paid For Playing Persuasive Essay
Should College Athletes Be Paid For Playing Persuasive EssayShould College Athletes Be Paid For Playing Persuasive Essay
Should College Athletes Be Paid For Playing Persuasive EssayDeja Lewis
 
021 Leadership Essays Essay Example Nursing W
021 Leadership Essays Essay Example Nursing W021 Leadership Essays Essay Example Nursing W
021 Leadership Essays Essay Example Nursing WDeja Lewis
 
UK Best Essays Trusted Essays Writing Service Essay Writers
UK Best Essays Trusted Essays Writing Service  Essay WritersUK Best Essays Trusted Essays Writing Service  Essay Writers
UK Best Essays Trusted Essays Writing Service Essay WritersDeja Lewis
 
41 Scarecrow Writing Prompts Fun Ideas To Writ
41 Scarecrow Writing Prompts Fun Ideas To Writ41 Scarecrow Writing Prompts Fun Ideas To Writ
41 Scarecrow Writing Prompts Fun Ideas To WritDeja Lewis
 
Purpose Of Introduction In Essay. Online assignment writing service.
Purpose Of Introduction In Essay. Online assignment writing service.Purpose Of Introduction In Essay. Online assignment writing service.
Purpose Of Introduction In Essay. Online assignment writing service.Deja Lewis
 
Weather Worksheet - Our English Site . Online assignment writing service.
Weather Worksheet -  Our English Site . Online assignment writing service.Weather Worksheet -  Our English Site . Online assignment writing service.
Weather Worksheet - Our English Site . Online assignment writing service.Deja Lewis
 
Example Of A Hero Essay Template. Online assignment writing service.
Example Of A Hero Essay Template. Online assignment writing service.Example Of A Hero Essay Template. Online assignment writing service.
Example Of A Hero Essay Template. Online assignment writing service.Deja Lewis
 
How To Write A Synthesis Essay Full Guide By Han
How To Write A Synthesis Essay  Full Guide By HanHow To Write A Synthesis Essay  Full Guide By Han
How To Write A Synthesis Essay Full Guide By HanDeja Lewis
 
My Favorite Teacher Essay In English. Seamo-Official.Org
My Favorite Teacher Essay In English. Seamo-Official.OrgMy Favorite Teacher Essay In English. Seamo-Official.Org
My Favorite Teacher Essay In English. Seamo-Official.OrgDeja Lewis
 
Short Essay On My Father In English - YouTube
Short Essay On My Father In English - YouTubeShort Essay On My Father In English - YouTube
Short Essay On My Father In English - YouTubeDeja Lewis
 
Buy College Application Essay Best Ever Best Coll
Buy College Application Essay Best Ever Best CollBuy College Application Essay Best Ever Best Coll
Buy College Application Essay Best Ever Best CollDeja Lewis
 
Free Printable Love Letter Pad Stationery Free Print
Free Printable Love Letter Pad Stationery  Free PrintFree Printable Love Letter Pad Stationery  Free Print
Free Printable Love Letter Pad Stationery Free PrintDeja Lewis
 
Paper Mate Write Bros. Medium. Online assignment writing service.
Paper Mate Write Bros. Medium. Online assignment writing service.Paper Mate Write Bros. Medium. Online assignment writing service.
Paper Mate Write Bros. Medium. Online assignment writing service.Deja Lewis
 
Grammar Clinic Letter Writing (Informal Letter) P
Grammar Clinic Letter Writing (Informal Letter)  PGrammar Clinic Letter Writing (Informal Letter)  P
Grammar Clinic Letter Writing (Informal Letter) PDeja Lewis
 
Transfer Essays Sample. Online assignment writing service.
Transfer Essays Sample. Online assignment writing service.Transfer Essays Sample. Online assignment writing service.
Transfer Essays Sample. Online assignment writing service.Deja Lewis
 
The Cost Of College Is Too High Essays. Online assignment writing service.
The Cost Of College Is Too High Essays. Online assignment writing service.The Cost Of College Is Too High Essays. Online assignment writing service.
The Cost Of College Is Too High Essays. Online assignment writing service.Deja Lewis
 
Check Out Flawless Interview Paper From Our Writers
Check Out Flawless Interview Paper From Our WritersCheck Out Flawless Interview Paper From Our Writers
Check Out Flawless Interview Paper From Our WritersDeja Lewis
 
Castle Leaflet Writing Frame - KS1 (Teacher Made) - Tw
Castle Leaflet Writing Frame - KS1 (Teacher Made) - TwCastle Leaflet Writing Frame - KS1 (Teacher Made) - Tw
Castle Leaflet Writing Frame - KS1 (Teacher Made) - TwDeja Lewis
 
Buy Ready Essays - Buy. Online assignment writing service.
Buy Ready Essays - Buy. Online assignment writing service.Buy Ready Essays - Buy. Online assignment writing service.
Buy Ready Essays - Buy. Online assignment writing service.Deja Lewis
 

More from Deja Lewis (20)

Outlining Essays (Grades ) - Introducing Expository W
Outlining Essays (Grades ) - Introducing Expository WOutlining Essays (Grades ) - Introducing Expository W
Outlining Essays (Grades ) - Introducing Expository W
 
Should College Athletes Be Paid For Playing Persuasive Essay
Should College Athletes Be Paid For Playing Persuasive EssayShould College Athletes Be Paid For Playing Persuasive Essay
Should College Athletes Be Paid For Playing Persuasive Essay
 
021 Leadership Essays Essay Example Nursing W
021 Leadership Essays Essay Example Nursing W021 Leadership Essays Essay Example Nursing W
021 Leadership Essays Essay Example Nursing W
 
UK Best Essays Trusted Essays Writing Service Essay Writers
UK Best Essays Trusted Essays Writing Service  Essay WritersUK Best Essays Trusted Essays Writing Service  Essay Writers
UK Best Essays Trusted Essays Writing Service Essay Writers
 
41 Scarecrow Writing Prompts Fun Ideas To Writ
41 Scarecrow Writing Prompts Fun Ideas To Writ41 Scarecrow Writing Prompts Fun Ideas To Writ
41 Scarecrow Writing Prompts Fun Ideas To Writ
 
Purpose Of Introduction In Essay. Online assignment writing service.
Purpose Of Introduction In Essay. Online assignment writing service.Purpose Of Introduction In Essay. Online assignment writing service.
Purpose Of Introduction In Essay. Online assignment writing service.
 
Weather Worksheet - Our English Site . Online assignment writing service.
Weather Worksheet -  Our English Site . Online assignment writing service.Weather Worksheet -  Our English Site . Online assignment writing service.
Weather Worksheet - Our English Site . Online assignment writing service.
 
Example Of A Hero Essay Template. Online assignment writing service.
Example Of A Hero Essay Template. Online assignment writing service.Example Of A Hero Essay Template. Online assignment writing service.
Example Of A Hero Essay Template. Online assignment writing service.
 
How To Write A Synthesis Essay Full Guide By Han
How To Write A Synthesis Essay  Full Guide By HanHow To Write A Synthesis Essay  Full Guide By Han
How To Write A Synthesis Essay Full Guide By Han
 
My Favorite Teacher Essay In English. Seamo-Official.Org
My Favorite Teacher Essay In English. Seamo-Official.OrgMy Favorite Teacher Essay In English. Seamo-Official.Org
My Favorite Teacher Essay In English. Seamo-Official.Org
 
Short Essay On My Father In English - YouTube
Short Essay On My Father In English - YouTubeShort Essay On My Father In English - YouTube
Short Essay On My Father In English - YouTube
 
Buy College Application Essay Best Ever Best Coll
Buy College Application Essay Best Ever Best CollBuy College Application Essay Best Ever Best Coll
Buy College Application Essay Best Ever Best Coll
 
Free Printable Love Letter Pad Stationery Free Print
Free Printable Love Letter Pad Stationery  Free PrintFree Printable Love Letter Pad Stationery  Free Print
Free Printable Love Letter Pad Stationery Free Print
 
Paper Mate Write Bros. Medium. Online assignment writing service.
Paper Mate Write Bros. Medium. Online assignment writing service.Paper Mate Write Bros. Medium. Online assignment writing service.
Paper Mate Write Bros. Medium. Online assignment writing service.
 
Grammar Clinic Letter Writing (Informal Letter) P
Grammar Clinic Letter Writing (Informal Letter)  PGrammar Clinic Letter Writing (Informal Letter)  P
Grammar Clinic Letter Writing (Informal Letter) P
 
Transfer Essays Sample. Online assignment writing service.
Transfer Essays Sample. Online assignment writing service.Transfer Essays Sample. Online assignment writing service.
Transfer Essays Sample. Online assignment writing service.
 
The Cost Of College Is Too High Essays. Online assignment writing service.
The Cost Of College Is Too High Essays. Online assignment writing service.The Cost Of College Is Too High Essays. Online assignment writing service.
The Cost Of College Is Too High Essays. Online assignment writing service.
 
Check Out Flawless Interview Paper From Our Writers
Check Out Flawless Interview Paper From Our WritersCheck Out Flawless Interview Paper From Our Writers
Check Out Flawless Interview Paper From Our Writers
 
Castle Leaflet Writing Frame - KS1 (Teacher Made) - Tw
Castle Leaflet Writing Frame - KS1 (Teacher Made) - TwCastle Leaflet Writing Frame - KS1 (Teacher Made) - Tw
Castle Leaflet Writing Frame - KS1 (Teacher Made) - Tw
 
Buy Ready Essays - Buy. Online assignment writing service.
Buy Ready Essays - Buy. Online assignment writing service.Buy Ready Essays - Buy. Online assignment writing service.
Buy Ready Essays - Buy. Online assignment writing service.
 

Recently uploaded

Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Jisc
 
Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111GangaMaiya1
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxheathfieldcps1
 
Call Girls in Uttam Nagar (delhi) call me [🔝9953056974🔝] escort service 24X7
Call Girls in  Uttam Nagar (delhi) call me [🔝9953056974🔝] escort service 24X7Call Girls in  Uttam Nagar (delhi) call me [🔝9953056974🔝] escort service 24X7
Call Girls in Uttam Nagar (delhi) call me [🔝9953056974🔝] escort service 24X79953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Simple, Complex, and Compound Sentences Exercises.pdf
Simple, Complex, and Compound Sentences Exercises.pdfSimple, Complex, and Compound Sentences Exercises.pdf
Simple, Complex, and Compound Sentences Exercises.pdfstareducators107
 
REMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptxREMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptxDr. Ravikiran H M Gowda
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and ModificationsMJDuyan
 
How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17Celine George
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxmarlenawright1
 
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxExploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxPooja Bhuva
 
How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17Celine George
 
How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17Celine George
 
Towards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxTowards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxJisc
 
Basic Intentional Injuries Health Education
Basic Intentional Injuries Health EducationBasic Intentional Injuries Health Education
Basic Intentional Injuries Health EducationNeilDeclaro1
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxRamakrishna Reddy Bijjam
 
dusjagr & nano talk on open tools for agriculture research and learning
dusjagr & nano talk on open tools for agriculture research and learningdusjagr & nano talk on open tools for agriculture research and learning
dusjagr & nano talk on open tools for agriculture research and learningMarc Dusseiller Dusjagr
 
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxOn_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxPooja Bhuva
 
AIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.pptAIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.pptNishitharanjan Rout
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentationcamerronhm
 
Philosophy of china and it's charactistics
Philosophy of china and it's charactisticsPhilosophy of china and it's charactistics
Philosophy of china and it's charactisticshameyhk98
 

Recently uploaded (20)

Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Call Girls in Uttam Nagar (delhi) call me [🔝9953056974🔝] escort service 24X7
Call Girls in  Uttam Nagar (delhi) call me [🔝9953056974🔝] escort service 24X7Call Girls in  Uttam Nagar (delhi) call me [🔝9953056974🔝] escort service 24X7
Call Girls in Uttam Nagar (delhi) call me [🔝9953056974🔝] escort service 24X7
 
Simple, Complex, and Compound Sentences Exercises.pdf
Simple, Complex, and Compound Sentences Exercises.pdfSimple, Complex, and Compound Sentences Exercises.pdf
Simple, Complex, and Compound Sentences Exercises.pdf
 
REMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptxREMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptx
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17How to Manage Call for Tendor in Odoo 17
How to Manage Call for Tendor in Odoo 17
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
 
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxExploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
 
How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17
 
How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17How to Add New Custom Addons Path in Odoo 17
How to Add New Custom Addons Path in Odoo 17
 
Towards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptxTowards a code of practice for AI in AT.pptx
Towards a code of practice for AI in AT.pptx
 
Basic Intentional Injuries Health Education
Basic Intentional Injuries Health EducationBasic Intentional Injuries Health Education
Basic Intentional Injuries Health Education
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
dusjagr & nano talk on open tools for agriculture research and learning
dusjagr & nano talk on open tools for agriculture research and learningdusjagr & nano talk on open tools for agriculture research and learning
dusjagr & nano talk on open tools for agriculture research and learning
 
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptxOn_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
On_Translating_a_Tamil_Poem_by_A_K_Ramanujan.pptx
 
AIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.pptAIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.ppt
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentation
 
Philosophy of china and it's charactistics
Philosophy of china and it's charactisticsPhilosophy of china and it's charactistics
Philosophy of china and it's charactistics
 

A Coding Scheme For Analysing Problem-Solving Processes Of First-Year Engineering Students

  • 1. This article was downloaded by: [ Lisa Besnon] On: 19 March 2014, At: 08: 07 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK European Journal of Engineering Education Publication details, including instructions for authors and subscription information: http:/ / www.tandfonline.com/ loi/ ceee20 A coding scheme for analysing problem-solving processes of first-year engineering students Sarah J. Grigg a & Lisa C. Benson a a Engineering and Science Education, Clemson University, Clemson, SC, USA Published online: 17 Mar 2014. To cite this article: Sarah J. Grigg & Lisa C. Benson (2014): A coding scheme for analysing problem- solving processes of first-year engineering students, European Journal of Engineering Education, DOI: 10.1080/ 03043797.2014.895709 To link to this article: http:/ / dx.doi.org/ 10.1080/ 03043797.2014.895709 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http: / / www.tandfonline.com/ page/ terms- and-conditions
  • 2. European Journal of Engineering Education, 2014 http://dx.doi.org/10.1080/03043797.2014.895709 A coding scheme for analysing problem-solving processes of first-year engineering students Sarah J. Grigg and Lisa C. Benson∗ Engineering and Science Education, Clemson University, Clemson, SC, USA (Received 28 January 2014; accepted 6 February 2014) This study describes the development and structure of a coding scheme for analysing solutions to well-structured problems in terms of cognitive processes and problem-solving deficiencies for first-year engineeringstudents.Ataskanalysisapproachwasusedtoassessstudents’problemsolutionsusingthehier- archical structure from a theoretical framework from mathematics research. The coding scheme comprises 54 codes within the categories of knowledge access, knowledge generation, self-management, conceptual errors, mechanical errors, management errors, approach strategies and solution accuracy, and was demon- strated to be both dependable and credible for analysing problems typical of topics in first-year engineering courses. The problem-solving processes were evaluated in terms of time, process elements, errors commit- ted and self-corrected errors. Therefore, problem-solving performance can be analysed in terms of both accuracy and efficiency of processing, pinpointing areas meriting further study from a cognitive perspective, and for documenting processes for research purposes. Keywords: problem-solving skills; cognitive processes; coding scheme; first-year engineering 1. Introduction While engineers once worked almost exclusively in their specialised field of study, the practice of engineering is changing in the wake of rapidly changing global economy. Companies are faced with new challenges that require integration of knowledge from various domains and are often under tight time constraints to find solutions (NationalAcademy of Engineering 2004). Therefore, proficiency in problem-solving is highly valuable as industry leaders looks to engineers to tackle problems involving such constraints as technological change (Jablokow 2007), market globali- sation and resource sustainability (Rugarcia et al. 2000). The National Academy of Engineering describes the necessary attributes of the engineer of 2020 as having ingenuity, problem-solving capabilities, scientific insight, creativity, determination, leadership abilities, conscience, vision and curiosity (National Academy of Engineering 2004). To prepare for problem-solving in the workplace, students must develop conceptual and pro- cedural frameworks that they can use to solve real world problems that are often complex, have conflicting goals and undefined system constraints (Jonassen et al. 2006). However, students must first build an engineering knowledge base and develop process skills used in the applica- tion of knowledge, such as problem-solving and self-assessment (Woods et al. 2000). Because of the importance of problem-solving skills, educators should strive to help students obtain the knowledge resources and develop skills required for problem-solving success. ∗Corresponding author. Email: lbenson@clemson.edu © 2014 SEFI Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 3. 2 S.J. Grigg and L.C. Benson The purpose of this research is to establish a standardised method for analysing student problem- solving processes that can be used in the evaluation of problem-solving performances. This paper details the methodology used to develop a structured scheme for coding the solution processes of first-year students solving engineering problems independently and presents the coding scheme as a credible instrument for assessing first-year engineering students’problem-solving skills using a mixed model methodology (Tashakkori and Teddlie 1998). 2. Literature review Much research has been conducted on problem-solving from a variety of perspectives. This review of relevant literature describes various models that have been proposed to explain the problem- solvingprocess,factorsthathavebeenshowntoimpactproblem-solvingsuccessintheeducational problem-solving context and analysis tools that have been used by other researchers in the study of problem-solving. 2.1. Problem-solving models Several theoretical frameworks describe problem-solving in contexts as diverse as explaining insights in creativity (Wallas 1926), to heuristics in mathematics (Polya 1957) and gaming strate- gies in chess (Simon and Simon 1978). Wallas’ model describes creative problem-solving in four stages: (1) preparation, (2) incubation, (3) inspiration and (4) verification (Wallas 1926). The first widely accepted problem-solving methodology is credited to George Polya, who describes the act of problem-solving in four steps: (1) understanding the problem, (2) devising a plan, (3) car- rying out the plan and (4) looking back or reviewing (Polya 1957). However, like other heuristic models, the implication that problem-solving is a linear process that can be memorised is flawed; problem-solvers may iteratively transition back to previous steps (Wilson et al. 1993). A more recent model depicts problem-solving as a seven-stage cycle that emphasises the itera- tive nature of the cycle (Pretz et al. 2003). The stages include: (1) recognise/identify the problem, (2) define and represent the problem mentally, (3) develop a solution strategy, (4) organise knowl- edge about the problem, (5) allocate resources for solving the problem, (6) monitor progress towards the goals and (7) evaluate the solution for accuracy. While this structure gives a more complete view of the stages of problem-solving, in practice, there is much variability in how people approach the problem and how well each of the stages are completed, if at all (Wilson et al. 1993). The stages listed above are based on Sternberg’s Triarchic Theory of Human Intelligence that breaks analytical intelligence, the form of intelligence utilised in problem-solving, into three com- ponents: meta-components, performance components and knowledge acquisition components. Meta-components (metacognition) are higher level executive functions that consist of planning, monitoring and evaluating the problem-solving process. Performance components are the cogni- tive processes that perform operations such as making calculations, comparing data or encoding information that are used to generate new knowledge. Knowledge acquisition components are the processes used to gain or store new knowledge (Sternberg 1985). 2.2. Factors influencing problem-solving success Research in problem types and strategies has shown that characteristics of the problem such as the complexity or structure of the problem (Jonassen and Hung 2008), the person such as prior experiences (Kirton 2003) and reasoning skills (Jonassen and Hung 2008), the process such Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 4. European Journal of Engineering Education 3 as cognitive and metacognitive actions (Sternberg 1985; Greeno and Riley 1987) and strategies (Nickerson 1994), and the environment such as the social context (Woods et al. 2000) all influence problem-solving performance. In the search of behaviours that promote proficiency in problem- solving, much research has focused on classifying variations in performance between expert and novice problem-solvers (Hutchinson 1988) presumably because expert problem solutions exhibit more successful application of problem-solving skills. Experts have been shown to be up to four times faster at determining a solution than novices, even though experts also take time to pause between retrieving equations or chunks of information (Chi et al. 1981) and spend more time than novices in the problem representation phase of the problem-solving process (Pretz et al. 2003). Experts also exhibit dramatically different approaches to solving the problem (Chi et al. 1981) and organise their information differently than novices, displaying larger chunking of information than novices (Larkin et al. 1980). While expert strategies are linked with higher level performances, methods used by experts to solve problems are not necessarily transferable to novices due to cognitive requirements of these strategies. For example, cognitive overload has been noted as a factor in some of the major hindrances to achieving problem-solving proficiency, including the inability to solve the prob- lem without acquiring more information, lack of awareness of performance errors and resistance to changing a selected method or representation (Wang and Chiew 2010). First-year engineer- ing students, who have limited experience solving even well-defined problems, tend to access information stored in memory in a piecemeal rather than systematic or organised fashion (Sweller 1988). Inefficient mapping of information requires more cognitive effort when attempting to iden- tify relevant information needed to solve problems. This can force students into a state of cognitive overload, which may limit their ability to complete the problem at all, but also reduces the amount of cognitive capacity available for metacognitive tasks, such as assessing the reasonableness of their answers (Wang and Chiew 2010; Sweller 1988). While strategies such as problem decom- position or subgoaling can be used to alleviate some of the cognitive demand required to solve a problem (Nickerson 1994); problem-solvers can become too reliant on strategies or use them inappropriately, leading to a decrement in performance (Matlin 2001). Some of the best ways of improving performance is through promoting heightened awareness of performance errors, identifying the source of those errors and personalising instruction in addressing problem areas (Stigler and Hiebert 2009). 2.3. Analysis tools in prior research While several independent coding schemes have been developed to analyse problem-solving, most have addressed written work in conjunction with a think-aloud (Artzt and Armour-Thomas 1992; Litzinger et al. 2010; Wong et al. 2002; Weston et al. 2001). These coding schemes are tailored to analyse the students’ verbal expressions of their work, not necessarily the elements explicitly contained in the artefact itself, i.e. the students’ actual problem solution by which they communicate their problem-solving competencies in the classroom. In addition, studies have shown that thinking out loud while working on a problem can alter a student’s thought process (Ericsson and Simon 1998). Coding schemes for assessing students’ think-alouds are not readily applicable to the assess- ment of the written problem solutions. Yet, written data are rich in many ways, and analysing tasks explicitly enacted under authentic problem-solving conditions can reveal strategies or errors that occur organically and that may impact problem-solving success. As an alternative to cur- rent methods of assessing problem-solving processes, we utilise a method that has been utilised in human factors research in the analysis of both physical and cognitive processes: task analysis. Task analysis methods originated with the work of Gilbreth and Taylor (Gilbreth 1914, Taylor Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 5. 4 S.J. Grigg and L.C. Benson 1911), whose work-study approaches were traditionally used to evaluate and improve the effi- ciency of workers (Stammers and Shepherd 1990). The definition of task analysis has been broadened to include the qualitative assessment of humans interacting with a system or pro- cess to understand how to better match the demands of the task to the capabilities of the human (Wickens et al. 1998). The subcomponent tasks obtained from task analyses, referred to as ele- ments, often serve as inputs for other forms of data analysis including error analysis and process charting techniques (Stanton et al. 2005). Research in mathematics education describes the impor- tance of error analysis as providing the opportunity to diagnose learning difficulties and develop criteria for differentiating education, so that instructors can tailor education to individual students to improve their performance and understanding (Radatz 1980). While there is no consensus on what constitutes an element, typically they are defined as discrete, measureable and repeatable units of activity, and it is at the user’s discretion to assign elements that are appropriately sized for their intended analysis (Stammers and Shepherd 1990). 3. Research purpose This research establishes a standard method for analysing engineering student problem-solving processes that can be used in the evaluation of problem-solving performances in a way that is not context dependent. By analysing the processes and strategies used by first-year engineering students during problem-solving attempts and measuring their association with errors committed, researchers can identify which behaviours to emphasise and which to discourage to promote problem-solving proficiency. Analysing multiple students’ problem-solving attempts enables the identification of variations inprocessesandtheevaluationofproblem-solvingperformances.However,toenablecomparisons of processes over time and across contexts, such as when measuring skills development analysis must be conducted using consistent methods and common, dependable instruments that are free from problem-specific features. The development and testing of such an analysis method is an arduous process, one that requires a critical perspective and iterative refinement. The development process and coding scheme are presented in this manuscript. 4. Research methods A task analysis approach was utilised to develop a taxonomy of component and subcomponent tasks used in the completion of problem solutions, along with a taxonomy of errors and strategies used by first-year engineering students. Together, these components comprise a coding scheme that can be used to analyse the problem-solving process and assess performance. Measures of rater agreement were used to assess dependability, generalizability and repeatability of the coding scheme by testing multiple solutions from the same problem, different problems and with a new rater, respectively. Credibility of the coding scheme was assessed by comparing summarised coding data to results found in previous research on problem-solving tasks. 4.1. Educational environment and participants Problem solutions were collected from first-year engineering students in an introductory engineer- ing course taught in a studio setting using active learning techniques. Studio classrooms differ from traditional lecture halls in that students sit with a small number of peers (6–8) at tables, which facilitates group interaction unlike traditional lecture halls, which utilise rows of seats that Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 6. European Journal of Engineering Education 5 are quite often stationary. Learning environments that facilitate student interactions (i.e. studio environments) are effective in achieving student-centred and inquiry-based learning, which are approaches for building students’problem-solving and laboratory skills (Prince and Felder 2006). An example of this type of learning environment is the Student-CentredActive Learning Environ- ments for Undergraduate Programs (SCALE-UP) approach, which has been applied in science, engineering, technology and mathematics courses (Oliver-Hoyo and Beichner 2004). Research on the outcomes of SCALE-UP in engineering and mathematics showed that students participating in SCALE-UP classrooms exhibited higher levels of efficacy with the course material (Benson et al. 2010). There was evidence of improved academic performance, conceptual understand- ing and skills development in students participating in SCALE-UP classrooms compared with traditional lecture-based instruction. Affective outcomes associated with SCALE-UP classrooms include higher attendance, completion and retention rates as well as more favourable perceptions of the course overall (Beichner et al. 2007; Benson et al. 2010). While students were regularly encouraged to work with their peers on in-class activities, students completed the problems in this study independently. Students in the course sections taught by a member of the research team were invited to participate in this study. Students’ solutions to problems were collected for three problems completed throughout the semester, with data collected for four semesters. Perceived mental workload was assessed by administering the NASA-TLX, a validated measure of task load (Farmer and Brownson 2003, Hart 2006), which students completed following the completion of each problem for two of the four semesters. Data from these surveys were used to establish the relative difficulty of the three problems (Grigg and Benson, 2012a, 2012b, 2012c). Data from one problem from the first semester was utilised for the initial development of the coding scheme (n = 24). Subsequent tests for generalizability and refinements were made using solutions from all three problems from the first semester (n = 68). Finally, the original data- set from the first problem was used to check the credibility of the coding scheme (n = 24) by comparing results obtained to those obtained in previous research efforts. Coding of subsequent semester’s data proved that the coding scheme was adequate among a larger data-set (n = 187). 4.2. Engineering problems Problems were chosen based on characteristics that would ensure moderate problem difficulty for students in a first-year engineering classroom, who are building their engineering knowledge base and process skills. The chosen problems (shown in Figure A1 in Appendix 1) struck a balance of being well structured enough to limit the cognitive load on the students but remain challenging and provide multiple perspectives to solving the problem in accordance with the guidelines for problem-based learning (Jonassen and Hung 2008). All problems had (1) a constrained context, including pre-defined elements (problem inputs), (2) allowed the use of multiple procedures or algorithms and (3) had a single correct answer (Jonassen 2004). Three problems were selected that reflected different types of well-structured problems, as defined by Jonassen (2010). One originated from the course textbook (Stephan et al. 2010) and two were developed by the project team. All three problems were story problems, in which the student is presented with a narrative that embeds the values needed to obtain the final answer (Jonassen 2010). Problem 1 involved a multi-stage solar energy conversion system and required calculation of the efficiency of one stage given input and output values for the other stages (Stephan et al. 2010). Problem 2 required students to solve for values of components in a given electrical circuit. This problem, developed by the project team, also contained a Rule-Using/Rule Induction portion (a problem having one correct solution but multiple rules governing the process (Jonassen 2010)), where students were asked to determine an equivalent circuit based on a set of given constraints. Problem 3 involved total Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 7. 6 S.J. Grigg and L.C. Benson pressure calculations and required students to solve for values within the system, and conversion between different unit systems. Average perceived mental workload scores on the NASA-TLX survey was highest for the solar efficiency problem (47.04 out of 100), followed by the resistor problem (43.54 out of 100) and the total pressure problem (41.93 out of 100) (Grigg and Benson 2012a, 2012b, 2012c). These three problems are included in Appendix 1. 4.3. Tablet PC technology and data collection software To capture problem-solving processes for analysis, students wrote their solution attempts on a Tablet PC using a custom-designed software called MuseInk (Bowman and Benson 2011, Grigg and Benson 2011). The software allows students to work problems on a Tablet PC, and stores the digital Ink so that it can be replayed, annotated and exported as data to a database for analysis. Students work through problems much as they would with pen and paper, with the added benefit of having electronic access to their work, while researchers are provided with a comprehensive expression of the problem-solving attempt from beginning to end including work that was erased. The MuseInk software differs from other tablet-based instructional applications, such as DyKnow and Classroom Presenter, which allow insertion of pen-based feedback into student work in real time but cannot archive data in a way that is useful for research purposes (DyKnow 2012; University of Washington). Other instructional applications, such as CogSketch (Forbus et al. 2008) and BeSocratic (Bryfczynski et al. 2012), can provide students with automated feedback and guidance in an interactive mode, however they do not have the capability of importing a coding scheme or directly inserting codes into students’ work. 4.4. Coding scheme development The coding scheme development process progressed through 10 steps. These steps are illustrated in Figure 1. The results section details the findings of the testing and refinement stage. Figure 1. Depiction of the steps to develop a coding scheme for qualitative data analysis. Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 8. European Journal of Engineering Education 7 Figure 2. Summary of coding scheme elements (tasks, errors, strategies and accuracy). 5. Results and discussion A robust coding scheme for the analysis of problem-solving skills of first-year engineering stu- dents was developed, and the methodology used to develop and test these complex processes are described in the following section. Using this coding scheme, students’ solutions were analysed based on actions taken as a result of cognitive and metacognitive processes, errors, approach strategies and solution accuracy. 5.1. Coding scheme The primary result of this investigation is the coding scheme itself, which is summarised in Figure 2. The following sections describe the major categories of codes and their significance to the analysis of problem-solving performance. A more thorough description of each code is included in Appendix 2. 5.1.1. Process element codes Process elements are the backbone of the coding scheme and describe what the student is doing as depicted in the problem solution. For codes related to process elements, the basic structure set forth in the coding scheme by Wong et al. was used as an a priori framework. The framework was originally used to code student’s videotaped activity while studying mathematical materials during a concurrent think-aloud to assess processing differences in students based on self-explanation training versus a control group that did not receive self-explanation training (Wong et al. 2002). Similar to Sternberg’s (1985) theoretical framework , this coding scheme separated elements into categories of knowledge access (KA), knowledge generation (KG) and self-management (SM). Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 9. 8 S.J. Grigg and L.C. Benson ‱ KA codes describe instances of the student retrieving knowledge not explicitly stated in the problem statement. ‱ KG codes describe instances of transforming bits of information to form new connections or relationships. ‱ SM codes describe tasks related to assessing the current state of problem-solving activity. SM is collective of subcategories of planning, monitoring, revising and evaluating. The Wong et al. coding scheme provided the structure needed to distinguish between instances of retrieval of information from cognitive resources (KA), cognitive elements (KG) and metacog- nitive elements (self-management). We segmented the self-management category according to elements of planning, monitoring, evaluating and revising the solution in accordance with Hartman’s (2001) definition of the executive management aspects of metacognition. 5.1.2. Error codes Error codes indicate instances where an incorrect element first occurs; errors always occur in conjunction with a process element. For codes relating to errors, we utilised a structure derived from error detection literature in accounting, where it is common to classify errors as conceptual and mechanical errors (Owhoso et al. 2002, Ramsay 1994). A category for management errors to capture errors was added in metacognitive processes. ‱ Conceptual errors describe instances of misunderstanding of the problem and/or underlying fundamental concepts. ‱ Mechanical errors describe instances of operation errors, such as calculation errors. ‱ Management errors describe instances of problems managing information, including identifi- cation of given information, transcribing values or erasing correct work. With this error-coding structure, errors related to students’understanding of engineering concepts and their computational skills can be identified separately, allowing researchers to pinpoint hin- drances to learning. For example, the case in which a student does not use the efficiency equation properly because s/he did not understand the concept of efficiency is a much different case than if the student erred in using the equation because s/he has difficulty manipulating equations due to weak algebra skills or inattention to details. 5.1.3. Strategy codes Strategy codes are one-time-use codes that describe the overall approach taken to solve the problem. For strategy codes, we utilised a subset of strategies that appeared most applicable to story problems from the compilation described in ‘Thinking and Problem Solving’ (Nickerson 1994). The subset was refined from the broader list of strategies identified by Nickerson over the course of reviewing multiple problems completed by different students with different academic backgrounds. Specifically, six strategies appeared most frequently within the sample of problem solutions. These include: (1) Problem Decomposition (segmentation) – which involves breaking down a complex problem to ease analysis (Nickerson 1994). (2) Clustering (chunking) – which involves grouping similar information into larger units (Chi et al. 1981) (3) Means-End Analysis – which involves beginning with the identification of a goal state and the current state followed by the problem-solver making efforts to reduce the gap between states (Nickerson 1994). Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 10. European Journal of Engineering Education 9 (4) Forward Chaining – which is similar to Mean-EndAnalysis but involves a direct path between the current and goal states (Nickerson 1994). Some problems could also be classified according to an apparent lack of strategy. (5) Plug-and-chug – which involved inserting given values into an equation and producing an answer without necessarily understanding the reasons for doing so (Wankat 1999). (6) Guess-and-check – which is a slightly more sophisticated approach where the problem- solver checks that the values inserted into an equation yields the correct units or checks for reasonableness of the solution (Wankat 1999). Plug-and-chug and guess-and-check strategies are considered beginner-level strategies. Problem decomposition and means-end analysis strategies are considered intermediate-level strategies while clustering and forward chaining are considered advanced strategies. These strategies can be used as a proxy measure of ‘level of expertise’ as has been used in much problem-solving research in the past. 5.1.4. Solution accuracy codes Solution accuracy codes describe the accuracy of the final answer and are used once in each solution. While standard answer states of ‘Correct’ and ‘Incorrect’ could be used to describe the accuracy of the problem solution, two additional codes were included to describe solutions for a more fine-grained analysis of accuracy. ‘Correct but Missing/Incorrect units’ is a subset of ‘Correct’ in which the numerical value of the solution is correct, but the units are missing or the answer is given in a unit other than what was required, such as 120 seconds when the correct answer should have been reported in minutes. ‘Incomplete’ is a subset of ‘Incorrect’ indicating that no final answer was obtained. 5.2. Assessment of dependability, generalizability and repeatability Measures of rater agreement were used to assess dependability, generalizability and repeatability to the coding scheme by testing multiple solutions from the same problem, different problems and coding with a new rater, respectively. Coded solutions were assessed on three criteria: (1) code agreement (i.e. Did all coders associate this particular code with the element?), (2) code frequency (i.e. Did all coders code the same number of elements?) and (3) code timing (i.e. Were elements coded by coders consistently at the same point within a solution?). Inter-rater reliability was calculated based on overall agreement rates for all coded elements, as shown in calculation (1), and adjusted overall agreement rate, which accounts for discrepancies due to missing data by measuring agreement only for elements coded by all coders, as shown in calculation (2) (Gwet 2010). ‘Agreement’ was defined as an instance where a code was identified by all coders. ‘Missing data’ was defined as a code that one coder applied but another did not, which may be considered an important omission, depending on how the analysis is used to answer the research question. Overall Agreement Rate = number of ratings in agreement total number of ratings , (1) Adjusted Overall Agreement Rate = number of ratings in agreement total number of ratings − number of ratings with missing data . (2) Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 11. 10 S.J. Grigg and L.C. Benson Overall agreement answers the question, ‘To what degree are the codes assigned to a solution the same across raters?’This approach examines the degree to which coders can both identify and code elements precisely. Overall agreement rates are used to determine the ability of coders to identify elements and complete the coding process consistently. Adjusted overall agreement (dropping missing data from the analysis) answers the questions, ‘Given a set of identified elements, do two coders code elements similarly?’ Instances with missing data may be dropped to examine the consistency of the application of codes themselves but is not as good of a measure of coder performance. For example, an instance where a coder fails to identify an element and neglects to apply a code signals a need for better coder training. In contrast, an instance of disagreement about what to code a jointly identified element indicates the need for additional clarity around the meaning and interpretation of the codes, which is of larger concern in the coding scheme development process. 5.2.1. Assessing dependability of the coding protocol Dependability of the coding protocol was evaluated using data from three solutions of the same problem (the efficiency problem). Inter-rater reliability was calculated by examining overall agree- ment. Initial results showed an overall agreement rate of 55%, with individual agreement rates of 77%, 55% and 42% for the three solutions. Reviewing the instances of disagreements was instru- mental in identifying inconsistencies in applying codes and revealing missing problem features that were not captured in the initial coding scheme. Two more iterations of revisions were conducted before reaching a satisfactory level of inter- rater reliability. Initial inter-rater reliability for the second round was ‘Moderate’ with overall agreement lower than the initial round of coding (for the three solutions, inter-rater agreement was 73%, 40% and 25%, for an overall agreement rate of 41%). This round of coding revealed that, with the addition of new codes and reconfiguring code categories, there was confusion with the use of specific codes as well as the frequency of assigning codes. The coding protocol was clarified and documented in the codebook. To improve the inter-rater reliability of coders, a round- robin review cycle was implemented for the next round of coding, in which each coded problem was reviewed by a second coder as an internal check of adherence to proper coding procedures. The third round of coding showed great improvement of overall agreement rates with an ‘almost perfect’ rating. Across the three solutions, agreement rates were 100%, 96% and 85%, for an overall agreement rate of 92%. As is shown in the second round of assessment, not all refinements result in automatic improve- ments (Table 1). This iterative cycle of testing was crucial in refining the coding scheme to be of a high quality that others can ensure is dependable for their assessment. 5.2.2. Establishing generalizability of coding scheme Generalizability was evaluated by analysing solutions from three different problems with different contexts and features. All three problems were different in terms of the context, but were similar Table 1. Results of iterative refinement on overall coder agreement scores for efficiency problem. Overall agreement (%) Solution 1 (%) Solution 2 (%) Solution 3 (%) Round 1: Initial Coding Scheme; One problem tested (three solutions) 55 77 55 42 Round 2:New codes added; One problem tested (three solutions) 41 73 40 25 Round 3: Added review cycle; One problem tested (three solutions) 92 100 96 85 Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 12. European Journal of Engineering Education 11 Table 2. Inter-rater reliability for two original raters calculated as overall agreement and adjusted overall agreement (missing data removed) with Cohen’s Kappa coefficients. Retain missing codes Drop missing codes Problem Overall agreement (%) Îș Adjusted overall agreement (%) Îș 1 88 0.86 98 0.98 2 74 0.72 90 0.89 3 74 0.72 97 0.96 Average 78.7 0.77 95.0 0.94 Table 3. Inter-rater reliability for two original coders + new coder calculated as overall agreement and adjusted overall agreement. Retain missing codes Drop missing codes Problem Overall agreement (%) Îș Adjusted overall agreement (%) Îș 1 70 0.72 98 0.97 2 62 0.65 94 0.94 3 38 0.47 94 0.93 Average 57 0.61 95 0.95 in terms of processes required to solve them. For example, problem-finding tasks were limited since all problems were defined for the students. Also, no physical construction was required, as all problems could be solved using mathematical means. This iteration of coding was important to ensure that the coding scheme was robust enough to be used for a variety of problems within engineering contexts. Both overall agreement and adjusted overall agreement (removing instances of missing data) were calculated along with Cohen’s Kappa coefficients for both measures of agreement.Two of the three original coders conducted this round of coding. The overall agreement rate was ‘substantial’ (0.769) and adjusted overall agreement rate was ‘almost perfect’(0.942) (Gwet 2010), indicating that when an element was coded by both coders, it was very likely that they assigned the same code. A summary of results is given in Table 2. 5.2.3. Assessing repeatability of coding scheme with new rater To assess whether the coding protocol was repeatable, another member who had no previous involvement in the coding scheme development was brought in to code solutions. Inter-rater agreement showed that agreement was acceptable. Again, both overall agreement and adjusted overall agreement (removing instances of missing data) were calculated along with Cohen’s Kappa coefficients for both measures of agreement. As given in Table 3, inter-rater reliability decreased with the addition of the new coder, but remained ‘substantial’ for problems 1 and 2 and ‘almost perfect’ when adjusted to remove data points with missing ratings. In problem 3, there was a sizeable portion that was not coded because the student iterated through the same element repeatedly; these iterations were not captured by the new coder, resulting in only ‘fair’ agreement. For the efficiency problem from which the coding scheme was initially developed, the agreement rates were 70% (98% adjusted). The agreement rate was 62% (94% adjusted) for the circuit problem and 38% (94%) for the pressure problem, leading to an overall agreement rate of 57% (95% adjusted). Overall, Cohen’s Kappa coefficients were 0.614 (0.948 adjusted) for a ‘substantial’ level of inter-rater reliability and ‘near perfect’ level on adjusted scores. These inter-rater reliability measures were encouraging, showing that the coding scheme is robust and detailed enough to achieve high reliability between raters. By the end of scheme development Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 13. 12 S.J. Grigg and L.C. Benson Table 4. Average number of codes by the approach strategy. Code frequencies by strategy group Average Time to Average Average Average Average number Strategy Sample number of completion number of number of number of of answer group size codes (minutes) KA codes KG codes SM codes codes Plug-and-chug 2 10 5.1 1 3 2.5 1 Guess-and-check 3 33.3 20.7 1.7 13.7 8 1 Segmentation 15 31.2 17.7 3.4 8.3 9.1 1.6 Chunking 4 21.3 14.3 2.5 5.8 5.8 1.5 Table 5. Average number of errors by the expertise level (as indicated by strategy). Error code frequencies by strategy group Strategy Sample Average number of Average number of Average number of Probability of group size conceptual errors mechanical errors management errors success Plug-and-chug 2 1 0 1 0 Guess-and-check 3 4 2 4.7 0 Segmentation 15 5.3 1 4.8 0.5 Chunking 4 3.5 2 3.3 0.6 and training, coders were consistently assigning the same codes, though there remained some confusion for the new coder on when to assign a code (i.e. to code each instance of a task, even when the tasks were iteratively written and erased) 5.2.4. Establishing credibility of coding scheme The data from the analysis of the first problem (the efficiency problem) were used to ensure that the results obtained are in line with those obtained by others when examining problem-solving with respect to expertise (Chi et al. 1981). Tables 4 and 5 summarise the data used to make this comparison. Solutions were divided into four groups based on their approach strategy as a proxy of expertise level as evidenced in this particular solution. (The remaining two strategies were not found in this sample of solutions.) Results indicate that those who used a plug-and-chug strategy were not successful but had the fewest number of codes, number of errors and the shortest time to completion. These results can be explained by limited awareness of performance problems. The other two novice groups (guess-and-check and segmentation groups) mirrored results identified in the previous literature (Chi et al. 1988) as characteristic of novice performance, including longer time to completion, more errors and a lower probability of success than the more expert level of performance (chunking group). Our results indicated faster completion time for more expert performance, though the results showed a more moderate difference between them and the more novice performance groups than what was observed in the research by Chi et al., namely four times faster. (Average completion times for ‘novice’ groups, guess-and-check and segmentation, were 20.72 minutes and 17.71 minutes, respectively, compared with the ‘expert’ performance group, chunking, which was 14.32 minutes). Our research supports the claim that novices commit more errors. Guess-and-check and segmentation groups committed an average of 10.67 and 11.1 errors respectively, compared with the chunking group, with an average of 8.83 errors, which indicates that this coding scheme provides a reasonable assessment of problem-solving performance, as indicated by relative expertise of the students. Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 14. European Journal of Engineering Education 13 5.3. Solution assessment Students’solutions were analysed based on the actions they took as a result of their cognitive and metacognitive processes, errors, approach strategies and solution accuracy. Examples of step-by- stepcodingofastudent’ssolutiontotheefficiencyproblemisshowninAppendix3.Usingthiscod- ing scheme to evaluate problem solutions enabled several different types of analyses. Comparisons were made to identify key tasks that were associated with successful problem solutions (Grigg and Benson 2012a, 2012b, 2012c), to identify tasks and errors associated with extreme levels of perceived mental workload (Grigg and Benson 2012a, 2012b, 2012c) and evaluate the effects of prior academic experiences on process variations (Grigg and Benson 2012a, 2012b, 2012c). This research indicated that planning activities had positive benefits on student problem-solving success, while the use of visual representations was associated with lower mental workload dur- ing problem-solving attempts. Conceptual errors had the highest association with unsuccessful solutions; yet having pre-engineering experience did not have a significant impact on success- fully solving problems. Mechanical errors were also significantly associated with unsuccessful solution. Having previously completed a calculus course was significantly related to successful problem-solving, even though none of the problems required calculus to solve them. 6. Implications for research and instruction While the coding scheme was developed and tested using a set of well-defined story problems typi- calofafirst-yearengineeringcourse,itislikelytransferrabletothestudyofothertypesofproblems because of the general nature of the categories of processes, errors and strategies. Problem-solving activities within various contexts require similar cognitive and metacognitive processes, evoke similar types of errors and utilise similar approach strategies. However, the generalizability of the coding scheme to problems beyond well-defined story problems within the first-year engineering context is an area for future and ongoing research. Much work has gone into adapting critical ele- ments of the coding scheme into an instructional tool that quantitatively assesses problem-solving performance for both well-defined and more open-ended problems. The outcome of this research is an assessment that supports teaching successful problem-solving strategies (Grigg et al. 2013). The assessment tool maps the presented coding scheme to a problem-solving cycle (Pretz et al. 2003) and allows instructors to provide individualised feedback to students on their problem- solving proficiency by summarising processes and errors committed as well as what stages of the problem-solving cycle presented the biggest challenges. To date, this assessment tool has been used to assess problem solutions in three different undergraduate engineering courses as part of ongoing testing and refinement. The major advantage of the assessment tool over tradi- tional grading methods is that it can provide personalised feedback to students about their level of problem-solving proficiency as well as pinpoint skill deficiencies that need attention. It is believed that this personalised feedback will improve students’ awareness of their cognitive processes and their problem-solving performance. Because the criteria are consistent for each problem, student performance could be tracked over time. This would be enhanced by a streamlined process for digitising assessment data, which is currently under development. Once a large set of solution assessments have been collected for a variety of problem types across several courses, regres- sion models will be used to determine combinations of tasks that contribute to more successful problem-solving performance. Acknowledgements The authors thank Michelle Cook, Catherine McGough, Jennifer Parham-Mocello, David Bowman and Roy Pargas for their assistance with data collection and development of the coding scheme. Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 15. 14 S.J. Grigg and L.C. Benson Funding This work was supported through a grant from the National Science Foundation [EEC 0935163]. References Artzt, Alice F., and Eleanor Armour-Thomas. 1992. “Development of a Cognitive-Metacognitive Framework for Protocol Analysis of Mathematical Problem Solving in Small Groups.” Cognition and Instruction 9 (2):137–175. Beichner, R., J. Saul, D. Abbott, J. Morse, D. Deardorff, R. Allain, S. Bonham, M. Dancy, and J. Risley. 2007. “Student- CenteredActivities for Large Enrollment Undergraduate Programs (SCALE-UP) project.” In Research-Based Reform of University Physics Vol. 1, edited by E. F. Redish and P. J. Cooney 1–42. College Park, MD: American Association of Physics Teachers. http://www.per-central.org/document/ServeFile.cfm?ID=4517 Benson, L. C., M. K. Orr, S. B. Biggers, W. F. Moss, M. W. Ohland, and S. D. Schiff. 2010. “Student-Centered Active, Cooperative Learning in Engineering.” International Journal of Engineering Education 26 (5): 1097–1110. Bowman, D., and L. Benson. 2011. “MuseInk: Seeing and Hearing a Freshman Engineering Student Think.” Paper read at 2010 ASEE annual conference, Louisville, KY. Bryfczynski, S., R. P. Pargas, M. Cooper, and M. Klymkowsky. 2012. “Analyzing and visualizing student work with BeSocratic.” Paper read at proceedings of the 50th annual southeast regional conference, at ACM, Tuscaloosa, AL. Chi, Michelene T., Paul J. Feltovich, and Robert Glaser. 1981. “Categorization and Representation of Physics Problems by Experts and Novices.” Cognitive Science 5 (2): 121–152. Chi,MicheleneT.,RobertGlaser,andMarchallJ.Farr.1988.TheNatureofExpertise.Hillsdale,NJ:L.ErlbaumAssociates. DyKnow. 12/17/2012. “The DyKnow Advantage.” Accessed December 17, 2012. http://www.dyknow.com. Ericsson, K. Anders, and Herbert A. Simon. 1998. “How to Study Thinking in Everyday Life: Contrasting Think-Aloud Protocols with Descriptions and Explanations of Thinking.” Mind, Culture, and Activity 5 (3): 178–186. Farmer, Eric, and Adam Brownson. 2003. Review of Workload Measurement, Analysis and Interpretation Methods. Technical Report CARE-Integra-TRS-130-02-WP2, EUROCONTROL. Forbus, K., J. Usher, A. Lovett, K. Lockwood, and J. Wetzel. 2008. “CogSketch: Open-domain Sketch Understanding for Cognitive Science Research and for Education.” Paper read at proceedings of the fifth eurographics workshop on sketch-based interfaces and modeling, Annecy, France. Gilbreth, Lillian. 1914. The Psychology of Management. New York, NY: Sturgis & Walton. Greeno, James G., and Mary S. Riley. 1987. “Processes and Development of Understanding.” In Metacognition, Moti- vation, and Understanding, edited by F. E. Weinert and R. H. Kluwe, 289–313. Hillsdale, NJ: Lawrence Erlbaum Associates. Grigg, S., and L. Benson. 2011. “Work in Progress: Robust Engineering Problems for the Study of Problem Solving Strategies.” Proceedings of the Frontiers in Education conference, Rapid City, SD. Grigg, Sarah J., and Lisa C. Benson. 2012a. “Effects of Student Strategies on Successful Problem Solving.” ASEE annual conference, San Antonio, TX. Grigg, Sarah J., and Lisa C. Benson. 2012b. “How doesAcademic Preparation Influence How Engineering Students Solve Problems?” Paper read at frontiers in education, sSeattle, Washington. Grigg, Sarah J., and Lisa C. Benson. 2012c. “Using the NASA-TLX toAssess FirstYear Engineering Problem Difficulty.” Paper read at industrial and systems engineering research conference, Orlando, FL. Grigg, S., J. Van Dyken, L. Benson, and B. Morkos. 2013. “Process Analysis as a Feedback Tool for Development of Engineering Problem Solving Skills.” Paper read at 2013 ASEE annual conference, Atlanta, GA. Gwet, Kilem L. 2010. Handbook of Inter-Rater Reliability: The Definitive Guide to Measuring the Extent of Agreement upon Multiple Raters. 2nd ed. Gaithersburg, MD: Advanced Analytics, LLC. Hart, Sandra G. 2006. “NASA-Task Load Index (NASA-TLX); 20 Years Later.” Paper read at human factors and ergonomics society annual meeting proceedings, San Francisco, CA. Hartman, Hope J. 2001. “Developing Students’ Metacognitive Knowledge and Skills.” In Metacognition in Learning and Instruction: Theory, Research and Practice, edited by H. J. Hartman, 33–68. Dordrecht: Kluwer Academic Publishers. Hutchinson, SallyA. 1988. “Education and Grounded Theory.” In Qualitative Research in Education: Focus and Methods, edited by R. R. Sherman and R. B. Webb. New York: Falmer Press. Jablokow, Kathryn W. 2007. “Engineers as Problem-Solving Leaders: Embracing the Humanities.” IEEE Technology and Society Magazine 26 (4): 29–35. Jonassen, David H. 2004. Learning to Solve Problems. San Fransisco, CA: Pfeiffer. Jonassen, David H. 2010. “Research Issues in Problem Solving.” Paper read at 11th international conference on education research, Seoul, Korea. Jonassen, David H., and Woei Hung. 2008. “All Problems are not Equal: Implications for Problem-Based Learning.” The Interdisciplinary Journal of Problem-Based Learning 2 (2): 6–28. Jonassen, David, Johannes Strobel, and Chwee Beng Lee. 2006. “Everyday Problem Solving in Engineering: Lessons for Engineering Educators.” Journal of Engineering Education 95 (2): 139–151. Kirton, M. J. 2003. Adaption-Innovation in the Context of Diversity and Change. Hove: Routledge. Larkin, Jill, John McDermott, Dorothea P. Simon, and Herbert A. Simon. 1980. “Expert and Novice Performance in Solving Physics Problems.” Science 208 (4450): 1335–1342. Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 16. European Journal of Engineering Education 15 Litzinger, Thomas A, Peggy Van Meter, Carla M. Firetto, Lucas J. Passmore, Christine B. Masters, Stephen R. Turns, Gary L. Gray, Francesco Costanzo, and Sarah E. Zappe. 2010. “A Cognitive Study of Problem Solving in Statics.” Journal of Engineering Education 99 (4): 337–353. Matlin, Margaret W., ed. 2001. Cognition. 5th ed. Hoboken, NJ: John Wiley & Sons. National Academy of Engineering. 2004. The Engineer of 2020: Visions of Engineering in the New Century. Washington, DC: National Academies Press. Nickerson, Raymond S. 1994. “The Teaching and Thinking of Problem Solving.” In Thinking and Problem Solving, edited by R. J. Sternberg. San Diego, CA: Academic Press. Oliver-Hoyo, M., and R. Beichner, 2004. “SCALE-UP: Bringing Inquiry-Guided Methods to Large Enrollment Courses.” In Teaching and Learning through Inquiry: A Guidebook for Institutions and Instructors, edited by V. S. Lee, 51–70. Sterling, VA: Stylus Publishing. Owhoso, Vincent E., William F. Messier, and John G. Lynch Jr. 2002. “Error Detection by Industry-Specialised Teams During Sequential Audit Review.” Journal of Accounting Research 40 (3): 883–900. Polya, George. 1957. How to Solve It. Garden City, New York: Doubleday. Pretz, Jean E., Adam J. Naples, and Robert J. Sternberg. 2003. “Recognizing, Defining, and Representing Problems.” In The Psychology of Problem Solving, edited by J. E. Davidson and R. J. Sternberg, 3–30. Cambridge, UK: Cambridge University Press. Prince, M., and R.M. Felder. 2006. “Inductive Teaching and Learning Methods: Definitions, Comparisons, and Research Bases.” Journal of Engineering Education 95 (2): 123–138. Radatz, Hendrik. 1980. “Students’ Errors in the Mathematical Learning Environment.” For the Learning of Mathematics 1 (1): 16–20. Ramsay, Robert J. 1994. “Senior/Manager Difference in Audit Workpaper Review Performance.” Journal of Accounting Research 32 (1): 127–135. Rugarcia,Armando, Richard M. Felder, Donald R.Woods, and James E. Stice. 2000. “The Future of Engineering Education I. A Vision for a New Century.” Chemical Engineering Education 34 (1): 16–25. Simon, Dorothea P., and Herbert A. Simon. 1978. “Individual Differences in Solving Physics Problems.” In Children’s Thinking: What Develops? edited by R. S. Sigler, 325–348. NJ: Lawrence Erlbaum. Stammers, Robert B., andAndrew Shepherd. 1990. “TaskAnalysis.” In Evaluation of Human Work, edited by J. R. Wilson and E. N. Corlett, 144–168. Philadelphia, PA: Taylor & Francis. Stanton, Neville,A., Paul M. Salmon, Guy H. Walker, Chris Baber, and Daniel P. Jenkins. 2005. Human Factors Methods: A Practical Guide for Engineering and Design. Burlington, VT: Ashgate. Stephan, E. A., W. J. Park, B. L.Sill, D. R. Bowman, and M. H. Ohland. 2010. Thinking Like an Engineer: An Active Learning Approach. Upper Saddle River, NJ: Pearson Education. Sternberg, Robert J. 1985. Beyond IQ: A Triarchic Theory of Human Intelligence. NewYork: Cambridge University Press. Stigler, J. W., and J. Hiebert. 2009. The Teaching Gap: Best Ideas from the World’s Teachers for Improving Education in the Classroom. New York, NY: Free Press. Sweller, John. 1988. “Cognitive Load during Problem Solving: Effects on Learning.” Cognitive Science 12 (2): 257–285. Tashakkori, Abbas, and Charles Teddlie. 1998. Mixed Methodology. Vol. 46. Applied Social Research Methods Series. Thousand Oaks, CA: Sage Publications. Taylor, F. W. 1911. The Principles of Scientific Management. New York, NY: Harper & Brothers. University of Washington. “Classroom Presenter 3.” Accessed December 17, 2012. http://classroompresenter.cs. washington.edu/. Wallas, Graham. 1926. The Art of Thought. London: J. Cape. Wang, Yingxu, and Vincent Chiew. 2010. “On the Cognitive Process of Human Problem Solving.” Cognitive Systems Research 11 (1): 81–92. Wankat, Phillip C. 1999. “Reflective Analysis of Student Learning in a Sophomore Engineering Course.” Journal of Engineering Education 88 (2): 195–203. Weston, Cynthia, Terry Gandell, Jacinthe Beauchamp, Lynn McAlpine, Carol Wiseman, and Cathy Beauchamp. 2001. “Analyzing Interview Data: The Development and Evolution of a Coding System.” Qualitative Sociology 24 (3): 381–400. Wickens, C. D., S. E. Gordon, andY. Liu. 1998. An Introduction to Human Factors Engineering. NewYork, NY: Longman. Wilson, James W., Maria L. Fernandez, and Nelda Hadaway. 1993. “Mathematical Problem Solving.” Chapter 4 In Research Ideas for the Classroom: High School Mathematics, edited by P. S. Wilson. New York, NY: MacMillan. http://www.recsam.edu.my/Mathematical_Problem_Solving.pdf Wong, Regina M. F., Michael J. Lawson, and John Keeves. 2002. “The Effects of Self-Explanation Training on Students’ Problem Solving in High-School Mathematics.” Learning and Instruction 12 (1): 233–262. Woods, Donald R., Richard M. Felder,Armando Rugarcia, and James E. Stice. 2000. “The Future of Engineering Education III. Developing Critical Skills.” Chemical Engineering Education 34 (2): 108–117. About the authors Sarah J. Grigg is a Lecturer in General Engineering at Clemson University. Her research focuses on process improvement and error mitigation across various contexts, including engineering education, healthcare and transportation. She received PhD, M.S. and B.S. degrees in Industrial Engineering and an MBA from Clemson University. Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 17. 16 S.J. Grigg and L.C. Benson Lisa C. Benson is an Associate Professor of Engineering and Science Education at Clemson University, with a joint appointment in Bioengineering. Her research focuses on how student motivation affects learning experiences. Her projects involve assessing and studying the interaction of motivation, problem-solving strategies and knowledge transfer. Other projects in the Benson group include utilising Tablet PCs to enhance and assess learning, implementing student-centred active learning, and incorporating engineering into secondary science and mathematics classrooms. Her education includes a B.S. in Bioengineering from the University of Vermont and M.S. and PhD in Bioengineering from Clemson University. Appendix 1. Engineering problems under analysis. (a) (b) Figure A1. (a) The efficiency problem. Reprinted with permission from Pearson Education. (b) The resistor problem and (c) pressure problem were written by members of the research team. Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 18. European Journal of Engineering Education 17 (c) Figure A1. Continued. Appendix 2. Description of coding scheme elements Category Code Description Knowledge access Identify equation Equation with variables, no values Implicit equation identification No formal equation shown, values inserted initially Identified assumption Explicit statement of assumption or self-imposed constraint Identify prior knowledge Identifying outside knowledge to solve the problem Identify conversion factor Explicit listing of conversion factor(s) used Use conversion factor Ex 1ft = 12 in 4ft => in = 48 in Knowledge generation Draw a picture/diagram Flow diagram, schematic, sketch, Venn diagram, etc. Make a table Organising similar data in lists Relate variables Assigning relationships in the system, show connections, insert known values in diagram Manipulate equation Solving an equation for another variable Derive units Ex: 4 ft∗12in/1ft = 48 in Plug values in equation Inserting given or derived values Document math Documentation of mathematical calculations Solve intermediate value Getting a sub answer Self-management Planning Restate problem Summarising in phrases or sentences Identify known value Defining variables by given values from problem statement Identify unknown value Explicitly identifying what is being solved for Identify constraint Information from problem statement (Ex: only one of each type of resistors) Revising Labelling/renaming Clarifying documentation, relabelling variables Erase work Transition or correction (not fixing penmanship) Abandon process/start over Completely changing gears Evaluating Check accuracy Plugging answer back in and checking Identify final answer Boxed/underlined/circled answer Monitoring Identify error Correcting or erasing a previous error Conceptual errors Incorrectly relate variables EX: P1out = P2in, P2out = P3in Misuse governing equation Error in equation EX: flipped variables or sign Incorrect visual/graphic representation Misrepresenting underlying concepts Incorrect assumptions Placing or misusing constraints on the system or assumptions not given in problem statement Mechanical errors Incorrectly manipulate equation Algebra problem Incorrect calculation Plugging numbers in calculator incorrectly Incorrect unit derivation Error in deriving units Management errors Incorrect known value Inserting wrong number for variable Incorrect unknown value Solving for wrong variable Ignored problem constraints Not conforming to constraints given in problem statement Irrelevant information Using values that are not given and not needed Inconsistent transcription Correct information rewritten incorrectly (miscopy) Inconsistent units Mismatch of units in a calculation (EX: mixing English and SI units in an equation) Incorrect unit assignment Labelling units on value incorrectly (arbitrarily with no other documentation) Using incorrectly generated information Using incorrect equation or value calculated in previous part of problem Missing units throughout No use of units (or few) in calculations throughout (Continued) Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 19. 18 S.J. Grigg and L.C. Benson Appendix 2. Continued Category Code Description Erasing correct work Correcting ‘mistake’ that is not really wrong Strategies Plug-and-chug Plugging numbers into equations without understanding why Guess-and-check Trying values and seeing what gives good answers Work backwards Choosing steps based on known solution Utilise a similar problem Referring to or working from example in book or notes Segmentation Discovering or acknowledging multiple parts to problem (AKA problem decomposition or subgoaling) Chunking Collapsing multiple parts into one step Means-ends analysis Working to minimise differences between end goal and starting point Forward chaining Planning out path to solve problem Specialisation/extreme cases Considering abstract or extreme forms of problem Solution accuracy Correct answer Correctly calculating final answer Correct but missing/incorrect units Correct value with no or incorrect units Incorrect answer Solving for wrong variable, skipped steps Incomplete No final answer produced Appendix 3. Example of coded student work: step-by-step coding of solutions Downloaded by [Lisa Besnon] at 08:07 19 March 2014
  • 20. European Journal of Engineering Education 19 These snapshots at different points in time throughout the solution indicate the student’s work and the associated codes that were inserted at relevant points in the work. Downloaded by [Lisa Besnon] at 08:07 19 March 2014