SlideShare a Scribd company logo
1 of 11
Download to read offline
AN INSTRUMENT FOR ASSESSING WRITING-INTENSIVE
PHYSICS LABORATORY REPORTS
Andy Buffler, Saalih Allie and Loveness Kaunda
Department of Physics, University of Cape Town, Rondebosch, 7700
Tel: (021) 650 3339 Fax: (021) 650 3342 abuffler@physci.uct.ac.za
Margaret Inglis
Department of Second Language Studies, University of Natal - Pietermaritzburg
Abstract
This paper reports the development of an instrument for assessing writing-intensive laboratory
reports in undergraduate physics. Writing-intensive reports combine the aspects of laboratory
experimentation and written communication and involve a number of distinct facets which have to be
taken into account during assessment. At one level, there are features that pertain to the
technicalities and conventions of the science involved as well as the technical features of language
use, while at another level, there are features that pertain to the report as an exercise in meaningful
communication. In an attempt to take cognisance of all these aspects, the coherence of the report
was used as an important concept in developing the instrument. The instrument was also designed
so that it could be used by physics tutors, both staff and post-graduate students, who are generally
more familiar with assessing reports on the basis of the technical aspects of scientific procedure only,
but might judge the communicative aspects of the report by assessing only surface-level language
errors.
Introduction
Writing-intensive laboratory reports were first introduced into the first year physics curriculum at the
University of Cape Town as a form of providing language intervention for ESL students who came
from disadvantaged educational backgrounds. Makina-Kaunda and Allie (1994) identified the
laboratory practical as an appropriate context, not only for introducing the students to the
procedures of practical science, but also for developing communication skills by writing about the
laboratory activity. Recently, the physics department, acknowledging communication as being an
integral part of investigative and scientific activities (Allen and Widdowson 1978, Shih 1986, Baird
1988, Vermillion 1991), included writing-intensive reports as a component of the laboratory
curriculum for all first year physics students.
In general, it is widespread practice that learning objectives in academic settings are measured
primarily via the written word. This has been illustrated in a number of studies such as the Keller-
Cohen and Wolfe (1987) study done at the University of Michigan, and cited in Hamp-Lyons
(1990), where it is reported that “written work forms part of the course expectations in over 70% of
courses within the undergraduate curriculum while 97% of faculty surveyed agreed that skill in writing
is important or very important for college study” (Hamp-Lyons 1990 p. 71). At the same time
writing tasks are seen as being important in promoting the development and understanding of
scientific concepts and process skills. The importance of task-based activities in practical science in
in Proceedings of the 5th
Annual Meeting of the Southern African Association for Research in Mathematics and Science Education,
Ed: M. Sanders, Johannesburg (1997) 85-91
developing scientific process skills is well documented by Woolnough (1991), Tamir (1991), Millar
(1991) and Gott and Mashiter (1991) as is the link between writing and concept development is also
described in detail in Radloff and Samson (1993), Radloff (1994), Connolly (1989) and Widdowson
(1983). These views and research findings would support the validity of the laboratory report-writing
task as a means through which mastery of concepts and process skills in experimental science can be
taught and assessed. Holistic assessment, which is discussed in detail in Bamberg (1984), Hamp-
Lyons (1993) and Allaei and Connor (1993), encourages the assessment of a piece of writing as a
whole rather than focusing on individual aspects. It is considered an appropriate approach for the
assessment of multiple traits in a piece of writing (Hamp-Lyons 1993, Allaei and Connor 1993) in
that important aspects of content or those of form are not arbitrarily ignored. However, none of the
holistic assessment instruments presented focuses on the assessment of laboratory reports. There
also does not appear to be any work reported on integrating laboratory practical work and
communication skills.
Coherence has been described as that quality in a piece of writing which enables a reader to sense a
flow in what is being communicated (Fahnestock 1983). We see the achievement of such a quality
as being dependent on the closeness of the relationship between the content and the expression of
that content. In the physics department at the University of Cape Town laboratory reports are
traditionally assessed by tutors who are postgraduate students in the department. The introduction of
writing-intensive reports posed a problem for the tutors since the traditional laboratory reports are
usually a short summary of activities in the laboratory and are graded largely on the basis of the
scientific content. Writing tends to be limited to a few terse sentences interspersed amongst tables,
graphs and diagrams. Loosely-conceptualised assessment criteria were found to be unhelpful to the
tutors in assessing the writing-intensive reports and caused large variations in the quality of grading
and feedback, bearing in mind the large numbers of first year students that are involved. An
acknowledgement of these problems led to the development of an instrument which was aimed at
providing a more detailed set of criteria which could be used by the “discipline-based” personnel to
assess the writing-intensive reports more objectively Thus, once trained in the use of the instrument,
tutors would be able to give specific and meaningful feedback on the scientific content as well as the
communication and language aspects. This paper describes the process by which the instrument
was developed, taking into consideration theoretical issues surrounding assessment and learning.
It should be pointed out that the reports in question follow a well-structured format around the
following broad sections: introduction, method (of the experiment), data collection, data processing
and results, conclusions and discussion. The “introduction” section introduces the report, the
intended audience, relevant theory and the purpose of the task. The “method” section describes the
procedures followed, precautions taken and should include labelled diagrams of the apparatus used.
The “data” section usually presents tables of recorded data and many of the practical tasks require a
graph to be drawn. The appropriate calculations are presented in a “results” section which should
also refer to any graphs plotted and contain the appropriate data analysis. Finally, the “discussion”
or “conclusions” section should tie the whole report together by briefly referring to the original aims,
the results and the relevant conclusions. Discussion of statistical uncertainties and other sources of
experimental errors and recommendations could also be included in this section.
Theoretical Perspectives
Assessing student writing
A brief review of the literature on student writing and assessment reveals the fact that assessing any
kind of written work is a complex exercise This is so, especially when several markers are involved,
since the markers may have different views as to what constitutes the essential assessment criteria for
a piece of writing. A large body of research into the assessment of student writing in various
disciplines has shown, for example, that when staff in a particular discipline assess student writing,
they respond primarily to the content while language teachers traditionally tend to respond to
surface-level features such as grammar, spelling and punctuation rather than to the subject matter
(see Zamel 1985, Swales 1978, Shih 1986, Hamp-Lyons 1993, White 1994). These differing
perceptions can lead to large disparities in assessment particularly in disciplines, such as in physics,
where numerical and graphical forms of communication are relied on extensively. The assessment
instrument we present in this paper attempts to create a balance between discipline-based and
language-based criteria as well as limiting opportunities for disparities and inconsistencies between
markers.
Formative assessment
Boud (1995), in his discussion of the complementary nature of assessment and learning, argues that
assessment should not only focus on measuring achievement, that is, giving a mark, but it should have
an impact on quality learning and teaching. The importance of ‘formative’ as opposed to only
‘summative’ assessment in promoting learning is further discussed by Brown, Race and Rust (1995).
Despite the different perceptions among staff and students with regard to what constitutes ‘good’
writing or ‘good’ feedback, there is general agreement about the fact that students will not
necessarily modify what they do according to the comments and changes made on their piece of
writing and that in some cases they are unable to interpret comments on either form or content
(Vaughan 1993, Fathman and Whalley 1990, Swales 1978). Simply conforming to the correct
structure of the report, for example, is sometimes viewed by students as being the main ingredient of
successful report writing (Kaunda 1995). It is argued that drawing attention to specific strategies for
accomplishing a task and to meaning-related errors and omissions may be more helpful than general
comments on surface-level features and other arbitrary comments (Zamel 1985, Fathman and
Whaley 1990). Our own experience has also been that specific feedback helps to focus students on
specific sections of the report where more detail, explanation or clarity, is required. This leads to an
overall improvement in report-writing skills as well as to a better understanding of the concepts
involved in the experimental investigation (Kaunda 1995). However, this strategy is resource-
intensive and time consuming as copious comments have to be written on each student’s report.
The present instrument is designed to give students meaningful feedback in a standardised form,
while at the same time relieving the tutor from the frustration of having to repeat very similar
comments on each report.
Coherence and scientific discourse
Coherence is primarily a qualitative feature of writing and, as a vital component of holistic
assessment, is more appropriately described in holistic terms (Allaei and Connor 1993, Hamp-Lyons
1993, Bamberg 1984). It is a vital component of holistic assessment. Besides the content,
coherence is what gives a piece of writing its logical flow (Witte and Faigley 1981, Bamberg 1984,
Hubbard 1989, 1993) and enables the reader to construct an argument from what is being
communicated (Fahnestock 1983). In a report, this flow would be expected to result initially from
the ability of the writer to create a context which orientates the reader in such a way that he or she
can anticipate upcoming information and the way this information will be presented. Coherence
comes about by the ability of the writer to use both the structure of the report and the appropriate
linguistic, rhetorical and semantic markers so as to help carry the reader across sections of the report
in a way that makes the relationship between one section and another clear and also reduces
interference in the processing of the information conveyed. Imparting to students an understanding of
what makes a piece of writing coherent, is one way of initiating them into the discourse of the
scientific community, and facilitates the development of scientific thinking (Marshall 1991, Parkhurst
1990, Johns 1993, Shih 1986).
The assessment instrument
Four researchers were involved in the development of the assessment instrument, namely, two
physicists and two applied linguists. In a series of meetings one of the first decisions was to agree on
who the “clients” for the instrument would be. Three levels of users were identified, namely, the
students, the markers (tutors) and the laboratory facilitators (lecturers). This identification helped in
determining the nature of the information which would be contained in the instrument as well as its
presentation. While acknowledging the importance of making assessment criteria explicit (Knight
1995), we were aware of the difficulty of trying to assess everything that one might notice in a
laboratory report especially when the laboratory task contains the two distinct aspects discussed
above. What was important in the process of generating criteria was to define carefully the dynamic
relationship between the scientific content and the features pertaining to communication and language.
The final instrument therefore required that both of these aspects be assessed objectively and
consistently but not at the expense of each other.
Using a list of criteria based on the experience from previous marking, all four researchers marked a
sample of ‘good’, ‘average’ and ‘poor’ reports and noted additional criteria. A draft assessment
instrument was then used by a group of post-graduate students to mark a number of reports and
their experiences were recorded and included in the set of assessment criteria which formed part of
the final instrument. The final version of the assessment instrument consists of two parts, namely;
(a) The mark schedule (Appendix 1) which contains the assessment criteria on which the marks
awarded are based and,
(b) the coherence rating scale (Appendix 2) which provides criteria to the marker for assessing
the coherence of the report.
With reference to the mark schedule (Appendix 1) it can be seen that the assessment criteria have
been subdivided into two sections. Section A outlines those criteria relating to the scientific aspects
including the practical and processing skills related to the experiment. These are grouped in three
sub-sections: (a) method, (b) data collection, and (c) data processing (graphs and calculations).
Section B focuses on criteria related to achieving coherence in the report under the following sub-
headings, (a) the introduction and aim, (b) the discussion and interpretation of data and (c)
conclusion and recommendations. A total of sixty marks are allocated for the report. Up to thirty
five marks may be awarded for Section A and twenty five marks for Section B. This relative
weighting was decided upon to suit our own needs and may be adjusted according to the teaching
context. The assessment schedule, bearing the personal details of the student, is be returned to the
student together with the report that had been handed in previously. Apart from showing the final
mark, the tutor places crosses to indicate elements that are either absent from the report or are not
adequately presented. In this way, it is clearly indicated to the student how and where marks were
gained or lost and which areas need improvement. The schedules may be photocopied before
handing them back to students in order to keep track of common problems which can be addressed
at a later stage.
Marking Section A
Up to thirty-five marks may be awarded for Section A distributed as discussed below. Five marks
are allocated to the description of method. The method should be described clearly and explicitly,
indicating the sequence in which the procedure was followed. The student must pay attention to the
selection and exactness of the descriptive detail, precautions taken, and must make sure that the
description is not difficult to follow due to too many grammatical and spelling errors. The description
should be supported by a correctly labelled diagram of the apparatus. Ten marks have been
allocated to the collection of data and presentation of tables. Here, the marks would be awarded for
the presence of sufficient, appropriate data and how these are recorded in tables with the
appropriate significant figures. Tables must have suitable headings and columns of data should
indicate correct units. Twenty marks have been allocated to data processing; ten for graphs and ten
for calculations. All graphs should have appropriate scales, correctly plotted data points, properly
labelled axes with correct units and suitable headings. All calculations must be correctly completed
using the appropriate equations, significant figures. Uncertainties must be handled correctly. After
assessing the aspects in a particular sub-section in the report, the tutor should turn to the student
mark sheet and put a cross against items that are absent from the report or need attention. This
serves to make criteria and expectations visible to the student. A mark for that sub-section is then
awarded. To avoid mark inflation and allow for qualitative judgements by the tutor the items are not
equal to one mark each.
Marking Section B
While it is to be expected that the physics tutors are able to assess the identifiable elements of the
scientific aspects, it cannot be assumed that they could find it equally easy to mark the coherence of
reports. Using insights gained from the various multiple-trait scoring systems in use such as those
developed by Bamberg (1984), Hamp-Lyons (1993), Jacob et al. (1981), cited in Hamp-Lyons
(1993) and Allaei and Connor (1993), we developed a coherence scale (figure 2) of performance in
which we described the salient qualities which we identified as contributing to coherence in a
laboratory report.
The marks in the coherence scale (Appendix 2) are arranged in five bands: 0-4 (very poor), 5-9
(poor), 10-14 (satisfactory), 15-21 (very good) and 22-25 (excellent) following a normal
distribution curve of performance. The degree of coherence is reflected by the descriptors in each
band and it focuses on elements in the introduction, discussion of results, conclusion,
recommendations and how information in one section is linked to that in another. To get full marks,
all the elements in the 22-25 band, which characterise good scientific reporting, must be fulfilled (see
Appendix 2).
The marker must be satisfied that, in the introduction, the writer has set an appropriate context for
the report by briefly discussing the theory on which the experiment is based. The writer must have
made explicit links between the aim of the report, the data collected and processed and the
conclusions and recommendations made. Since the degree of coherence in the report is reflected by
the descriptors in each, coherence will be affected by the presence or absence of one or more of the
descriptors. Thus, the marker must make a decision about the appropriate band for each report.
This is followed by a further decision about the actual mark to be given within the band. These bands
allow the marker to easily decide upon a broad range of marks while also allowing some differences
in the quality of reports in the same band to be reflected in the final mark. The marker then should
quickly read through the whole report again to judge whether the writer has completed the overall
task, namely, that of using the experimental results to answer the problem that the report-writing task
is required to address. The final mark for Section B is recorded in the appropriate space on the
schedule and the marker will put a cross against those criteria of coherence which need further
attention by the student.
Evaluation
The instrument described in this paper is currently in use by tutors in the physics department at the
University of Cape Town. Based on their previous experience of assessment of traditional
laboratory reports, the first reaction of the tutors to the instrument was that marking would become
even more tedious and time-consuming. However, after using the instrument to mark the first few
reports, it was found that it did not take more time as it was no longer necessary to write detailed
comments on the report itself. Tutors who were more experienced at marking reports also found that
the instrument indeed verified their ‘gut’ feeling of the appropriate mark for a report while the novice
markers found the explicitness of the instrument helpful in deciding what was expected in a report of
this kind in terms of content and coherence. After the training and the initial use of the instrument to
mark reports, tutors found it easy to internalise the instructions for the use of the instrument and
became less dependent on it. This exercise also served to change the perceptions of the tutors about
what writing a laboratory report entails as well as the purpose of assessment. The quality of
laboratory reports which students produced as a result of receiving feedback in the form of the
assessment schedule was extremely encouraging and the general agreement on the part of the
students was that they preferred this worksheet-based assessment as they could immediately see
where they needed to improve and could ask the instructor for assistance on a particular aspect.
The students could also see the criteria on which their final marks were based. It should also be
pointed out that the students received a copy of both the assessment schedule and coherence scale
before their first practical which familiarised them with the expectations of the instructor. It is our
general impression that using this form of assessment has greatly enhanced the overall learning
experience of students in the physics laboratory.
Concluding remarks
In this paper, we have presented an instrument for assessing writing-intensive reports with scientific
content within the context of the physics laboratory practical. We have used insights from the
literature on writing and assessment to highlight the importance of developing communication skills
alongside other scientific process skills. The central concept around which the assessment
instrument was designed was that of coherence. We have described the process by which the
assessment instrument was developed and tested as well as how it can be used to provide meaningful
feedback on reports to both students and researchers. We believe that this form of assessment may
prove to be a useful way of assessing a range of writing-intensive tasks in different scientific
disciplines.
Acknowledgements
The research on which this paper is based was funded by the Foundation for Research and
Development (FRD). We would also like to thank the tutors who agreed to use the assessment
instrument and provide us with valuable feedback, in particular, Beth Ratering, Mark Marais, Trevor
Volkwyn, Dieter Geduld, Mirela Fetea, Rodney Morgan and David Brookes.
References
ALLEN, J. P. B. and WIDDOWSON, H. G. (1978). Teaching the communicative use of English.
In R. Mackay and A. Mountford (Eds), English for Specific Purposes: A Case Study Approach
(pp. 56-77). London, Longman.
ALLAEI, S. K. and CONNOR, U. (1993). Using performative assessment instruments with ESL
student writers. In L. Hamp-Lyons (Ed.), Assessing Second Language Writing in Academic
Contexts (pp. 227-240). Norwood (NJ), Ablex.
BAIRD, D. C. (1988). Experimentation: An Introduction to Measurement Theory and
Experiment Design 2nd edition. Englewood Cliffs (NJ), Prentice Hall.
BAMBERG, B. (1984). Assessing coherence: A reanalysis of essays written for the national
assessment of educational progress, 1969-1979. Research in the Teaching of English, 18 (3),
305-319.
BOUD, D. (1995). Assessment and learning: contradictory or complementary? In P. Knight (Ed.),
Assessment for Learning in Higher Education (pp. 35-48). London, Kogan Page.
BROWN, S., RACE, P. and RUST, C. (1995). Making assessment a positive experience. In P.
Knight (Ed.), Assessment for Learning in Higher Education (pp. 75-85) London, Kogan Page.
CHARLES, M. (1990). Responding to problems in written English using a student self-monitoring
technique. English Language Teaching Journal 44 (4), 286-293.
CONNOLY, P. (1989). Writing and the ecology of learning. In P. Connoly and T. Viladi (Eds),
Writing to learn Mathematics and Science (pp. 1-14) New York, Teachers College Press.
FAHNESTOCK, J. (1983). Semantic and lexical coherence. College Composition and
Communication, 34 (4), 400-416.
GOTT, R. and MASHITER, J. (1991). Practical work in science - a task-based approach. In B.
Woolnough (Ed.), Practical Science : The Role and Reality of Practical Work in School Science
(pp. 53-66). Milton Keynes, Open University Press.
HAMP-LYONS, L. (1990). Second language writing: assessment issues in second language writing.
In B. Kroll (Ed.), Second Language Writing - Research Insights for the Classroom (pp. 69-87).
Cambridge, Cambridge University Press.
HAMP-LYONS, L. (1993a). Reconstructing academic writing proficiency. In L. Hamp-Lyons
(Ed.), Assessing Second Language Writing in Academic Contexts (pp. 127-153). Norwood
(NJ), Ablex.
HAMP-LYONS, L. (1993b). Scoring procedures for ESL contexts. In L. Hamp-Lyons (Ed.),
Assessing Second Language Writing in Academic Contexts (pp. 240-276). Norwood (NJ),
Ablex.
HUBBARD, E. H. (1989). Cohesion errors in the academic writing of second language users of
English. English Usage in South Africa, 20, 1-19.
HUBBARD, E. H. (1993). Some coherence correlates in expository writing. South African
Journal of Linguistics (Supplement), 15, 55-74.
JOHNS, A. (1993). Faculty assessment of ESL student literacy skills: Implications for writing
assessment. In L. Hamp-Lyons (Ed.), Assessing Second Language Writing in Academic
Contexts (pp. 167-179). Norwood (NJ), Ablex.
KAUNDA, L. (1995). Exploring the relationship between language and learning in the numerate
sciences: the role of report-writing in physics. Internal report, University of Cape Town.
KNIGHT, P. (1995). Introduction. In P. Knight (Ed.), Assessment for learning in Higher
Education (pp. 13-23). London, Kogan Page.
MAKINA-KAUNDA, L. and ALLIE, S. (1994). A language intervention for science students:
laboratory report-writing in physics. In D. Adey, P. Steyn, N. Herman and G. Scholtz (Eds), State
of the Art in Higher Education, Volume 2 (pp. 53-61) Pretoria, UNISA.
MARSHALL, S. (1991). A genre-based approach to the teaching of report-writing. English for
Specific Purposes, 10, 3-13.
MILLAR, R. (1991). A means to an end: the role of processes in science education. In B.
Woolnough (Ed.), Practical Science : The Role and Reality of Practical Work in School Science
(pp. 43-52). Milton Keynes, Open University Press.
PARKHURST, C. (1990). The Composition process of science writers. English for Specific
Purposes, 9, 169-179.
RADLOFF, A. and SAMSON, J. (1993). Promoting Deep Learning: Using Academic Writing
to Change the Learner's Epistemological Stance. Paper presented at the 5th European
Association for Research on Learning and Instruction Conference , Aix-en-Provence, 1-13.
RADLOFF, A. (1994). Writing to Learn, Learning to write: Helping Academic Staff to
Support Student Writing in their Discipline. Workshop at the 13th International Seminar on Staff
and Educational Development, Cape Town, 2-9.
SHIH, M. (1986). Content-based approaches to teaching academic writing. TESOL Quarterly, 20
(4), 617-648.
SWALES, R. (1978). Writing 'Writing Scientific English'. In R. Mackay and A. Mountford (Eds),
English for Specific Purposes; A Case Study Approach (pp. 43-55). London: Longman.
TAMIR, P. (1991). Practical work in school science: an analysis of current practice. In B.
Woolnough (Ed.), Practical Science : The Role and Reality of Practical Work in School Science
(pp. 13-20). Milton Keynes, Open University Press.
VAUGHAN, C. (1993). Holistic assessment: what goes on in the rater's mind? In L. Hamp-Lyons
(Ed.), Assessing Second Language Writing in Academic Contexts (pp. 111-125). Norwood
(NJ), Ablex.
VERMILLION, R. E. (1991). Projects and Investigations: The Practice of Physics. New York,
Macmillan.
WHITE, E. M. (1994). Teaching and Assessing Writing. 2nd edition. San Fransisco, Jossey
Bass.
WIDDOWSON, H.G. (1983). New starts and different kinds of failure in learning how to write. In
A. Freedman, I. Pringle and J. Yalden (Eds), First Language / Second Language - Selected
Papers from the 1979 CCTE conference, Ottawa, Canada (pp. 34-47). London: Longman.
WITTE, S. P. and FAIGLEY, L. (1981). Coherence, cohesion and writing quality. College
Composition and Communication, 32 (2), 189-204.
WOOLNOUGH, B. (1991). Setting the scene. In B. Woolnough (Ed.), Practical Science : The
Role and Reality of Practical Work in School Science (pp. 3-9). Milton Keynes, Open University
Press.
ZAMEL, V. (1985). Responding to student writing. TESOL Quarterly, 19 (1), 79-101
Appendix 1
University of Cape Town : Department of Physics
PRACTICAL REPORT ASSESSMENT SCHEDULE FOR : _________________
Name:__________________________ Course:__________ Day:_________ Group:_______
Partner(s):_________________________________________ Marked by: ________________
? Cross indicates that item needs attention!
Data collection and processing : 35 marks:________
1. Method: 5 marks: ______ 2. Collection of data and tables: 10 marks:____
? diagram of apparatus with labels ? all appropriate data collected
? method described clearly and explicitly ? tables of measurements exist
? method described in sequence of performance ? data recorded correctly (significant figures)
? not difficult to follow due to poor
grammar/spelling
? tables have suitable headings
? selection and exactness of detail ? columns have correct units
? precautions discussed ? sufficient data points
? sufficient measurements for each data point
3. Data processing:
Graph: 10 marks: ______ Calculations: 10 marks: ______
? appropriate variables plotted ? measurements manipulated correctly
? points plotted correctly ? correct formulae written down
? graph has labelled axes with correct units ? calculations completed clearly and correctly
? graph has a suitable heading ? correct formulae used to determine uncertainties
? correctly drawn fit to data (straight line) ? uncertainties calculated correctly
? appropriate scale ? results quoted correctly
? significant figures correct
Coherence of the report : 25 marks: _________
Title, Introduction and Aim: Discussion and interpretation of data:
? date, headings and names ? reasoning of analysis easy to follow
? aim of report ? line of argument and explanations explicit
? context and audience specified ? grammar and spelling enhance argument
? discussion of relevant theory ? links to data clearly expressed
Conclusion and recommendations:
? measured result related to the aim of the experiment
? suitable discussion based on result
? suitable discussion related to aim of the report
? links between conclusion and aim clearly expressed.
? valid conclusions based on result and aim of report
? discussion of possible systematic errors
? valid recommendations made to enhance
experiment
? grammar and spelling enhance discussion
60
Total Mark :
Appendix 2
Coherence Scale for scoring a report based on a physics laboratory practical.
Mark range: 25 - 22 21 - 15 14 - 10 9 - 5 4 - 0
The report has nearly all of these
elements present :
One or two of the following diminish
the report's coherence slightly :
Some of the following prevent the
reader from integrating the report into
a coherent piece of writing:
Many of the following prevent the
reader from making sense of the
report :
The report has no coherence
because of most of the following:
The aim of the report is clearly
stated.
The aim is stated. The aim is stated but may be too
brief and/or inaccurate.
The aim is inadequately stated. The aim is not stated.
Aim and
Introduction
The relevant theory is competently
discussed.
Relevant theory is discussed (but
may be somewhat brief, or too
detailed).
The theory is mentioned (but it may
be too brief or irrelevant or
inaccurate).
The theory is stated inadequately or
not at all.
No mention of theory.
The audience is specified and
orientated to the purpose of the
report and to any relevant
background information.
The audience is specified and
orientated, but the writer could be
more explicit about the context.
The audience may not be specified
and not explicitly orientated towards
the context.
The audience is unspecified. Little or
no mention of the context.
No mention of audience or context.
Conclusion /
The results of the experiment are
clearly discussed and interpreted in
relation to aim of the report.
The results of the experiment are
discussed and interpreted in relation
to the aim of the report.
The results are mentioned with little
discussion in relation to the aim of
the report.
The results are discussed
inadequately. Little or no reference
to the aim of the report.
No discussion of results.
Discussion of
results /
A definite conclusion, which is
clearly supported by the final
results, is stated and clearly related
to the aim of the report.
An accurate conclusion is stated and
draws on final results.
A conclusion is poorly stated or may
be incorrect, and may not refer to
results or to the aim of the report.
A conclusion may not be stated. No conclusion stated.
Recommen-
dations
The significance of the final results
is thoroughly discussed.
The significance of the final results
is stated.
The significance of the final results
are poorly stated and may be
inaccurate.
The significance of the final results
may not be stated.
No statement of the significance of
the final results.
Recommendations are made and
clearly linked to the method of the
experiment and/or the aim of the
report.
Recommendations are made and
linked to the aim of the report.
Recommendations are made but link
poorly to the aim of the report.
Unrelevant recommendations are
made which do not at all link to the
aim of the report
No recommendations made.
Sentences within and across
sections are linked together
effectively with the use of cohesive
ties.
Sentences within and across
sections are linked together with the
use of some cohesive ties.
Sentences within and across
sections are not well linked because
of little use of cohesive ties.
Very few cohesive ties provide any
linking of sentences within or across
sections.
No links created between sentences.
Overall
presentation
The reading process is uninterrupted
as there are few grammar or spelling
errors.
The reading process is easy although
occasional grammar and spelling
mistakes may intervene.
The reading process is frequently
interrupted by grammar and spelling
errors.
The reading process is continually
interrupted by numerous grammar
and spelling errors.
Excessive grammar and spelling
errors make the reading task almost
impossible.
The report is neat and well-
structured, with appropriate headings.
The structure of the report is clear,
although one or two headings may be
missing or misplaced.
The structure of the report is not
clear because many headings may
be missing or are not prominently
positioned.
Little attempt to structure the report.
Untidy presentation.
No obvious structure to the report.
Untidy presentation.
An Instrument For Assessing Writing-Intensive

More Related Content

Similar to An Instrument For Assessing Writing-Intensive

A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...
A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...
A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...Simar Neasy
 
Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...
Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...
Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...Jessica Henderson
 
An Analysis Of Discourse Markers In Academic Report Writing Pedagogical Impl...
An Analysis Of Discourse Markers In Academic Report Writing  Pedagogical Impl...An Analysis Of Discourse Markers In Academic Report Writing  Pedagogical Impl...
An Analysis Of Discourse Markers In Academic Report Writing Pedagogical Impl...Brooke Heidt
 
Journal Assessment
Journal AssessmentJournal Assessment
Journal AssessmentShafiqah Rashid
 
What does language testing have to offier
What does language testing have to offierWhat does language testing have to offier
What does language testing have to offierAPSACS
 
Genre analysis and the language 2847
Genre analysis and the language 2847Genre analysis and the language 2847
Genre analysis and the language 2847Bilal Yaseen
 
A Comparison Of Freshman And Sophomore EFL Students Written Performance Thro...
A Comparison Of Freshman And Sophomore EFL Students  Written Performance Thro...A Comparison Of Freshman And Sophomore EFL Students  Written Performance Thro...
A Comparison Of Freshman And Sophomore EFL Students Written Performance Thro...Bryce Nelson
 
Appropriating Arguments Academic Reading and Writing.pdf
Appropriating Arguments  Academic Reading and Writing.pdfAppropriating Arguments  Academic Reading and Writing.pdf
Appropriating Arguments Academic Reading and Writing.pdfJessica Navarro
 
An Assessment Instrument of Product versus Process Writing Instruction: A Ra...
 An Assessment Instrument of Product versus Process Writing Instruction: A Ra... An Assessment Instrument of Product versus Process Writing Instruction: A Ra...
An Assessment Instrument of Product versus Process Writing Instruction: A Ra...English Literature and Language Review ELLR
 
The framework of materials and method & Current approaches to materials and m...
The framework of materials and method & Current approaches to materials and m...The framework of materials and method & Current approaches to materials and m...
The framework of materials and method & Current approaches to materials and m...RBLmadev Class 2018
 
Approaches To The Teaching Of Writing Skills
Approaches To The Teaching Of Writing SkillsApproaches To The Teaching Of Writing Skills
Approaches To The Teaching Of Writing SkillsSara Perez
 
defining CAF.pptx
defining CAF.pptxdefining CAF.pptx
defining CAF.pptxAbeerHumud
 
A Self-Assessment Checklist For Undergraduate Students Argumentative Writing
A Self-Assessment Checklist For Undergraduate Students  Argumentative WritingA Self-Assessment Checklist For Undergraduate Students  Argumentative Writing
A Self-Assessment Checklist For Undergraduate Students Argumentative WritingPedro Craggett
 
An Analysis On Undergraduate Students Abstracts At English Education Departm...
An Analysis On Undergraduate Students  Abstracts At English Education Departm...An Analysis On Undergraduate Students  Abstracts At English Education Departm...
An Analysis On Undergraduate Students Abstracts At English Education Departm...Amy Cernava
 
Studies in Higher Education Volume 25, No. 1, 2000Teaching.docx
Studies in Higher Education Volume 25, No. 1, 2000Teaching.docxStudies in Higher Education Volume 25, No. 1, 2000Teaching.docx
Studies in Higher Education Volume 25, No. 1, 2000Teaching.docxflorriezhamphrey3065
 
A CORPUS-DRIVEN DESIGN OF A TEST FOR ASSESSING THE ESL COLLOCATIONAL COMPETEN...
A CORPUS-DRIVEN DESIGN OF A TEST FOR ASSESSING THE ESL COLLOCATIONAL COMPETEN...A CORPUS-DRIVEN DESIGN OF A TEST FOR ASSESSING THE ESL COLLOCATIONAL COMPETEN...
A CORPUS-DRIVEN DESIGN OF A TEST FOR ASSESSING THE ESL COLLOCATIONAL COMPETEN...Lori Moore
 
A Comparative Investigation of Peer Revision versus Teacher Revision on the P...
A Comparative Investigation of Peer Revision versus Teacher Revision on the P...A Comparative Investigation of Peer Revision versus Teacher Revision on the P...
A Comparative Investigation of Peer Revision versus Teacher Revision on the P...Valerie Felton
 
Annotated Bibliography
Annotated BibliographyAnnotated Bibliography
Annotated Bibliographykstaff
 

Similar to An Instrument For Assessing Writing-Intensive (20)

A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...
A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...
A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...
 
Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...
Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...
Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...
 
An Analysis Of Discourse Markers In Academic Report Writing Pedagogical Impl...
An Analysis Of Discourse Markers In Academic Report Writing  Pedagogical Impl...An Analysis Of Discourse Markers In Academic Report Writing  Pedagogical Impl...
An Analysis Of Discourse Markers In Academic Report Writing Pedagogical Impl...
 
Journal Assessment
Journal AssessmentJournal Assessment
Journal Assessment
 
What does language testing have to offier
What does language testing have to offierWhat does language testing have to offier
What does language testing have to offier
 
Genre analysis and the language 2847
Genre analysis and the language 2847Genre analysis and the language 2847
Genre analysis and the language 2847
 
A Comparison Of Freshman And Sophomore EFL Students Written Performance Thro...
A Comparison Of Freshman And Sophomore EFL Students  Written Performance Thro...A Comparison Of Freshman And Sophomore EFL Students  Written Performance Thro...
A Comparison Of Freshman And Sophomore EFL Students Written Performance Thro...
 
Appropriating Arguments Academic Reading and Writing.pdf
Appropriating Arguments  Academic Reading and Writing.pdfAppropriating Arguments  Academic Reading and Writing.pdf
Appropriating Arguments Academic Reading and Writing.pdf
 
An Assessment Instrument of Product versus Process Writing Instruction: A Ra...
 An Assessment Instrument of Product versus Process Writing Instruction: A Ra... An Assessment Instrument of Product versus Process Writing Instruction: A Ra...
An Assessment Instrument of Product versus Process Writing Instruction: A Ra...
 
The framework of materials and method & Current approaches to materials and m...
The framework of materials and method & Current approaches to materials and m...The framework of materials and method & Current approaches to materials and m...
The framework of materials and method & Current approaches to materials and m...
 
Evaluative checklist
Evaluative checklistEvaluative checklist
Evaluative checklist
 
Approaches To The Teaching Of Writing Skills
Approaches To The Teaching Of Writing SkillsApproaches To The Teaching Of Writing Skills
Approaches To The Teaching Of Writing Skills
 
defining CAF.pptx
defining CAF.pptxdefining CAF.pptx
defining CAF.pptx
 
A Self-Assessment Checklist For Undergraduate Students Argumentative Writing
A Self-Assessment Checklist For Undergraduate Students  Argumentative WritingA Self-Assessment Checklist For Undergraduate Students  Argumentative Writing
A Self-Assessment Checklist For Undergraduate Students Argumentative Writing
 
An Analysis On Undergraduate Students Abstracts At English Education Departm...
An Analysis On Undergraduate Students  Abstracts At English Education Departm...An Analysis On Undergraduate Students  Abstracts At English Education Departm...
An Analysis On Undergraduate Students Abstracts At English Education Departm...
 
Africa
AfricaAfrica
Africa
 
Studies in Higher Education Volume 25, No. 1, 2000Teaching.docx
Studies in Higher Education Volume 25, No. 1, 2000Teaching.docxStudies in Higher Education Volume 25, No. 1, 2000Teaching.docx
Studies in Higher Education Volume 25, No. 1, 2000Teaching.docx
 
A CORPUS-DRIVEN DESIGN OF A TEST FOR ASSESSING THE ESL COLLOCATIONAL COMPETEN...
A CORPUS-DRIVEN DESIGN OF A TEST FOR ASSESSING THE ESL COLLOCATIONAL COMPETEN...A CORPUS-DRIVEN DESIGN OF A TEST FOR ASSESSING THE ESL COLLOCATIONAL COMPETEN...
A CORPUS-DRIVEN DESIGN OF A TEST FOR ASSESSING THE ESL COLLOCATIONAL COMPETEN...
 
A Comparative Investigation of Peer Revision versus Teacher Revision on the P...
A Comparative Investigation of Peer Revision versus Teacher Revision on the P...A Comparative Investigation of Peer Revision versus Teacher Revision on the P...
A Comparative Investigation of Peer Revision versus Teacher Revision on the P...
 
Annotated Bibliography
Annotated BibliographyAnnotated Bibliography
Annotated Bibliography
 

More from Daniel Wachtel

How To Write A Conclusion Paragraph Examples - Bobby
How To Write A Conclusion Paragraph Examples - BobbyHow To Write A Conclusion Paragraph Examples - Bobby
How To Write A Conclusion Paragraph Examples - BobbyDaniel Wachtel
 
The Great Importance Of Custom Research Paper Writi
The Great Importance Of Custom Research Paper WritiThe Great Importance Of Custom Research Paper Writi
The Great Importance Of Custom Research Paper WritiDaniel Wachtel
 
Free Writing Paper Template With Bo. Online assignment writing service.
Free Writing Paper Template With Bo. Online assignment writing service.Free Writing Paper Template With Bo. Online assignment writing service.
Free Writing Paper Template With Bo. Online assignment writing service.Daniel Wachtel
 
How To Write A 5 Page Essay - Capitalize My Title
How To Write A 5 Page Essay - Capitalize My TitleHow To Write A 5 Page Essay - Capitalize My Title
How To Write A 5 Page Essay - Capitalize My TitleDaniel Wachtel
 
Sample Transfer College Essay Templates At Allbu
Sample Transfer College Essay Templates At AllbuSample Transfer College Essay Templates At Allbu
Sample Transfer College Essay Templates At AllbuDaniel Wachtel
 
White Pen To Write On Black Paper. Online assignment writing service.
White Pen To Write On Black Paper. Online assignment writing service.White Pen To Write On Black Paper. Online assignment writing service.
White Pen To Write On Black Paper. Online assignment writing service.Daniel Wachtel
 
Thanksgiving Writing Paper By Catherine S Teachers
Thanksgiving Writing Paper By Catherine S TeachersThanksgiving Writing Paper By Catherine S Teachers
Thanksgiving Writing Paper By Catherine S TeachersDaniel Wachtel
 
Transitional Words. Online assignment writing service.
Transitional Words. Online assignment writing service.Transitional Words. Online assignment writing service.
Transitional Words. Online assignment writing service.Daniel Wachtel
 
Who Can Help Me Write An Essay - HelpcoachS Diary
Who Can Help Me Write An Essay - HelpcoachS DiaryWho Can Help Me Write An Essay - HelpcoachS Diary
Who Can Help Me Write An Essay - HelpcoachS DiaryDaniel Wachtel
 
Persuasive Writing Essays - The Oscillation Band
Persuasive Writing Essays - The Oscillation BandPersuasive Writing Essays - The Oscillation Band
Persuasive Writing Essays - The Oscillation BandDaniel Wachtel
 
Write Essay On An Ideal Teacher Essay Writing English - YouTube
Write Essay On An Ideal Teacher Essay Writing English - YouTubeWrite Essay On An Ideal Teacher Essay Writing English - YouTube
Write Essay On An Ideal Teacher Essay Writing English - YouTubeDaniel Wachtel
 
How To Exploit Your ProfessorS Marking Gui
How To Exploit Your ProfessorS Marking GuiHow To Exploit Your ProfessorS Marking Gui
How To Exploit Your ProfessorS Marking GuiDaniel Wachtel
 
Word Essay Professional Writ. Online assignment writing service.
Word Essay Professional Writ. Online assignment writing service.Word Essay Professional Writ. Online assignment writing service.
Word Essay Professional Writ. Online assignment writing service.Daniel Wachtel
 
How To Write A Thesis And Outline. How To Write A Th
How To Write A Thesis And Outline. How To Write A ThHow To Write A Thesis And Outline. How To Write A Th
How To Write A Thesis And Outline. How To Write A ThDaniel Wachtel
 
Write My Essay Cheap Order Cu. Online assignment writing service.
Write My Essay Cheap Order Cu. Online assignment writing service.Write My Essay Cheap Order Cu. Online assignment writing service.
Write My Essay Cheap Order Cu. Online assignment writing service.Daniel Wachtel
 
Importance Of English Language Essay Essay On Importance Of En
Importance Of English Language Essay Essay On Importance Of EnImportance Of English Language Essay Essay On Importance Of En
Importance Of English Language Essay Essay On Importance Of EnDaniel Wachtel
 
Narrative Structure Worksheet. Online assignment writing service.
Narrative Structure Worksheet. Online assignment writing service.Narrative Structure Worksheet. Online assignment writing service.
Narrative Structure Worksheet. Online assignment writing service.Daniel Wachtel
 
Essay Writing Service Recommendation Websites
Essay Writing Service Recommendation WebsitesEssay Writing Service Recommendation Websites
Essay Writing Service Recommendation WebsitesDaniel Wachtel
 
Critical Essay Personal Philosophy Of Nursing Essa
Critical Essay Personal Philosophy Of Nursing EssaCritical Essay Personal Philosophy Of Nursing Essa
Critical Essay Personal Philosophy Of Nursing EssaDaniel Wachtel
 
Terrorism Essay In English For Students (400 Easy Words)
Terrorism Essay In English For Students (400 Easy Words)Terrorism Essay In English For Students (400 Easy Words)
Terrorism Essay In English For Students (400 Easy Words)Daniel Wachtel
 

More from Daniel Wachtel (20)

How To Write A Conclusion Paragraph Examples - Bobby
How To Write A Conclusion Paragraph Examples - BobbyHow To Write A Conclusion Paragraph Examples - Bobby
How To Write A Conclusion Paragraph Examples - Bobby
 
The Great Importance Of Custom Research Paper Writi
The Great Importance Of Custom Research Paper WritiThe Great Importance Of Custom Research Paper Writi
The Great Importance Of Custom Research Paper Writi
 
Free Writing Paper Template With Bo. Online assignment writing service.
Free Writing Paper Template With Bo. Online assignment writing service.Free Writing Paper Template With Bo. Online assignment writing service.
Free Writing Paper Template With Bo. Online assignment writing service.
 
How To Write A 5 Page Essay - Capitalize My Title
How To Write A 5 Page Essay - Capitalize My TitleHow To Write A 5 Page Essay - Capitalize My Title
How To Write A 5 Page Essay - Capitalize My Title
 
Sample Transfer College Essay Templates At Allbu
Sample Transfer College Essay Templates At AllbuSample Transfer College Essay Templates At Allbu
Sample Transfer College Essay Templates At Allbu
 
White Pen To Write On Black Paper. Online assignment writing service.
White Pen To Write On Black Paper. Online assignment writing service.White Pen To Write On Black Paper. Online assignment writing service.
White Pen To Write On Black Paper. Online assignment writing service.
 
Thanksgiving Writing Paper By Catherine S Teachers
Thanksgiving Writing Paper By Catherine S TeachersThanksgiving Writing Paper By Catherine S Teachers
Thanksgiving Writing Paper By Catherine S Teachers
 
Transitional Words. Online assignment writing service.
Transitional Words. Online assignment writing service.Transitional Words. Online assignment writing service.
Transitional Words. Online assignment writing service.
 
Who Can Help Me Write An Essay - HelpcoachS Diary
Who Can Help Me Write An Essay - HelpcoachS DiaryWho Can Help Me Write An Essay - HelpcoachS Diary
Who Can Help Me Write An Essay - HelpcoachS Diary
 
Persuasive Writing Essays - The Oscillation Band
Persuasive Writing Essays - The Oscillation BandPersuasive Writing Essays - The Oscillation Band
Persuasive Writing Essays - The Oscillation Band
 
Write Essay On An Ideal Teacher Essay Writing English - YouTube
Write Essay On An Ideal Teacher Essay Writing English - YouTubeWrite Essay On An Ideal Teacher Essay Writing English - YouTube
Write Essay On An Ideal Teacher Essay Writing English - YouTube
 
How To Exploit Your ProfessorS Marking Gui
How To Exploit Your ProfessorS Marking GuiHow To Exploit Your ProfessorS Marking Gui
How To Exploit Your ProfessorS Marking Gui
 
Word Essay Professional Writ. Online assignment writing service.
Word Essay Professional Writ. Online assignment writing service.Word Essay Professional Writ. Online assignment writing service.
Word Essay Professional Writ. Online assignment writing service.
 
How To Write A Thesis And Outline. How To Write A Th
How To Write A Thesis And Outline. How To Write A ThHow To Write A Thesis And Outline. How To Write A Th
How To Write A Thesis And Outline. How To Write A Th
 
Write My Essay Cheap Order Cu. Online assignment writing service.
Write My Essay Cheap Order Cu. Online assignment writing service.Write My Essay Cheap Order Cu. Online assignment writing service.
Write My Essay Cheap Order Cu. Online assignment writing service.
 
Importance Of English Language Essay Essay On Importance Of En
Importance Of English Language Essay Essay On Importance Of EnImportance Of English Language Essay Essay On Importance Of En
Importance Of English Language Essay Essay On Importance Of En
 
Narrative Structure Worksheet. Online assignment writing service.
Narrative Structure Worksheet. Online assignment writing service.Narrative Structure Worksheet. Online assignment writing service.
Narrative Structure Worksheet. Online assignment writing service.
 
Essay Writing Service Recommendation Websites
Essay Writing Service Recommendation WebsitesEssay Writing Service Recommendation Websites
Essay Writing Service Recommendation Websites
 
Critical Essay Personal Philosophy Of Nursing Essa
Critical Essay Personal Philosophy Of Nursing EssaCritical Essay Personal Philosophy Of Nursing Essa
Critical Essay Personal Philosophy Of Nursing Essa
 
Terrorism Essay In English For Students (400 Easy Words)
Terrorism Essay In English For Students (400 Easy Words)Terrorism Essay In English For Students (400 Easy Words)
Terrorism Essay In English For Students (400 Easy Words)
 

Recently uploaded

Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxDenish Jangid
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.MateoGardella
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfAyushMahapatra5
 
Gardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterGardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterMateoGardella
 
An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfSanaAli374401
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...KokoStevan
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingTeacherCyreneCayanan
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingTechSoup
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104misteraugie
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsTechSoup
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin ClassesCeline George
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Celine George
 

Recently uploaded (20)

INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptxBasic Civil Engineering first year Notes- Chapter 4 Building.pptx
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
 
Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.Gardella_Mateo_IntellectualProperty.pdf.
Gardella_Mateo_IntellectualProperty.pdf.
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Gardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch LetterGardella_PRCampaignConclusion Pitch Letter
Gardella_PRCampaignConclusion Pitch Letter
 
An Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdfAn Overview of Mutual Funds Bcom Project.pdf
An Overview of Mutual Funds Bcom Project.pdf
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
SECOND SEMESTER TOPIC COVERAGE SY 2023-2024 Trends, Networks, and Critical Th...
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Grant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy ConsultingGrant Readiness 101 TechSoup and Remy Consulting
Grant Readiness 101 TechSoup and Remy Consulting
 
Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104Nutritional Needs Presentation - HLTH 104
Nutritional Needs Presentation - HLTH 104
 
Introduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The BasicsIntroduction to Nonprofit Accounting: The Basics
Introduction to Nonprofit Accounting: The Basics
 
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17  How to Extend Models Using Mixin ClassesMixin Classes in Odoo 17  How to Extend Models Using Mixin Classes
Mixin Classes in Odoo 17 How to Extend Models Using Mixin Classes
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
CĂłdigo Creativo y Arte de Software | Unidad 1
CĂłdigo Creativo y Arte de Software | Unidad 1CĂłdigo Creativo y Arte de Software | Unidad 1
CĂłdigo Creativo y Arte de Software | Unidad 1
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 

An Instrument For Assessing Writing-Intensive

  • 1. AN INSTRUMENT FOR ASSESSING WRITING-INTENSIVE PHYSICS LABORATORY REPORTS Andy Buffler, Saalih Allie and Loveness Kaunda Department of Physics, University of Cape Town, Rondebosch, 7700 Tel: (021) 650 3339 Fax: (021) 650 3342 abuffler@physci.uct.ac.za Margaret Inglis Department of Second Language Studies, University of Natal - Pietermaritzburg Abstract This paper reports the development of an instrument for assessing writing-intensive laboratory reports in undergraduate physics. Writing-intensive reports combine the aspects of laboratory experimentation and written communication and involve a number of distinct facets which have to be taken into account during assessment. At one level, there are features that pertain to the technicalities and conventions of the science involved as well as the technical features of language use, while at another level, there are features that pertain to the report as an exercise in meaningful communication. In an attempt to take cognisance of all these aspects, the coherence of the report was used as an important concept in developing the instrument. The instrument was also designed so that it could be used by physics tutors, both staff and post-graduate students, who are generally more familiar with assessing reports on the basis of the technical aspects of scientific procedure only, but might judge the communicative aspects of the report by assessing only surface-level language errors. Introduction Writing-intensive laboratory reports were first introduced into the first year physics curriculum at the University of Cape Town as a form of providing language intervention for ESL students who came from disadvantaged educational backgrounds. Makina-Kaunda and Allie (1994) identified the laboratory practical as an appropriate context, not only for introducing the students to the procedures of practical science, but also for developing communication skills by writing about the laboratory activity. Recently, the physics department, acknowledging communication as being an integral part of investigative and scientific activities (Allen and Widdowson 1978, Shih 1986, Baird 1988, Vermillion 1991), included writing-intensive reports as a component of the laboratory curriculum for all first year physics students. In general, it is widespread practice that learning objectives in academic settings are measured primarily via the written word. This has been illustrated in a number of studies such as the Keller- Cohen and Wolfe (1987) study done at the University of Michigan, and cited in Hamp-Lyons (1990), where it is reported that “written work forms part of the course expectations in over 70% of courses within the undergraduate curriculum while 97% of faculty surveyed agreed that skill in writing is important or very important for college study” (Hamp-Lyons 1990 p. 71). At the same time writing tasks are seen as being important in promoting the development and understanding of scientific concepts and process skills. The importance of task-based activities in practical science in in Proceedings of the 5th Annual Meeting of the Southern African Association for Research in Mathematics and Science Education, Ed: M. Sanders, Johannesburg (1997) 85-91
  • 2. developing scientific process skills is well documented by Woolnough (1991), Tamir (1991), Millar (1991) and Gott and Mashiter (1991) as is the link between writing and concept development is also described in detail in Radloff and Samson (1993), Radloff (1994), Connolly (1989) and Widdowson (1983). These views and research findings would support the validity of the laboratory report-writing task as a means through which mastery of concepts and process skills in experimental science can be taught and assessed. Holistic assessment, which is discussed in detail in Bamberg (1984), Hamp- Lyons (1993) and Allaei and Connor (1993), encourages the assessment of a piece of writing as a whole rather than focusing on individual aspects. It is considered an appropriate approach for the assessment of multiple traits in a piece of writing (Hamp-Lyons 1993, Allaei and Connor 1993) in that important aspects of content or those of form are not arbitrarily ignored. However, none of the holistic assessment instruments presented focuses on the assessment of laboratory reports. There also does not appear to be any work reported on integrating laboratory practical work and communication skills. Coherence has been described as that quality in a piece of writing which enables a reader to sense a flow in what is being communicated (Fahnestock 1983). We see the achievement of such a quality as being dependent on the closeness of the relationship between the content and the expression of that content. In the physics department at the University of Cape Town laboratory reports are traditionally assessed by tutors who are postgraduate students in the department. The introduction of writing-intensive reports posed a problem for the tutors since the traditional laboratory reports are usually a short summary of activities in the laboratory and are graded largely on the basis of the scientific content. Writing tends to be limited to a few terse sentences interspersed amongst tables, graphs and diagrams. Loosely-conceptualised assessment criteria were found to be unhelpful to the tutors in assessing the writing-intensive reports and caused large variations in the quality of grading and feedback, bearing in mind the large numbers of first year students that are involved. An acknowledgement of these problems led to the development of an instrument which was aimed at providing a more detailed set of criteria which could be used by the “discipline-based” personnel to assess the writing-intensive reports more objectively Thus, once trained in the use of the instrument, tutors would be able to give specific and meaningful feedback on the scientific content as well as the communication and language aspects. This paper describes the process by which the instrument was developed, taking into consideration theoretical issues surrounding assessment and learning. It should be pointed out that the reports in question follow a well-structured format around the following broad sections: introduction, method (of the experiment), data collection, data processing and results, conclusions and discussion. The “introduction” section introduces the report, the intended audience, relevant theory and the purpose of the task. The “method” section describes the procedures followed, precautions taken and should include labelled diagrams of the apparatus used. The “data” section usually presents tables of recorded data and many of the practical tasks require a graph to be drawn. The appropriate calculations are presented in a “results” section which should also refer to any graphs plotted and contain the appropriate data analysis. Finally, the “discussion” or “conclusions” section should tie the whole report together by briefly referring to the original aims, the results and the relevant conclusions. Discussion of statistical uncertainties and other sources of experimental errors and recommendations could also be included in this section.
  • 3. Theoretical Perspectives Assessing student writing A brief review of the literature on student writing and assessment reveals the fact that assessing any kind of written work is a complex exercise This is so, especially when several markers are involved, since the markers may have different views as to what constitutes the essential assessment criteria for a piece of writing. A large body of research into the assessment of student writing in various disciplines has shown, for example, that when staff in a particular discipline assess student writing, they respond primarily to the content while language teachers traditionally tend to respond to surface-level features such as grammar, spelling and punctuation rather than to the subject matter (see Zamel 1985, Swales 1978, Shih 1986, Hamp-Lyons 1993, White 1994). These differing perceptions can lead to large disparities in assessment particularly in disciplines, such as in physics, where numerical and graphical forms of communication are relied on extensively. The assessment instrument we present in this paper attempts to create a balance between discipline-based and language-based criteria as well as limiting opportunities for disparities and inconsistencies between markers. Formative assessment Boud (1995), in his discussion of the complementary nature of assessment and learning, argues that assessment should not only focus on measuring achievement, that is, giving a mark, but it should have an impact on quality learning and teaching. The importance of ‘formative’ as opposed to only ‘summative’ assessment in promoting learning is further discussed by Brown, Race and Rust (1995). Despite the different perceptions among staff and students with regard to what constitutes ‘good’ writing or ‘good’ feedback, there is general agreement about the fact that students will not necessarily modify what they do according to the comments and changes made on their piece of writing and that in some cases they are unable to interpret comments on either form or content (Vaughan 1993, Fathman and Whalley 1990, Swales 1978). Simply conforming to the correct structure of the report, for example, is sometimes viewed by students as being the main ingredient of successful report writing (Kaunda 1995). It is argued that drawing attention to specific strategies for accomplishing a task and to meaning-related errors and omissions may be more helpful than general comments on surface-level features and other arbitrary comments (Zamel 1985, Fathman and Whaley 1990). Our own experience has also been that specific feedback helps to focus students on specific sections of the report where more detail, explanation or clarity, is required. This leads to an overall improvement in report-writing skills as well as to a better understanding of the concepts involved in the experimental investigation (Kaunda 1995). However, this strategy is resource- intensive and time consuming as copious comments have to be written on each student’s report. The present instrument is designed to give students meaningful feedback in a standardised form, while at the same time relieving the tutor from the frustration of having to repeat very similar comments on each report. Coherence and scientific discourse Coherence is primarily a qualitative feature of writing and, as a vital component of holistic assessment, is more appropriately described in holistic terms (Allaei and Connor 1993, Hamp-Lyons 1993, Bamberg 1984). It is a vital component of holistic assessment. Besides the content, coherence is what gives a piece of writing its logical flow (Witte and Faigley 1981, Bamberg 1984, Hubbard 1989, 1993) and enables the reader to construct an argument from what is being
  • 4. communicated (Fahnestock 1983). In a report, this flow would be expected to result initially from the ability of the writer to create a context which orientates the reader in such a way that he or she can anticipate upcoming information and the way this information will be presented. Coherence comes about by the ability of the writer to use both the structure of the report and the appropriate linguistic, rhetorical and semantic markers so as to help carry the reader across sections of the report in a way that makes the relationship between one section and another clear and also reduces interference in the processing of the information conveyed. Imparting to students an understanding of what makes a piece of writing coherent, is one way of initiating them into the discourse of the scientific community, and facilitates the development of scientific thinking (Marshall 1991, Parkhurst 1990, Johns 1993, Shih 1986). The assessment instrument Four researchers were involved in the development of the assessment instrument, namely, two physicists and two applied linguists. In a series of meetings one of the first decisions was to agree on who the “clients” for the instrument would be. Three levels of users were identified, namely, the students, the markers (tutors) and the laboratory facilitators (lecturers). This identification helped in determining the nature of the information which would be contained in the instrument as well as its presentation. While acknowledging the importance of making assessment criteria explicit (Knight 1995), we were aware of the difficulty of trying to assess everything that one might notice in a laboratory report especially when the laboratory task contains the two distinct aspects discussed above. What was important in the process of generating criteria was to define carefully the dynamic relationship between the scientific content and the features pertaining to communication and language. The final instrument therefore required that both of these aspects be assessed objectively and consistently but not at the expense of each other. Using a list of criteria based on the experience from previous marking, all four researchers marked a sample of ‘good’, ‘average’ and ‘poor’ reports and noted additional criteria. A draft assessment instrument was then used by a group of post-graduate students to mark a number of reports and their experiences were recorded and included in the set of assessment criteria which formed part of the final instrument. The final version of the assessment instrument consists of two parts, namely; (a) The mark schedule (Appendix 1) which contains the assessment criteria on which the marks awarded are based and, (b) the coherence rating scale (Appendix 2) which provides criteria to the marker for assessing the coherence of the report. With reference to the mark schedule (Appendix 1) it can be seen that the assessment criteria have been subdivided into two sections. Section A outlines those criteria relating to the scientific aspects including the practical and processing skills related to the experiment. These are grouped in three sub-sections: (a) method, (b) data collection, and (c) data processing (graphs and calculations). Section B focuses on criteria related to achieving coherence in the report under the following sub- headings, (a) the introduction and aim, (b) the discussion and interpretation of data and (c) conclusion and recommendations. A total of sixty marks are allocated for the report. Up to thirty five marks may be awarded for Section A and twenty five marks for Section B. This relative weighting was decided upon to suit our own needs and may be adjusted according to the teaching context. The assessment schedule, bearing the personal details of the student, is be returned to the student together with the report that had been handed in previously. Apart from showing the final
  • 5. mark, the tutor places crosses to indicate elements that are either absent from the report or are not adequately presented. In this way, it is clearly indicated to the student how and where marks were gained or lost and which areas need improvement. The schedules may be photocopied before handing them back to students in order to keep track of common problems which can be addressed at a later stage. Marking Section A Up to thirty-five marks may be awarded for Section A distributed as discussed below. Five marks are allocated to the description of method. The method should be described clearly and explicitly, indicating the sequence in which the procedure was followed. The student must pay attention to the selection and exactness of the descriptive detail, precautions taken, and must make sure that the description is not difficult to follow due to too many grammatical and spelling errors. The description should be supported by a correctly labelled diagram of the apparatus. Ten marks have been allocated to the collection of data and presentation of tables. Here, the marks would be awarded for the presence of sufficient, appropriate data and how these are recorded in tables with the appropriate significant figures. Tables must have suitable headings and columns of data should indicate correct units. Twenty marks have been allocated to data processing; ten for graphs and ten for calculations. All graphs should have appropriate scales, correctly plotted data points, properly labelled axes with correct units and suitable headings. All calculations must be correctly completed using the appropriate equations, significant figures. Uncertainties must be handled correctly. After assessing the aspects in a particular sub-section in the report, the tutor should turn to the student mark sheet and put a cross against items that are absent from the report or need attention. This serves to make criteria and expectations visible to the student. A mark for that sub-section is then awarded. To avoid mark inflation and allow for qualitative judgements by the tutor the items are not equal to one mark each. Marking Section B While it is to be expected that the physics tutors are able to assess the identifiable elements of the scientific aspects, it cannot be assumed that they could find it equally easy to mark the coherence of reports. Using insights gained from the various multiple-trait scoring systems in use such as those developed by Bamberg (1984), Hamp-Lyons (1993), Jacob et al. (1981), cited in Hamp-Lyons (1993) and Allaei and Connor (1993), we developed a coherence scale (figure 2) of performance in which we described the salient qualities which we identified as contributing to coherence in a laboratory report. The marks in the coherence scale (Appendix 2) are arranged in five bands: 0-4 (very poor), 5-9 (poor), 10-14 (satisfactory), 15-21 (very good) and 22-25 (excellent) following a normal distribution curve of performance. The degree of coherence is reflected by the descriptors in each band and it focuses on elements in the introduction, discussion of results, conclusion, recommendations and how information in one section is linked to that in another. To get full marks, all the elements in the 22-25 band, which characterise good scientific reporting, must be fulfilled (see Appendix 2). The marker must be satisfied that, in the introduction, the writer has set an appropriate context for the report by briefly discussing the theory on which the experiment is based. The writer must have made explicit links between the aim of the report, the data collected and processed and the
  • 6. conclusions and recommendations made. Since the degree of coherence in the report is reflected by the descriptors in each, coherence will be affected by the presence or absence of one or more of the descriptors. Thus, the marker must make a decision about the appropriate band for each report. This is followed by a further decision about the actual mark to be given within the band. These bands allow the marker to easily decide upon a broad range of marks while also allowing some differences in the quality of reports in the same band to be reflected in the final mark. The marker then should quickly read through the whole report again to judge whether the writer has completed the overall task, namely, that of using the experimental results to answer the problem that the report-writing task is required to address. The final mark for Section B is recorded in the appropriate space on the schedule and the marker will put a cross against those criteria of coherence which need further attention by the student. Evaluation The instrument described in this paper is currently in use by tutors in the physics department at the University of Cape Town. Based on their previous experience of assessment of traditional laboratory reports, the first reaction of the tutors to the instrument was that marking would become even more tedious and time-consuming. However, after using the instrument to mark the first few reports, it was found that it did not take more time as it was no longer necessary to write detailed comments on the report itself. Tutors who were more experienced at marking reports also found that the instrument indeed verified their ‘gut’ feeling of the appropriate mark for a report while the novice markers found the explicitness of the instrument helpful in deciding what was expected in a report of this kind in terms of content and coherence. After the training and the initial use of the instrument to mark reports, tutors found it easy to internalise the instructions for the use of the instrument and became less dependent on it. This exercise also served to change the perceptions of the tutors about what writing a laboratory report entails as well as the purpose of assessment. The quality of laboratory reports which students produced as a result of receiving feedback in the form of the assessment schedule was extremely encouraging and the general agreement on the part of the students was that they preferred this worksheet-based assessment as they could immediately see where they needed to improve and could ask the instructor for assistance on a particular aspect. The students could also see the criteria on which their final marks were based. It should also be pointed out that the students received a copy of both the assessment schedule and coherence scale before their first practical which familiarised them with the expectations of the instructor. It is our general impression that using this form of assessment has greatly enhanced the overall learning experience of students in the physics laboratory. Concluding remarks In this paper, we have presented an instrument for assessing writing-intensive reports with scientific content within the context of the physics laboratory practical. We have used insights from the literature on writing and assessment to highlight the importance of developing communication skills alongside other scientific process skills. The central concept around which the assessment instrument was designed was that of coherence. We have described the process by which the assessment instrument was developed and tested as well as how it can be used to provide meaningful feedback on reports to both students and researchers. We believe that this form of assessment may prove to be a useful way of assessing a range of writing-intensive tasks in different scientific disciplines.
  • 7. Acknowledgements The research on which this paper is based was funded by the Foundation for Research and Development (FRD). We would also like to thank the tutors who agreed to use the assessment instrument and provide us with valuable feedback, in particular, Beth Ratering, Mark Marais, Trevor Volkwyn, Dieter Geduld, Mirela Fetea, Rodney Morgan and David Brookes. References ALLEN, J. P. B. and WIDDOWSON, H. G. (1978). Teaching the communicative use of English. In R. Mackay and A. Mountford (Eds), English for Specific Purposes: A Case Study Approach (pp. 56-77). London, Longman. ALLAEI, S. K. and CONNOR, U. (1993). Using performative assessment instruments with ESL student writers. In L. Hamp-Lyons (Ed.), Assessing Second Language Writing in Academic Contexts (pp. 227-240). Norwood (NJ), Ablex. BAIRD, D. C. (1988). Experimentation: An Introduction to Measurement Theory and Experiment Design 2nd edition. Englewood Cliffs (NJ), Prentice Hall. BAMBERG, B. (1984). Assessing coherence: A reanalysis of essays written for the national assessment of educational progress, 1969-1979. Research in the Teaching of English, 18 (3), 305-319. BOUD, D. (1995). Assessment and learning: contradictory or complementary? In P. Knight (Ed.), Assessment for Learning in Higher Education (pp. 35-48). London, Kogan Page. BROWN, S., RACE, P. and RUST, C. (1995). Making assessment a positive experience. In P. Knight (Ed.), Assessment for Learning in Higher Education (pp. 75-85) London, Kogan Page. CHARLES, M. (1990). Responding to problems in written English using a student self-monitoring technique. English Language Teaching Journal 44 (4), 286-293. CONNOLY, P. (1989). Writing and the ecology of learning. In P. Connoly and T. Viladi (Eds), Writing to learn Mathematics and Science (pp. 1-14) New York, Teachers College Press. FAHNESTOCK, J. (1983). Semantic and lexical coherence. College Composition and Communication, 34 (4), 400-416. GOTT, R. and MASHITER, J. (1991). Practical work in science - a task-based approach. In B. Woolnough (Ed.), Practical Science : The Role and Reality of Practical Work in School Science (pp. 53-66). Milton Keynes, Open University Press. HAMP-LYONS, L. (1990). Second language writing: assessment issues in second language writing. In B. Kroll (Ed.), Second Language Writing - Research Insights for the Classroom (pp. 69-87). Cambridge, Cambridge University Press. HAMP-LYONS, L. (1993a). Reconstructing academic writing proficiency. In L. Hamp-Lyons (Ed.), Assessing Second Language Writing in Academic Contexts (pp. 127-153). Norwood (NJ), Ablex. HAMP-LYONS, L. (1993b). Scoring procedures for ESL contexts. In L. Hamp-Lyons (Ed.), Assessing Second Language Writing in Academic Contexts (pp. 240-276). Norwood (NJ), Ablex. HUBBARD, E. H. (1989). Cohesion errors in the academic writing of second language users of English. English Usage in South Africa, 20, 1-19. HUBBARD, E. H. (1993). Some coherence correlates in expository writing. South African Journal of Linguistics (Supplement), 15, 55-74.
  • 8. JOHNS, A. (1993). Faculty assessment of ESL student literacy skills: Implications for writing assessment. In L. Hamp-Lyons (Ed.), Assessing Second Language Writing in Academic Contexts (pp. 167-179). Norwood (NJ), Ablex. KAUNDA, L. (1995). Exploring the relationship between language and learning in the numerate sciences: the role of report-writing in physics. Internal report, University of Cape Town. KNIGHT, P. (1995). Introduction. In P. Knight (Ed.), Assessment for learning in Higher Education (pp. 13-23). London, Kogan Page. MAKINA-KAUNDA, L. and ALLIE, S. (1994). A language intervention for science students: laboratory report-writing in physics. In D. Adey, P. Steyn, N. Herman and G. Scholtz (Eds), State of the Art in Higher Education, Volume 2 (pp. 53-61) Pretoria, UNISA. MARSHALL, S. (1991). A genre-based approach to the teaching of report-writing. English for Specific Purposes, 10, 3-13. MILLAR, R. (1991). A means to an end: the role of processes in science education. In B. Woolnough (Ed.), Practical Science : The Role and Reality of Practical Work in School Science (pp. 43-52). Milton Keynes, Open University Press. PARKHURST, C. (1990). The Composition process of science writers. English for Specific Purposes, 9, 169-179. RADLOFF, A. and SAMSON, J. (1993). Promoting Deep Learning: Using Academic Writing to Change the Learner's Epistemological Stance. Paper presented at the 5th European Association for Research on Learning and Instruction Conference , Aix-en-Provence, 1-13. RADLOFF, A. (1994). Writing to Learn, Learning to write: Helping Academic Staff to Support Student Writing in their Discipline. Workshop at the 13th International Seminar on Staff and Educational Development, Cape Town, 2-9. SHIH, M. (1986). Content-based approaches to teaching academic writing. TESOL Quarterly, 20 (4), 617-648. SWALES, R. (1978). Writing 'Writing Scientific English'. In R. Mackay and A. Mountford (Eds), English for Specific Purposes; A Case Study Approach (pp. 43-55). London: Longman. TAMIR, P. (1991). Practical work in school science: an analysis of current practice. In B. Woolnough (Ed.), Practical Science : The Role and Reality of Practical Work in School Science (pp. 13-20). Milton Keynes, Open University Press. VAUGHAN, C. (1993). Holistic assessment: what goes on in the rater's mind? In L. Hamp-Lyons (Ed.), Assessing Second Language Writing in Academic Contexts (pp. 111-125). Norwood (NJ), Ablex. VERMILLION, R. E. (1991). Projects and Investigations: The Practice of Physics. New York, Macmillan. WHITE, E. M. (1994). Teaching and Assessing Writing. 2nd edition. San Fransisco, Jossey Bass. WIDDOWSON, H.G. (1983). New starts and different kinds of failure in learning how to write. In A. Freedman, I. Pringle and J. Yalden (Eds), First Language / Second Language - Selected Papers from the 1979 CCTE conference, Ottawa, Canada (pp. 34-47). London: Longman. WITTE, S. P. and FAIGLEY, L. (1981). Coherence, cohesion and writing quality. College Composition and Communication, 32 (2), 189-204. WOOLNOUGH, B. (1991). Setting the scene. In B. Woolnough (Ed.), Practical Science : The Role and Reality of Practical Work in School Science (pp. 3-9). Milton Keynes, Open University Press. ZAMEL, V. (1985). Responding to student writing. TESOL Quarterly, 19 (1), 79-101
  • 9. Appendix 1 University of Cape Town : Department of Physics PRACTICAL REPORT ASSESSMENT SCHEDULE FOR : _________________ Name:__________________________ Course:__________ Day:_________ Group:_______ Partner(s):_________________________________________ Marked by: ________________ ? Cross indicates that item needs attention! Data collection and processing : 35 marks:________ 1. Method: 5 marks: ______ 2. Collection of data and tables: 10 marks:____ ? diagram of apparatus with labels ? all appropriate data collected ? method described clearly and explicitly ? tables of measurements exist ? method described in sequence of performance ? data recorded correctly (significant figures) ? not difficult to follow due to poor grammar/spelling ? tables have suitable headings ? selection and exactness of detail ? columns have correct units ? precautions discussed ? sufficient data points ? sufficient measurements for each data point 3. Data processing: Graph: 10 marks: ______ Calculations: 10 marks: ______ ? appropriate variables plotted ? measurements manipulated correctly ? points plotted correctly ? correct formulae written down ? graph has labelled axes with correct units ? calculations completed clearly and correctly ? graph has a suitable heading ? correct formulae used to determine uncertainties ? correctly drawn fit to data (straight line) ? uncertainties calculated correctly ? appropriate scale ? results quoted correctly ? significant figures correct Coherence of the report : 25 marks: _________ Title, Introduction and Aim: Discussion and interpretation of data: ? date, headings and names ? reasoning of analysis easy to follow ? aim of report ? line of argument and explanations explicit ? context and audience specified ? grammar and spelling enhance argument ? discussion of relevant theory ? links to data clearly expressed Conclusion and recommendations: ? measured result related to the aim of the experiment ? suitable discussion based on result ? suitable discussion related to aim of the report ? links between conclusion and aim clearly expressed. ? valid conclusions based on result and aim of report ? discussion of possible systematic errors ? valid recommendations made to enhance experiment ? grammar and spelling enhance discussion 60 Total Mark :
  • 10. Appendix 2 Coherence Scale for scoring a report based on a physics laboratory practical. Mark range: 25 - 22 21 - 15 14 - 10 9 - 5 4 - 0 The report has nearly all of these elements present : One or two of the following diminish the report's coherence slightly : Some of the following prevent the reader from integrating the report into a coherent piece of writing: Many of the following prevent the reader from making sense of the report : The report has no coherence because of most of the following: The aim of the report is clearly stated. The aim is stated. The aim is stated but may be too brief and/or inaccurate. The aim is inadequately stated. The aim is not stated. Aim and Introduction The relevant theory is competently discussed. Relevant theory is discussed (but may be somewhat brief, or too detailed). The theory is mentioned (but it may be too brief or irrelevant or inaccurate). The theory is stated inadequately or not at all. No mention of theory. The audience is specified and orientated to the purpose of the report and to any relevant background information. The audience is specified and orientated, but the writer could be more explicit about the context. The audience may not be specified and not explicitly orientated towards the context. The audience is unspecified. Little or no mention of the context. No mention of audience or context. Conclusion / The results of the experiment are clearly discussed and interpreted in relation to aim of the report. The results of the experiment are discussed and interpreted in relation to the aim of the report. The results are mentioned with little discussion in relation to the aim of the report. The results are discussed inadequately. Little or no reference to the aim of the report. No discussion of results. Discussion of results / A definite conclusion, which is clearly supported by the final results, is stated and clearly related to the aim of the report. An accurate conclusion is stated and draws on final results. A conclusion is poorly stated or may be incorrect, and may not refer to results or to the aim of the report. A conclusion may not be stated. No conclusion stated. Recommen- dations The significance of the final results is thoroughly discussed. The significance of the final results is stated. The significance of the final results are poorly stated and may be inaccurate. The significance of the final results may not be stated. No statement of the significance of the final results. Recommendations are made and clearly linked to the method of the experiment and/or the aim of the report. Recommendations are made and linked to the aim of the report. Recommendations are made but link poorly to the aim of the report. Unrelevant recommendations are made which do not at all link to the aim of the report No recommendations made. Sentences within and across sections are linked together effectively with the use of cohesive ties. Sentences within and across sections are linked together with the use of some cohesive ties. Sentences within and across sections are not well linked because of little use of cohesive ties. Very few cohesive ties provide any linking of sentences within or across sections. No links created between sentences. Overall presentation The reading process is uninterrupted as there are few grammar or spelling errors. The reading process is easy although occasional grammar and spelling mistakes may intervene. The reading process is frequently interrupted by grammar and spelling errors. The reading process is continually interrupted by numerous grammar and spelling errors. Excessive grammar and spelling errors make the reading task almost impossible. The report is neat and well- structured, with appropriate headings. The structure of the report is clear, although one or two headings may be missing or misplaced. The structure of the report is not clear because many headings may be missing or are not prominently positioned. Little attempt to structure the report. Untidy presentation. No obvious structure to the report. Untidy presentation.