SlideShare a Scribd company logo
1 of 18
Download to read offline
APPLIED RESEARCH SUMMARY
᭜ Investigates the effectiveness of information
design principles and feedback-based usability
testing in developing clinical questionnaires
᭜ Finds that a form developed using information
design principles collected significantly more
data than did a control form
A Study of the Effectiveness
of Information Design
Principles Applied to Clinical
Research Questionnaires
BEVERLY B. ZIMMERMAN AND JESSICA R. SCHULTZ
INTRODUCTION
M
edical professionals must develop effective
methods for collecting, filing, storing, and ab-
stracting data about their clients. This data is
used for monitoring clinical standards of care
and intervention, and for clinical management and re-
search. Increasingly, researchers depend on computer-gen-
erated health surveys or questionnaires to obtain this med-
ical data from patients. But such forms are not necessarily
well designed, a problem that can result in potential loss of
data or incorrect information. A possible solution to this
problem is to apply information design principles to the
data-collection forms; however, medical researchers and
computer technicians usually have no experience in de-
signing questionnaires or in evaluating whether patients
can understand and use the data-gathering forms.
Researchers may not appreciate the difficulty patients
have in filling out data-collection forms. Krosnick (1991)
pointed out that readers go through a complex cognitive
process to provide an answer to each question on a survey
or questionnaire:
1. Readers must interpret the meaning and intent of
the question.
2. They must recall relevant information and inte-
grate that information into a single answer.
3. They must translate that answer into an appropri-
ate response on the form.
When readers are motivated to perform these neces-
sary cognitive tasks in a careful and objective manner, they
are said to be “optimizing.” However, when the cognitive
task of completing a questionnaire becomes too demand-
ing, respondents may shift their response strategy to be less
diligent in any or all parts of the response process and
provide a satisfactory answer rather than an optimal one.
This behavior, called satisficing (the term is borrowed from
Simon 1957), may involve spending less time thinking
about the meaning of a question, being careless in recalling
and integrating information, and being less precise in se-
lecting a response. Krosnick and others (1996) concluded
that three factors increase the likelihood that a person
responding to questions may begin to satisfice:
᭜ The greater the difficulty of the task
᭜ The lower the ability of the person reading the ques-
tionnaire
᭜ The lower the person’s motivation to optimize
Dillman (1978) argued that getting a response to a
survey or questionnaire must be viewed as a special case of
social exchange. Unlike economic exchange, where peo-
ple exchange money for goods and services, the obliga-
tions created by a questionnaire are unspecified and may
vary widely. Whether or not people complete a question-
naire or survey depends on their perception of the rewards
Manuscript received 18 August 1999; revised 4 November 1999;
accepted 10 November 1999.
Second Quarter 2000 • TechnicalCOMMUNICATION 177
they expect to receive as a result. Thus, if respondents
believe that the cost in terms of time, inconvenience, or
mental exertion exceeds the perceived reward, such as
potential benefits and services, they will fail to complete
the survey.
Researchers at the American Institutes for Research
found that poorly written forms were overwhelming to
many readers who decided not to deal with them even
when it meant giving up something they deserved and
needed (Rose 1981). In a later study, Bagin and Rose (1991)
analyzed nearly 4,000 responses to an article in Modern
maturity about bad forms and reported that aside from tax
forms, the largest number of complaints—over half of the
responses—were about healthcare forms. Included among
the complaints were the following:
᭜ Forms too complicated (reported by 48% of the re-
spondents)
᭜ Instructions unclear (47%)
᭜ Form too long (30%)
᭜ Type too small (28%)
᭜ Not enough room for answers (25%)
᭜ Words too difficult (23%)
᭜ Information requested too personal (21%)
The researchers concluded that bad forms and question-
naires cost companies, hospitals, and doctors untold dol-
lars in lost benefits and services.
MacNealy (1994) also found that poorly designed in-
formation-gathering forms drove up operating costs for
businesses and other organizations. Recent computeriza-
tion of medical forms has added to the problem. Powsner,
Wyatt, and Wright (1998) report that although “poorly de-
signed medical charting and data collection programs
waste time, hamper data retrieval, and compromise the
accuracy of record keeping,” computerization of medical
records will become more common.
If poorly written forms are costly and can cause prob-
lems for readers, why aren’t forms better designed? The
answer may be threefold:
᭜ Not all factors of the design or design process are
within the designer’s control.
᭜ Those who create the forms may not be knowledge-
able about or experienced with principles of infor-
mation design.
᭜ Designers may inadequately analyze the audiences
who will use their forms.
Couper and Groves (1996) note that some factors such
as the respondent’s environment, demographics, psycho-
logical predisposition, and ability are outside the control of
the information designer. However, they point out, recent
research in survey methods and in cognitive and social
psychology has shown that some factors are within the
designer’s control. For example, research has shown that
small changes in question wording, format, and ordering
can increase questionnaire responses (Braverman 1996;
Krosnick and others 1996; Salant and Dillman 1994) and
confidentiality assurances can significantly increase re-
sponse rates when the questionnaires ask about sensitive
items such as health-related behavior (Lee 1993). Thus,
Couper and Groves argue, some form design aspects can
influence the cooperation of respondents.
Unfortunately, as Duffy (1981) argued, there is a gap
between research and practice in document design.
Schriver (1989) in her review of recent research on the
subject, concluded that the field still has “an impoverished
and scattered literature” and there are few summaries of
research in the field. Furthermore, she maintained, because
information design is an eclectic field, borrowing from
other disciplines, much of the research is conducted in
widely disparate fields such as rhetoric, computer-human
interaction, social psychology, computer technologies, dis-
course analysis, and graphic design, to name only a few. It
is no surprise, therefore, that many of the people who
create documents such as forms and questionnaires, have
no training or experience in document or information de-
sign.
In addition, Schriver (1997) found that creators of doc-
uments such as forms and questionnaires often inade-
quately analyze the audiences who will use their docu-
ments. In a study of teenagers’ attitudes toward drug
education literature, she concluded that the designers had
little understanding of the needs and expectations of their
audience because they failed to obtain direct input from
the people for whom the survey questions were intended.
Schriver’s study showed that without proper usability test-
ing through reader input, even well-intentioned informa-
tion designers may produce documents that fail to be used
by their intended readers.
THE RESEARCH STUDY
The Comprehensive Breast Cancer Program at the H. Lee
Moffitt Cancer Center and Research Institute at the Univer-
sity of South Florida has for the past 10 years used simple
printed forms to collect data on breast cancer patients.
These forms were filled out by doctors and nurses, and
then the data was manually entered into a computer data-
base. The resulting database currently contains 2,000 pa-
tient files, and information from the database is used not
If poorly written forms are costly
and can cause problems for
readers, why aren’t forms better
designed?
APPLIED RESEARCH
Information Design and Clinical Research Zimmerman and Schultz
178 TechnicalCOMMUNICATION • Second Quarter 2000
only for improving patient care but also for expanding
research (Reintgen and others 1996). Recently, the center
has become the training site for lymphatic mapping world-
wide and has been named as one of the top 10 breast
cancer programs nationally. As a result, the number of
patients has doubled in the past 3 years, and this influx of
patients has overloaded the database system. In addition,
guidelines issued by the U.S. federal government specify-
ing new information requirements for billing patients made
it imperative for the center to modify its data collection
process.
For these reasons, officials at the center wanted to
redesign their forms so that
᭜ Patients could complete them prior to their arrival at
the breast cancer center.
᭜ The forms could request more information (patient
demographics, history of present illness, medica-
tions, past medical history, family history, and re-
view of systems).
᭜ The forms would permit automated data entry
through the use of optical character recognition.
Because the cancer center was redesigning their forms,
we saw this as an opportunity to determine the effective-
ness of information design principles as applied to a clin-
ical questionnaire. The purpose of our study, therefore,
was to determine whether designers using principles culled
from survey research and from usability testing involving
feedback-driven audience analysis could improve the ef-
fectiveness of the new health questionnaires and thus im-
prove the efficiency of the data collection process used at
that time. Specifically, we sought to determine whether
information designers could create properly designed
health questionnaires to increase the amount of data col-
lected and the accuracy of the data in the Breast Cancer
Lymphatic Mapping Database. The hypothesis to be tested
was that health questionnaires that had been designed by
information designers would reduce the number of errors
caused by incomplete data spaces or improper data entry.
METHODOLOGY
This study compared the effectiveness of two indepen-
dently designed computer-generated health question-
naires—a questionnaire created using information design
principles and feedback-driven audience analysis, and a
questionnaire designed by a computer systems analyst us-
ing the default settings of the design software. The staff of
the Comprehensive Breast Cancer Program at the H. Lee
Moffitt Cancer Center and Research Institute under the
direction of Charles E. Cox, MD, professor of surgery and
director of the program, determined the general specifica-
tions for both questionnaires to be used by the team of
physicians, nurses, and researchers at the Comprehensive
Breast Cancer Program. The systems analyst assigned to
this project created what we call the Control form, and
Brigham Young University (BYU) researchers created what
we call the Study form. Design specifications were the
same for both questionnaires; however, while designing
the Study form, the BYU researchers also conducted us-
ability testing and feedback-driven audience analysis using
members of the form’s target audience.
The audience for the new form was new patients at the
H. Lee Moffitt Cancer Center and Research Institute. While
the majority of the breast cancer patients are women aged
50 and older, there have been male breast cancer patients
as well as patients as young as 21 treated at the Cancer
Center. Although the patient normally completed the pa-
tient clinical history form, the audience also included
spouses and other family members of the patient because
the form gathers information on the medical history of
family members. Additional education and socio-economic
factors were not specified because people from all back-
grounds get breast cancer.
Both the Control and Study forms were generated
using Teleform Elite Version 5.0, a Windows 95 and Win-
dows NT program designed by Cardiff Software, Inc. Tele-
form Elite creates data collecting forms optimized for fax-
ing or scanning into a computer database. Teleform can
also export the data for use by other applications. Although
the user’s manual for Teleform Elite Version 5.0 provides
information on forms design, it does so in terms of how to
add data fields to a form rather than tutoring the user on
fundamentals of forms design. The manual also provides
instructions on how to access a library of templates for
commonly used data entry fields; however, it doesn’t dis-
cuss the information design principles used to create these
templates.
Both forms requested the same patient information;
that is, they both contained the same number and types of
fields. However, for the Study form, designers identified
each aspect of the data collection process that might affect
the quality or quantity of responses, and applied insights
The hypothesis to be tested was
that health questionnaires that had
been designed by information
designers would reduce the
number of errors caused by
incomplete data spaces or
improper data entry.
APPLIED RESEARCH
Information Design and Clinical ResearchZimmerman and Schultz
Second Quarter 2000 • TechnicalCOMMUNICATION 179
gathered from survey research, questionnaire design, and
usability testing. Following the general guidelines outlined
by Schriver (1997), we performed three stages of usability
testing in creating the Study form:
1. Evaluating and testing of the original question-
naire
2. Creation, evaluation, and testing of a revised
questionnaire
3. Evaluation and testing of a final questionnaire
However, to ensure that we were implementing appropri-
ate design elements, we conducted two additional stages of
usability testing on the Study form, for a total of five stages.
Once they had been created, both the Control forms
and the Study forms were randomly assigned to new pa-
tients entering the Comprehensive Breast Cancer Center for
new patient evaluations. Approximately 200 patients used
the two forms over a 4-month period; that is, approxi-
mately 100 patients received the Study form, and 100 pa-
tients received the Control form. Nurses at the clinic dis-
tributed the questionnaire randomly in unmarked
envelopes and observed the following protocol: they could
not assist the patient in any way in completing the ques-
tionnaire; they could not verify the answers that patients
filled in, nor could they fill in or change any answers on the
questionnaires; they could only complete sections of the
questionnaire marked “For Office Use Only.” In the final
count, we obtained 98 Study forms and 92 Control forms.
As the questionnaires were returned, the forms were
verified to determine whether or not the patient answered
each question and placed his or her answer to that specific
question in the appropriate box. To maintain the privacy of
the patients who completed the forms, all the identifying
information was marked out of the questionnaires before
they were given to us for analysis.
In the middle of the implementation and data collect-
ing phase of the study, the administrative staff at the H. Lee
Moffitt Cancer Center requested that page numbers be
added to the Control version of the questionnaire. The lack
of page numbers was causing problems when the nurses
faxed the form and verified the data before it was entered
in the database, as well as when they scanned the ques-
tionnaire in the patient’s chart. Although we felt this design
change could compromise the research study, we reluc-
tantly agreed to allow page numbers to be added to the
Control form of the questionnaire because we did not want
to jeopardize the patient’s entire record in the database.
USABILITY TESTING PROTOCOL
Following general guidelines outlined by Schriver (1997),
we performed three stages of usability testing in creating
the Study form:
1. Evaluating and testing the original questionnaire
2. Creating, evaluating, and testing the revised ques-
tionnaire
3. Evaluating and testing the final Study form
Stage 1. Evaluating and testing
the original questionnaire
Schriver argues that the first phase of usability testing
should involve the evaluation of the form by both experts
and users. In the first stage of our usability testing, we
examined the original questionnaire, analyzing its features
and predicting problems we believed users would encoun-
ter in completing the form (see Figure 1).
Then we characterized the questionnaire’s potential
problems by completing an Expert Evaluation Form, a
series of questions based on information design principles
from the work of Wright and Barnard (1975), Dillman
(1978), Wright and Haybittle (1979a, 1979b, 1979c), Wright
(1983), Salant and Dillman (1994), Schriver (1997), and
Kostelnick and Roberts (1998) illustrated in Table 1.
The problems we identified in the original form in-
cluded the lack of directions for completing the question-
naire, inclusion of technical terms or jargon that only a
medical practitioner would understand, inconsistent orga-
nization of the questions, and poor grouping of questions
about the respondent’s family interspersed throughout the
questionnaire. These problems were consistent with com-
plaints identified in studies by Bagin and Rose (1991) dis-
cussed previously. In addition, we noted specific proce-
dures that might cause problems, including expecting
patients to calculate their height in inches and determine
the size of a palpable breast mass using the metric system.
Next, two volunteers completed the original question-
naire and responded in terms of the difficulties they expe-
rienced in filling out the form. Volunteers for this initial
session of usability testing were purposely selected be-
cause they had no training in information design. To par-
ticipate in our usability testing, each potential test partici-
pant had to be a nonpregnant female, be 21 years old or
older, speak English as her first or second language, and be
a potential candidate for breast cancer in the present or
future. Because the Moffitt Cancer Center provides service
to a high number of Spanish-speaking patients, our pool of
usability testers included some individuals who spoke En-
glish as their second language but whose first language was
Spanish. As part of the usability testing procedure, we
We audiotaped the users’
comments and noted problems the
volunteers experienced in
completing the form.
APPLIED RESEARCH
Information Design and Clinical Research Zimmerman and Schultz
180 TechnicalCOMMUNICATION • Second Quarter 2000
created a Consent to Be a Usability Testing Research Sub-
ject form that participants signed. The consent form was
also translated into Spanish, although all the usability test-
ing was performed in English and the questionnaires were
in English.
We audiotaped the users’ comments and noted prob-
lems the volunteers experienced in completing the form.
After completing the original form, volunteers completed a
User Evaluation Form, rating the questionnaire in terms of
its wording, design, and ease of use, using a scale of 1 to 5,
with 5 being the highest rating.
The problems identified by the usability volunteers
included unfamiliar terminology, insufficient directions on
how to complete the form, insufficient space for writing
answers, and insufficient lines for writing answers when
the question required multiple answers, such as multiple
pregnancies or more than one major surgery.
Stage 2. Creating, evaluating, and
testing the revised questionnaire
In the next stage of usability testing, designers revised the
original form using data from the expert evaluation analy-
Figure 1. Sample page from the original clinical questionnaire.
APPLIED RESEARCH
Information Design and Clinical ResearchZimmerman and Schultz
Second Quarter 2000 • TechnicalCOMMUNICATION 181
TABLE 1. CRITERIA USED TO EVALUATE THE ORIGINAL
QUESTIONNAIRE DURING THE EXPERT EVALUATION
Criteria Evaluation Questions
Content Is the questionnaire appropriate for the specific user?
Is there a specific purpose for the questionnaire and does it meet that purpose?
Do the questions ask for appropriate information?
Are the questions comprehensive enough to obtain the necessary information?
Is there an explanation of the purpose of the questionnaire and of how to complete the
overall questionnaire and each section of the questionnaire?
Is there an explanation of what to do if the user experiences difficulty and what to do once
the user has completed the questionnaire?
Is each category meaningful?
Is each question necessary?
Organization Does the form follow a consistent format?
Are questions about similar information clumped or grouped together?
Are the questions organized in a logical order?
Is important information asked for in a logical sequence?
Is the overall form broken down into manageable subsections or questions?
Does the form require too much writing?
Are the spaces for the answers grouped close to the questions?
Is there enough room to write the answers?
Style Does the form have a professional tone without talking down to the audience?
Is the form free of jargon and acronyms?
Does the form use appropriate grammar, spelling, and punctuation?
Are the sentences simple rather than complex?
Are medical terms accurate?
Are definitions or examples provided for difficult medical terms?
Are the examples accurate and the explanations precise?
Visual Design Does the form look professional?
Does the form look simple to complete, or is it intimidating?
Can users find important information using headings, columns, lists, or tables?
Are page number and heading numbers accessible and easy to use?
Are icons and symbols consistent?
Are typographic cues uniform at each level of information?
Are there too many navigational cues such as bolding, reverse type?
Are the instructions and/or graphics attractive and professional looking?
Are the illustrations and/or graphics clear and accurate?
Is the typeface legible? Are line lengths consistent?
APPLIED RESEARCH
Information Design and Clinical Research Zimmerman and Schultz
182 TechnicalCOMMUNICATION • Second Quarter 2000
sis, user feedback, and their training in information de-
sign. Designers added general directions to the form, as
well as directions indicating where patients could skip ques-
tions that weren’t applicable to them, such as questions directed
to cigarette smokers. Directions for completing individual ques-
tions were italicized to set them apart from answers and exam-
ples.
Users had complained that many words in the original
form were too technical for them to understand. While
some might argue that wording problems might be consid-
ered language problems rather than design issues, the
research shows that a form’s use of technical terms affects
its overall effectiveness. To simplify the terminology in the
revised form, we used less technical terms or provided
definitions in parentheses. For example, definitions were
provided for terms such as bilateral, oopherectomy, and
fibrocystic disease.
The two usability testers indicated that being confined
to writing in a limited space required that they omit perti-
nent information from their medical history. For example,
the original questionnaire provided space for only six chil-
dren, four medical problems, and three surgeries. Although
designers added more columns and rows for additional
answers, they were limited by constraints imposed by the
form design software, which required that all responses for
a given field be on the same page. Although the additional
columns and rows added to the overall length of the new
form, it did not add to the total number of fields on the
form.
Finally, as much as possible given the constraints of the
software, designers arranged the questions so that the pa-
tient’s personal medical history was grouped together
rather than being combined with questions about family
medical history.
After the form was revised, two volunteers read the
revised form. Volunteers for these sessions were randomly
selected from a pool of potential participants who fit the
audience profile for this form. Again, we audiotaped the
users’ comments. The testers also completed a question-
naire asking them to rate the revised form in terms of its
wording, design, and usability, using a scale of 1–5, with 5
being the highest rating.
This usability testing revealed problems in the revised
version of the form. We discovered that in trying to help the
reader understand what to do with the questions and
where to put the answers, we provided too many direc-
tions for the questions, which made the participant less
likely to read them at all. The users tended to skip the
directions and examples and attempted to complete the
questions on their own. After failing in their attempt to
answer the question on their own, they read the directions
and became even more confused. As a result, we again
revised the form, omitting extensive directions and exam-
ples from these sections and including one-sentence in-
structions. In subsequent tests, usability volunteers were
able to accurately complete the questions.
We made other changes to the revised form. To correct a
problem of emphasis, we bolded the key phrase in each
question. To overcome a date format problem, we explained
the required format of the date in parentheses. To solve the
problem some patients had of not being able to calculate their
height in inches, we added directions and an example of how
to complete the calculation. To help patients who might not
know measurements in millimeters or inches, we added com-
parisons (such as “1 mm ϭ the size of a pen tip”) to help
readers determine the diameter of a breast mass.
We often found that we had to compromise among the
needs of the users, the general design guidelines, and the
constraints of the software. For example, although research
(Wright and Haybittle 1979a, 1979b, 1979c) has shown that
answers are less legible in character-segmented spaces, we
had to use this feature to allow for optical character recogni-
tion. And although users asked for more space for multiple
responses, the computer program required that all items that
referred to a specific field of the database be on the same
page. Therefore, we could allow only for as many answers as
would fit on the remaining space of a page. And, although
users preferred a 12-point font, we had to keep the font
smaller to avoid an excessive number of pages.
Stage 3. Evaluating and
testing the final Study form
After the final Study form had been completed, two volun-
teers read the revised form. Again, researchers audiotaped the
users’ comments and measured the speed and accuracy with
which the volunteers completed the form. Volunteers for this
final session were randomly selected from a pool of potential
participants who fit the audience profile for this form.
By this stage of usability testing, the Study form had
undergone several revisions to simplify it and to accom-
modate it to the needs of the prospective audience. There
were still some small problems that the volunteers had with
the form; however, as explained previously, these prob-
lems were minor and could not be corrected because of the
parameters of the computer program and the database that
We often found that we had to
compromise among the needs of
the users, the general design
guidelines, and the constraints of
the software.
APPLIED RESEARCH
Information Design and Clinical ResearchZimmerman and Schultz
Second Quarter 2000 • TechnicalCOMMUNICATION 183
collected the information from the form.
ANALYSIS OF THE FORMS
As part of the study, we conducted three different analyses:
᭜ A features or visual analysis of the overall design
differences between the two forms
᭜ A statistical analysis using a matched sample t test of
the data gathered from the forms
᭜ A field analysis of the performance differences be-
tween each field in the forms
Features analysis
Using the work of Schriver (1997) and Kostelick and Rob-
erts (1998), we completed both a features analysis and a
visual analysis of both the Control form and the Study form
that revealed many similarities and many differences be-
tween the two forms (see Figure 2 and Figure 3). Both
forms were 8.5 by 11 inches (21.59 ϫ 27.94 cm) in portrait
orientation. Both forms used character-segmented spaces
and bubbles for responses to optimize optical character
recognition. Both forms were subject to the constraints of
Figure 2. A sample page of the Control form.
APPLIED RESEARCH
Information Design and Clinical Research Zimmerman and Schultz
184 TechnicalCOMMUNICATION • Second Quarter 2000
the Teleform Elite Designer software that did not allow for
easily changing the spacing between lines (leading) or the
spacing between letters (kerning), or for easily aligning the
answer boxes and bubbles. The result was that both forms
had a narrow left margin and a very ragged right margin.
The Control form was 13 pages long. The font was
Courier, the default for the Teleform Elite Designer pro-
gram. The type size for questions and headings was incon-
sistent, with some pages set in 10-point type and others in
12-point. The type size used for responses was 10 points.
Both uppercase and lowercase letters were used, and the
treatment was either bold or plain text. Headings were
bolded and underlined. No page numbers were provided.
No general instructions were given for completing the
questionnaire, and no instructions were given for complet-
ing each section of the form. The format for questions
included both complete sentences and phrases (for exam-
ple, “Date First mammogram”). Four terms were defined;
one example was provided (helping patients determine a
dosage according to the color of the pill). There were
inconsistencies in capitalization (for example, “Family His-
tory of benign Breast Conditions”), and irregular spacing of
some headings resulted in truncated words (for example,
“Famil Membe”).
Figure 3. A sample page of the Study form.
APPLIED RESEARCH
Information Design and Clinical ResearchZimmerman and Schultz
Second Quarter 2000 • TechnicalCOMMUNICATION 185
Multiple typos were present (for example, “Nunmber,”
“Estorgen,” “Previuosly,” and “Followup”). Responses were
ordered with the N/A (non-applicable) response listed last.
The Study form was 20 pages long. The font was Times
New Roman. The type size was consistent, with all base
text in 10-point and all headings in 12-point throughout the
questionnaire. Both uppercase and lowercase letters were
used, and the text was either bold or plain text. Headings
were bolded, and four levels of headings were used. Page
numbers were provided in the lower right-hand corner of
each page. General instructions were given for completing
the questionnaire (see Figure 4), and instructions for com-
pleting each section were given in italics. The format for
questions consisted of complete sentences. Ten terms were
defined; eight examples were provided (including how to
fill in answer boxes and bubbles, how to enter a date
format, how to calculate height in inches, and how to
determine the size of a breast mass). There were no incon-
Figure 4. The instructions for the Study form.
APPLIED RESEARCH
Information Design and Clinical Research Zimmerman and Schultz
186 TechnicalCOMMUNICATION • Second Quarter 2000
sistencies in capitalization, no typos, and no truncated
headings. Responses were ordered with the non-applicable
response listed first.
Based on our analysis of the forms, we predicted that
the differences between the two forms that would most
affect patients’ behavior were the length of the form; the
use of directions to help the patient fill out the question-
naire; the level of the language as indicated by complete
sentences, definitions of terms, and additional examples;
the absence of errors and inconsistencies; and the presence
of a hierarchy of structured headings. A summary of the
design differences appears in Table 2.
Although we believed the ordering of the responses
would enable users to complete the form more quickly—
because they could skip non-applicable questions without
reading through the entire list of responses—we had no
way to test our assumption. Ideally, we would have liked to
conduct further observational studies with patients as they
completed the actual forms, to determine whether there
was a significant difference in the speed with which re-
spondents completed the forms. However, the confidential
nature of the information that was gathered and the risks to
participants as a result of the effects of anxiety and stress on
their medical condition prevented us from doing so.
TABLE 2. SUMMARY OF DESIGN DIFFERENCES
BETWEEN THE CONTROL AND STUDY FORMS
Design Characteristic Control Form Study Form
Typeface Courier Times New Roman
Type size 10 pt and 12 pt 10 pt body
12 pt heading
Case Upper/lower Upper/lower
Headings Bold, underlined, 1 level Bold, 4 levels
Length in pages 13 20
Inclusion of page numbers No Yes; bottom right
General instructions Not included Included
Section instructions Not included Included
Number of defined terms 4 10
Number of examples 1 8
Format of questions Sentences and phrases Sentences only
Spelling errors 4 spelling errors None
Truncation of words
Inconsistencies Capitalization None
Spacing
Order of responses N/A placed last N/A placed first
APPLIED RESEARCH
Information Design and Clinical ResearchZimmerman and Schultz
Second Quarter 2000 • TechnicalCOMMUNICATION 187
Statistical analysis
Although the Teleform Elite software was designed to col-
lect data in a format that tracks the responses, the program
could not individually record and track responses by field.
To determine the questionnaire’s efficiency, it was imper-
ative that the program be able to do this type of item
analysis because a comparison of responses by field was
the key to the statistical analysis of the study. Because the
Verification program could not simply analyze whether a
field was marked or not, we had to manually score each of
the forms.
In performing this analysis, we determined whether or
not a response was filled in for each field. Thus, for each
field, there were five scoring categories.
1. Filled in, Correctly—that is, the patient properly
wrote an answer in the appropriate field.
2. Filled in, Incorrectly—that is, the patient wrote an
answer in the appropriate field, but in an improper format.
For example, listing the height in feet rather than in inches.
3. Filled in, Shouldn’t Have—that is, the patient
wrote an answer in a field where previous question re-
sponses indicated that this question should not have
been answered. For example, if patients responded that
they were nonsmokers, then completed the field asking
for the number of cigarettes smoked daily.
4. Not Filled in, Should Have—that is, the patient
didn’t write an answer in a field where previous question
responses indicated that this question should have been
answered. For example, if patients responded that they
had been pregnant but didn’t complete the field asking
how many times they had been pregnant.
5. Not Filled in, Shouldn’t Have—that is, the patient
didn’t write an answer in a field where previous question
responses indicated that the question should not have
been answered.
The scoring of each field was not based on whether the
answer was accurate, but rather on whether the respondent
completed the field in accordance with the requirements
for that field.
A statistical analysis of the study data, a matched sam-
ple t test, was conducted by estimating the difference
between two population means with matched pairs. Each
of the 143 fields in the 98 Study forms could be linked to or
paired with a corresponding field in the 92 Control forms;
therefore, the two sets of data were treated as dependent
samples. First, we obtained an adjusted total for each of the
fields in both the Control form and the Study form. The
adjusted total consisted of the total number of potential
responses for each field minus the number of non-appli-
cable responses for each field.
Figure 5. The number and percent of fields for which the forms
performed similarly, the Control form performed better, or the
Study form performed better.
TABLE 3. FIELD NAMES AND PERCENT
DIFFERENCES FOR SAMPLE FIELDS
Form
Name of
Field
Percent
Difference
Control Agestop 28.30
Agestart 20.45
Ageabort 11.54
Termabor 11.54
Date82 5.48
Miscarr 3.88
Fhxbrsca 2.20
Aborts 0.55
Study Oraldose 66.19
Oralbran 60.43
Data1722 46.12
Data1724 42.86
Height 42.99
APPLIED RESEARCH
Information Design and Clinical Research Zimmerman and Schultz
188 TechnicalCOMMUNICATION • Second Quarter 2000
Figure 6b. Control form version of fields Oralbran and Oraldose.
Figure 6a. Study form version of fields Oralbran and Oraldose.
APPLIED RESEARCH
Information Design and Clinical ResearchZimmerman and Schultz
Second Quarter 2000 • TechnicalCOMMUNICATION 189
Then we obtained the percentage of completed re-
sponses for each field in both the Control form and the
Study form by dividing the number of completed responses
by the adjusted total. Next, we obtained the set of differ-
ence scores between the percentages for the Control form and
for the Study form by subtracting the percentage of completed
responses for the Study form from the percentage of com-
pleted responses for the Control form for each field.
One could think of our data as being organized into
three columns, the first column being the Study scores on
the questionnaire items, the second column being the Con-
trol scores on the same questionnaire items, and the third
column being the difference (D) between the two previous
columns. The null hypothesis would be that the mean of
the D score (third column) would equal zero. In our case,
however, the average difference between the two differ-
ence scores was 18.13, with a variance of 326.48 and a stan-
dard deviation of 18.01. Using a confidence interval of 0.05,
the result was 18.17 Ϯ (1.96)(1.51) or 15.2 Յ D Յ 21.1.
Our statistical analysis allowed us to make the follow-
ing conclusion: We can estimate, with 95% confidence, that
the difference between the fields for the Control form and
the Study form falls within the interval from 15.2 to 21.1
percent. Since all the values within that interval are posi-
tive, the Study form was 15.2 to 21.1% better in obtaining
patient information than the Control form.
Field analysis
Each version of the questionnaire contained a total of 143
fields. A statistical item analysis of the differences between
the forms for each field showed the following results:
᭜ In 4 percent of the fields (6 fields), patients com-
pleted the Control and Study forms equally well.
᭜ In 6 percent of the fields (8 fields), patients com-
pleted the Control form better than the Study form.
᭜ In 90 percent of the fields (129 fields), patients com-
pleted the Study form better than the Control form.
Figure 5 summarizes the results of the field analysis.
The largest differences where the Control form of the ques-
tionnaire performed better than the Study form occurred in
the Smoking History section of the questionnaire (fields
Agestop and Agestart). Table 3 shows the eight fields where
the Control form outperformed the Study form and the
differences for each field.
Figure 7b. Control form version of fields Data1722 and Data1724.
Figure 7a. Study form version of fields Data1722 and Data1724.
APPLIED RESEARCH
Information Design and Clinical Research Zimmerman and Schultz
190 TechnicalCOMMUNICATION • Second Quarter 2000
Interestingly, the designs of both forms were similar,
including the structure of the answer boxes; however, the
Control form elliptically stated the information being re-
quested (“Age Started Smoking”) while the Study form
asked, “If you are a smoker, at what age did you start
smoking?” A possible explanation of why the Control form
outperformed the Study form in these fields could be that
patients didn’t remember the age at which they started or
stopped smoking, or they may have started and stopped
smoking throughout their lives and therefore could not
answer with a specific age.
The Control form also appreciably outperformed the
Study form in fields where the design of the two forms was
similar but the Study form again asked for specifics on
sensitive subjects—abortion and miscarriage (fields Agea-
bor and Termabor). Again, the design of these fields was
the same in both version of the forms; however, the ques-
tions were phrased differently in the two versions. We
assume that in this instance, patients preferred not to reveal
specific information and provided only limited answers.
In the remaining four fields where the Control form
outperformed the Study form, the differences are slight and
could be attributed to either patient or study error.
Although the Study form outperformed the Control
form in 129 fields, we will discuss only five representative
fields where the Study form outperformed the control form
(see Table 3).
Fields Oralbran and Oraldose asked for patients to
provide the brand name and dosage of their oral contra-
ceptive (see Figure 6a and Figure 6b).
The Study form attempted to compensate for a seem-
ingly insurmountable problem: the large number of brands
of oral contraceptives. Using 10 years of data from patients
who had listed their brand of oral contraceptive, we were
able to identify the nine most popular brands of oral con-
traceptives, as well as the most popular doses of oral
contraceptives. Realizing that some patients may not have
used a popular brand or may have used multiple brands,
we also included such choices as “Multiple brands,” “Not
specified,” and “Other.” In providing a list of choices, we
attempted to avoid the possibility that many patients would
not answer the question because they couldn’t remember
the name of their oral contraceptive, thereby eliminating
“lack of recall” as a possible reason for failing to answer.
The Control form allowed the patient to write in the
name and dosage of the oral contraceptive but didn’t leave
enough space for the patient to write in the complete
name. Furthermore, the dosage answer field did not pro-
vide for a dosage such as “1/20” or “1.5 FE 28 days” (the
two most widely used dosages).
Figure 8b. Control form version of the field Height.
Figure 8a. Study form version of the field Height, with calculation example.
APPLIED RESEARCH
Information Design and Clinical ResearchZimmerman and Schultz
Second Quarter 2000 • TechnicalCOMMUNICATION 191
Other fields where the Study form outperformed the
Control form were those asking patients for specific infor-
mation, such as durations, dates, doses, and quantities. For
example, in the fields asking about alcohol consumption
(Data1722, Data1724, and Data1726) the Control form
failed to provide a “None” answer even though the data-
base itself required an answer (see Figure 7a and Figure
7b). Thus for those patients who answered “No” to the
preceding question asking whether or not they consumed
alcohol, the answer “Less than 1” isn’t the same as “None.”
As a result, patients may have left the fields blank. In the
Study form, a “None” answer was provided for these sec-
tions, as well as directions indicating to the patient that the
field required a response.
The Study design outperformed the Control form by
42.99% in the Height field (see Figure 8a and Figure 8b). The
database required that patients enter their height in inches. In
the U.S., however, most people think of their height in feet
and inches; therefore, they might not understand how to
compute their height in inches. To compensate for this prob-
lem, the Study form provided directions and an example
explaining how to calculate height in inches.
DISCUSSION
This study supports the hypothesis that applying principles
of information design and the results of user-feedback-
based usability testing can improve medical question-
naires. We believe that the design strategies we imple-
mented as a result of our usability testing helped us identify
specific problems patients had with the form and allowed
us to make it more effective and understandable to the
user. As a result, we were able to improve the accuracy of
the database by collecting significantly more information
about the patient than would have otherwise been
collected.
As a result of our study, we became aware of several
constraints that currently hinder information designers
working with medical forms. One of those constraints is a
lack of understanding by the medical community of infor-
mation design’s importance in medical research. For exam-
ple, the Moffitt Cancer Center’s Scientific Review Commit-
tee denied our original research protocol because “the
Committee felt that the aims of this project were, in reality,
more consistent with those of a quality assurance study
than a scientific study.” Although we complied with the
committee’s recommendation “to revise the procedures in
such a manner that [the] project [could] be conducted with-
out the necessity of either Moffitt Scientific Review or IRB
approval,” we believe the committee’s review exemplifies a
belief that the design of a form makes no difference in the
treatment of patients and has no bearing on medical research.
This belief also surfaced during the study. Members of
the nursing staff experienced problems when scanning
patient data from the Control version into the database
because the Control form lacked page numbers. Adminis-
trators insisted that page numbers be added to the Control
form, arguing that lack of page numbers was not a problem
that related to applying document design principles, but to
more careful editing. We reluctantly agreed to add page
numbers to the Control form during the data collection
phase to protect valuable patient records, despite feeling
that the change dismissed the value of the document de-
sign principles used in creating the Study form and could
have jeopardized the entire study.
We believe that the success of the design used in this
study should be considered in terms of its potential to add
to patients’ overall personal healthcare and raises an ethical
question about the practice of making broad generaliza-
tions regarding healthcare based on incomplete data. We
believe that the creation of effective data-collection forms is
more than a simple design or editing issue; it is a funda-
mental research issue.
Our efforts to create effective forms were generally
misunderstood by nurses and technicians who expressed
some frustration over the additional demands on their time
and resources to collect patient information. They also
questioned the role of information designers as part of the
medical support team because the nurses’ and technicians’
experience with health questionnaires had been that forms
are designed based on the designer’s preferences rather
than on forms design criteria generally accepted by infor-
mation designers.
Despite the results of this study, neither of the forms
was used beyond the scope of this study, nor were those
results considered when designing new forms. Before all
the data could be analyzed and conclusions drawn, physi-
cians at Moffitt began work on a larger study and devel-
oped entirely new forms using the same computer program
and the same systems analyst who had designed the Con-
trol version of this study’s questionnaire. As far as we
know, the new forms were created without using docu-
ment design principles and without consulting potential
users of the forms.
Our experience paralleled that described by Russek
and others (1997) in the field of physical therapy. They
found that therapists and staff trying to establish a clinical
database felt data collection was too inconvenient in a busy
clinic. We believe that more research must be done to
show that the value added to the medical profession by
information designers outweighs the costs of additional
time and training for medical staff. We applaud recent
efforts in the field of nursing informatics and public health-
care management to educate students and practitioners in
the use and value of computer-based patient records. At a
“summit” meeting in 1997, the Computer-based Patient
Records Institute, the American Health Information Man-
APPLIED RESEARCH
Information Design and Clinical Research Zimmerman and Schultz
192 TechnicalCOMMUNICATION • Second Quarter 2000
agement Association, the American Medical Informatics
Association, the Center for Healthcare Information Man-
agement, the Medical Library Association, and other orga-
nizations recommended a comprehensive framework for
medical research, including the adoption of information
management standards and coordination of terminology
and data set development.
Finally, our study brought to light constraints imposed
on the design of medical forms by database and forms
design software and emphasizes the need for information
designers to play a significant part in the design of com-
puter-based patient record systems. We hope that informa-
tion designers would also play a role in providing design
training for programmers and technicians who will use
these systems. TC
ACKNOWLEDGMENTS
Our thanks to the Society for Technical Communication for providing
the research grant that made this study possible and the medical
staff at the H. Lee Moffitt Cancer Center and Research Institute for
their cooperation.
REFERENCES
Bagin, C. B., and A. M. Rose. 1991. “Worst forms unearthed:
Deciphering bureaucratic gobbledygook.” Modern maturity
(February-March):64–66.
Braverman, M. T. 1996. “Sources of survey error: Implications for
evaluation studies.” In Advances in survey research, ed. M. T.
Braverman and J. K. Slater. San Francisco, CA: Jossey-Bass,
pp. 17–28.
Couper, M. P., and R. M. Groves. 1996. “Household-level
determinants of survey nonresponse.” In Advances in survey
research, ed. M.T. Braverman and J. K. Slater. San Francisco,
CA: Jossey-Bass, pp. 63–70.
Dillman, D. A. 1978. Mail and telephone surveys: The total design
method. New York, NY: John Wiley & Sons.
Duffy, T. M. 1981. “Organizing and utilizing document design
options.” Information design journal 2:256–266.
Kostelnick, C., and D. D. Roberts. 1998. Designing visual
language: Strategies for professional communicators. Boston,
MA: Allyn and Bacon.
Krosnick, J. A. 1991. “Response strategies for coping with the
cognitive demands of attitude measures in surveys.” Applied
cognitive psychology 5:213–236.
Krosnick, J. A., S. Narayan, and W. R. Smith. 1996. “Satisficing
in surveys: Initial evidence.” In Advances in survey research, ed.
M. T. Braverman and J. K. Slater. San Francisco, CA: Jossey-
Bass, 29–44.
Lee, R. M. 1993. Doing research on sensitive topics. Newbury
Park, CA: Sage.
Powsner, S. M., J. C. Wyatt, and P. Wright. 1998. “Opportunities for
and challenges of computerisation.” The lancet 14:1617–1623.
MacNealy, M. S. 1994. “Designing information-gathering forms.”
Proceedings of the STC Annual Conference. Arlington, VA:
Society for Technical Communication, pp. 440–442.
Reintgen, D., J. King, and C. Cox. 1996. “Computer database for
melanoma registry: A clinical management and research tool to
monitor outcomes and ensure continuous quality improvement.”
Surgical clinics of North America 76, no 6:1273–1285.
Rose, A. M. 1981. “Problems in public documents.” Information
design journal 2:179–196.
Russek, L., M. Wooden, S. Ekedahl, and A. Bush. 1997.
“Attitudes toward standardized data collection.” Physical therapy
77, no. 7:714–729.
Salant, P., and D. A. Dillman. 1994. How to conduct your own
survey. New York, NY: John Wiley & Sons.
Schriver, K. A. 1989. “Document design from 1980 to 1989:
Challenges that remain.” Technical communication 36:316–331.
Schriver, K. A. 1997. Dynamics of document design. New York,
NY: John Wiley & Sons.
Simon, H. A. 1957. Models of man. New York, NY: John Wiley &
Sons.
Wright, P. 1983. “Informed design for forms.” In Information
design: The design and evaluation of signs and printed material,
eds. R. Easterby and H. Zwaga. Chichester, UK: John Wiley &
Sons, pp. 545–577.
Wright, P., and P. Barnard. 1975. “‘Just fill in this form’—A review
for designers.” Applied ergonomics 6, no. 4:213–220.
Wright, P., and J. Haybittle. 1979a. “Design of forms for clinical
trials (1)” British medical journal 2:529–530.
Wright, P., and J. Haybittle. 1979b. “Design of forms for clinical
trials (2)” British medical journal, 2:590–592.
Wright, P., and J. Haybittle. 1979c. “Design of forms for clinical
trials (3).” British medical journal 2:650–651.
APPLIED RESEARCH
Information Design and Clinical ResearchZimmerman and Schultz
Second Quarter 2000 • TechnicalCOMMUNICATION 193
BEVERLY B. ZIMMERMAN is an assistant professor of En-
glish at Brigham Young University in Provo, UT, where she spe-
cializes in technical communication and instructional design. A
senior member of STC, she is faculty advisor of the BYU student
chapter. Contact information: beverly_zimmerman@ byu.edu
JESSICA R. SCHULTZ is university photographer at
Brigham Young University-Hawaii Campus. She received an
MA in English at Brigham Young University, emphasizing
rhetoric and document design. Contact information:
jrs@email.byu.edu
APPLIED RESEARCH
Information Design and Clinical Research Zimmerman and Schultz
194 TechnicalCOMMUNICATION • Second Quarter 2000

More Related Content

What's hot

Systematic reviews of
Systematic reviews ofSystematic reviews of
Systematic reviews ofislam kassem
 
cer%2E15%2E52
cer%2E15%2E52cer%2E15%2E52
cer%2E15%2E52kimErwin
 
Example Dissertation Proposal Defense Power Point Slide
Example Dissertation Proposal Defense Power Point SlideExample Dissertation Proposal Defense Power Point Slide
Example Dissertation Proposal Defense Power Point SlideDr. Vince Bridges
 
Therapeutic_Innovation_&_Regulatory_Science-2015-Tantsyura
Therapeutic_Innovation_&_Regulatory_Science-2015-TantsyuraTherapeutic_Innovation_&_Regulatory_Science-2015-Tantsyura
Therapeutic_Innovation_&_Regulatory_Science-2015-TantsyuraVadim Tantsyura
 
Results of an Online Survey of Stakeholders Regarding Barriers and Solutions ...
Results of an Online Survey of Stakeholders Regarding Barriers and Solutions ...Results of an Online Survey of Stakeholders Regarding Barriers and Solutions ...
Results of an Online Survey of Stakeholders Regarding Barriers and Solutions ...John Reites
 
Patient Recruitment Insights
Patient Recruitment InsightsPatient Recruitment Insights
Patient Recruitment InsightsDora Calderon
 
Clinical Trial Recruitment & Retention
Clinical Trial Recruitment & RetentionClinical Trial Recruitment & Retention
Clinical Trial Recruitment & RetentionAsijit Sen
 
Machine Learning applied to heart failure readmissions
Machine Learning applied to heart failure readmissionsMachine Learning applied to heart failure readmissions
Machine Learning applied to heart failure readmissionsJohn Frias Morales, DrBA, MS
 
Digital Scholar Webinar: Understanding and using PROSPERO: International pros...
Digital Scholar Webinar: Understanding and using PROSPERO: International pros...Digital Scholar Webinar: Understanding and using PROSPERO: International pros...
Digital Scholar Webinar: Understanding and using PROSPERO: International pros...SC CTSI at USC and CHLA
 
Best strategies for successful recruitment and retention
Best strategies for successful recruitment and retentionBest strategies for successful recruitment and retention
Best strategies for successful recruitment and retentionTrialJoin
 
BIM Forum_2010_Beyond a Reasonable Doubt
BIM Forum_2010_Beyond a Reasonable DoubtBIM Forum_2010_Beyond a Reasonable Doubt
BIM Forum_2010_Beyond a Reasonable DoubtUpali Nanda
 
Policy and research gap
Policy and research gapPolicy and research gap
Policy and research gapNayyar Kazmi
 
Improving Engagement and Enrollment in Research Studies by MYSUBJECTMATTERS.CA
Improving Engagement and Enrollment in Research Studies by MYSUBJECTMATTERS.CAImproving Engagement and Enrollment in Research Studies by MYSUBJECTMATTERS.CA
Improving Engagement and Enrollment in Research Studies by MYSUBJECTMATTERS.CAemendWELL Inc.
 
A reporting guide for qualitative studies
A reporting guide for qualitative studiesA reporting guide for qualitative studies
A reporting guide for qualitative studies怡然 张
 
Research Gaps and Evidences in Perioperative nursing
Research Gaps and Evidences in Perioperative nursingResearch Gaps and Evidences in Perioperative nursing
Research Gaps and Evidences in Perioperative nursingRyan Michael Oducado
 
Doctor onlineneeds
Doctor onlineneedsDoctor onlineneeds
Doctor onlineneedsAlex Sanchez
 

What's hot (20)

Systematic reviews of
Systematic reviews ofSystematic reviews of
Systematic reviews of
 
cer%2E15%2E52
cer%2E15%2E52cer%2E15%2E52
cer%2E15%2E52
 
Example Dissertation Proposal Defense Power Point Slide
Example Dissertation Proposal Defense Power Point SlideExample Dissertation Proposal Defense Power Point Slide
Example Dissertation Proposal Defense Power Point Slide
 
Therapeutic_Innovation_&_Regulatory_Science-2015-Tantsyura
Therapeutic_Innovation_&_Regulatory_Science-2015-TantsyuraTherapeutic_Innovation_&_Regulatory_Science-2015-Tantsyura
Therapeutic_Innovation_&_Regulatory_Science-2015-Tantsyura
 
Results of an Online Survey of Stakeholders Regarding Barriers and Solutions ...
Results of an Online Survey of Stakeholders Regarding Barriers and Solutions ...Results of an Online Survey of Stakeholders Regarding Barriers and Solutions ...
Results of an Online Survey of Stakeholders Regarding Barriers and Solutions ...
 
Patient Recruitment Insights
Patient Recruitment InsightsPatient Recruitment Insights
Patient Recruitment Insights
 
National Patient and Stakeholder Dialogue
National Patient and Stakeholder DialogueNational Patient and Stakeholder Dialogue
National Patient and Stakeholder Dialogue
 
Clinical Trial Recruitment & Retention
Clinical Trial Recruitment & RetentionClinical Trial Recruitment & Retention
Clinical Trial Recruitment & Retention
 
Machine Learning applied to heart failure readmissions
Machine Learning applied to heart failure readmissionsMachine Learning applied to heart failure readmissions
Machine Learning applied to heart failure readmissions
 
Digital Scholar Webinar: Understanding and using PROSPERO: International pros...
Digital Scholar Webinar: Understanding and using PROSPERO: International pros...Digital Scholar Webinar: Understanding and using PROSPERO: International pros...
Digital Scholar Webinar: Understanding and using PROSPERO: International pros...
 
Best strategies for successful recruitment and retention
Best strategies for successful recruitment and retentionBest strategies for successful recruitment and retention
Best strategies for successful recruitment and retention
 
BIM Forum_2010_Beyond a Reasonable Doubt
BIM Forum_2010_Beyond a Reasonable DoubtBIM Forum_2010_Beyond a Reasonable Doubt
BIM Forum_2010_Beyond a Reasonable Doubt
 
Clinical Trial Recruitment
Clinical Trial RecruitmentClinical Trial Recruitment
Clinical Trial Recruitment
 
Policy and research gap
Policy and research gapPolicy and research gap
Policy and research gap
 
Research design (1)
Research design (1)Research design (1)
Research design (1)
 
Improving Engagement and Enrollment in Research Studies by MYSUBJECTMATTERS.CA
Improving Engagement and Enrollment in Research Studies by MYSUBJECTMATTERS.CAImproving Engagement and Enrollment in Research Studies by MYSUBJECTMATTERS.CA
Improving Engagement and Enrollment in Research Studies by MYSUBJECTMATTERS.CA
 
A reporting guide for qualitative studies
A reporting guide for qualitative studiesA reporting guide for qualitative studies
A reporting guide for qualitative studies
 
Cross sectional design
Cross sectional designCross sectional design
Cross sectional design
 
Research Gaps and Evidences in Perioperative nursing
Research Gaps and Evidences in Perioperative nursingResearch Gaps and Evidences in Perioperative nursing
Research Gaps and Evidences in Perioperative nursing
 
Doctor onlineneeds
Doctor onlineneedsDoctor onlineneeds
Doctor onlineneeds
 

Similar to Zimmerman and Schultz 2000

Listening to the Patient - Leveraging Direct-to-Patient Data Collection to Sh...
Listening to the Patient - Leveraging Direct-to-Patient Data Collection to Sh...Listening to the Patient - Leveraging Direct-to-Patient Data Collection to Sh...
Listening to the Patient - Leveraging Direct-to-Patient Data Collection to Sh...John Reites
 
Methods for the design
Methods for the designMethods for the design
Methods for the designLe Van Dat
 
Research Methodology For A Researcher
Research Methodology For A ResearcherResearch Methodology For A Researcher
Research Methodology For A ResearcherRenee Wardowski
 
survey method.ppt community medicine psm
survey method.ppt community medicine psmsurvey method.ppt community medicine psm
survey method.ppt community medicine psmDr Ramniwas
 
Importance of research in the feild of medical science
Importance of research in the feild of medical scienceImportance of research in the feild of medical science
Importance of research in the feild of medical scienceIram Anwar
 
ACT500 Research Evaluation TablesArticle 1 Measuring Perfo.docx
ACT500 Research Evaluation TablesArticle 1 Measuring Perfo.docxACT500 Research Evaluation TablesArticle 1 Measuring Perfo.docx
ACT500 Research Evaluation TablesArticle 1 Measuring Perfo.docxbobbywlane695641
 
11.the questionnaire allergy among business executives in ghana
11.the questionnaire allergy among business executives in ghana11.the questionnaire allergy among business executives in ghana
11.the questionnaire allergy among business executives in ghanaAlexander Decker
 
MCJ 5532, Research Methods in Criminal Justice Administra.docx
MCJ 5532, Research Methods in Criminal Justice Administra.docxMCJ 5532, Research Methods in Criminal Justice Administra.docx
MCJ 5532, Research Methods in Criminal Justice Administra.docxAASTHA76
 
AN OVERVIEW OF QUANTITATIVE AND QUALITATIVE DATA COLLECTION METHODS 5. DATA C...
AN OVERVIEW OF QUANTITATIVE AND QUALITATIVE DATA COLLECTION METHODS 5. DATA C...AN OVERVIEW OF QUANTITATIVE AND QUALITATIVE DATA COLLECTION METHODS 5. DATA C...
AN OVERVIEW OF QUANTITATIVE AND QUALITATIVE DATA COLLECTION METHODS 5. DATA C...Sabrina Green
 
Complete the following assignments using excel and the following t
Complete the following assignments using excel and the following tComplete the following assignments using excel and the following t
Complete the following assignments using excel and the following tLynellBull52
 
Running head RESEARCH .docx
Running head RESEARCH                                          .docxRunning head RESEARCH                                          .docx
Running head RESEARCH .docxtodd521
 
Running Head RESEARCH .docx
Running Head RESEARCH                                            .docxRunning Head RESEARCH                                            .docx
Running Head RESEARCH .docxtodd521
 
Pandemic Preparedness Results and Recommendations.pdf
Pandemic Preparedness Results and Recommendations.pdfPandemic Preparedness Results and Recommendations.pdf
Pandemic Preparedness Results and Recommendations.pdfbkbk37
 
Translating Evidence into Practice Data Collection Assignment.pdf
Translating Evidence into Practice Data Collection Assignment.pdfTranslating Evidence into Practice Data Collection Assignment.pdf
Translating Evidence into Practice Data Collection Assignment.pdfsdfghj21
 
Admission To Medical School International Perspectives
Admission To Medical School  International PerspectivesAdmission To Medical School  International Perspectives
Admission To Medical School International PerspectivesWendy Hager
 
httpfmx.sagepub.comField Methods DOI 10.117715258.docx
httpfmx.sagepub.comField Methods DOI 10.117715258.docxhttpfmx.sagepub.comField Methods DOI 10.117715258.docx
httpfmx.sagepub.comField Methods DOI 10.117715258.docxpooleavelina
 

Similar to Zimmerman and Schultz 2000 (20)

Listening to the Patient - Leveraging Direct-to-Patient Data Collection to Sh...
Listening to the Patient - Leveraging Direct-to-Patient Data Collection to Sh...Listening to the Patient - Leveraging Direct-to-Patient Data Collection to Sh...
Listening to the Patient - Leveraging Direct-to-Patient Data Collection to Sh...
 
Methods for the design
Methods for the designMethods for the design
Methods for the design
 
Research Methodology For A Researcher
Research Methodology For A ResearcherResearch Methodology For A Researcher
Research Methodology For A Researcher
 
survey method.ppt community medicine psm
survey method.ppt community medicine psmsurvey method.ppt community medicine psm
survey method.ppt community medicine psm
 
Data & Technology
Data & TechnologyData & Technology
Data & Technology
 
Importance of research in the feild of medical science
Importance of research in the feild of medical scienceImportance of research in the feild of medical science
Importance of research in the feild of medical science
 
ACT500 Research Evaluation TablesArticle 1 Measuring Perfo.docx
ACT500 Research Evaluation TablesArticle 1 Measuring Perfo.docxACT500 Research Evaluation TablesArticle 1 Measuring Perfo.docx
ACT500 Research Evaluation TablesArticle 1 Measuring Perfo.docx
 
11.the questionnaire allergy among business executives in ghana
11.the questionnaire allergy among business executives in ghana11.the questionnaire allergy among business executives in ghana
11.the questionnaire allergy among business executives in ghana
 
MCJ 5532, Research Methods in Criminal Justice Administra.docx
MCJ 5532, Research Methods in Criminal Justice Administra.docxMCJ 5532, Research Methods in Criminal Justice Administra.docx
MCJ 5532, Research Methods in Criminal Justice Administra.docx
 
AN OVERVIEW OF QUANTITATIVE AND QUALITATIVE DATA COLLECTION METHODS 5. DATA C...
AN OVERVIEW OF QUANTITATIVE AND QUALITATIVE DATA COLLECTION METHODS 5. DATA C...AN OVERVIEW OF QUANTITATIVE AND QUALITATIVE DATA COLLECTION METHODS 5. DATA C...
AN OVERVIEW OF QUANTITATIVE AND QUALITATIVE DATA COLLECTION METHODS 5. DATA C...
 
Complete the following assignments using excel and the following t
Complete the following assignments using excel and the following tComplete the following assignments using excel and the following t
Complete the following assignments using excel and the following t
 
Running head RESEARCH .docx
Running head RESEARCH                                          .docxRunning head RESEARCH                                          .docx
Running head RESEARCH .docx
 
Running Head RESEARCH .docx
Running Head RESEARCH                                            .docxRunning Head RESEARCH                                            .docx
Running Head RESEARCH .docx
 
Pandemic Preparedness Results and Recommendations.pdf
Pandemic Preparedness Results and Recommendations.pdfPandemic Preparedness Results and Recommendations.pdf
Pandemic Preparedness Results and Recommendations.pdf
 
Examples Of Research Essays
Examples Of Research EssaysExamples Of Research Essays
Examples Of Research Essays
 
Rm sem-3
Rm sem-3Rm sem-3
Rm sem-3
 
research proposal 2
research proposal 2research proposal 2
research proposal 2
 
Translating Evidence into Practice Data Collection Assignment.pdf
Translating Evidence into Practice Data Collection Assignment.pdfTranslating Evidence into Practice Data Collection Assignment.pdf
Translating Evidence into Practice Data Collection Assignment.pdf
 
Admission To Medical School International Perspectives
Admission To Medical School  International PerspectivesAdmission To Medical School  International Perspectives
Admission To Medical School International Perspectives
 
httpfmx.sagepub.comField Methods DOI 10.117715258.docx
httpfmx.sagepub.comField Methods DOI 10.117715258.docxhttpfmx.sagepub.comField Methods DOI 10.117715258.docx
httpfmx.sagepub.comField Methods DOI 10.117715258.docx
 

Recently uploaded

Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibitjbellavia9
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.pptRamjanShidvankar
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docxPoojaSen20
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxVishalSingh1417
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxheathfieldcps1
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...Poonam Aher Patil
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...Nguyen Thanh Tu Collection
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfAdmir Softic
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxAreebaZafar22
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...christianmathematics
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptxMaritesTamaniVerdade
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxVishalSingh1417
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxnegromaestrong
 

Recently uploaded (20)

Sociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning ExhibitSociology 101 Demonstration of Learning Exhibit
Sociology 101 Demonstration of Learning Exhibit
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Application orientated numerical on hev.ppt
Application orientated numerical on hev.pptApplication orientated numerical on hev.ppt
Application orientated numerical on hev.ppt
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
psychiatric nursing HISTORY COLLECTION .docx
psychiatric  nursing HISTORY  COLLECTION  .docxpsychiatric  nursing HISTORY  COLLECTION  .docx
psychiatric nursing HISTORY COLLECTION .docx
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
General Principles of Intellectual Property: Concepts of Intellectual Proper...
General Principles of Intellectual Property: Concepts of Intellectual  Proper...General Principles of Intellectual Property: Concepts of Intellectual  Proper...
General Principles of Intellectual Property: Concepts of Intellectual Proper...
 
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
TỔNG ÔN TẬP THI VÀO LỚP 10 MÔN TIẾNG ANH NĂM HỌC 2023 - 2024 CÓ ĐÁP ÁN (NGỮ Â...
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
ICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptxICT Role in 21st Century Education & its Challenges.pptx
ICT Role in 21st Century Education & its Challenges.pptx
 
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
Explore beautiful and ugly buildings. Mathematics helps us create beautiful d...
 
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
2024-NATIONAL-LEARNING-CAMP-AND-OTHER.pptx
 
Unit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptxUnit-IV; Professional Sales Representative (PSR).pptx
Unit-IV; Professional Sales Representative (PSR).pptx
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Seal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptxSeal of Good Local Governance (SGLG) 2024Final.pptx
Seal of Good Local Governance (SGLG) 2024Final.pptx
 

Zimmerman and Schultz 2000

  • 1. APPLIED RESEARCH SUMMARY ᭜ Investigates the effectiveness of information design principles and feedback-based usability testing in developing clinical questionnaires ᭜ Finds that a form developed using information design principles collected significantly more data than did a control form A Study of the Effectiveness of Information Design Principles Applied to Clinical Research Questionnaires BEVERLY B. ZIMMERMAN AND JESSICA R. SCHULTZ INTRODUCTION M edical professionals must develop effective methods for collecting, filing, storing, and ab- stracting data about their clients. This data is used for monitoring clinical standards of care and intervention, and for clinical management and re- search. Increasingly, researchers depend on computer-gen- erated health surveys or questionnaires to obtain this med- ical data from patients. But such forms are not necessarily well designed, a problem that can result in potential loss of data or incorrect information. A possible solution to this problem is to apply information design principles to the data-collection forms; however, medical researchers and computer technicians usually have no experience in de- signing questionnaires or in evaluating whether patients can understand and use the data-gathering forms. Researchers may not appreciate the difficulty patients have in filling out data-collection forms. Krosnick (1991) pointed out that readers go through a complex cognitive process to provide an answer to each question on a survey or questionnaire: 1. Readers must interpret the meaning and intent of the question. 2. They must recall relevant information and inte- grate that information into a single answer. 3. They must translate that answer into an appropri- ate response on the form. When readers are motivated to perform these neces- sary cognitive tasks in a careful and objective manner, they are said to be “optimizing.” However, when the cognitive task of completing a questionnaire becomes too demand- ing, respondents may shift their response strategy to be less diligent in any or all parts of the response process and provide a satisfactory answer rather than an optimal one. This behavior, called satisficing (the term is borrowed from Simon 1957), may involve spending less time thinking about the meaning of a question, being careless in recalling and integrating information, and being less precise in se- lecting a response. Krosnick and others (1996) concluded that three factors increase the likelihood that a person responding to questions may begin to satisfice: ᭜ The greater the difficulty of the task ᭜ The lower the ability of the person reading the ques- tionnaire ᭜ The lower the person’s motivation to optimize Dillman (1978) argued that getting a response to a survey or questionnaire must be viewed as a special case of social exchange. Unlike economic exchange, where peo- ple exchange money for goods and services, the obliga- tions created by a questionnaire are unspecified and may vary widely. Whether or not people complete a question- naire or survey depends on their perception of the rewards Manuscript received 18 August 1999; revised 4 November 1999; accepted 10 November 1999. Second Quarter 2000 • TechnicalCOMMUNICATION 177
  • 2. they expect to receive as a result. Thus, if respondents believe that the cost in terms of time, inconvenience, or mental exertion exceeds the perceived reward, such as potential benefits and services, they will fail to complete the survey. Researchers at the American Institutes for Research found that poorly written forms were overwhelming to many readers who decided not to deal with them even when it meant giving up something they deserved and needed (Rose 1981). In a later study, Bagin and Rose (1991) analyzed nearly 4,000 responses to an article in Modern maturity about bad forms and reported that aside from tax forms, the largest number of complaints—over half of the responses—were about healthcare forms. Included among the complaints were the following: ᭜ Forms too complicated (reported by 48% of the re- spondents) ᭜ Instructions unclear (47%) ᭜ Form too long (30%) ᭜ Type too small (28%) ᭜ Not enough room for answers (25%) ᭜ Words too difficult (23%) ᭜ Information requested too personal (21%) The researchers concluded that bad forms and question- naires cost companies, hospitals, and doctors untold dol- lars in lost benefits and services. MacNealy (1994) also found that poorly designed in- formation-gathering forms drove up operating costs for businesses and other organizations. Recent computeriza- tion of medical forms has added to the problem. Powsner, Wyatt, and Wright (1998) report that although “poorly de- signed medical charting and data collection programs waste time, hamper data retrieval, and compromise the accuracy of record keeping,” computerization of medical records will become more common. If poorly written forms are costly and can cause prob- lems for readers, why aren’t forms better designed? The answer may be threefold: ᭜ Not all factors of the design or design process are within the designer’s control. ᭜ Those who create the forms may not be knowledge- able about or experienced with principles of infor- mation design. ᭜ Designers may inadequately analyze the audiences who will use their forms. Couper and Groves (1996) note that some factors such as the respondent’s environment, demographics, psycho- logical predisposition, and ability are outside the control of the information designer. However, they point out, recent research in survey methods and in cognitive and social psychology has shown that some factors are within the designer’s control. For example, research has shown that small changes in question wording, format, and ordering can increase questionnaire responses (Braverman 1996; Krosnick and others 1996; Salant and Dillman 1994) and confidentiality assurances can significantly increase re- sponse rates when the questionnaires ask about sensitive items such as health-related behavior (Lee 1993). Thus, Couper and Groves argue, some form design aspects can influence the cooperation of respondents. Unfortunately, as Duffy (1981) argued, there is a gap between research and practice in document design. Schriver (1989) in her review of recent research on the subject, concluded that the field still has “an impoverished and scattered literature” and there are few summaries of research in the field. Furthermore, she maintained, because information design is an eclectic field, borrowing from other disciplines, much of the research is conducted in widely disparate fields such as rhetoric, computer-human interaction, social psychology, computer technologies, dis- course analysis, and graphic design, to name only a few. It is no surprise, therefore, that many of the people who create documents such as forms and questionnaires, have no training or experience in document or information de- sign. In addition, Schriver (1997) found that creators of doc- uments such as forms and questionnaires often inade- quately analyze the audiences who will use their docu- ments. In a study of teenagers’ attitudes toward drug education literature, she concluded that the designers had little understanding of the needs and expectations of their audience because they failed to obtain direct input from the people for whom the survey questions were intended. Schriver’s study showed that without proper usability test- ing through reader input, even well-intentioned informa- tion designers may produce documents that fail to be used by their intended readers. THE RESEARCH STUDY The Comprehensive Breast Cancer Program at the H. Lee Moffitt Cancer Center and Research Institute at the Univer- sity of South Florida has for the past 10 years used simple printed forms to collect data on breast cancer patients. These forms were filled out by doctors and nurses, and then the data was manually entered into a computer data- base. The resulting database currently contains 2,000 pa- tient files, and information from the database is used not If poorly written forms are costly and can cause problems for readers, why aren’t forms better designed? APPLIED RESEARCH Information Design and Clinical Research Zimmerman and Schultz 178 TechnicalCOMMUNICATION • Second Quarter 2000
  • 3. only for improving patient care but also for expanding research (Reintgen and others 1996). Recently, the center has become the training site for lymphatic mapping world- wide and has been named as one of the top 10 breast cancer programs nationally. As a result, the number of patients has doubled in the past 3 years, and this influx of patients has overloaded the database system. In addition, guidelines issued by the U.S. federal government specify- ing new information requirements for billing patients made it imperative for the center to modify its data collection process. For these reasons, officials at the center wanted to redesign their forms so that ᭜ Patients could complete them prior to their arrival at the breast cancer center. ᭜ The forms could request more information (patient demographics, history of present illness, medica- tions, past medical history, family history, and re- view of systems). ᭜ The forms would permit automated data entry through the use of optical character recognition. Because the cancer center was redesigning their forms, we saw this as an opportunity to determine the effective- ness of information design principles as applied to a clin- ical questionnaire. The purpose of our study, therefore, was to determine whether designers using principles culled from survey research and from usability testing involving feedback-driven audience analysis could improve the ef- fectiveness of the new health questionnaires and thus im- prove the efficiency of the data collection process used at that time. Specifically, we sought to determine whether information designers could create properly designed health questionnaires to increase the amount of data col- lected and the accuracy of the data in the Breast Cancer Lymphatic Mapping Database. The hypothesis to be tested was that health questionnaires that had been designed by information designers would reduce the number of errors caused by incomplete data spaces or improper data entry. METHODOLOGY This study compared the effectiveness of two indepen- dently designed computer-generated health question- naires—a questionnaire created using information design principles and feedback-driven audience analysis, and a questionnaire designed by a computer systems analyst us- ing the default settings of the design software. The staff of the Comprehensive Breast Cancer Program at the H. Lee Moffitt Cancer Center and Research Institute under the direction of Charles E. Cox, MD, professor of surgery and director of the program, determined the general specifica- tions for both questionnaires to be used by the team of physicians, nurses, and researchers at the Comprehensive Breast Cancer Program. The systems analyst assigned to this project created what we call the Control form, and Brigham Young University (BYU) researchers created what we call the Study form. Design specifications were the same for both questionnaires; however, while designing the Study form, the BYU researchers also conducted us- ability testing and feedback-driven audience analysis using members of the form’s target audience. The audience for the new form was new patients at the H. Lee Moffitt Cancer Center and Research Institute. While the majority of the breast cancer patients are women aged 50 and older, there have been male breast cancer patients as well as patients as young as 21 treated at the Cancer Center. Although the patient normally completed the pa- tient clinical history form, the audience also included spouses and other family members of the patient because the form gathers information on the medical history of family members. Additional education and socio-economic factors were not specified because people from all back- grounds get breast cancer. Both the Control and Study forms were generated using Teleform Elite Version 5.0, a Windows 95 and Win- dows NT program designed by Cardiff Software, Inc. Tele- form Elite creates data collecting forms optimized for fax- ing or scanning into a computer database. Teleform can also export the data for use by other applications. Although the user’s manual for Teleform Elite Version 5.0 provides information on forms design, it does so in terms of how to add data fields to a form rather than tutoring the user on fundamentals of forms design. The manual also provides instructions on how to access a library of templates for commonly used data entry fields; however, it doesn’t dis- cuss the information design principles used to create these templates. Both forms requested the same patient information; that is, they both contained the same number and types of fields. However, for the Study form, designers identified each aspect of the data collection process that might affect the quality or quantity of responses, and applied insights The hypothesis to be tested was that health questionnaires that had been designed by information designers would reduce the number of errors caused by incomplete data spaces or improper data entry. APPLIED RESEARCH Information Design and Clinical ResearchZimmerman and Schultz Second Quarter 2000 • TechnicalCOMMUNICATION 179
  • 4. gathered from survey research, questionnaire design, and usability testing. Following the general guidelines outlined by Schriver (1997), we performed three stages of usability testing in creating the Study form: 1. Evaluating and testing of the original question- naire 2. Creation, evaluation, and testing of a revised questionnaire 3. Evaluation and testing of a final questionnaire However, to ensure that we were implementing appropri- ate design elements, we conducted two additional stages of usability testing on the Study form, for a total of five stages. Once they had been created, both the Control forms and the Study forms were randomly assigned to new pa- tients entering the Comprehensive Breast Cancer Center for new patient evaluations. Approximately 200 patients used the two forms over a 4-month period; that is, approxi- mately 100 patients received the Study form, and 100 pa- tients received the Control form. Nurses at the clinic dis- tributed the questionnaire randomly in unmarked envelopes and observed the following protocol: they could not assist the patient in any way in completing the ques- tionnaire; they could not verify the answers that patients filled in, nor could they fill in or change any answers on the questionnaires; they could only complete sections of the questionnaire marked “For Office Use Only.” In the final count, we obtained 98 Study forms and 92 Control forms. As the questionnaires were returned, the forms were verified to determine whether or not the patient answered each question and placed his or her answer to that specific question in the appropriate box. To maintain the privacy of the patients who completed the forms, all the identifying information was marked out of the questionnaires before they were given to us for analysis. In the middle of the implementation and data collect- ing phase of the study, the administrative staff at the H. Lee Moffitt Cancer Center requested that page numbers be added to the Control version of the questionnaire. The lack of page numbers was causing problems when the nurses faxed the form and verified the data before it was entered in the database, as well as when they scanned the ques- tionnaire in the patient’s chart. Although we felt this design change could compromise the research study, we reluc- tantly agreed to allow page numbers to be added to the Control form of the questionnaire because we did not want to jeopardize the patient’s entire record in the database. USABILITY TESTING PROTOCOL Following general guidelines outlined by Schriver (1997), we performed three stages of usability testing in creating the Study form: 1. Evaluating and testing the original questionnaire 2. Creating, evaluating, and testing the revised ques- tionnaire 3. Evaluating and testing the final Study form Stage 1. Evaluating and testing the original questionnaire Schriver argues that the first phase of usability testing should involve the evaluation of the form by both experts and users. In the first stage of our usability testing, we examined the original questionnaire, analyzing its features and predicting problems we believed users would encoun- ter in completing the form (see Figure 1). Then we characterized the questionnaire’s potential problems by completing an Expert Evaluation Form, a series of questions based on information design principles from the work of Wright and Barnard (1975), Dillman (1978), Wright and Haybittle (1979a, 1979b, 1979c), Wright (1983), Salant and Dillman (1994), Schriver (1997), and Kostelnick and Roberts (1998) illustrated in Table 1. The problems we identified in the original form in- cluded the lack of directions for completing the question- naire, inclusion of technical terms or jargon that only a medical practitioner would understand, inconsistent orga- nization of the questions, and poor grouping of questions about the respondent’s family interspersed throughout the questionnaire. These problems were consistent with com- plaints identified in studies by Bagin and Rose (1991) dis- cussed previously. In addition, we noted specific proce- dures that might cause problems, including expecting patients to calculate their height in inches and determine the size of a palpable breast mass using the metric system. Next, two volunteers completed the original question- naire and responded in terms of the difficulties they expe- rienced in filling out the form. Volunteers for this initial session of usability testing were purposely selected be- cause they had no training in information design. To par- ticipate in our usability testing, each potential test partici- pant had to be a nonpregnant female, be 21 years old or older, speak English as her first or second language, and be a potential candidate for breast cancer in the present or future. Because the Moffitt Cancer Center provides service to a high number of Spanish-speaking patients, our pool of usability testers included some individuals who spoke En- glish as their second language but whose first language was Spanish. As part of the usability testing procedure, we We audiotaped the users’ comments and noted problems the volunteers experienced in completing the form. APPLIED RESEARCH Information Design and Clinical Research Zimmerman and Schultz 180 TechnicalCOMMUNICATION • Second Quarter 2000
  • 5. created a Consent to Be a Usability Testing Research Sub- ject form that participants signed. The consent form was also translated into Spanish, although all the usability test- ing was performed in English and the questionnaires were in English. We audiotaped the users’ comments and noted prob- lems the volunteers experienced in completing the form. After completing the original form, volunteers completed a User Evaluation Form, rating the questionnaire in terms of its wording, design, and ease of use, using a scale of 1 to 5, with 5 being the highest rating. The problems identified by the usability volunteers included unfamiliar terminology, insufficient directions on how to complete the form, insufficient space for writing answers, and insufficient lines for writing answers when the question required multiple answers, such as multiple pregnancies or more than one major surgery. Stage 2. Creating, evaluating, and testing the revised questionnaire In the next stage of usability testing, designers revised the original form using data from the expert evaluation analy- Figure 1. Sample page from the original clinical questionnaire. APPLIED RESEARCH Information Design and Clinical ResearchZimmerman and Schultz Second Quarter 2000 • TechnicalCOMMUNICATION 181
  • 6. TABLE 1. CRITERIA USED TO EVALUATE THE ORIGINAL QUESTIONNAIRE DURING THE EXPERT EVALUATION Criteria Evaluation Questions Content Is the questionnaire appropriate for the specific user? Is there a specific purpose for the questionnaire and does it meet that purpose? Do the questions ask for appropriate information? Are the questions comprehensive enough to obtain the necessary information? Is there an explanation of the purpose of the questionnaire and of how to complete the overall questionnaire and each section of the questionnaire? Is there an explanation of what to do if the user experiences difficulty and what to do once the user has completed the questionnaire? Is each category meaningful? Is each question necessary? Organization Does the form follow a consistent format? Are questions about similar information clumped or grouped together? Are the questions organized in a logical order? Is important information asked for in a logical sequence? Is the overall form broken down into manageable subsections or questions? Does the form require too much writing? Are the spaces for the answers grouped close to the questions? Is there enough room to write the answers? Style Does the form have a professional tone without talking down to the audience? Is the form free of jargon and acronyms? Does the form use appropriate grammar, spelling, and punctuation? Are the sentences simple rather than complex? Are medical terms accurate? Are definitions or examples provided for difficult medical terms? Are the examples accurate and the explanations precise? Visual Design Does the form look professional? Does the form look simple to complete, or is it intimidating? Can users find important information using headings, columns, lists, or tables? Are page number and heading numbers accessible and easy to use? Are icons and symbols consistent? Are typographic cues uniform at each level of information? Are there too many navigational cues such as bolding, reverse type? Are the instructions and/or graphics attractive and professional looking? Are the illustrations and/or graphics clear and accurate? Is the typeface legible? Are line lengths consistent? APPLIED RESEARCH Information Design and Clinical Research Zimmerman and Schultz 182 TechnicalCOMMUNICATION • Second Quarter 2000
  • 7. sis, user feedback, and their training in information de- sign. Designers added general directions to the form, as well as directions indicating where patients could skip ques- tions that weren’t applicable to them, such as questions directed to cigarette smokers. Directions for completing individual ques- tions were italicized to set them apart from answers and exam- ples. Users had complained that many words in the original form were too technical for them to understand. While some might argue that wording problems might be consid- ered language problems rather than design issues, the research shows that a form’s use of technical terms affects its overall effectiveness. To simplify the terminology in the revised form, we used less technical terms or provided definitions in parentheses. For example, definitions were provided for terms such as bilateral, oopherectomy, and fibrocystic disease. The two usability testers indicated that being confined to writing in a limited space required that they omit perti- nent information from their medical history. For example, the original questionnaire provided space for only six chil- dren, four medical problems, and three surgeries. Although designers added more columns and rows for additional answers, they were limited by constraints imposed by the form design software, which required that all responses for a given field be on the same page. Although the additional columns and rows added to the overall length of the new form, it did not add to the total number of fields on the form. Finally, as much as possible given the constraints of the software, designers arranged the questions so that the pa- tient’s personal medical history was grouped together rather than being combined with questions about family medical history. After the form was revised, two volunteers read the revised form. Volunteers for these sessions were randomly selected from a pool of potential participants who fit the audience profile for this form. Again, we audiotaped the users’ comments. The testers also completed a question- naire asking them to rate the revised form in terms of its wording, design, and usability, using a scale of 1–5, with 5 being the highest rating. This usability testing revealed problems in the revised version of the form. We discovered that in trying to help the reader understand what to do with the questions and where to put the answers, we provided too many direc- tions for the questions, which made the participant less likely to read them at all. The users tended to skip the directions and examples and attempted to complete the questions on their own. After failing in their attempt to answer the question on their own, they read the directions and became even more confused. As a result, we again revised the form, omitting extensive directions and exam- ples from these sections and including one-sentence in- structions. In subsequent tests, usability volunteers were able to accurately complete the questions. We made other changes to the revised form. To correct a problem of emphasis, we bolded the key phrase in each question. To overcome a date format problem, we explained the required format of the date in parentheses. To solve the problem some patients had of not being able to calculate their height in inches, we added directions and an example of how to complete the calculation. To help patients who might not know measurements in millimeters or inches, we added com- parisons (such as “1 mm ϭ the size of a pen tip”) to help readers determine the diameter of a breast mass. We often found that we had to compromise among the needs of the users, the general design guidelines, and the constraints of the software. For example, although research (Wright and Haybittle 1979a, 1979b, 1979c) has shown that answers are less legible in character-segmented spaces, we had to use this feature to allow for optical character recogni- tion. And although users asked for more space for multiple responses, the computer program required that all items that referred to a specific field of the database be on the same page. Therefore, we could allow only for as many answers as would fit on the remaining space of a page. And, although users preferred a 12-point font, we had to keep the font smaller to avoid an excessive number of pages. Stage 3. Evaluating and testing the final Study form After the final Study form had been completed, two volun- teers read the revised form. Again, researchers audiotaped the users’ comments and measured the speed and accuracy with which the volunteers completed the form. Volunteers for this final session were randomly selected from a pool of potential participants who fit the audience profile for this form. By this stage of usability testing, the Study form had undergone several revisions to simplify it and to accom- modate it to the needs of the prospective audience. There were still some small problems that the volunteers had with the form; however, as explained previously, these prob- lems were minor and could not be corrected because of the parameters of the computer program and the database that We often found that we had to compromise among the needs of the users, the general design guidelines, and the constraints of the software. APPLIED RESEARCH Information Design and Clinical ResearchZimmerman and Schultz Second Quarter 2000 • TechnicalCOMMUNICATION 183
  • 8. collected the information from the form. ANALYSIS OF THE FORMS As part of the study, we conducted three different analyses: ᭜ A features or visual analysis of the overall design differences between the two forms ᭜ A statistical analysis using a matched sample t test of the data gathered from the forms ᭜ A field analysis of the performance differences be- tween each field in the forms Features analysis Using the work of Schriver (1997) and Kostelick and Rob- erts (1998), we completed both a features analysis and a visual analysis of both the Control form and the Study form that revealed many similarities and many differences be- tween the two forms (see Figure 2 and Figure 3). Both forms were 8.5 by 11 inches (21.59 ϫ 27.94 cm) in portrait orientation. Both forms used character-segmented spaces and bubbles for responses to optimize optical character recognition. Both forms were subject to the constraints of Figure 2. A sample page of the Control form. APPLIED RESEARCH Information Design and Clinical Research Zimmerman and Schultz 184 TechnicalCOMMUNICATION • Second Quarter 2000
  • 9. the Teleform Elite Designer software that did not allow for easily changing the spacing between lines (leading) or the spacing between letters (kerning), or for easily aligning the answer boxes and bubbles. The result was that both forms had a narrow left margin and a very ragged right margin. The Control form was 13 pages long. The font was Courier, the default for the Teleform Elite Designer pro- gram. The type size for questions and headings was incon- sistent, with some pages set in 10-point type and others in 12-point. The type size used for responses was 10 points. Both uppercase and lowercase letters were used, and the treatment was either bold or plain text. Headings were bolded and underlined. No page numbers were provided. No general instructions were given for completing the questionnaire, and no instructions were given for complet- ing each section of the form. The format for questions included both complete sentences and phrases (for exam- ple, “Date First mammogram”). Four terms were defined; one example was provided (helping patients determine a dosage according to the color of the pill). There were inconsistencies in capitalization (for example, “Family His- tory of benign Breast Conditions”), and irregular spacing of some headings resulted in truncated words (for example, “Famil Membe”). Figure 3. A sample page of the Study form. APPLIED RESEARCH Information Design and Clinical ResearchZimmerman and Schultz Second Quarter 2000 • TechnicalCOMMUNICATION 185
  • 10. Multiple typos were present (for example, “Nunmber,” “Estorgen,” “Previuosly,” and “Followup”). Responses were ordered with the N/A (non-applicable) response listed last. The Study form was 20 pages long. The font was Times New Roman. The type size was consistent, with all base text in 10-point and all headings in 12-point throughout the questionnaire. Both uppercase and lowercase letters were used, and the text was either bold or plain text. Headings were bolded, and four levels of headings were used. Page numbers were provided in the lower right-hand corner of each page. General instructions were given for completing the questionnaire (see Figure 4), and instructions for com- pleting each section were given in italics. The format for questions consisted of complete sentences. Ten terms were defined; eight examples were provided (including how to fill in answer boxes and bubbles, how to enter a date format, how to calculate height in inches, and how to determine the size of a breast mass). There were no incon- Figure 4. The instructions for the Study form. APPLIED RESEARCH Information Design and Clinical Research Zimmerman and Schultz 186 TechnicalCOMMUNICATION • Second Quarter 2000
  • 11. sistencies in capitalization, no typos, and no truncated headings. Responses were ordered with the non-applicable response listed first. Based on our analysis of the forms, we predicted that the differences between the two forms that would most affect patients’ behavior were the length of the form; the use of directions to help the patient fill out the question- naire; the level of the language as indicated by complete sentences, definitions of terms, and additional examples; the absence of errors and inconsistencies; and the presence of a hierarchy of structured headings. A summary of the design differences appears in Table 2. Although we believed the ordering of the responses would enable users to complete the form more quickly— because they could skip non-applicable questions without reading through the entire list of responses—we had no way to test our assumption. Ideally, we would have liked to conduct further observational studies with patients as they completed the actual forms, to determine whether there was a significant difference in the speed with which re- spondents completed the forms. However, the confidential nature of the information that was gathered and the risks to participants as a result of the effects of anxiety and stress on their medical condition prevented us from doing so. TABLE 2. SUMMARY OF DESIGN DIFFERENCES BETWEEN THE CONTROL AND STUDY FORMS Design Characteristic Control Form Study Form Typeface Courier Times New Roman Type size 10 pt and 12 pt 10 pt body 12 pt heading Case Upper/lower Upper/lower Headings Bold, underlined, 1 level Bold, 4 levels Length in pages 13 20 Inclusion of page numbers No Yes; bottom right General instructions Not included Included Section instructions Not included Included Number of defined terms 4 10 Number of examples 1 8 Format of questions Sentences and phrases Sentences only Spelling errors 4 spelling errors None Truncation of words Inconsistencies Capitalization None Spacing Order of responses N/A placed last N/A placed first APPLIED RESEARCH Information Design and Clinical ResearchZimmerman and Schultz Second Quarter 2000 • TechnicalCOMMUNICATION 187
  • 12. Statistical analysis Although the Teleform Elite software was designed to col- lect data in a format that tracks the responses, the program could not individually record and track responses by field. To determine the questionnaire’s efficiency, it was imper- ative that the program be able to do this type of item analysis because a comparison of responses by field was the key to the statistical analysis of the study. Because the Verification program could not simply analyze whether a field was marked or not, we had to manually score each of the forms. In performing this analysis, we determined whether or not a response was filled in for each field. Thus, for each field, there were five scoring categories. 1. Filled in, Correctly—that is, the patient properly wrote an answer in the appropriate field. 2. Filled in, Incorrectly—that is, the patient wrote an answer in the appropriate field, but in an improper format. For example, listing the height in feet rather than in inches. 3. Filled in, Shouldn’t Have—that is, the patient wrote an answer in a field where previous question re- sponses indicated that this question should not have been answered. For example, if patients responded that they were nonsmokers, then completed the field asking for the number of cigarettes smoked daily. 4. Not Filled in, Should Have—that is, the patient didn’t write an answer in a field where previous question responses indicated that this question should have been answered. For example, if patients responded that they had been pregnant but didn’t complete the field asking how many times they had been pregnant. 5. Not Filled in, Shouldn’t Have—that is, the patient didn’t write an answer in a field where previous question responses indicated that the question should not have been answered. The scoring of each field was not based on whether the answer was accurate, but rather on whether the respondent completed the field in accordance with the requirements for that field. A statistical analysis of the study data, a matched sam- ple t test, was conducted by estimating the difference between two population means with matched pairs. Each of the 143 fields in the 98 Study forms could be linked to or paired with a corresponding field in the 92 Control forms; therefore, the two sets of data were treated as dependent samples. First, we obtained an adjusted total for each of the fields in both the Control form and the Study form. The adjusted total consisted of the total number of potential responses for each field minus the number of non-appli- cable responses for each field. Figure 5. The number and percent of fields for which the forms performed similarly, the Control form performed better, or the Study form performed better. TABLE 3. FIELD NAMES AND PERCENT DIFFERENCES FOR SAMPLE FIELDS Form Name of Field Percent Difference Control Agestop 28.30 Agestart 20.45 Ageabort 11.54 Termabor 11.54 Date82 5.48 Miscarr 3.88 Fhxbrsca 2.20 Aborts 0.55 Study Oraldose 66.19 Oralbran 60.43 Data1722 46.12 Data1724 42.86 Height 42.99 APPLIED RESEARCH Information Design and Clinical Research Zimmerman and Schultz 188 TechnicalCOMMUNICATION • Second Quarter 2000
  • 13. Figure 6b. Control form version of fields Oralbran and Oraldose. Figure 6a. Study form version of fields Oralbran and Oraldose. APPLIED RESEARCH Information Design and Clinical ResearchZimmerman and Schultz Second Quarter 2000 • TechnicalCOMMUNICATION 189
  • 14. Then we obtained the percentage of completed re- sponses for each field in both the Control form and the Study form by dividing the number of completed responses by the adjusted total. Next, we obtained the set of differ- ence scores between the percentages for the Control form and for the Study form by subtracting the percentage of completed responses for the Study form from the percentage of com- pleted responses for the Control form for each field. One could think of our data as being organized into three columns, the first column being the Study scores on the questionnaire items, the second column being the Con- trol scores on the same questionnaire items, and the third column being the difference (D) between the two previous columns. The null hypothesis would be that the mean of the D score (third column) would equal zero. In our case, however, the average difference between the two differ- ence scores was 18.13, with a variance of 326.48 and a stan- dard deviation of 18.01. Using a confidence interval of 0.05, the result was 18.17 Ϯ (1.96)(1.51) or 15.2 Յ D Յ 21.1. Our statistical analysis allowed us to make the follow- ing conclusion: We can estimate, with 95% confidence, that the difference between the fields for the Control form and the Study form falls within the interval from 15.2 to 21.1 percent. Since all the values within that interval are posi- tive, the Study form was 15.2 to 21.1% better in obtaining patient information than the Control form. Field analysis Each version of the questionnaire contained a total of 143 fields. A statistical item analysis of the differences between the forms for each field showed the following results: ᭜ In 4 percent of the fields (6 fields), patients com- pleted the Control and Study forms equally well. ᭜ In 6 percent of the fields (8 fields), patients com- pleted the Control form better than the Study form. ᭜ In 90 percent of the fields (129 fields), patients com- pleted the Study form better than the Control form. Figure 5 summarizes the results of the field analysis. The largest differences where the Control form of the ques- tionnaire performed better than the Study form occurred in the Smoking History section of the questionnaire (fields Agestop and Agestart). Table 3 shows the eight fields where the Control form outperformed the Study form and the differences for each field. Figure 7b. Control form version of fields Data1722 and Data1724. Figure 7a. Study form version of fields Data1722 and Data1724. APPLIED RESEARCH Information Design and Clinical Research Zimmerman and Schultz 190 TechnicalCOMMUNICATION • Second Quarter 2000
  • 15. Interestingly, the designs of both forms were similar, including the structure of the answer boxes; however, the Control form elliptically stated the information being re- quested (“Age Started Smoking”) while the Study form asked, “If you are a smoker, at what age did you start smoking?” A possible explanation of why the Control form outperformed the Study form in these fields could be that patients didn’t remember the age at which they started or stopped smoking, or they may have started and stopped smoking throughout their lives and therefore could not answer with a specific age. The Control form also appreciably outperformed the Study form in fields where the design of the two forms was similar but the Study form again asked for specifics on sensitive subjects—abortion and miscarriage (fields Agea- bor and Termabor). Again, the design of these fields was the same in both version of the forms; however, the ques- tions were phrased differently in the two versions. We assume that in this instance, patients preferred not to reveal specific information and provided only limited answers. In the remaining four fields where the Control form outperformed the Study form, the differences are slight and could be attributed to either patient or study error. Although the Study form outperformed the Control form in 129 fields, we will discuss only five representative fields where the Study form outperformed the control form (see Table 3). Fields Oralbran and Oraldose asked for patients to provide the brand name and dosage of their oral contra- ceptive (see Figure 6a and Figure 6b). The Study form attempted to compensate for a seem- ingly insurmountable problem: the large number of brands of oral contraceptives. Using 10 years of data from patients who had listed their brand of oral contraceptive, we were able to identify the nine most popular brands of oral con- traceptives, as well as the most popular doses of oral contraceptives. Realizing that some patients may not have used a popular brand or may have used multiple brands, we also included such choices as “Multiple brands,” “Not specified,” and “Other.” In providing a list of choices, we attempted to avoid the possibility that many patients would not answer the question because they couldn’t remember the name of their oral contraceptive, thereby eliminating “lack of recall” as a possible reason for failing to answer. The Control form allowed the patient to write in the name and dosage of the oral contraceptive but didn’t leave enough space for the patient to write in the complete name. Furthermore, the dosage answer field did not pro- vide for a dosage such as “1/20” or “1.5 FE 28 days” (the two most widely used dosages). Figure 8b. Control form version of the field Height. Figure 8a. Study form version of the field Height, with calculation example. APPLIED RESEARCH Information Design and Clinical ResearchZimmerman and Schultz Second Quarter 2000 • TechnicalCOMMUNICATION 191
  • 16. Other fields where the Study form outperformed the Control form were those asking patients for specific infor- mation, such as durations, dates, doses, and quantities. For example, in the fields asking about alcohol consumption (Data1722, Data1724, and Data1726) the Control form failed to provide a “None” answer even though the data- base itself required an answer (see Figure 7a and Figure 7b). Thus for those patients who answered “No” to the preceding question asking whether or not they consumed alcohol, the answer “Less than 1” isn’t the same as “None.” As a result, patients may have left the fields blank. In the Study form, a “None” answer was provided for these sec- tions, as well as directions indicating to the patient that the field required a response. The Study design outperformed the Control form by 42.99% in the Height field (see Figure 8a and Figure 8b). The database required that patients enter their height in inches. In the U.S., however, most people think of their height in feet and inches; therefore, they might not understand how to compute their height in inches. To compensate for this prob- lem, the Study form provided directions and an example explaining how to calculate height in inches. DISCUSSION This study supports the hypothesis that applying principles of information design and the results of user-feedback- based usability testing can improve medical question- naires. We believe that the design strategies we imple- mented as a result of our usability testing helped us identify specific problems patients had with the form and allowed us to make it more effective and understandable to the user. As a result, we were able to improve the accuracy of the database by collecting significantly more information about the patient than would have otherwise been collected. As a result of our study, we became aware of several constraints that currently hinder information designers working with medical forms. One of those constraints is a lack of understanding by the medical community of infor- mation design’s importance in medical research. For exam- ple, the Moffitt Cancer Center’s Scientific Review Commit- tee denied our original research protocol because “the Committee felt that the aims of this project were, in reality, more consistent with those of a quality assurance study than a scientific study.” Although we complied with the committee’s recommendation “to revise the procedures in such a manner that [the] project [could] be conducted with- out the necessity of either Moffitt Scientific Review or IRB approval,” we believe the committee’s review exemplifies a belief that the design of a form makes no difference in the treatment of patients and has no bearing on medical research. This belief also surfaced during the study. Members of the nursing staff experienced problems when scanning patient data from the Control version into the database because the Control form lacked page numbers. Adminis- trators insisted that page numbers be added to the Control form, arguing that lack of page numbers was not a problem that related to applying document design principles, but to more careful editing. We reluctantly agreed to add page numbers to the Control form during the data collection phase to protect valuable patient records, despite feeling that the change dismissed the value of the document de- sign principles used in creating the Study form and could have jeopardized the entire study. We believe that the success of the design used in this study should be considered in terms of its potential to add to patients’ overall personal healthcare and raises an ethical question about the practice of making broad generaliza- tions regarding healthcare based on incomplete data. We believe that the creation of effective data-collection forms is more than a simple design or editing issue; it is a funda- mental research issue. Our efforts to create effective forms were generally misunderstood by nurses and technicians who expressed some frustration over the additional demands on their time and resources to collect patient information. They also questioned the role of information designers as part of the medical support team because the nurses’ and technicians’ experience with health questionnaires had been that forms are designed based on the designer’s preferences rather than on forms design criteria generally accepted by infor- mation designers. Despite the results of this study, neither of the forms was used beyond the scope of this study, nor were those results considered when designing new forms. Before all the data could be analyzed and conclusions drawn, physi- cians at Moffitt began work on a larger study and devel- oped entirely new forms using the same computer program and the same systems analyst who had designed the Con- trol version of this study’s questionnaire. As far as we know, the new forms were created without using docu- ment design principles and without consulting potential users of the forms. Our experience paralleled that described by Russek and others (1997) in the field of physical therapy. They found that therapists and staff trying to establish a clinical database felt data collection was too inconvenient in a busy clinic. We believe that more research must be done to show that the value added to the medical profession by information designers outweighs the costs of additional time and training for medical staff. We applaud recent efforts in the field of nursing informatics and public health- care management to educate students and practitioners in the use and value of computer-based patient records. At a “summit” meeting in 1997, the Computer-based Patient Records Institute, the American Health Information Man- APPLIED RESEARCH Information Design and Clinical Research Zimmerman and Schultz 192 TechnicalCOMMUNICATION • Second Quarter 2000
  • 17. agement Association, the American Medical Informatics Association, the Center for Healthcare Information Man- agement, the Medical Library Association, and other orga- nizations recommended a comprehensive framework for medical research, including the adoption of information management standards and coordination of terminology and data set development. Finally, our study brought to light constraints imposed on the design of medical forms by database and forms design software and emphasizes the need for information designers to play a significant part in the design of com- puter-based patient record systems. We hope that informa- tion designers would also play a role in providing design training for programmers and technicians who will use these systems. TC ACKNOWLEDGMENTS Our thanks to the Society for Technical Communication for providing the research grant that made this study possible and the medical staff at the H. Lee Moffitt Cancer Center and Research Institute for their cooperation. REFERENCES Bagin, C. B., and A. M. Rose. 1991. “Worst forms unearthed: Deciphering bureaucratic gobbledygook.” Modern maturity (February-March):64–66. Braverman, M. T. 1996. “Sources of survey error: Implications for evaluation studies.” In Advances in survey research, ed. M. T. Braverman and J. K. Slater. San Francisco, CA: Jossey-Bass, pp. 17–28. Couper, M. P., and R. M. Groves. 1996. “Household-level determinants of survey nonresponse.” In Advances in survey research, ed. M.T. Braverman and J. K. Slater. San Francisco, CA: Jossey-Bass, pp. 63–70. Dillman, D. A. 1978. Mail and telephone surveys: The total design method. New York, NY: John Wiley & Sons. Duffy, T. M. 1981. “Organizing and utilizing document design options.” Information design journal 2:256–266. Kostelnick, C., and D. D. Roberts. 1998. Designing visual language: Strategies for professional communicators. Boston, MA: Allyn and Bacon. Krosnick, J. A. 1991. “Response strategies for coping with the cognitive demands of attitude measures in surveys.” Applied cognitive psychology 5:213–236. Krosnick, J. A., S. Narayan, and W. R. Smith. 1996. “Satisficing in surveys: Initial evidence.” In Advances in survey research, ed. M. T. Braverman and J. K. Slater. San Francisco, CA: Jossey- Bass, 29–44. Lee, R. M. 1993. Doing research on sensitive topics. Newbury Park, CA: Sage. Powsner, S. M., J. C. Wyatt, and P. Wright. 1998. “Opportunities for and challenges of computerisation.” The lancet 14:1617–1623. MacNealy, M. S. 1994. “Designing information-gathering forms.” Proceedings of the STC Annual Conference. Arlington, VA: Society for Technical Communication, pp. 440–442. Reintgen, D., J. King, and C. Cox. 1996. “Computer database for melanoma registry: A clinical management and research tool to monitor outcomes and ensure continuous quality improvement.” Surgical clinics of North America 76, no 6:1273–1285. Rose, A. M. 1981. “Problems in public documents.” Information design journal 2:179–196. Russek, L., M. Wooden, S. Ekedahl, and A. Bush. 1997. “Attitudes toward standardized data collection.” Physical therapy 77, no. 7:714–729. Salant, P., and D. A. Dillman. 1994. How to conduct your own survey. New York, NY: John Wiley & Sons. Schriver, K. A. 1989. “Document design from 1980 to 1989: Challenges that remain.” Technical communication 36:316–331. Schriver, K. A. 1997. Dynamics of document design. New York, NY: John Wiley & Sons. Simon, H. A. 1957. Models of man. New York, NY: John Wiley & Sons. Wright, P. 1983. “Informed design for forms.” In Information design: The design and evaluation of signs and printed material, eds. R. Easterby and H. Zwaga. Chichester, UK: John Wiley & Sons, pp. 545–577. Wright, P., and P. Barnard. 1975. “‘Just fill in this form’—A review for designers.” Applied ergonomics 6, no. 4:213–220. Wright, P., and J. Haybittle. 1979a. “Design of forms for clinical trials (1)” British medical journal 2:529–530. Wright, P., and J. Haybittle. 1979b. “Design of forms for clinical trials (2)” British medical journal, 2:590–592. Wright, P., and J. Haybittle. 1979c. “Design of forms for clinical trials (3).” British medical journal 2:650–651. APPLIED RESEARCH Information Design and Clinical ResearchZimmerman and Schultz Second Quarter 2000 • TechnicalCOMMUNICATION 193
  • 18. BEVERLY B. ZIMMERMAN is an assistant professor of En- glish at Brigham Young University in Provo, UT, where she spe- cializes in technical communication and instructional design. A senior member of STC, she is faculty advisor of the BYU student chapter. Contact information: beverly_zimmerman@ byu.edu JESSICA R. SCHULTZ is university photographer at Brigham Young University-Hawaii Campus. She received an MA in English at Brigham Young University, emphasizing rhetoric and document design. Contact information: jrs@email.byu.edu APPLIED RESEARCH Information Design and Clinical Research Zimmerman and Schultz 194 TechnicalCOMMUNICATION • Second Quarter 2000