SlideShare a Scribd company logo
1 of 32
Download to read offline
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 1
An Overview of Measurement Issues in School Violence and School Safety Research
Jill D. Sharkey, Erin Dowdy, Jennifer Twyford, and Michael J. Furlong
Version 1.3.11— Final, in press
Jill D. Sharkey, Ph.D., is an Academic Coordinator in the Department of Counseling, Clinical,
and School Psychology at the University of California, Santa Barbara.
<jsharkey@education.ucsb.edu>
Erin Dowdy, Ph.D., is an Assistant Professor in the Department of Counseling, Clinical, and
School Psychology at the University of California, Santa Barbara.
<edowdy@education.ucsb.edu>
Jennifer Twyford, Ed.S., is a Doctoral Candidate in the Department of Counseling, Clinical, and
School Psychology at the University of California, Santa Barbara.
<jtwyford@education.ucsb.edu>
Michael J. Furlong, Ph.D., is a Professor in the Department of Counseling, Clinical, and School
Psychology, University of California, Santa Barbara. <mfurlong@education.ucsb.edu>
Chapter submitted for publication in:
The Handbook of School Violence and School Safety:
International Research and Practice
Edited by Shane R. Jimerson, Amanda B. Nickerson, Matthew J. Mayer, & Michael J. Furlong
To be published by Routldege, New York
7,864 words
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 2
Abstract
Precise and accurate measures are needed for research to advance knowledge about school
violence and safety concerns. This chapter provides an overview of school violence and safety
measurement issues. After introducing the history of measurement of school violence and
safety, we provide a review of recent studies of school violence with a focus on their
measurement approaches and practices. We then describe procedures commonly used to
measure school safety- and violence-related variables, review prominent threats to the reliability
and validity of these procedures, and suggest mechanisms for improvement.
Keywords: school violence, school safety, methodology, self-report, psychometrics,
epidemiology, research, assessment, measurement
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 3
An Overview of Measurement Issues in School Violence and School Safety Research
Progress has been made in understanding school violence and safety measurement issues;
however, despite calls to action (e.g., Furlong, Morrison, Cornell, & Skiba, 2004; Sharkey,
Furlong, & Yetter, 2004), school violence and safety research continues to be hindered by the
limited scrutiny given to the psychometric properties of its measurement tools. This chapter
focuses on how researchers have measured school violence and safety variables and the
implications of their practices for data validity. After a historical summary, recent school
violence research is examined with a focus on these studies’ measurement practices. Existing
methods to measure school violence, and crucial measurement considerations, are described. We
conclude with a list of important measurement considerations for school violence research.
School Violence and Safety Measurement in Historical Context
School violence and safety instruments predominately include items developed to
estimate population trends to monitor public health conditions. Although items on population-
based surveys, such as the Youth Risk Behavior Survey (YRBS; Centers for Disease Control and
Prevention [CDC], 2004, 2009, 2010), have undergone content scrutiny by researchers, they are
not typically subjected to rigorous psychometric analyses. This is because, in part, their original
purpose was to obtain population estimates of behaviors and not to assess individual differences,
or to provide information about the root causes of school violence. Nonetheless, researchers
often use these surveillance surveys to examine individual differences and the association among
risk and health behaviors. However, because the items were not developed for this specific
purpose, their sensitivity and validity for complex measurement purposes is not established.
To date, only the Survey of School Crime and Safety was originally and specifically
designed to assess school safety and violence conditions; however, it is completed by school
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 4
principals and does not produce student, staff, or parent-level information. Although many of
the most prominent sources of information about school safety and violence have come from
sources that were not specifically designed for this purpose, modifications of instruments to
include school-context items still provide the best information available about school violence
and safety and is used for the reports, Indicators of School Crime and Safety (e.g., Dinkes,
Kemp, & Baum, 2009) and Crime, Violence, Discipline, and Safety in U.S. Public Schools
(Neiman & DeVoe, 2009). There is overwhelming momentum to suggest that surveillance
instruments will continue to be widely used because they have established baseline information
that is being used to inform public policy.
Measurement Practices of Recent School Violence and Safety Research
To provide a perspective on recent school violence research measurement practices, we
conducted a search of peer-reviewed journal articles with the descriptor of “school violence” for
2009 in the PsycInfo database (a list of these studies and review summary table is available from
the authors). This search yielded 59 articles focused on youth in kindergarten through Grade 12
in the following categories: (a) qualitative or theoretical discourse regarding school shootings (n
= 18), (b) relations between variables (n = 16), (c) bullying and victimization (n = 7), (d) school
violence prevalence (n = 6), (e) other theory/review (n = 4), (f) prevention and intervention
studies (n = 4), and (j) book reviews and journal editorials (n = 4). Approximately half (n = 33)
of these articles were empirical studies. Although this review was designed to be inclusive, it
yielded only a sampling of articles on the topic of school violence. For example, the search only
identified 12 of 29 articles published in the Journal of School Violence in 2009. Identifying
critical scholarship in the field of school violence would be facilitated with commonly applied
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 5
keywords and descriptors such as “school violence,” “school safety,” and where appropriate
“measurement.”
Standards for reporting quantitative methods in the social sciences necessitate a
theoretical rationale for the selection of variables, clearly defined variables, and established
psychometric characteristics of measures (e.g., Osborne, 2010). To examine the degree to which
recent studies use psychometrically sound measures, we examined 32 (one was inaccessible) of
the empirical articles. First, we counted the different types of measures used in each study. Each
study could only contribute one count per measure, but multiple measures could be counted per
study. The most commonly used type of measurement was self-report (n = 24; e.g., Rose,
Espelage, & Monda-Amaya, 2009), distantly followed by teacher report (n = 4; e.g., Yavuzer,
Gundogdu, & Dikici, 2009), and neighborhood crime records (n = 2; e.g., Cornell, Sheras,
Gregory, & Fan, 2009). The following measurement strategies were used in only one study:
mapping tools, interviews and focus groups, observations, sociometric questionnaire, card sort,
disciplinary records, trained professional rating, and participatory action research. Self-report
was by far the most popular type of measure employed, particularly when examining relations
between variables. Although most self-report studies focused on youth, one of the self-report
measures was for teacher reports of their bully experiences and how these experiences impacted
their school’s bullying interventions (Kokko & Pörhölä, 2009).
Second, we examined the methods section of each article to evaluate the psychometric
properties of measures used. Although the majority of the studies used established measures
(e.g., California School Climate & Safety Survey [Furlong, 1996; Furlong et al., 2005] or the
YRBS) only 2 studies (Rose et al., 2009; Saylor & Lech, 2009) provided both reliability and
validity information of the school violence scales used. Most (n = 17) reported reliability, but no
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 6
validity information (e.g., Cornell et al., 2009). In a few cases (n = 3; e.g., Estévez, Murgui, &
Musitu, 2009), measures were modified for the specific study and the resulting reliability, but no
validity, data were reported. Several studies reported no psychometric information (n = 5; e.g.,
Chen & Astor, 2009). Other scholars developed their own measures as part of the study (n = 8)
but less than half (n = 3; e.g., Sela-Shayovitz, 2009) reported psychometric properties of the
resulting scale. One study adapted a measure by translating it into different languages, but did
not report any psychometric data regarding the final scales (Linares, Días, Fuentes, & Acién,
2009). Overall, it was clear that some scales used in school violence research are well developed
with multiple citations reflecting prior development and use. Yet, in many cases, scales were
developed specifically for one study, at times without any reliability or validity analysis
presented.
This examination of recent studies in the area of school violence indicated limited
consensus about the use of common measures to examine school violence. In 32 studies, 31
different approaches or scales were used to measure school violence. When established
measures were used, oftentimes, psychometric properties for the study sample were
underreported (e.g., Turner, Powell, Langhinrichsen-Rohling, & Carson, 2009). Measures were
adapted for unique purposes without revalidation and even translated without documenting new
psychometric components (e.g., Linares et al., 2009). These recent research practices demand
attention because they suggest that the validity of school violence scholarship is questionable due
to its measurement practices. Our analysis also revealed popular and neglected areas of study.
Discourse regarding school shootings was highly popular and dominated more than a quarter of
the articles. The study of bullying and victimization was also popular. Neglected, however, were
several areas needing additional scholarship such as school violence prevention, crisis
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 7
intervention, gang interdiction, school discipline practices, and the influence of school climate on
school safety.
School Violence and Safety Measurement Instruments
Despite the need for improved measurement practices, a body of evidence is available for
some commonly used instruments. This section provides an overview of procedures to assess
school violence and safety. The most frequent way of measuring safety and violence-related
concerns is through self-report surveys, including population-based surveillance surveys. Other
common methods include mandated districtwide reporting of school discipline referrals.
Additional methods such as mapping tools and sociometric questionnaires are used less
frequently, and thus, are not detailed here.
Comprehensive School Violence and Safe School Instruments
National surveys of students’ self-reported behaviors and experiences are the most
common form of data collection regarding school violence and provide information about the
widespread prevalence of a problem. However, because this information is aggregated over
broad contexts, it is useful for estimating state or national trends, but its applicability to local
conditions is restricted. At the school level, students and teachers may be more interested in how
their school compares to similar schools in the community or to examine changes over time in
response to interventions. School- and district-level assessment of school violence indicators is
crucial to indicate specific local needs, as well as to provide a baseline against which treatment
efficacy can later be evaluated (Benbenishty, Astor, & Zeira, 2003). Moreover, individual
development occurs within the context of school and community influences, and large-scale
surveys rarely account for contextual variables (other than geographic region and city size).
Therefore, a person’s characteristics inherently reflect many environmental influences over a
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 8
protracted time period, so that attempts to partition out the effects of environment from
individual-level variables are unlikely to yield large effects. Oftentimes, the sophistication of the
analytic methods used to examine predictors of aggression and violence (such as hierarchical
linear modeling) supersedes the psychometric sophistication of the measures used to gather the
data (Sullivan, 2002).
Youth Risk Behavior Surveillance Survey (YRBS). The YRBS was originally
developed in the late 1980s as a youth health risk surveillance system (e.g., see Kann, 2001;
Kann et al., 2000). It is an anonymous self-report survey administered to students within their
classroom. The YRBS is administered biennially to a representative sample of United States
high school students. Its content focuses on health-risk behaviors that may result in later
disability, mortality, morbidity, and/or significant social problems (CDC, 2009) including
alcohol and drug use, unintended pregnancies, dietary behaviors, physical activity, and sexual
activity. Results are used to monitor health-risk behaviors among high school students, evaluate
the impact of various efforts to decrease the prevalence of health-risk behaviors, and monitor the
progress of national health objectives (e.g., CDC, 2010). However, there is limited reliability
and validity data to support the use of the YRBS (beyond its epidemiological purpose) to
describe characteristics of youths who engage in high-risk behaviors (see CDC, 2004). For
instance, the stability of weapon-carrying items, as measured through test-retest reliability, was
low, and reliability analyses did not address the different time frames of items (i.e., 30-day vs.
lifetime; Brener et al., 2002). Finally, the validity of the responses of youth who selected the
most extreme response options was not supported (see Furlong, Sharkey, Bates, & Smith, 2004).
California Healthy Kids Survey (CHKS). The CHKS is a series of assessment modules
developed by WestEd’s Human Development Program in collaboration with Duerr Evaluation
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 9
Resources for the California Department of Education (California Department of Education
[CDE], 2010). Items for the CHKS were taken from the YRBS and the California Student
Substance Use Survey. The CHKS provides schools with a method of collecting ongoing youth
health and risk behavior data and, since 1998, has been successfully administered across the
United States and internationally. Three separate versions of the CHKS are available for use at
the elementary, middle school, and high school levels (CDE, 2010) and supplemental topics may
be added to the core surveys. The surveys contain items concerning youth nutrition and physical
activity, sexual behavior, exposure to prevention and intervention activities, risk and protective
factors, alcohol, tobacco and other drug use as well as items specific to violence, school safety,
gang involvement, and delinquency (CDE, 2010). The surveys include items to assess the
truthfulness of each respondent’s answers. The results can be compared to results from the
YRBS and various health-risk surveys. Benefits of using the CHKS include: meeting categorical
program requirements (such as Title IV), identifying program goals and high-risk groups (e.g.,
CDE, 2010), and identifying risk and protective factors (through its Resiliency and Youth
Development Module), which has an extensive report of its psychometric properties (Hanson &
Kim, 2007).
Mandated Districtwide Reporting
The federal Gun-Free Schools Act of 1994 (GFSA; 1994) imposed specific reporting
requirements for each state regarding the number of students who engage in a variety of violent
behaviors. Unfortunately, data submitted by districts were not comparable (Kingery &
Coggeshall, 2001) because (a) some states reported more detailed information, (b) districts
differed in how they categorized violent behavior, (c) inconsistencies in time periods assessed
interfered with comparing data, and (d) federally-mandated definitions of the classifications of
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 10
weapons and violent behavior were often not specified in student surveys (for instance, the
GFSA definitions included bombs and gas as firearms). Kingery and Coggeshall (2001) also
cited other problems that interfered with a clear understanding of the data. They noted that (a)
students may hide serious incidents from staff, (b) school personnel may fail to detect many
violence incidents or only document observed infractions, (c) staff are not often trained in a
standardized fashion, and (d) school personnel may feel pressured to underreport violence so not
to reflect negatively on their school or district.
Discipline Referrals
School discipline data, such as office referrals, suspensions, and expulsions, have the
potential to provide valuable information about students’ risk for future infractions. Sugai,
Sprague, Horner, and Walker (2000) observed that all schools collect discipline data to some
extent, so discipline data may be the most effective way for school systems to understand the
relationships and behavioral contexts that disrupt schools. Discipline data can show (a) which
student behaviors are of greatest concern; (b) whether or not there are disproportionate referrals
by gender, ethnicity, or special education status; and (c) whether or not there are disproportionate
referrals made by certain teachers or during certain periods of instruction (e.g., recess, P.E.,
reading, and math). This information may be used to identify targets for intervention. However,
methods for collecting discipline data need careful consideration and procedures need to be
standardized across classroom, playground, and other school settings.
Morrison, Peterson, O’Farrell, and Redding (2004) note that although these data are
easily obtained, little is known about the reliability or validity for predicting future aggressive
acts with either school-level or individual-level data. Additionally, Morrison and Skiba (2001)
caution that school discipline data are not as straightforward as they might appear. Behavior
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 11
referrals, suspensions, and other disciplinary actions reflect not only students’ behavior, but also
teachers’ tolerance for disruptive behavior, teachers’ skills in classroom management,
administrative discipline policies, and other classroom, school, and community factors, although
they often fail to document the contribution of these environmental influences. Thus, predicting
disruptive and violent behavior from school discipline data is problematic because it must
account for these multiple levels of influence.
Persistent School Violence and Safety Measurement Issues
Although numerous studies examine factors affecting the measurement accuracy of other
health-risk behaviors, such as substance use and dietary behavior, studies examining behaviors
related to violence are scarce (Brener, Billy, & Grady, 2003). Researchers have identified threats
to the validity of school violence survey data and recommended a variety of design practices for
enhancing survey accuracy. These strategies include maintaining confidentiality or anonymity of
responses (Ong & Weiss, 2006), verifying the survey’s reading level (Stapleton, Cafarelli,
Almario, & Ching, 2010), checking for the honesty of answers (Rosenblatt & Furlong, 1997),
including sufficient numbers of items to measure a given construct (Bradley & Sampson, 2006),
and asking questions about past experiences in ways that are most likely to elicit accurate recall
(Coggeshall & Kingery, 2001). Despite the need for enhanced methodological rigor, information
about the reliability and validity of methods used to collect self-report surveys is limited.
Implausible Responding Patterns
Investigators need to address the possibility that students may not respond honestly. This
issue is related to, but different than, the occurrence of missing or spoiled survey responses to
specific questions. For example, with the YRBS, one report (Brener, Kann, et al., 2004)
indicated that unusable responses were obtained for 0.4% (age) to 15.5% (suicide attempt) of all
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 12
2003 YRBS responses. Data screening methods should detect response inconsistencies or
implausibly extreme patterns of responding, which may indicate not just that specific item
responses were spoiled, but that the validity of the entire set of responses is questionable.
Although data validity screening should be an essential element of all large-scale student
surveys, unfortunately, few school violence surveys have included such safeguards (Cornell &
Loper, 1998; Kingery & Coggeshall, 2001), and many others do not report whether or how they
evaluated the quality of student responses (Furlong, Morrison et al., 2004). To investigate this
issue, researchers from the University of Oregon examined dubious and inconsistent responders
to a state survey and found that 1.9% of the cases were eliminated (Oregon Healthy Teens,
2004). This finding supports the general utility of using students’ self reports; however, it
suggests that surveys that do not apply such standards may produce results that overestimate very
low incidence risk behaviors because inconsistent responders report higher rates of risk
behaviors (Cornell & Loper, 1998; Rosenblatt & Furlong, 1997).
The CHKS uses high quality data screening and is a carefully developed instrument,
having undergone more than six years of rigorous development and review by a standing panel
of independent experts. Analysis of CHKS data includes a data-validation procedure that screens
records from the data set that meet a combination of criteria, including inconsistent patterns of
responding, implausible reports of drug use, the endorsement of a fictitious drug, or failure to
assent to having answered survey questions honestly. The CHKS procedure is rigorous and
typically produces a listwise case elimination of about 2–3% of cases (e.g., O’Brennan &
Furlong, 2010). In their examination of responses to the California School Climate and Safety
Survey, Rosenblatt and Furlong (1997) compared a group of students who failed reliability and
validity checks to a matched control group. They found systematic bias in the way failed
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 13
students responded to the survey, including higher ratings of school violence victimization,
campus danger, poor grades, and few good friends. By contrast, comparison students were more
likely to be detected by the social desirability item.
Although surveyors are often concerned with participants responding in a socially
desirable manner, one concern is that youths involved with antisocial and aggressive peers will
exaggerate their involvement in delinquent activities as an alternative form of social desirability.
For instance, Cross and Newman-Gonchar (2004) examined responses on three surveys: the
Colorado Youth Survey, the Safe Schools/Healthy Students survey (SS/HS), and a Perception
Survey. When examining the pattern of extreme and inconsistent responses, they found that not
only did rates vary substantially by survey and by school, but extreme and inconsistent responses
inflated rates of violent behavior much more at one school than at the other school. Even small
percentages of questionable responses had the ability to inflate estimates of risk behavior
substantially. For example, by excluding the fewer than 3% of responses to the Colorado Youth
Survey that were suspect, Cross and Newman-Gonchar eliminated 70% of reported incidents of
gun-carrying. It is important to note than such small differences in rates are important for low
incidence behaviors (such as school gun possession) because even high-quality large-scale
surveys such as the YRBS are designed to have a 5% confidence interval (Brener, Kann et al.,
2004). However, school violence researchers rarely consider the effects of implausible responses
on their study results.
Survey Administration Procedures
Cross and Newman-Gonchar (2004) examined the impact of survey administrator
training on student response patterns. District prevention specialists coached a group of teachers
about the importance of collecting quality data, and they were taught to explain the uses and
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 14
importance of the data to their students and to ask them to respond honestly. Rates of invalid
responses to surveys administrated by trained versus untrained teachers were compared and
results indicated that the trained administrators obtained far lower rates of highly suspect
responses (3%) than the untrained administrators (28%). Related to this finding, researchers
using large sample databases rarely present data related to adherence with a standard
administration protocol by school staff.
In addition to administration procedures, responses may be influenced by survey format.
Turner et al. (1998) provided one of few investigations regarding how traditional paper-and-
pencil formats and computer assisted presentation influence response rates. Based on similar
studies about youth substance use, computer presentations were expected to produce higher self-
report rates. Turner and colleagues found that the computer format produced substantially higher
prevalence rates than did the paper-and-pencil format for weapon-carrying, acts of violence, and
threatened violence. The authors explained that the computer format, with the benefit of audio
presentation and computerization, was likely to promote more accurate responding for sensitive
questions. This explanation is supported by evidence of similarly high rates from studies that
rely on retrospective accounts by adults regarding sensitive adolescent behaviors.
In a related study, researchers from the CDC compared paper-and-pencil versions to
several online versions of the YRBS (Denniston et al., 2010). They found that seeking passive
parental consent for both the hard copy and the online options produced similar usable response
rates. One difference was that when the survey was completed online the youth were more likely
to say that they did not think the answers were private, given the visibility of monitors in the
computer lab administration setting. The students were also given one other option. They were
given a card with the URL for the survey and asked to complete it on their own time over the
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 15
following two weeks. They received a reminder after one week. When given the option and
asked to do it on their own time, only 24% of the students completed the survey. Although not
mentioned by these researchers, this was interesting because this latter finding suggests that the
level of intrinsic motivation to complete school violence and safety surveys among adolescents
may be low.
In an important study, Hilton, Harris, and Rice (2003) examined the consistency of youth
self-reports of violence victimization and perpetration (not school violence) and compared
prevalence rates derived from traditional paper-and-pencil reports to those provoked by the same
experienced modeled in an audio vignette. They found that the same youths reported 2 to 3
times more violence perpetration and victimization using the self-report format. This finding,
considered in relation to other studies (e.g., Hilton, Harris, & Rice, 1998), led these researchers
to conclude that, “…although we would not suggest that standard paper-and-pencil surveys yield
completely inaccurate data, the accumulated evidence, including these results, calls on
researchers in the field of interpersonal violence to redouble efforts to demonstrate and improve
the factual accuracy of their primary dependent measures” (Hilton et al., 2003, p. 235).
Item Wording
It is also important to evaluate if all respondents understand school violence survey
questions in the same way. Brener, Grunbaum, et al. (2004) found that differences in item
wording across three national surveys resulted in significantly different rates of behavior. One
challenge with surveys of school violence and safety is that questions are worded so broadly as
to leave a great deal of interpretation to individual responders (Cornell & Loper, 1998). A
survey question that asks students if they carry a weapon without defining the word “weapon”
has the potential to result in students reporting their hunting rifles or pocketknives as weapons.
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 16
For example, the SS/HS (Safe Schools Healthy Students, 2004) item, “During	
  the	
  past	
  12	
  
months,	
  how	
  often	
  have	
  you	
  been	
  picked	
  on	
  or	
  bullied	
  by	
  a	
  student	
  on	
  school	
  property?”	
  is	
  
ambiguous	
  for	
  several	
  reasons.	
  	
  First,	
  it	
  is	
  unclear	
  whether	
  being	
  picked	
  on	
  and	
  being	
  
bullied	
  are	
  to	
  be	
  interpreted	
  as	
  separate	
  experiences,	
  or	
  if	
  they	
  are	
  implied	
  to	
  be	
  
equivalent.	
  	
  Beyond	
  this,	
  the	
  item	
  does	
  not	
  define	
  “bullying”	
  in	
  a	
  way	
  that	
  is	
  consistent	
  with	
  
best	
  research	
  practices.	
  	
  It	
  could	
  be	
  argued	
  that	
  this	
  item	
  does	
  not	
  measure	
  “bullying”	
  
victimization	
  per	
  se,	
  but	
  rather	
  each	
  student’s	
  understanding	
  of	
  this	
  word.	
  	
  To promote
consistent and accurate responses, items should be unambiguous, easy to read, and all terms
should be clearly defined (Fowler, 1993).
Items Response Time Frames
Many commonly used school violence and safety instruments include items that refer to
past behavior using a variety of time frames (e.g., 30 days, 6 months, 1 year). It would seem
logical to presume that survey responses in any given month provide a cross-section (in time) of
students’ behaviors and experiences and that the reported incidence of these behaviors would be
higher over a much longer period of time. For example, if 10% of students report carrying a
weapon to school in the past 30 days, one might expect this percentage to be higher if the same
sample of students were asked to report about past year weapon possession. However, no
published research has confirmed this. According	
  to	
  Schwartz	
  (1999),	
  asking	
  about	
  past	
  
month	
  incidents	
  is	
  likely	
  to	
  convey	
  to	
  respondents	
  that	
  researchers	
  want	
  to	
  know	
  about	
  
less	
  serious	
  but	
  common	
  events.	
  	
  In	
  contrast,	
  students	
  might	
  interpret	
  asking	
  about	
  fights	
  
in	
  the	
  past	
  year	
  as	
  seeking	
  information	
  about	
  less	
  frequent	
  but	
  more	
  serious	
  fights	
  
(Schwartz, 1999).	
  
To explore this issue, Hilton and colleagues (1998) examined differences in self-reports
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 17
across 1-month, 6-month, and 1-year time periods. They reported “…standard self-reports of
interpersonal violence were insensitive to the specified time frame; for example, participants
reported almost the same number of violent acts in the past month as in the past year, something
that could not be factually true” (p. 234). Similarly, based on their review of what is known
about influences of response time referents, Brener et al. (2003) concluded that multiple factors
affect student recollection of school violence experiences.
Survey designers should be mindful that respondents often find it difficult to accurately
remember past events (Cornell & Loper, 1998; Fowler, 1993). For this reason, survey questions
that ask about past behavior or experiences can incorporate any of a variety of techniques for
enhancing recall. Converse and Presser (1986) recommend: (a) using simple language; (b)
asking about experiences within a narrow reference period (within no more than a six-month
period); (c) using memory landmarks (e.g., asking about behavior “since the beginning of the
school year” or “since New Year’s”); and (d) stimulating recall by describing concrete events
(“Instead of asking respondents if they have experienced ‘assaults,’ for instance, ... ask if
anyone…used force by grabbing, punching, choking, scratching, or biting” p. 22).
Unfortunately, school violence and safety research has not yet thoroughly attended to these
measurement concerns.
Item-Response Options
Not only do surveys differ in the response time frame, but they also vary in the number
and type of response options offered. Often, items that appear on one survey are included on
other questionnaires with a different number of response options, without stating a rationale for
the change in response options. For example, the YRBS asks, “During the past 30 days, on how
many days did you carry a weapon such as a gun, knife, or club on school property?” The
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 18
following five response options are offered; 0 days, 1 day, 2 or 3 days, 4 or 5 days, and 6 or more
days. However, when applied by Farrell and Meyer (1997) to evaluate the effectiveness of a
violence prevention program, the item included six response options: Never, 1-2 times, 3-5
times, 6-9 times, 10-19 times, and 20 times or more. Although it might appear that similar items
with different numbers of response options yield equivalent results, in fact, this is not the case
(Schwartz, 1999). In addition, Schwartz noted that survey respondents also are most likely to
choose response options near the middle of response scales.
School Violence and Safety Instrument Psychometric Properties
The survey method typically used to assess school safety and violence seeks to identify
indicators that increase the likelihood of negative outcomes. This practice differs substantially
from that most often used in school psychology, which establishes in advance the psychometric
properties of a scale for identification or diagnosis purposes (Cornell & Loper, 1998).
Unfortunately, some reports have used survey instruments for identification purposes without
adequately examining the psychometric properties of the scales. There is limited research
examining the reliability and validity of school violence and safety instruments.
Stability of Responses
As with any self-report measure, measures of violence and safety should have rigorous
psychometric testing to establish reliability and validity (Rosenblatt & Furlong, 1997). Given
that school violence and safety instruments typically do not measure latent traits, the reliability
of greatest interest is the consistency of responses over a brief interval. Unfortunately, testing of
response stability is infrequent. Brener et al. (2002) completed the only study to date that has
examined the reliability of the YRBS items that inquired about school-associated behaviors.
They examined the responses of 4,619 of 6,802 eligible students, derived from a convenience
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 19
sample, who completed the YRBS twice over a two-week time period. They subsequently
converted the responses of all items into binary format to compute a kappa statistic, a measure of
response consistency that is corrected for chance agreement. Kappa coefficients ranged from .41
to .68 (in the moderate to substantial range; Landis & Koch, 1977) for the four YRBS items that
directly assess school violence content. The incidence of one behavior (“Injured in a physical
fight one time in the past 12 months”) was significantly higher at time 2 compared to time 1.
Though Brener et al. concluded that, in general, the YRBS is a reliable instrument and this study
is cited often to justify the use of the YRBS, there are several methodological questions with this
study, particularly when school violence items are examined. These issues pertain to the
appropriateness of excluding inconsistent responses prior to analysis; converting responses to
binary format and using the kappa statistic; using a 14-day (average) retest period for items with
30-day or 12-month reference timeframes; and interpreting kappa coefficients for items with
different reference timeframes (e.g., past 30 days and past 12 months). As a result, this analysis
likely generated the most favorable possible test-retest reliability. Even so, the reliability
coefficients for the school items were only in the moderate range.
Validity of Measures
Skiba et al. (2004) noted that instruments can contribute significantly to the field of
school violence and safety only when they include the complex and representative set of factors
involved. They recommended using empirical methods such as factor analysis to justify
inclusion of items in surveys, rather than relying only on professional judgment, as done in the
past. Regarding the content of school safety and violence measures, Skiba et al. (2004) pointed
out that many surveys focus on significant acts of violence, such as weapon carrying. However,
although extreme violence is the end result within a context of unsafe school climate, it may not
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 20
provide enough variance to yield meaningful comparisons, particularly for typical schools. For
this reason, Skiba et al. (2004) stated that researchers need to focus less on serious violent acts,
especially as smaller discipline problems are more frequent and are part of a common trajectory
towards future antisocial behavior.
A few researchers have developed school violence and safety instruments using
psychometric approaches to scale development (Furlong, Sharkey, et al., 2004; Skiba et al.,
2004). Although these instruments are not appropriate for surveillance style surveys, they have
potential to provide options when used as outcome measures in local evaluations and studies
using controlled experimental designs. Recently, Greif and Furlong (2004a, 2004b) extended the
psychometric approach by using item response analysis to create a unidimensional school
violence victim scale. Such a scale could be used to track victimization experiences across time.
Table 1 provides a review of the school violence and safety measures along with their limitations
and recommendations for improvement.
Conclusion
Efforts to refine school violence and safety measurement practices are needed. Although
more sophisticated techniques are being employed, such as web administration with programmed
skip patterns, it is essential that researchers continue to examine the fundamental aspects of
measurement procedures. Significant limitations currently exist, perhaps most notably the
continued and sustained use of items from instruments with little, or no known, validity
evidence. Researchers and consumers of research should be aware of the dangers of drawing
conclusions based on measures with limited or poor psychometric properties. Similarly, as new
measures are being developed it is critical that researchers do not rely on a “bootstrapping”
approach, in which they continually try to validate new measures against known inferior
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 21
measures until enough evidence is accumulated to demonstrate that the newly developed
measure is superior. In this chapter, we provided a review of the current state of research and
current available instruments and highlighted many issues affecting measurement accuracy. The
key recommendations of the chapter are summarized in Table 2. While there is still much to be
done, it is hoped that the concepts and guidelines discussed here will lead to improved research
in school violence and safety measurement.
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 22
References
Benbenishty, R., Astor, R. A., & Zeira, A. (2003). Monitoring school violence: Linking
national-, district-, and school-level data over time. Journal of School Violence, 2, 29–50.
doi:10.1300/J202v02n02_03
Bradley, K. D., & Sampson, S. O. (2006). Constructing a quality assessment through Rasch
techniques: The process of measurement, feedback, reflection and change. In X. Liu &
W. Boone (Eds.), Applications of Rasch measurement in science education (pp. 23–44).
Maple Grove, MN: JAM Press.
Brener, N. D., Billy, J. O. G., & Grady, W. R. (2003). Assessment of factors affecting the
validity of self-reported health-risk behavior among adolescents: Evidence from the
scientific literature. Journal of Adolescent Health, 33, 436–457. doi:10.1016/S1054-
139X(03)00052-1
Brener, N. D., Grunbaum, J. A., Kann, L., McManus, T., & Ross, J. (2004). Assessing health risk
behaviors among adolescents: The effect of question wording and appeals for honesty.
Journal of Adolescent Health, 35, 91–100. doi:10.1016/j.jadohealth.2003.08.013
Brener, N. D., Kann, L., Kinchen, S. A., Grunbaum, J., Whalen, L., … Ross, J. G. (2004).
Methodology of the Youth Risk Behavior Surveillance System. MMWR
Recommendations and Reports, 53(RR-12), 1–13.
Brener, N. D., Kann, L., McManus, T., Kinchen, S. A., Sundberg, E. C., & Ross, J. G. (2002).
Reliability of the 1999 Youth Risk Behavior Survey Questionnaire. Journal of Adolescent
Health, 31, 336–342. doi:10.1016/S1054-139X(02)00339-7
California Department of Education (CDE). (2010). California Healthy Kids Survey web site.
Retrieved September 5, 2010, from http://chks.wested.org/
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 23
Centers for Disease Control and Prevention (CDC). (2004). Methodology of the Youth Risk
Behavior Surveillance System. MMWR, 53(RR-12), 1–11.
Centers for Disease Control and Prevention (CDC). (2009). 2009 Youth Risk Behavior Survey.
Retrieved from www.cdc.gov/yrbss
Centers for Disease Control and Prevention (CDC). (2010). Youth Risk Behavior Surveillance
Survey — United States, 2009. Surveillance Summaries, MMWR, 59, No. SS-5.
Available from http://www.cdc.gov/HealthyYouth/yrbs/index.htm
Chen, J. K. & Astor, R. A. (2009). The perpetration of school violence in Taiwan: An analysis of
gender, grade level and school type. School Psychology International, 30, 568-584.
doi:10.1177/0143034309107076
Coggeshall, M. B., & Kingery, P. M. (2001). Cross-survey analysis of school violence and
disorder. Psychology in the Schools, 38, 107–116. doi:10.1002/pits.1003
Converse, J. M., & Presser, S. (1986). Survey questions: Handcrafting the standardized
questionnaire. Newbury Park, CA: Sage.
Cornell, D., Sheras, P., Gregory, A., & Fan, X. (2009). A retrospective study of school safety
conditions in high schools using the Virginia threat assessment guidelines versus
alternative approaches. School Psychology Quarterly, 24, 119–129.
doi:10.1037/a0016182
Cornell, D. G., & Loper, A. B. (1998). Assessment of violence and other high-risk behaviors
with a school survey. The School Psychology Review, 27, 317–330.
Cross, J. E., & Newman-Gonchar, R. (2004). Data quality in student risk behavior surveys and
administrator training. Journal of School Violence, 3, 89–108.
doi:10.1300/J202v03n02_06
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 24
Denniston, M. D., Brener, N. D., Kann, L., Eaton, D. K., McManus, T., … Ross, J. G. (2010).
Comparison of paper-and-pencil versus Web administration of the Youth Risk Behavior
Survey (YRBS): Participation, data quality, and perceived privacy and anonymity,
Computers in Human Behavior, 26, 1054–1060. doi:10.1016/j.chb.2010.03.006
Dinkes, R., Kemp, J., & Baum, K. (2009). Indicators of school crime and safety: 2009 (NCES
2010–012/ NCJ 228478). Washington, DC: National Center for Education Statistics,
Institute of Education Sciences, U.S. Department of Education, and Bureau of Justice
Statistics, Office of Justice Programs, U.S. Department of Justice.
Estévez, E., Murgui, S., & Musitu, G. (2009). Psychological adjustment in bullies and victims of
school violence. European Journal of Psychology of Education, 24, 473–483.
doi:10.1007/BF03178762
Farrell, A. D., & Meyer, A. L. (1997). The effectiveness of a school-based curriculum for
reducing violence among urban sixth-grade students. American Journal of Public Health,
87, 979–984. doi:10.2105/AJPH.87.6.979
Fowler, F. J., Jr. (1993). Survey research methods (2nd ed.). Newbury Park, CA: Sage.
Furlong, M. J. (1996). Tools for assessing school violence. In S. Miller, J. Brodine, & T. Miller
(Eds.), Safe by design: Planning for peaceful school communities (pp. 71–84). Seattle,
WA: Committee for Children.
Furlong, M. J., Greif, J. L., Bates, M. P., Whipple, A. D., Jimenez, T. C., & Morrison, R. (2005).
Development of the California School Climate and Safety Survey–Short Form.
Psychology in the Schools, 42, 137–149. doi:10.1002/pits.20053
Furlong, M. J., Morrison, G. M., Cornell, D., & Skiba, R. (2004). Methodological and
measurement issues in school violence research: Moving beyond the social problem era.
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 25
Journal of School Violence, 3(2/3), 5–12. doi:10.1300/J202v03n02_02
Furlong, M. J., Sharkey, J. D., Bates, M. P., & Smith, D. C. (2004). An examination of the
reliability, data screening procedures, and extreme response patterns for the Youth Risk
Behavior Surveillance Survey. Journal of School Violence, 3(2/3), 109–130.
doi:10.1300/J202v03n02_07
Greif, J. L., & Furlong, M. J. (2004a). Towards precision in measuring school violence
victimization using IRT. Presentation at the 20th Annual Meeting of the International
Society for Traumatic Stress Studies, New Orleans, LA.
Greif, J. L., & Furlong, M. J. (2004b). Using item response analysis to develop a unidimensional
school violence victimization scale. In Persistently safe schools: The National
Conference of the Hamilton Fish Institute on School and Community Violence.
Washington, DC: Hamilton Fish Institute, George Washington University. Available,
from http://gwired.gwu.edu/hamfish/AnnualConference/2004/
Gun-Free Schools Act (GFSA), 20 USC §8921. (1994).
Hanson, T. L., & Kim, J. O. (2007). Measuring resilience and youth development: The
psychometric properties of the Healthy Kids Survey. (Issues & Answers Report, REL
2007–No. 034). Washington, DC: U.S. Department of Education, Institute of Education
Sciences, National Center for Education Evaluation and Regional Assistance, Regional
Educational Laboratory West. Retrieved from http://ies.ed.gov/ncee/edlabs
Hilton, N. Z., Harris, G. T., & Rice, M. E. (1998). On the validity of self-reported rates of
interpersonal violence. Journal of Interpersonal Violence, 13, 58–72.
doi:10.1177/088626098013001004
Hilton, N. Z., Harris, G. T., & Rice, M. E. (2003). Correspondence between self-reports of
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 26
interpersonal violence. Journal of Interpersonal Violence, 18, 223–239.
doi:10.1177/0886260502250065
Kann L. (2001). The Youth Risk Behavior Surveillance System: Measuring health-risk
behaviors. American Journal of Health Behavior, 25, 272–277.
Kann, L., Kinchen, S. A., Williams, B. I., Ross, J. G., Lowry, R., Grunbaum, J. A., & Kolbe, L.
J. (2000). Youth Risk Behavior Surveillance – United States, 1999. Journal of School
Health, 70, 271–285. doi:10.1111/j.1746-1561.2000.tb07252.x
Kingery, P. M., & Coggeshall, M. B. (2001). Surveillance of school violence, injury, and
disciplinary actions. Psychology in the Schools, 38, 117–126. doi:10.1002/pits.1004
Kokko, T. H. J., & Pörhölä, M. (2009). Tackling bullying: Victimized by peers as a pupil, an
effective intervener as a teacher? Teaching and Teacher Education, 25, 1000–1008.
doi:10.1016/j.tate.2009.04.005
Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical
data. Biometrics, 33, 159–174. doi:10.2307/2529310
Linares, J. J. G., Días, A. J. C., Fuentes, M. D. C. P., & Acién, F. L. (2009). Teachers’
perception of school violence in a sample from three European countries. European
Journal of Psychology of Education, 24, 49–59. doi:10.1007/BF03173474
Morrison, G. M., Peterson, R., O’Farrell, S., & Redding, M. (2004). Using office referral records
in school violence research: Possibilities and limitations. Journal of School Violence,
3(2/3), 139–149. doi:10.1300/J202v03n02_04
Morrison, G. M., & Skiba, R. (2001). Predicting violence from school misbehavior: promises
and perils. Psychology in the Schools, 38, 173–184. doi:10.1002/pits.1008
Neiman, S., & DeVoe, J. F. (2009). Crime, violence, discipline, and safety in U.S. public
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 27
schools: Findings from the School Survey on Crime and Safety: 2007–08 (NCES 2009-
326). Washington, DC: National Center for Education Statistics, Institute of Education
Sciences, U.S. Department of Education.
O’Brennan, L. M., & Furlong, M. J. (2010). Relations between students’ perceptions of school
connectedness and peer victimization. Journal of School Violence, 9, 375–391.
doi:10.1080/15388220.2010.509009
Ong, A. D., & Weiss, D. J. (2006). The impact of anonymity on responses to sensitive questions.
Journal of Applied Social Psychology, 30, 1691–1708. doi:10.1111/j.1559-
1816.2000.tb02462.x
Oregon Healthy Teens. (2004). OHT 2004 survey methodology. Retrieved September 5, 2004,
from http://www.dhs.state.or.us/dhs/ph/chs/youthsurvey/ohteens/2004/methods.shtml
Osborne, J. W. (2010). Correlation and other measures of association. In G. R. Hancock & R. O.
Mueller (Eds.), The reviewer’s guide to quantitative methods in the social sciences (pp.
55-69) New York, NY: Routledge.
Rose, C. A. Espelage, D. L., & Monda-Amaya, L. E. (2009). Bullying and victimization rates
among students in general and special education: A comparative analysis. Educational
Psychology, 29, 761–776. doi:10.1080/01443410903254864
Rosenblatt, J. A., & Furlong, M. J. (1997). Assessing the reliability and validity of student self-
reports of campus violence. Journal of Youth and Adolescence, 26, 187–202.
doi:10.1023/A:1024552531672
Safe Schools Healthy Students. (2004). The faces of the Safe Schools/Healthy students. Retrieved
September 4, 2004, from http://www.sshs.samhsa.gov/initiative/faces.aspx
Saylor, C. F., & Leach, J. B. (2009). Perceived bullying and social support in students accessing
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 28
special inclusion programming. Journal Development Physical Disabilities, 21, 69–80.
doi:10.1007/s10882-008-9126-4
Schwartz, N. (1999). Self reports: How the questions shape the answers. American Psychologist,
54, 93–105. doi:10.1037/0003-066X.54.2.93
Sela-Shayovitz, R. (2009). Dealing with school violence: The effect of school violence
prevention training on teachers’ perceived self-efficacy in dealing with violent events.
Teaching and Teacher Education, 25, 1061–1066. doi:10.1016/j.tate.2009.04.010
Sharkey, J. D., Furlong, M. J., & Yetter, G. (2004). An overview of measurement issues in
school violence and school safety issues. In S. R. Jimerson & M. J. Furlong (Eds.),
Handbook of school violence and school safety: From research to practice (pp. 121–
134). Mahwah, NJ: Erlbaum.
Skiba, R., Simmons, A. B., Peterson, R., McKelvey, J., Ford, S., & Gallini, S. (2004). Beyond
guns, drugs and gangs: The structure of student perceptions of school safety. In M. J.
Furlong, M. P. Bates, D. C. Smith, & P. Kingery (Eds.), Appraisal and prediction of
school violence: Issues, methods and contexts (pp. 149–171). Hauppauge, NY: Nova
Science.
Stapleton, L. M., Cafarelli, M., Almario, M. N., & Ching, T. (2010). Prevalence and
characteristics of student attitude surveys used in public elementary schools in the United
States. Practical Assessment, Research & Evaluation. 15(9). Available from
http://pareonline.net/getvn.asp?v=15&n=9
Sugai, G., Sprague, J. R., Horner, R. H., & Walker, H. M. (2000). Preventing school violence:
The use of office discipline referrals to assess and monitor school-wide discipline
interventions. Journal of Emotional and Behavioral Disorders, 8, 94–101.
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 29
doi:10.1177/106342660000800205
Sullivan, M. L. (2002). Exploring layers: Extended case method as a tool for multilevel analysis
of school violence. Sociological Methods & Research, 31, 255–285.
doi:10.1177/0049124102031002005
Turner, C. F., Ku, L., Rogers, S. M., Lindberg, L. D., Pleck, J. H., & Sonenstein, F. L. (1998).
Adolescent sexual behavior, drug use, and violence: Increased reporting with computer
survey technology. Science, 280, 867–873. doi:10.1126/science.280.5365.867
Turner, L. A., Powell, A. E., Langhinrichsen-Rohling, J., & Carson, J. (2009). Helping families
initiative: Intervening with high risk students through a community, school, and district
attorney partnership. Child and Adolescent Social Work Journal, 26, 209–223.
doi:10.1007/s10560-009-0167-z
Yavuzer, Y., Gundogdu, R., & Dikici, A. (2009). Teachers’ perceptions about school violence in
one Turkish city. Journal of School Violence, 8, 29–41.
doi:10.1080/15388220802067797
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 30
Table 1
School Violence and Safety Measures: Intended Use, Limitations, and Recommendations
Type Intended Use Limitations Recommendations
General Surveillance Obtain population
estimates broadly
about youth risk.
Not subjected to rigorous psychometric
analysis; limited information about
school violence and safety; does not
inform individual development.
Conduct psychometric analysis for new
purpose prior to extending their use.
School Violence Items
in Surveillance Surveys
Obtain populations
estimates about
school violence and
safety.
The most rigorous survey is only
administered to school principals and
does not produce student, staff, or
parent-level information; does not
inform individual development.
Include items designed to specifically
assess school violence and safety in
more broadly administered assessments.
Make sure reliability and validity have
been tested for intended purposes.
School Violence and
Safety Measurement
Instruments
Measure latent
constructs of youth
violence.
Threats to validity such as reading level,
honesty of answers, response options,
recall, and extreme responses.
Advance knowledge of how to
overcome threats to validity by studying
these issues. Create and validate more
advanced measurement tools with the
knowledge gained.
Mandated Districtwide
Reporting
Track student
violence across
schools.
Data submitted by districts are not
comparable.
Identify an alternative method or
implement extensive training,
standardization, and incentive for
accurate and comparable reporting.
Discipline Referrals Examine school
safety; evaluate if
there are
disproportionate
referrals; predict
students’ risks for
future infractions.
Lack of standard procedures within and
across schools; lack of psychometric
scrutiny.
Limit cross analysis of data from
different sources unless discipline
procedures, reporting, and data
collection are standardized within and
across settings. Evaluate data for
reliability and validity.
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 31
Table 2
Implications for Practice: Recommendations for the Use of School Violence Self-report
Procedures
Constructing Surveys
Clearly define key terms
Keep questions concrete
Use memory aids when asking about past events
Include multiple questions to measure each main construct
Check survey’s reading level for the target population
Include items that screen for honesty and accuracy
Selecting Existing Surveys
Check reliability and validity for the population to be tested
Administering Surveys
Ensure confidentiality / anonymity
Train staff to adhere to a standard protocol
Explain purpose of survey to students and how results are used
Be prepared to read questions aloud to students and to clarify
Debrief students afterward: What were they thinking?
Reporting Survey Results
Present information separately to students, staff, parents
Solicit perceptions of survey data
Brainstorm alternative courses of action
Maintain a collaborative, team-oriented stance
Next Steps
Link assessment to planning and intervention
Choose activities and programs with empirical support
Reassess students periodically to measure program effectiveness
Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 31
Table 2
Implications for Practice: Recommendations for the Use of School Violence Self-report
Procedures
Constructing Surveys
Clearly define key terms
Keep questions concrete
Use memory aids when asking about past events
Include multiple questions to measure each main construct
Check survey’s reading level for the target population
Include items that screen for honesty and accuracy
Selecting Existing Surveys
Check reliability and validity for the population to be tested
Administering Surveys
Ensure confidentiality / anonymity
Train staff to adhere to a standard protocol
Explain purpose of survey to students and how results are used
Be prepared to read questions aloud to students and to clarify
Debrief students afterward: What were they thinking?
Reporting Survey Results
Present information separately to students, staff, parents
Solicit perceptions of survey data
Brainstorm alternative courses of action
Maintain a collaborative, team-oriented stance
Next Steps
Link assessment to planning and intervention
Choose activities and programs with empirical support
Reassess students periodically to measure program effectiveness
View publication stats
View publication stats

More Related Content

Similar to An Overview of Measurement Issues in School Violence and School Safety Research.pdf

Rubric Analysis of a case studyStudentGroup Name Course .docx
Rubric  Analysis of a case studyStudentGroup Name Course  .docxRubric  Analysis of a case studyStudentGroup Name Course  .docx
Rubric Analysis of a case studyStudentGroup Name Course .docxjoellemurphey
 
Peer Attachment and Intention of Aggressive Behavior among School Children
Peer Attachment and Intention of Aggressive Behavior among School ChildrenPeer Attachment and Intention of Aggressive Behavior among School Children
Peer Attachment and Intention of Aggressive Behavior among School Childreniosrjce
 
Effects of Multiple Intellgences on Academic Education
Effects of Multiple Intellgences on Academic EducationEffects of Multiple Intellgences on Academic Education
Effects of Multiple Intellgences on Academic EducationQuinn Collor
 
Secret Service Bystander study 2008
Secret Service Bystander study 2008Secret Service Bystander study 2008
Secret Service Bystander study 2008JA Larson
 
A Case Study With An Identified Bully Policy And Practice Implications
A Case Study With An Identified Bully  Policy And Practice ImplicationsA Case Study With An Identified Bully  Policy And Practice Implications
A Case Study With An Identified Bully Policy And Practice ImplicationsKristen Flores
 
A Scale Development Study about School Safety
A Scale Development Study about School SafetyA Scale Development Study about School Safety
A Scale Development Study about School Safetyinventionjournals
 
Mwera 2014 middleschool_schoolculture_discipline_final_paper
Mwera 2014 middleschool_schoolculture_discipline_final_paperMwera 2014 middleschool_schoolculture_discipline_final_paper
Mwera 2014 middleschool_schoolculture_discipline_final_paperTracey Stuckey-Mickell
 
Physical punishment and psychological treatment on students'
Physical punishment and psychological treatment on students'Physical punishment and psychological treatment on students'
Physical punishment and psychological treatment on students'Alexander Decker
 
Guidance and counselling approcach2
Guidance and counselling approcach2Guidance and counselling approcach2
Guidance and counselling approcach2Nurul Ain Abd Manan
 
Research proposal..
Research proposal..Research proposal..
Research proposal..Mimi Mumo
 
Application Essays As An Effective Tool For Assessing Instruction In The Basi...
Application Essays As An Effective Tool For Assessing Instruction In The Basi...Application Essays As An Effective Tool For Assessing Instruction In The Basi...
Application Essays As An Effective Tool For Assessing Instruction In The Basi...Don Dooley
 
Research Methodology Module-01
Research Methodology Module-01Research Methodology Module-01
Research Methodology Module-01Kishor Ade
 
Compare and Contrast PaperTopic School bullying compared and co
Compare and Contrast PaperTopic School bullying compared and coCompare and Contrast PaperTopic School bullying compared and co
Compare and Contrast PaperTopic School bullying compared and coLynellBull52
 
1. Need all 3 article read and compared answering the questions I .docx
1. Need all 3 article read and compared answering the questions I .docx1. Need all 3 article read and compared answering the questions I .docx
1. Need all 3 article read and compared answering the questions I .docxjackiewalcutt
 
Projectresearchpresentationsampleslides.pptx
Projectresearchpresentationsampleslides.pptxProjectresearchpresentationsampleslides.pptx
Projectresearchpresentationsampleslides.pptxFrancisSalami
 
1Methodology AssignmentParticipantProcedures
1Methodology AssignmentParticipantProcedures1Methodology AssignmentParticipantProcedures
1Methodology AssignmentParticipantProceduresAnastaciaShadelb
 
Student Safety Research Presentation
Student Safety Research PresentationStudent Safety Research Presentation
Student Safety Research PresentationDr. James Lake
 
See discussions, stats, and author profiles for this publica.docx
See discussions, stats, and author profiles for this publica.docxSee discussions, stats, and author profiles for this publica.docx
See discussions, stats, and author profiles for this publica.docxedgar6wallace88877
 

Similar to An Overview of Measurement Issues in School Violence and School Safety Research.pdf (20)

Rubric Analysis of a case studyStudentGroup Name Course .docx
Rubric  Analysis of a case studyStudentGroup Name Course  .docxRubric  Analysis of a case studyStudentGroup Name Course  .docx
Rubric Analysis of a case studyStudentGroup Name Course .docx
 
Peer Attachment and Intention of Aggressive Behavior among School Children
Peer Attachment and Intention of Aggressive Behavior among School ChildrenPeer Attachment and Intention of Aggressive Behavior among School Children
Peer Attachment and Intention of Aggressive Behavior among School Children
 
Effects of Multiple Intellgences on Academic Education
Effects of Multiple Intellgences on Academic EducationEffects of Multiple Intellgences on Academic Education
Effects of Multiple Intellgences on Academic Education
 
Secret Service Bystander study 2008
Secret Service Bystander study 2008Secret Service Bystander study 2008
Secret Service Bystander study 2008
 
A Case Study With An Identified Bully Policy And Practice Implications
A Case Study With An Identified Bully  Policy And Practice ImplicationsA Case Study With An Identified Bully  Policy And Practice Implications
A Case Study With An Identified Bully Policy And Practice Implications
 
A Scale Development Study about School Safety
A Scale Development Study about School SafetyA Scale Development Study about School Safety
A Scale Development Study about School Safety
 
Mwera 2014 middleschool_schoolculture_discipline_final_paper
Mwera 2014 middleschool_schoolculture_discipline_final_paperMwera 2014 middleschool_schoolculture_discipline_final_paper
Mwera 2014 middleschool_schoolculture_discipline_final_paper
 
Physical punishment and psychological treatment on students'
Physical punishment and psychological treatment on students'Physical punishment and psychological treatment on students'
Physical punishment and psychological treatment on students'
 
Guidance and counselling approcach2
Guidance and counselling approcach2Guidance and counselling approcach2
Guidance and counselling approcach2
 
Research proposal..
Research proposal..Research proposal..
Research proposal..
 
Application Essays As An Effective Tool For Assessing Instruction In The Basi...
Application Essays As An Effective Tool For Assessing Instruction In The Basi...Application Essays As An Effective Tool For Assessing Instruction In The Basi...
Application Essays As An Effective Tool For Assessing Instruction In The Basi...
 
Social Psychology
Social PsychologySocial Psychology
Social Psychology
 
Research Methodology Module-01
Research Methodology Module-01Research Methodology Module-01
Research Methodology Module-01
 
Compare and Contrast PaperTopic School bullying compared and co
Compare and Contrast PaperTopic School bullying compared and coCompare and Contrast PaperTopic School bullying compared and co
Compare and Contrast PaperTopic School bullying compared and co
 
Appleby College Research
Appleby College ResearchAppleby College Research
Appleby College Research
 
1. Need all 3 article read and compared answering the questions I .docx
1. Need all 3 article read and compared answering the questions I .docx1. Need all 3 article read and compared answering the questions I .docx
1. Need all 3 article read and compared answering the questions I .docx
 
Projectresearchpresentationsampleslides.pptx
Projectresearchpresentationsampleslides.pptxProjectresearchpresentationsampleslides.pptx
Projectresearchpresentationsampleslides.pptx
 
1Methodology AssignmentParticipantProcedures
1Methodology AssignmentParticipantProcedures1Methodology AssignmentParticipantProcedures
1Methodology AssignmentParticipantProcedures
 
Student Safety Research Presentation
Student Safety Research PresentationStudent Safety Research Presentation
Student Safety Research Presentation
 
See discussions, stats, and author profiles for this publica.docx
See discussions, stats, and author profiles for this publica.docxSee discussions, stats, and author profiles for this publica.docx
See discussions, stats, and author profiles for this publica.docx
 

More from Liz Adams

What Is Creative Writing. Essay Topics And Example
What Is Creative Writing. Essay Topics And ExampleWhat Is Creative Writing. Essay Topics And Example
What Is Creative Writing. Essay Topics And ExampleLiz Adams
 
Free Printable Spider-Shaped Writing Templates. This PD
Free Printable Spider-Shaped Writing Templates. This PDFree Printable Spider-Shaped Writing Templates. This PD
Free Printable Spider-Shaped Writing Templates. This PDLiz Adams
 
Find Out How To Earn 398Day Using Essay Wri
Find Out How To Earn 398Day Using Essay WriFind Out How To Earn 398Day Using Essay Wri
Find Out How To Earn 398Day Using Essay WriLiz Adams
 
Fish - All-Day Primary. Online assignment writing service.
Fish - All-Day Primary. Online assignment writing service.Fish - All-Day Primary. Online assignment writing service.
Fish - All-Day Primary. Online assignment writing service.Liz Adams
 
009 Essay Outline Template Mla Format Thatsnotus
009 Essay Outline Template Mla Format Thatsnotus009 Essay Outline Template Mla Format Thatsnotus
009 Essay Outline Template Mla Format ThatsnotusLiz Adams
 
Rain Text Effect - Photoshop Tutorial - Write On Foggy Window - YouTube
Rain Text Effect - Photoshop Tutorial - Write On Foggy Window - YouTubeRain Text Effect - Photoshop Tutorial - Write On Foggy Window - YouTube
Rain Text Effect - Photoshop Tutorial - Write On Foggy Window - YouTubeLiz Adams
 
New Year Writing Paper By Burst Into First TPT
New Year Writing Paper By Burst Into First TPTNew Year Writing Paper By Burst Into First TPT
New Year Writing Paper By Burst Into First TPTLiz Adams
 
Prudential Center Events New Jersey Live Entertain
Prudential Center Events New Jersey Live EntertainPrudential Center Events New Jersey Live Entertain
Prudential Center Events New Jersey Live EntertainLiz Adams
 
College Essay Literary Criticism Essay Outline
College Essay Literary Criticism Essay OutlineCollege Essay Literary Criticism Essay Outline
College Essay Literary Criticism Essay OutlineLiz Adams
 
Paper With Writing On It - College Homework Help A
Paper With Writing On It - College Homework Help APaper With Writing On It - College Homework Help A
Paper With Writing On It - College Homework Help ALiz Adams
 
Free Clipart Pencil And Paper 10 Free Cliparts
Free Clipart Pencil And Paper 10 Free ClipartsFree Clipart Pencil And Paper 10 Free Cliparts
Free Clipart Pencil And Paper 10 Free ClipartsLiz Adams
 
Hamburger Writing By Food For Taught Teachers Pay
Hamburger Writing By Food For Taught Teachers PayHamburger Writing By Food For Taught Teachers Pay
Hamburger Writing By Food For Taught Teachers PayLiz Adams
 
How To Avoid Plagiarism In Writing Research - Essay Hel
How To Avoid Plagiarism In Writing Research - Essay HelHow To Avoid Plagiarism In Writing Research - Essay Hel
How To Avoid Plagiarism In Writing Research - Essay HelLiz Adams
 
Writing An Academic Essay. Online assignment writing service.
Writing An Academic Essay. Online assignment writing service.Writing An Academic Essay. Online assignment writing service.
Writing An Academic Essay. Online assignment writing service.Liz Adams
 
Writing An Introduction To A Research Paper
Writing An Introduction To A Research PaperWriting An Introduction To A Research Paper
Writing An Introduction To A Research PaperLiz Adams
 
School Essay Essays For Kids In English. Online assignment writing service.
School Essay Essays For Kids In English. Online assignment writing service.School Essay Essays For Kids In English. Online assignment writing service.
School Essay Essays For Kids In English. Online assignment writing service.Liz Adams
 
Importance Of Exercise In Daily Life Essay. Importan
Importance Of Exercise In Daily Life Essay. ImportanImportance Of Exercise In Daily Life Essay. Importan
Importance Of Exercise In Daily Life Essay. ImportanLiz Adams
 
Vocabulary For Essay Writing Essay Writing Skills, Acade
Vocabulary For Essay Writing Essay Writing Skills, AcadeVocabulary For Essay Writing Essay Writing Skills, Acade
Vocabulary For Essay Writing Essay Writing Skills, AcadeLiz Adams
 
NEW LeapFrog LeapReader Deluxe. Online assignment writing service.
NEW LeapFrog LeapReader Deluxe. Online assignment writing service.NEW LeapFrog LeapReader Deluxe. Online assignment writing service.
NEW LeapFrog LeapReader Deluxe. Online assignment writing service.Liz Adams
 
System Proposal Sample. Online assignment writing service.
System Proposal Sample. Online assignment writing service.System Proposal Sample. Online assignment writing service.
System Proposal Sample. Online assignment writing service.Liz Adams
 

More from Liz Adams (20)

What Is Creative Writing. Essay Topics And Example
What Is Creative Writing. Essay Topics And ExampleWhat Is Creative Writing. Essay Topics And Example
What Is Creative Writing. Essay Topics And Example
 
Free Printable Spider-Shaped Writing Templates. This PD
Free Printable Spider-Shaped Writing Templates. This PDFree Printable Spider-Shaped Writing Templates. This PD
Free Printable Spider-Shaped Writing Templates. This PD
 
Find Out How To Earn 398Day Using Essay Wri
Find Out How To Earn 398Day Using Essay WriFind Out How To Earn 398Day Using Essay Wri
Find Out How To Earn 398Day Using Essay Wri
 
Fish - All-Day Primary. Online assignment writing service.
Fish - All-Day Primary. Online assignment writing service.Fish - All-Day Primary. Online assignment writing service.
Fish - All-Day Primary. Online assignment writing service.
 
009 Essay Outline Template Mla Format Thatsnotus
009 Essay Outline Template Mla Format Thatsnotus009 Essay Outline Template Mla Format Thatsnotus
009 Essay Outline Template Mla Format Thatsnotus
 
Rain Text Effect - Photoshop Tutorial - Write On Foggy Window - YouTube
Rain Text Effect - Photoshop Tutorial - Write On Foggy Window - YouTubeRain Text Effect - Photoshop Tutorial - Write On Foggy Window - YouTube
Rain Text Effect - Photoshop Tutorial - Write On Foggy Window - YouTube
 
New Year Writing Paper By Burst Into First TPT
New Year Writing Paper By Burst Into First TPTNew Year Writing Paper By Burst Into First TPT
New Year Writing Paper By Burst Into First TPT
 
Prudential Center Events New Jersey Live Entertain
Prudential Center Events New Jersey Live EntertainPrudential Center Events New Jersey Live Entertain
Prudential Center Events New Jersey Live Entertain
 
College Essay Literary Criticism Essay Outline
College Essay Literary Criticism Essay OutlineCollege Essay Literary Criticism Essay Outline
College Essay Literary Criticism Essay Outline
 
Paper With Writing On It - College Homework Help A
Paper With Writing On It - College Homework Help APaper With Writing On It - College Homework Help A
Paper With Writing On It - College Homework Help A
 
Free Clipart Pencil And Paper 10 Free Cliparts
Free Clipart Pencil And Paper 10 Free ClipartsFree Clipart Pencil And Paper 10 Free Cliparts
Free Clipart Pencil And Paper 10 Free Cliparts
 
Hamburger Writing By Food For Taught Teachers Pay
Hamburger Writing By Food For Taught Teachers PayHamburger Writing By Food For Taught Teachers Pay
Hamburger Writing By Food For Taught Teachers Pay
 
How To Avoid Plagiarism In Writing Research - Essay Hel
How To Avoid Plagiarism In Writing Research - Essay HelHow To Avoid Plagiarism In Writing Research - Essay Hel
How To Avoid Plagiarism In Writing Research - Essay Hel
 
Writing An Academic Essay. Online assignment writing service.
Writing An Academic Essay. Online assignment writing service.Writing An Academic Essay. Online assignment writing service.
Writing An Academic Essay. Online assignment writing service.
 
Writing An Introduction To A Research Paper
Writing An Introduction To A Research PaperWriting An Introduction To A Research Paper
Writing An Introduction To A Research Paper
 
School Essay Essays For Kids In English. Online assignment writing service.
School Essay Essays For Kids In English. Online assignment writing service.School Essay Essays For Kids In English. Online assignment writing service.
School Essay Essays For Kids In English. Online assignment writing service.
 
Importance Of Exercise In Daily Life Essay. Importan
Importance Of Exercise In Daily Life Essay. ImportanImportance Of Exercise In Daily Life Essay. Importan
Importance Of Exercise In Daily Life Essay. Importan
 
Vocabulary For Essay Writing Essay Writing Skills, Acade
Vocabulary For Essay Writing Essay Writing Skills, AcadeVocabulary For Essay Writing Essay Writing Skills, Acade
Vocabulary For Essay Writing Essay Writing Skills, Acade
 
NEW LeapFrog LeapReader Deluxe. Online assignment writing service.
NEW LeapFrog LeapReader Deluxe. Online assignment writing service.NEW LeapFrog LeapReader Deluxe. Online assignment writing service.
NEW LeapFrog LeapReader Deluxe. Online assignment writing service.
 
System Proposal Sample. Online assignment writing service.
System Proposal Sample. Online assignment writing service.System Proposal Sample. Online assignment writing service.
System Proposal Sample. Online assignment writing service.
 

Recently uploaded

Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfMahmoud M. Sallam
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxEyham Joco
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceSamikshaHamane
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...jaredbarbolino94
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaVirag Sontakke
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Meghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentMeghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentInMediaRes1
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersSabitha Banu
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementmkooblal
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxHistory Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxsocialsciencegdgrohi
 

Recently uploaded (20)

Pharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdfPharmacognosy Flower 3. Compositae 2023.pdf
Pharmacognosy Flower 3. Compositae 2023.pdf
 
Types of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptxTypes of Journalistic Writing Grade 8.pptx
Types of Journalistic Writing Grade 8.pptx
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in Pharmacovigilance
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 
Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...Historical philosophical, theoretical, and legal foundations of special and i...
Historical philosophical, theoretical, and legal foundations of special and i...
 
Painted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of IndiaPainted Grey Ware.pptx, PGW Culture of India
Painted Grey Ware.pptx, PGW Culture of India
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Meghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media ComponentMeghan Sutherland In Media Res Media Component
Meghan Sutherland In Media Res Media Component
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)ESSENTIAL of (CS/IT/IS) class 06 (database)
ESSENTIAL of (CS/IT/IS) class 06 (database)
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginners
 
Hierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of managementHierarchy of management that covers different levels of management
Hierarchy of management that covers different levels of management
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptxHistory Class XII Ch. 3 Kinship, Caste and Class (1).pptx
History Class XII Ch. 3 Kinship, Caste and Class (1).pptx
 

An Overview of Measurement Issues in School Violence and School Safety Research.pdf

  • 1. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 1 An Overview of Measurement Issues in School Violence and School Safety Research Jill D. Sharkey, Erin Dowdy, Jennifer Twyford, and Michael J. Furlong Version 1.3.11— Final, in press Jill D. Sharkey, Ph.D., is an Academic Coordinator in the Department of Counseling, Clinical, and School Psychology at the University of California, Santa Barbara. <jsharkey@education.ucsb.edu> Erin Dowdy, Ph.D., is an Assistant Professor in the Department of Counseling, Clinical, and School Psychology at the University of California, Santa Barbara. <edowdy@education.ucsb.edu> Jennifer Twyford, Ed.S., is a Doctoral Candidate in the Department of Counseling, Clinical, and School Psychology at the University of California, Santa Barbara. <jtwyford@education.ucsb.edu> Michael J. Furlong, Ph.D., is a Professor in the Department of Counseling, Clinical, and School Psychology, University of California, Santa Barbara. <mfurlong@education.ucsb.edu> Chapter submitted for publication in: The Handbook of School Violence and School Safety: International Research and Practice Edited by Shane R. Jimerson, Amanda B. Nickerson, Matthew J. Mayer, & Michael J. Furlong To be published by Routldege, New York 7,864 words
  • 2. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 2 Abstract Precise and accurate measures are needed for research to advance knowledge about school violence and safety concerns. This chapter provides an overview of school violence and safety measurement issues. After introducing the history of measurement of school violence and safety, we provide a review of recent studies of school violence with a focus on their measurement approaches and practices. We then describe procedures commonly used to measure school safety- and violence-related variables, review prominent threats to the reliability and validity of these procedures, and suggest mechanisms for improvement. Keywords: school violence, school safety, methodology, self-report, psychometrics, epidemiology, research, assessment, measurement
  • 3. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 3 An Overview of Measurement Issues in School Violence and School Safety Research Progress has been made in understanding school violence and safety measurement issues; however, despite calls to action (e.g., Furlong, Morrison, Cornell, & Skiba, 2004; Sharkey, Furlong, & Yetter, 2004), school violence and safety research continues to be hindered by the limited scrutiny given to the psychometric properties of its measurement tools. This chapter focuses on how researchers have measured school violence and safety variables and the implications of their practices for data validity. After a historical summary, recent school violence research is examined with a focus on these studies’ measurement practices. Existing methods to measure school violence, and crucial measurement considerations, are described. We conclude with a list of important measurement considerations for school violence research. School Violence and Safety Measurement in Historical Context School violence and safety instruments predominately include items developed to estimate population trends to monitor public health conditions. Although items on population- based surveys, such as the Youth Risk Behavior Survey (YRBS; Centers for Disease Control and Prevention [CDC], 2004, 2009, 2010), have undergone content scrutiny by researchers, they are not typically subjected to rigorous psychometric analyses. This is because, in part, their original purpose was to obtain population estimates of behaviors and not to assess individual differences, or to provide information about the root causes of school violence. Nonetheless, researchers often use these surveillance surveys to examine individual differences and the association among risk and health behaviors. However, because the items were not developed for this specific purpose, their sensitivity and validity for complex measurement purposes is not established. To date, only the Survey of School Crime and Safety was originally and specifically designed to assess school safety and violence conditions; however, it is completed by school
  • 4. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 4 principals and does not produce student, staff, or parent-level information. Although many of the most prominent sources of information about school safety and violence have come from sources that were not specifically designed for this purpose, modifications of instruments to include school-context items still provide the best information available about school violence and safety and is used for the reports, Indicators of School Crime and Safety (e.g., Dinkes, Kemp, & Baum, 2009) and Crime, Violence, Discipline, and Safety in U.S. Public Schools (Neiman & DeVoe, 2009). There is overwhelming momentum to suggest that surveillance instruments will continue to be widely used because they have established baseline information that is being used to inform public policy. Measurement Practices of Recent School Violence and Safety Research To provide a perspective on recent school violence research measurement practices, we conducted a search of peer-reviewed journal articles with the descriptor of “school violence” for 2009 in the PsycInfo database (a list of these studies and review summary table is available from the authors). This search yielded 59 articles focused on youth in kindergarten through Grade 12 in the following categories: (a) qualitative or theoretical discourse regarding school shootings (n = 18), (b) relations between variables (n = 16), (c) bullying and victimization (n = 7), (d) school violence prevalence (n = 6), (e) other theory/review (n = 4), (f) prevention and intervention studies (n = 4), and (j) book reviews and journal editorials (n = 4). Approximately half (n = 33) of these articles were empirical studies. Although this review was designed to be inclusive, it yielded only a sampling of articles on the topic of school violence. For example, the search only identified 12 of 29 articles published in the Journal of School Violence in 2009. Identifying critical scholarship in the field of school violence would be facilitated with commonly applied
  • 5. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 5 keywords and descriptors such as “school violence,” “school safety,” and where appropriate “measurement.” Standards for reporting quantitative methods in the social sciences necessitate a theoretical rationale for the selection of variables, clearly defined variables, and established psychometric characteristics of measures (e.g., Osborne, 2010). To examine the degree to which recent studies use psychometrically sound measures, we examined 32 (one was inaccessible) of the empirical articles. First, we counted the different types of measures used in each study. Each study could only contribute one count per measure, but multiple measures could be counted per study. The most commonly used type of measurement was self-report (n = 24; e.g., Rose, Espelage, & Monda-Amaya, 2009), distantly followed by teacher report (n = 4; e.g., Yavuzer, Gundogdu, & Dikici, 2009), and neighborhood crime records (n = 2; e.g., Cornell, Sheras, Gregory, & Fan, 2009). The following measurement strategies were used in only one study: mapping tools, interviews and focus groups, observations, sociometric questionnaire, card sort, disciplinary records, trained professional rating, and participatory action research. Self-report was by far the most popular type of measure employed, particularly when examining relations between variables. Although most self-report studies focused on youth, one of the self-report measures was for teacher reports of their bully experiences and how these experiences impacted their school’s bullying interventions (Kokko & Pörhölä, 2009). Second, we examined the methods section of each article to evaluate the psychometric properties of measures used. Although the majority of the studies used established measures (e.g., California School Climate & Safety Survey [Furlong, 1996; Furlong et al., 2005] or the YRBS) only 2 studies (Rose et al., 2009; Saylor & Lech, 2009) provided both reliability and validity information of the school violence scales used. Most (n = 17) reported reliability, but no
  • 6. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 6 validity information (e.g., Cornell et al., 2009). In a few cases (n = 3; e.g., Estévez, Murgui, & Musitu, 2009), measures were modified for the specific study and the resulting reliability, but no validity, data were reported. Several studies reported no psychometric information (n = 5; e.g., Chen & Astor, 2009). Other scholars developed their own measures as part of the study (n = 8) but less than half (n = 3; e.g., Sela-Shayovitz, 2009) reported psychometric properties of the resulting scale. One study adapted a measure by translating it into different languages, but did not report any psychometric data regarding the final scales (Linares, Días, Fuentes, & Acién, 2009). Overall, it was clear that some scales used in school violence research are well developed with multiple citations reflecting prior development and use. Yet, in many cases, scales were developed specifically for one study, at times without any reliability or validity analysis presented. This examination of recent studies in the area of school violence indicated limited consensus about the use of common measures to examine school violence. In 32 studies, 31 different approaches or scales were used to measure school violence. When established measures were used, oftentimes, psychometric properties for the study sample were underreported (e.g., Turner, Powell, Langhinrichsen-Rohling, & Carson, 2009). Measures were adapted for unique purposes without revalidation and even translated without documenting new psychometric components (e.g., Linares et al., 2009). These recent research practices demand attention because they suggest that the validity of school violence scholarship is questionable due to its measurement practices. Our analysis also revealed popular and neglected areas of study. Discourse regarding school shootings was highly popular and dominated more than a quarter of the articles. The study of bullying and victimization was also popular. Neglected, however, were several areas needing additional scholarship such as school violence prevention, crisis
  • 7. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 7 intervention, gang interdiction, school discipline practices, and the influence of school climate on school safety. School Violence and Safety Measurement Instruments Despite the need for improved measurement practices, a body of evidence is available for some commonly used instruments. This section provides an overview of procedures to assess school violence and safety. The most frequent way of measuring safety and violence-related concerns is through self-report surveys, including population-based surveillance surveys. Other common methods include mandated districtwide reporting of school discipline referrals. Additional methods such as mapping tools and sociometric questionnaires are used less frequently, and thus, are not detailed here. Comprehensive School Violence and Safe School Instruments National surveys of students’ self-reported behaviors and experiences are the most common form of data collection regarding school violence and provide information about the widespread prevalence of a problem. However, because this information is aggregated over broad contexts, it is useful for estimating state or national trends, but its applicability to local conditions is restricted. At the school level, students and teachers may be more interested in how their school compares to similar schools in the community or to examine changes over time in response to interventions. School- and district-level assessment of school violence indicators is crucial to indicate specific local needs, as well as to provide a baseline against which treatment efficacy can later be evaluated (Benbenishty, Astor, & Zeira, 2003). Moreover, individual development occurs within the context of school and community influences, and large-scale surveys rarely account for contextual variables (other than geographic region and city size). Therefore, a person’s characteristics inherently reflect many environmental influences over a
  • 8. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 8 protracted time period, so that attempts to partition out the effects of environment from individual-level variables are unlikely to yield large effects. Oftentimes, the sophistication of the analytic methods used to examine predictors of aggression and violence (such as hierarchical linear modeling) supersedes the psychometric sophistication of the measures used to gather the data (Sullivan, 2002). Youth Risk Behavior Surveillance Survey (YRBS). The YRBS was originally developed in the late 1980s as a youth health risk surveillance system (e.g., see Kann, 2001; Kann et al., 2000). It is an anonymous self-report survey administered to students within their classroom. The YRBS is administered biennially to a representative sample of United States high school students. Its content focuses on health-risk behaviors that may result in later disability, mortality, morbidity, and/or significant social problems (CDC, 2009) including alcohol and drug use, unintended pregnancies, dietary behaviors, physical activity, and sexual activity. Results are used to monitor health-risk behaviors among high school students, evaluate the impact of various efforts to decrease the prevalence of health-risk behaviors, and monitor the progress of national health objectives (e.g., CDC, 2010). However, there is limited reliability and validity data to support the use of the YRBS (beyond its epidemiological purpose) to describe characteristics of youths who engage in high-risk behaviors (see CDC, 2004). For instance, the stability of weapon-carrying items, as measured through test-retest reliability, was low, and reliability analyses did not address the different time frames of items (i.e., 30-day vs. lifetime; Brener et al., 2002). Finally, the validity of the responses of youth who selected the most extreme response options was not supported (see Furlong, Sharkey, Bates, & Smith, 2004). California Healthy Kids Survey (CHKS). The CHKS is a series of assessment modules developed by WestEd’s Human Development Program in collaboration with Duerr Evaluation
  • 9. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 9 Resources for the California Department of Education (California Department of Education [CDE], 2010). Items for the CHKS were taken from the YRBS and the California Student Substance Use Survey. The CHKS provides schools with a method of collecting ongoing youth health and risk behavior data and, since 1998, has been successfully administered across the United States and internationally. Three separate versions of the CHKS are available for use at the elementary, middle school, and high school levels (CDE, 2010) and supplemental topics may be added to the core surveys. The surveys contain items concerning youth nutrition and physical activity, sexual behavior, exposure to prevention and intervention activities, risk and protective factors, alcohol, tobacco and other drug use as well as items specific to violence, school safety, gang involvement, and delinquency (CDE, 2010). The surveys include items to assess the truthfulness of each respondent’s answers. The results can be compared to results from the YRBS and various health-risk surveys. Benefits of using the CHKS include: meeting categorical program requirements (such as Title IV), identifying program goals and high-risk groups (e.g., CDE, 2010), and identifying risk and protective factors (through its Resiliency and Youth Development Module), which has an extensive report of its psychometric properties (Hanson & Kim, 2007). Mandated Districtwide Reporting The federal Gun-Free Schools Act of 1994 (GFSA; 1994) imposed specific reporting requirements for each state regarding the number of students who engage in a variety of violent behaviors. Unfortunately, data submitted by districts were not comparable (Kingery & Coggeshall, 2001) because (a) some states reported more detailed information, (b) districts differed in how they categorized violent behavior, (c) inconsistencies in time periods assessed interfered with comparing data, and (d) federally-mandated definitions of the classifications of
  • 10. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 10 weapons and violent behavior were often not specified in student surveys (for instance, the GFSA definitions included bombs and gas as firearms). Kingery and Coggeshall (2001) also cited other problems that interfered with a clear understanding of the data. They noted that (a) students may hide serious incidents from staff, (b) school personnel may fail to detect many violence incidents or only document observed infractions, (c) staff are not often trained in a standardized fashion, and (d) school personnel may feel pressured to underreport violence so not to reflect negatively on their school or district. Discipline Referrals School discipline data, such as office referrals, suspensions, and expulsions, have the potential to provide valuable information about students’ risk for future infractions. Sugai, Sprague, Horner, and Walker (2000) observed that all schools collect discipline data to some extent, so discipline data may be the most effective way for school systems to understand the relationships and behavioral contexts that disrupt schools. Discipline data can show (a) which student behaviors are of greatest concern; (b) whether or not there are disproportionate referrals by gender, ethnicity, or special education status; and (c) whether or not there are disproportionate referrals made by certain teachers or during certain periods of instruction (e.g., recess, P.E., reading, and math). This information may be used to identify targets for intervention. However, methods for collecting discipline data need careful consideration and procedures need to be standardized across classroom, playground, and other school settings. Morrison, Peterson, O’Farrell, and Redding (2004) note that although these data are easily obtained, little is known about the reliability or validity for predicting future aggressive acts with either school-level or individual-level data. Additionally, Morrison and Skiba (2001) caution that school discipline data are not as straightforward as they might appear. Behavior
  • 11. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 11 referrals, suspensions, and other disciplinary actions reflect not only students’ behavior, but also teachers’ tolerance for disruptive behavior, teachers’ skills in classroom management, administrative discipline policies, and other classroom, school, and community factors, although they often fail to document the contribution of these environmental influences. Thus, predicting disruptive and violent behavior from school discipline data is problematic because it must account for these multiple levels of influence. Persistent School Violence and Safety Measurement Issues Although numerous studies examine factors affecting the measurement accuracy of other health-risk behaviors, such as substance use and dietary behavior, studies examining behaviors related to violence are scarce (Brener, Billy, & Grady, 2003). Researchers have identified threats to the validity of school violence survey data and recommended a variety of design practices for enhancing survey accuracy. These strategies include maintaining confidentiality or anonymity of responses (Ong & Weiss, 2006), verifying the survey’s reading level (Stapleton, Cafarelli, Almario, & Ching, 2010), checking for the honesty of answers (Rosenblatt & Furlong, 1997), including sufficient numbers of items to measure a given construct (Bradley & Sampson, 2006), and asking questions about past experiences in ways that are most likely to elicit accurate recall (Coggeshall & Kingery, 2001). Despite the need for enhanced methodological rigor, information about the reliability and validity of methods used to collect self-report surveys is limited. Implausible Responding Patterns Investigators need to address the possibility that students may not respond honestly. This issue is related to, but different than, the occurrence of missing or spoiled survey responses to specific questions. For example, with the YRBS, one report (Brener, Kann, et al., 2004) indicated that unusable responses were obtained for 0.4% (age) to 15.5% (suicide attempt) of all
  • 12. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 12 2003 YRBS responses. Data screening methods should detect response inconsistencies or implausibly extreme patterns of responding, which may indicate not just that specific item responses were spoiled, but that the validity of the entire set of responses is questionable. Although data validity screening should be an essential element of all large-scale student surveys, unfortunately, few school violence surveys have included such safeguards (Cornell & Loper, 1998; Kingery & Coggeshall, 2001), and many others do not report whether or how they evaluated the quality of student responses (Furlong, Morrison et al., 2004). To investigate this issue, researchers from the University of Oregon examined dubious and inconsistent responders to a state survey and found that 1.9% of the cases were eliminated (Oregon Healthy Teens, 2004). This finding supports the general utility of using students’ self reports; however, it suggests that surveys that do not apply such standards may produce results that overestimate very low incidence risk behaviors because inconsistent responders report higher rates of risk behaviors (Cornell & Loper, 1998; Rosenblatt & Furlong, 1997). The CHKS uses high quality data screening and is a carefully developed instrument, having undergone more than six years of rigorous development and review by a standing panel of independent experts. Analysis of CHKS data includes a data-validation procedure that screens records from the data set that meet a combination of criteria, including inconsistent patterns of responding, implausible reports of drug use, the endorsement of a fictitious drug, or failure to assent to having answered survey questions honestly. The CHKS procedure is rigorous and typically produces a listwise case elimination of about 2–3% of cases (e.g., O’Brennan & Furlong, 2010). In their examination of responses to the California School Climate and Safety Survey, Rosenblatt and Furlong (1997) compared a group of students who failed reliability and validity checks to a matched control group. They found systematic bias in the way failed
  • 13. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 13 students responded to the survey, including higher ratings of school violence victimization, campus danger, poor grades, and few good friends. By contrast, comparison students were more likely to be detected by the social desirability item. Although surveyors are often concerned with participants responding in a socially desirable manner, one concern is that youths involved with antisocial and aggressive peers will exaggerate their involvement in delinquent activities as an alternative form of social desirability. For instance, Cross and Newman-Gonchar (2004) examined responses on three surveys: the Colorado Youth Survey, the Safe Schools/Healthy Students survey (SS/HS), and a Perception Survey. When examining the pattern of extreme and inconsistent responses, they found that not only did rates vary substantially by survey and by school, but extreme and inconsistent responses inflated rates of violent behavior much more at one school than at the other school. Even small percentages of questionable responses had the ability to inflate estimates of risk behavior substantially. For example, by excluding the fewer than 3% of responses to the Colorado Youth Survey that were suspect, Cross and Newman-Gonchar eliminated 70% of reported incidents of gun-carrying. It is important to note than such small differences in rates are important for low incidence behaviors (such as school gun possession) because even high-quality large-scale surveys such as the YRBS are designed to have a 5% confidence interval (Brener, Kann et al., 2004). However, school violence researchers rarely consider the effects of implausible responses on their study results. Survey Administration Procedures Cross and Newman-Gonchar (2004) examined the impact of survey administrator training on student response patterns. District prevention specialists coached a group of teachers about the importance of collecting quality data, and they were taught to explain the uses and
  • 14. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 14 importance of the data to their students and to ask them to respond honestly. Rates of invalid responses to surveys administrated by trained versus untrained teachers were compared and results indicated that the trained administrators obtained far lower rates of highly suspect responses (3%) than the untrained administrators (28%). Related to this finding, researchers using large sample databases rarely present data related to adherence with a standard administration protocol by school staff. In addition to administration procedures, responses may be influenced by survey format. Turner et al. (1998) provided one of few investigations regarding how traditional paper-and- pencil formats and computer assisted presentation influence response rates. Based on similar studies about youth substance use, computer presentations were expected to produce higher self- report rates. Turner and colleagues found that the computer format produced substantially higher prevalence rates than did the paper-and-pencil format for weapon-carrying, acts of violence, and threatened violence. The authors explained that the computer format, with the benefit of audio presentation and computerization, was likely to promote more accurate responding for sensitive questions. This explanation is supported by evidence of similarly high rates from studies that rely on retrospective accounts by adults regarding sensitive adolescent behaviors. In a related study, researchers from the CDC compared paper-and-pencil versions to several online versions of the YRBS (Denniston et al., 2010). They found that seeking passive parental consent for both the hard copy and the online options produced similar usable response rates. One difference was that when the survey was completed online the youth were more likely to say that they did not think the answers were private, given the visibility of monitors in the computer lab administration setting. The students were also given one other option. They were given a card with the URL for the survey and asked to complete it on their own time over the
  • 15. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 15 following two weeks. They received a reminder after one week. When given the option and asked to do it on their own time, only 24% of the students completed the survey. Although not mentioned by these researchers, this was interesting because this latter finding suggests that the level of intrinsic motivation to complete school violence and safety surveys among adolescents may be low. In an important study, Hilton, Harris, and Rice (2003) examined the consistency of youth self-reports of violence victimization and perpetration (not school violence) and compared prevalence rates derived from traditional paper-and-pencil reports to those provoked by the same experienced modeled in an audio vignette. They found that the same youths reported 2 to 3 times more violence perpetration and victimization using the self-report format. This finding, considered in relation to other studies (e.g., Hilton, Harris, & Rice, 1998), led these researchers to conclude that, “…although we would not suggest that standard paper-and-pencil surveys yield completely inaccurate data, the accumulated evidence, including these results, calls on researchers in the field of interpersonal violence to redouble efforts to demonstrate and improve the factual accuracy of their primary dependent measures” (Hilton et al., 2003, p. 235). Item Wording It is also important to evaluate if all respondents understand school violence survey questions in the same way. Brener, Grunbaum, et al. (2004) found that differences in item wording across three national surveys resulted in significantly different rates of behavior. One challenge with surveys of school violence and safety is that questions are worded so broadly as to leave a great deal of interpretation to individual responders (Cornell & Loper, 1998). A survey question that asks students if they carry a weapon without defining the word “weapon” has the potential to result in students reporting their hunting rifles or pocketknives as weapons.
  • 16. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 16 For example, the SS/HS (Safe Schools Healthy Students, 2004) item, “During  the  past  12   months,  how  often  have  you  been  picked  on  or  bullied  by  a  student  on  school  property?”  is   ambiguous  for  several  reasons.    First,  it  is  unclear  whether  being  picked  on  and  being   bullied  are  to  be  interpreted  as  separate  experiences,  or  if  they  are  implied  to  be   equivalent.    Beyond  this,  the  item  does  not  define  “bullying”  in  a  way  that  is  consistent  with   best  research  practices.    It  could  be  argued  that  this  item  does  not  measure  “bullying”   victimization  per  se,  but  rather  each  student’s  understanding  of  this  word.    To promote consistent and accurate responses, items should be unambiguous, easy to read, and all terms should be clearly defined (Fowler, 1993). Items Response Time Frames Many commonly used school violence and safety instruments include items that refer to past behavior using a variety of time frames (e.g., 30 days, 6 months, 1 year). It would seem logical to presume that survey responses in any given month provide a cross-section (in time) of students’ behaviors and experiences and that the reported incidence of these behaviors would be higher over a much longer period of time. For example, if 10% of students report carrying a weapon to school in the past 30 days, one might expect this percentage to be higher if the same sample of students were asked to report about past year weapon possession. However, no published research has confirmed this. According  to  Schwartz  (1999),  asking  about  past   month  incidents  is  likely  to  convey  to  respondents  that  researchers  want  to  know  about   less  serious  but  common  events.    In  contrast,  students  might  interpret  asking  about  fights   in  the  past  year  as  seeking  information  about  less  frequent  but  more  serious  fights   (Schwartz, 1999).   To explore this issue, Hilton and colleagues (1998) examined differences in self-reports
  • 17. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 17 across 1-month, 6-month, and 1-year time periods. They reported “…standard self-reports of interpersonal violence were insensitive to the specified time frame; for example, participants reported almost the same number of violent acts in the past month as in the past year, something that could not be factually true” (p. 234). Similarly, based on their review of what is known about influences of response time referents, Brener et al. (2003) concluded that multiple factors affect student recollection of school violence experiences. Survey designers should be mindful that respondents often find it difficult to accurately remember past events (Cornell & Loper, 1998; Fowler, 1993). For this reason, survey questions that ask about past behavior or experiences can incorporate any of a variety of techniques for enhancing recall. Converse and Presser (1986) recommend: (a) using simple language; (b) asking about experiences within a narrow reference period (within no more than a six-month period); (c) using memory landmarks (e.g., asking about behavior “since the beginning of the school year” or “since New Year’s”); and (d) stimulating recall by describing concrete events (“Instead of asking respondents if they have experienced ‘assaults,’ for instance, ... ask if anyone…used force by grabbing, punching, choking, scratching, or biting” p. 22). Unfortunately, school violence and safety research has not yet thoroughly attended to these measurement concerns. Item-Response Options Not only do surveys differ in the response time frame, but they also vary in the number and type of response options offered. Often, items that appear on one survey are included on other questionnaires with a different number of response options, without stating a rationale for the change in response options. For example, the YRBS asks, “During the past 30 days, on how many days did you carry a weapon such as a gun, knife, or club on school property?” The
  • 18. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 18 following five response options are offered; 0 days, 1 day, 2 or 3 days, 4 or 5 days, and 6 or more days. However, when applied by Farrell and Meyer (1997) to evaluate the effectiveness of a violence prevention program, the item included six response options: Never, 1-2 times, 3-5 times, 6-9 times, 10-19 times, and 20 times or more. Although it might appear that similar items with different numbers of response options yield equivalent results, in fact, this is not the case (Schwartz, 1999). In addition, Schwartz noted that survey respondents also are most likely to choose response options near the middle of response scales. School Violence and Safety Instrument Psychometric Properties The survey method typically used to assess school safety and violence seeks to identify indicators that increase the likelihood of negative outcomes. This practice differs substantially from that most often used in school psychology, which establishes in advance the psychometric properties of a scale for identification or diagnosis purposes (Cornell & Loper, 1998). Unfortunately, some reports have used survey instruments for identification purposes without adequately examining the psychometric properties of the scales. There is limited research examining the reliability and validity of school violence and safety instruments. Stability of Responses As with any self-report measure, measures of violence and safety should have rigorous psychometric testing to establish reliability and validity (Rosenblatt & Furlong, 1997). Given that school violence and safety instruments typically do not measure latent traits, the reliability of greatest interest is the consistency of responses over a brief interval. Unfortunately, testing of response stability is infrequent. Brener et al. (2002) completed the only study to date that has examined the reliability of the YRBS items that inquired about school-associated behaviors. They examined the responses of 4,619 of 6,802 eligible students, derived from a convenience
  • 19. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 19 sample, who completed the YRBS twice over a two-week time period. They subsequently converted the responses of all items into binary format to compute a kappa statistic, a measure of response consistency that is corrected for chance agreement. Kappa coefficients ranged from .41 to .68 (in the moderate to substantial range; Landis & Koch, 1977) for the four YRBS items that directly assess school violence content. The incidence of one behavior (“Injured in a physical fight one time in the past 12 months”) was significantly higher at time 2 compared to time 1. Though Brener et al. concluded that, in general, the YRBS is a reliable instrument and this study is cited often to justify the use of the YRBS, there are several methodological questions with this study, particularly when school violence items are examined. These issues pertain to the appropriateness of excluding inconsistent responses prior to analysis; converting responses to binary format and using the kappa statistic; using a 14-day (average) retest period for items with 30-day or 12-month reference timeframes; and interpreting kappa coefficients for items with different reference timeframes (e.g., past 30 days and past 12 months). As a result, this analysis likely generated the most favorable possible test-retest reliability. Even so, the reliability coefficients for the school items were only in the moderate range. Validity of Measures Skiba et al. (2004) noted that instruments can contribute significantly to the field of school violence and safety only when they include the complex and representative set of factors involved. They recommended using empirical methods such as factor analysis to justify inclusion of items in surveys, rather than relying only on professional judgment, as done in the past. Regarding the content of school safety and violence measures, Skiba et al. (2004) pointed out that many surveys focus on significant acts of violence, such as weapon carrying. However, although extreme violence is the end result within a context of unsafe school climate, it may not
  • 20. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 20 provide enough variance to yield meaningful comparisons, particularly for typical schools. For this reason, Skiba et al. (2004) stated that researchers need to focus less on serious violent acts, especially as smaller discipline problems are more frequent and are part of a common trajectory towards future antisocial behavior. A few researchers have developed school violence and safety instruments using psychometric approaches to scale development (Furlong, Sharkey, et al., 2004; Skiba et al., 2004). Although these instruments are not appropriate for surveillance style surveys, they have potential to provide options when used as outcome measures in local evaluations and studies using controlled experimental designs. Recently, Greif and Furlong (2004a, 2004b) extended the psychometric approach by using item response analysis to create a unidimensional school violence victim scale. Such a scale could be used to track victimization experiences across time. Table 1 provides a review of the school violence and safety measures along with their limitations and recommendations for improvement. Conclusion Efforts to refine school violence and safety measurement practices are needed. Although more sophisticated techniques are being employed, such as web administration with programmed skip patterns, it is essential that researchers continue to examine the fundamental aspects of measurement procedures. Significant limitations currently exist, perhaps most notably the continued and sustained use of items from instruments with little, or no known, validity evidence. Researchers and consumers of research should be aware of the dangers of drawing conclusions based on measures with limited or poor psychometric properties. Similarly, as new measures are being developed it is critical that researchers do not rely on a “bootstrapping” approach, in which they continually try to validate new measures against known inferior
  • 21. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 21 measures until enough evidence is accumulated to demonstrate that the newly developed measure is superior. In this chapter, we provided a review of the current state of research and current available instruments and highlighted many issues affecting measurement accuracy. The key recommendations of the chapter are summarized in Table 2. While there is still much to be done, it is hoped that the concepts and guidelines discussed here will lead to improved research in school violence and safety measurement.
  • 22. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 22 References Benbenishty, R., Astor, R. A., & Zeira, A. (2003). Monitoring school violence: Linking national-, district-, and school-level data over time. Journal of School Violence, 2, 29–50. doi:10.1300/J202v02n02_03 Bradley, K. D., & Sampson, S. O. (2006). Constructing a quality assessment through Rasch techniques: The process of measurement, feedback, reflection and change. In X. Liu & W. Boone (Eds.), Applications of Rasch measurement in science education (pp. 23–44). Maple Grove, MN: JAM Press. Brener, N. D., Billy, J. O. G., & Grady, W. R. (2003). Assessment of factors affecting the validity of self-reported health-risk behavior among adolescents: Evidence from the scientific literature. Journal of Adolescent Health, 33, 436–457. doi:10.1016/S1054- 139X(03)00052-1 Brener, N. D., Grunbaum, J. A., Kann, L., McManus, T., & Ross, J. (2004). Assessing health risk behaviors among adolescents: The effect of question wording and appeals for honesty. Journal of Adolescent Health, 35, 91–100. doi:10.1016/j.jadohealth.2003.08.013 Brener, N. D., Kann, L., Kinchen, S. A., Grunbaum, J., Whalen, L., … Ross, J. G. (2004). Methodology of the Youth Risk Behavior Surveillance System. MMWR Recommendations and Reports, 53(RR-12), 1–13. Brener, N. D., Kann, L., McManus, T., Kinchen, S. A., Sundberg, E. C., & Ross, J. G. (2002). Reliability of the 1999 Youth Risk Behavior Survey Questionnaire. Journal of Adolescent Health, 31, 336–342. doi:10.1016/S1054-139X(02)00339-7 California Department of Education (CDE). (2010). California Healthy Kids Survey web site. Retrieved September 5, 2010, from http://chks.wested.org/
  • 23. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 23 Centers for Disease Control and Prevention (CDC). (2004). Methodology of the Youth Risk Behavior Surveillance System. MMWR, 53(RR-12), 1–11. Centers for Disease Control and Prevention (CDC). (2009). 2009 Youth Risk Behavior Survey. Retrieved from www.cdc.gov/yrbss Centers for Disease Control and Prevention (CDC). (2010). Youth Risk Behavior Surveillance Survey — United States, 2009. Surveillance Summaries, MMWR, 59, No. SS-5. Available from http://www.cdc.gov/HealthyYouth/yrbs/index.htm Chen, J. K. & Astor, R. A. (2009). The perpetration of school violence in Taiwan: An analysis of gender, grade level and school type. School Psychology International, 30, 568-584. doi:10.1177/0143034309107076 Coggeshall, M. B., & Kingery, P. M. (2001). Cross-survey analysis of school violence and disorder. Psychology in the Schools, 38, 107–116. doi:10.1002/pits.1003 Converse, J. M., & Presser, S. (1986). Survey questions: Handcrafting the standardized questionnaire. Newbury Park, CA: Sage. Cornell, D., Sheras, P., Gregory, A., & Fan, X. (2009). A retrospective study of school safety conditions in high schools using the Virginia threat assessment guidelines versus alternative approaches. School Psychology Quarterly, 24, 119–129. doi:10.1037/a0016182 Cornell, D. G., & Loper, A. B. (1998). Assessment of violence and other high-risk behaviors with a school survey. The School Psychology Review, 27, 317–330. Cross, J. E., & Newman-Gonchar, R. (2004). Data quality in student risk behavior surveys and administrator training. Journal of School Violence, 3, 89–108. doi:10.1300/J202v03n02_06
  • 24. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 24 Denniston, M. D., Brener, N. D., Kann, L., Eaton, D. K., McManus, T., … Ross, J. G. (2010). Comparison of paper-and-pencil versus Web administration of the Youth Risk Behavior Survey (YRBS): Participation, data quality, and perceived privacy and anonymity, Computers in Human Behavior, 26, 1054–1060. doi:10.1016/j.chb.2010.03.006 Dinkes, R., Kemp, J., & Baum, K. (2009). Indicators of school crime and safety: 2009 (NCES 2010–012/ NCJ 228478). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education, and Bureau of Justice Statistics, Office of Justice Programs, U.S. Department of Justice. Estévez, E., Murgui, S., & Musitu, G. (2009). Psychological adjustment in bullies and victims of school violence. European Journal of Psychology of Education, 24, 473–483. doi:10.1007/BF03178762 Farrell, A. D., & Meyer, A. L. (1997). The effectiveness of a school-based curriculum for reducing violence among urban sixth-grade students. American Journal of Public Health, 87, 979–984. doi:10.2105/AJPH.87.6.979 Fowler, F. J., Jr. (1993). Survey research methods (2nd ed.). Newbury Park, CA: Sage. Furlong, M. J. (1996). Tools for assessing school violence. In S. Miller, J. Brodine, & T. Miller (Eds.), Safe by design: Planning for peaceful school communities (pp. 71–84). Seattle, WA: Committee for Children. Furlong, M. J., Greif, J. L., Bates, M. P., Whipple, A. D., Jimenez, T. C., & Morrison, R. (2005). Development of the California School Climate and Safety Survey–Short Form. Psychology in the Schools, 42, 137–149. doi:10.1002/pits.20053 Furlong, M. J., Morrison, G. M., Cornell, D., & Skiba, R. (2004). Methodological and measurement issues in school violence research: Moving beyond the social problem era.
  • 25. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 25 Journal of School Violence, 3(2/3), 5–12. doi:10.1300/J202v03n02_02 Furlong, M. J., Sharkey, J. D., Bates, M. P., & Smith, D. C. (2004). An examination of the reliability, data screening procedures, and extreme response patterns for the Youth Risk Behavior Surveillance Survey. Journal of School Violence, 3(2/3), 109–130. doi:10.1300/J202v03n02_07 Greif, J. L., & Furlong, M. J. (2004a). Towards precision in measuring school violence victimization using IRT. Presentation at the 20th Annual Meeting of the International Society for Traumatic Stress Studies, New Orleans, LA. Greif, J. L., & Furlong, M. J. (2004b). Using item response analysis to develop a unidimensional school violence victimization scale. In Persistently safe schools: The National Conference of the Hamilton Fish Institute on School and Community Violence. Washington, DC: Hamilton Fish Institute, George Washington University. Available, from http://gwired.gwu.edu/hamfish/AnnualConference/2004/ Gun-Free Schools Act (GFSA), 20 USC §8921. (1994). Hanson, T. L., & Kim, J. O. (2007). Measuring resilience and youth development: The psychometric properties of the Healthy Kids Survey. (Issues & Answers Report, REL 2007–No. 034). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory West. Retrieved from http://ies.ed.gov/ncee/edlabs Hilton, N. Z., Harris, G. T., & Rice, M. E. (1998). On the validity of self-reported rates of interpersonal violence. Journal of Interpersonal Violence, 13, 58–72. doi:10.1177/088626098013001004 Hilton, N. Z., Harris, G. T., & Rice, M. E. (2003). Correspondence between self-reports of
  • 26. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 26 interpersonal violence. Journal of Interpersonal Violence, 18, 223–239. doi:10.1177/0886260502250065 Kann L. (2001). The Youth Risk Behavior Surveillance System: Measuring health-risk behaviors. American Journal of Health Behavior, 25, 272–277. Kann, L., Kinchen, S. A., Williams, B. I., Ross, J. G., Lowry, R., Grunbaum, J. A., & Kolbe, L. J. (2000). Youth Risk Behavior Surveillance – United States, 1999. Journal of School Health, 70, 271–285. doi:10.1111/j.1746-1561.2000.tb07252.x Kingery, P. M., & Coggeshall, M. B. (2001). Surveillance of school violence, injury, and disciplinary actions. Psychology in the Schools, 38, 117–126. doi:10.1002/pits.1004 Kokko, T. H. J., & Pörhölä, M. (2009). Tackling bullying: Victimized by peers as a pupil, an effective intervener as a teacher? Teaching and Teacher Education, 25, 1000–1008. doi:10.1016/j.tate.2009.04.005 Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33, 159–174. doi:10.2307/2529310 Linares, J. J. G., Días, A. J. C., Fuentes, M. D. C. P., & Acién, F. L. (2009). Teachers’ perception of school violence in a sample from three European countries. European Journal of Psychology of Education, 24, 49–59. doi:10.1007/BF03173474 Morrison, G. M., Peterson, R., O’Farrell, S., & Redding, M. (2004). Using office referral records in school violence research: Possibilities and limitations. Journal of School Violence, 3(2/3), 139–149. doi:10.1300/J202v03n02_04 Morrison, G. M., & Skiba, R. (2001). Predicting violence from school misbehavior: promises and perils. Psychology in the Schools, 38, 173–184. doi:10.1002/pits.1008 Neiman, S., & DeVoe, J. F. (2009). Crime, violence, discipline, and safety in U.S. public
  • 27. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 27 schools: Findings from the School Survey on Crime and Safety: 2007–08 (NCES 2009- 326). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. O’Brennan, L. M., & Furlong, M. J. (2010). Relations between students’ perceptions of school connectedness and peer victimization. Journal of School Violence, 9, 375–391. doi:10.1080/15388220.2010.509009 Ong, A. D., & Weiss, D. J. (2006). The impact of anonymity on responses to sensitive questions. Journal of Applied Social Psychology, 30, 1691–1708. doi:10.1111/j.1559- 1816.2000.tb02462.x Oregon Healthy Teens. (2004). OHT 2004 survey methodology. Retrieved September 5, 2004, from http://www.dhs.state.or.us/dhs/ph/chs/youthsurvey/ohteens/2004/methods.shtml Osborne, J. W. (2010). Correlation and other measures of association. In G. R. Hancock & R. O. Mueller (Eds.), The reviewer’s guide to quantitative methods in the social sciences (pp. 55-69) New York, NY: Routledge. Rose, C. A. Espelage, D. L., & Monda-Amaya, L. E. (2009). Bullying and victimization rates among students in general and special education: A comparative analysis. Educational Psychology, 29, 761–776. doi:10.1080/01443410903254864 Rosenblatt, J. A., & Furlong, M. J. (1997). Assessing the reliability and validity of student self- reports of campus violence. Journal of Youth and Adolescence, 26, 187–202. doi:10.1023/A:1024552531672 Safe Schools Healthy Students. (2004). The faces of the Safe Schools/Healthy students. Retrieved September 4, 2004, from http://www.sshs.samhsa.gov/initiative/faces.aspx Saylor, C. F., & Leach, J. B. (2009). Perceived bullying and social support in students accessing
  • 28. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 28 special inclusion programming. Journal Development Physical Disabilities, 21, 69–80. doi:10.1007/s10882-008-9126-4 Schwartz, N. (1999). Self reports: How the questions shape the answers. American Psychologist, 54, 93–105. doi:10.1037/0003-066X.54.2.93 Sela-Shayovitz, R. (2009). Dealing with school violence: The effect of school violence prevention training on teachers’ perceived self-efficacy in dealing with violent events. Teaching and Teacher Education, 25, 1061–1066. doi:10.1016/j.tate.2009.04.010 Sharkey, J. D., Furlong, M. J., & Yetter, G. (2004). An overview of measurement issues in school violence and school safety issues. In S. R. Jimerson & M. J. Furlong (Eds.), Handbook of school violence and school safety: From research to practice (pp. 121– 134). Mahwah, NJ: Erlbaum. Skiba, R., Simmons, A. B., Peterson, R., McKelvey, J., Ford, S., & Gallini, S. (2004). Beyond guns, drugs and gangs: The structure of student perceptions of school safety. In M. J. Furlong, M. P. Bates, D. C. Smith, & P. Kingery (Eds.), Appraisal and prediction of school violence: Issues, methods and contexts (pp. 149–171). Hauppauge, NY: Nova Science. Stapleton, L. M., Cafarelli, M., Almario, M. N., & Ching, T. (2010). Prevalence and characteristics of student attitude surveys used in public elementary schools in the United States. Practical Assessment, Research & Evaluation. 15(9). Available from http://pareonline.net/getvn.asp?v=15&n=9 Sugai, G., Sprague, J. R., Horner, R. H., & Walker, H. M. (2000). Preventing school violence: The use of office discipline referrals to assess and monitor school-wide discipline interventions. Journal of Emotional and Behavioral Disorders, 8, 94–101.
  • 29. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 29 doi:10.1177/106342660000800205 Sullivan, M. L. (2002). Exploring layers: Extended case method as a tool for multilevel analysis of school violence. Sociological Methods & Research, 31, 255–285. doi:10.1177/0049124102031002005 Turner, C. F., Ku, L., Rogers, S. M., Lindberg, L. D., Pleck, J. H., & Sonenstein, F. L. (1998). Adolescent sexual behavior, drug use, and violence: Increased reporting with computer survey technology. Science, 280, 867–873. doi:10.1126/science.280.5365.867 Turner, L. A., Powell, A. E., Langhinrichsen-Rohling, J., & Carson, J. (2009). Helping families initiative: Intervening with high risk students through a community, school, and district attorney partnership. Child and Adolescent Social Work Journal, 26, 209–223. doi:10.1007/s10560-009-0167-z Yavuzer, Y., Gundogdu, R., & Dikici, A. (2009). Teachers’ perceptions about school violence in one Turkish city. Journal of School Violence, 8, 29–41. doi:10.1080/15388220802067797
  • 30. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 30 Table 1 School Violence and Safety Measures: Intended Use, Limitations, and Recommendations Type Intended Use Limitations Recommendations General Surveillance Obtain population estimates broadly about youth risk. Not subjected to rigorous psychometric analysis; limited information about school violence and safety; does not inform individual development. Conduct psychometric analysis for new purpose prior to extending their use. School Violence Items in Surveillance Surveys Obtain populations estimates about school violence and safety. The most rigorous survey is only administered to school principals and does not produce student, staff, or parent-level information; does not inform individual development. Include items designed to specifically assess school violence and safety in more broadly administered assessments. Make sure reliability and validity have been tested for intended purposes. School Violence and Safety Measurement Instruments Measure latent constructs of youth violence. Threats to validity such as reading level, honesty of answers, response options, recall, and extreme responses. Advance knowledge of how to overcome threats to validity by studying these issues. Create and validate more advanced measurement tools with the knowledge gained. Mandated Districtwide Reporting Track student violence across schools. Data submitted by districts are not comparable. Identify an alternative method or implement extensive training, standardization, and incentive for accurate and comparable reporting. Discipline Referrals Examine school safety; evaluate if there are disproportionate referrals; predict students’ risks for future infractions. Lack of standard procedures within and across schools; lack of psychometric scrutiny. Limit cross analysis of data from different sources unless discipline procedures, reporting, and data collection are standardized within and across settings. Evaluate data for reliability and validity.
  • 31. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 31 Table 2 Implications for Practice: Recommendations for the Use of School Violence Self-report Procedures Constructing Surveys Clearly define key terms Keep questions concrete Use memory aids when asking about past events Include multiple questions to measure each main construct Check survey’s reading level for the target population Include items that screen for honesty and accuracy Selecting Existing Surveys Check reliability and validity for the population to be tested Administering Surveys Ensure confidentiality / anonymity Train staff to adhere to a standard protocol Explain purpose of survey to students and how results are used Be prepared to read questions aloud to students and to clarify Debrief students afterward: What were they thinking? Reporting Survey Results Present information separately to students, staff, parents Solicit perceptions of survey data Brainstorm alternative courses of action Maintain a collaborative, team-oriented stance Next Steps Link assessment to planning and intervention Choose activities and programs with empirical support Reassess students periodically to measure program effectiveness
  • 32. Running head: SCHOOL VIOLENCE MEASUREMENT ISSUES 31 Table 2 Implications for Practice: Recommendations for the Use of School Violence Self-report Procedures Constructing Surveys Clearly define key terms Keep questions concrete Use memory aids when asking about past events Include multiple questions to measure each main construct Check survey’s reading level for the target population Include items that screen for honesty and accuracy Selecting Existing Surveys Check reliability and validity for the population to be tested Administering Surveys Ensure confidentiality / anonymity Train staff to adhere to a standard protocol Explain purpose of survey to students and how results are used Be prepared to read questions aloud to students and to clarify Debrief students afterward: What were they thinking? Reporting Survey Results Present information separately to students, staff, parents Solicit perceptions of survey data Brainstorm alternative courses of action Maintain a collaborative, team-oriented stance Next Steps Link assessment to planning and intervention Choose activities and programs with empirical support Reassess students periodically to measure program effectiveness View publication stats View publication stats