Tips for Completing this Word Document Electronically


1) This document was created using Microsoft Word X on a Macintosh...
District Name:      *                                   Contact Person’s   Name:            *     
Review Date:        *  ...
Summary of Technical Adequacy Evidence
Part 1: Technical Quality (What is the technical quality of the assessment tool?)

...
Part 2. Implementation (How was the assessment administered and scored?)

 2.1 What administration procedures were followe...
2.4 When constructed-response tasks require raters to assign scores to student responses, what were the scoring
     proce...
Part 3. Interpretation (How are the assessment results interpreted and reported?)

 3.1 From the list below, mark the step...
education.uiowa.edu/cea/itap/information/pdf/...
Upcoming SlideShare
Loading in …5
×

education.uiowa.edu/cea/itap/information/pdf/...

340 views
315 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
340
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

education.uiowa.edu/cea/itap/information/pdf/...

  1. 1. Tips for Completing this Word Document Electronically 1) This document was created using Microsoft Word X on a Macintosh. When used on your computer, this version of the document might not have retained all of the formatting used to create the original document. (It depends upon your computer and word-processing software.) Therefore, please print a copy of the pdf version of this document and use it as a guide for how this Word document was intended to look. 2) This document has been protected in an effort to make it easier for you to use. As a result, there are only specific areas where the user can enter text. The parts of the document where you can enter text are the cells that are unshaded and/or contain an asterisk (*). By using your “tab” key you can advance from one unshaded cell to the next (or from one cell with an asterisk to the next cell with an asterisk). 3) If you find it necessary to unprotect the documents so that you can better maneuver within the document or more easily apply formatting, go to the “Tools” pull-down menu at the top of the screen. In this menu you will see the option to “Unprotect Document.” Once you select “Unprotect Document” you will be able to make changes to any area of the document. 4) Some sections of this document contain “check box” fields ( ). These fields will only work if the document is “Protected.” When active, you should be able to click on the box and then an “X” should appear inside the box. If you are unable to activate a check box, you will need to replace the with an “X” to mark the appropriate choice. 5) Please do not make any changes to the text written in any part of this document, unless the changes are necessary to make this Word version similar to the pdf version. 6) The font used to create this document was “Times New Roman” (10 point and 12 point), and the left and right margins were set to 1 inch. Thus, to obtain as much original formatting as possible, you may want to make sure that the same font and margins are applied to this document on your computer. © The University of Iowa, January 2003. This document may be copied and distributed to Iowa schools for use within the state of Iowa. Other uses are prohibited.
  2. 2. District Name: *      Contact Person’s Name: *      Review Date: *      Phone Number: *      Technical Adequacy Evidence: Locally-Developed Assessment General Instructions: This template is used to summarize evidence to support the use of a locally- developed assessment tool as part of a “district-wide assessment system.” (The evidence requested in this template is similar to, but not exactly the same as, evidence requested in the template for a commercially-developed assessment.) Provide the following information for the specific assessment tool being reviewed. Then, complete each of the three required parts of this template. Title: *      Grade Level(s): *      Development *      Content Area(s): *      Completion Date: In which ways are scores from this assessment tool used in your district’s Annual Progress Report (APR)? Report student achievement using a multiple measure * Monitor annual achievement goals Whenever the evidence reported in this template is in the form of a reference (or citation), complete citation information needs to be provided so that those who are evaluating the quality/sufficiency of this evidence can locate the referenced material. In the space below, list the references cited within this summary of technical adequacy evidence for this assessment tool. *      © The University of Iowa, January 2003. This document may be copied and distributed to Iowa schools for use within the state of Iowa. Other uses are prohibited.
  3. 3. Summary of Technical Adequacy Evidence Part 1: Technical Quality (What is the technical quality of the assessment tool?) 1.1 Provide a general description of the knowledge and skills, including their relative emphasis, that the assessment tool is designed to measure. *      1.2 Describe (or reference) the steps used to develop the assessment tool and support materials. *      1.3 Describe (or reference) the procedures used to ensure that the assessment tool is fair and free of bias. *      1.4 Provide an estimate of the reliability of student scores (e.g. Internal Consistency such as Coefficient α, KR-20, or KR-21) or decision consistency. *      1.5 If multiple test forms were used, describe the evidence obtained to ensure their equivalence. *      1.6 If achievement levels (or performance standards) were used, provide the verbal descriptors of student performance for each level and describe the procedures used to create these verbal descriptors and to establish the corresponding cut points. *      3/12/03 Technical Adequacy Evidence: Locally-Developed Assessment Page 2
  4. 4. Part 2. Implementation (How was the assessment administered and scored?) 2.1 What administration procedures were followed for the assessment, and what steps were taken to ensure that these procedures were uniformly followed across schools (classes) and in consecutive years? a. Describe (or reference) the administration procedures. *      b. If the administration procedures were modified for any class or building within the district or for different school years, describe how and why the administration procedures differed. *      c. From the list below, mark the steps taken to ensure/verify that the standard conditions were followed across settings and occasions: * Provided training or instruction to all test administrators on the use of standard conditions. * Observed test administrations in selected classrooms in the building. * Documented any irregularities or incidences that might distort the quality of the data. If optional conditions were possible (e.g., calculator availability), the conditions were applied to * all students. Other: *      d. From the list below, mark the steps taken to ensure/verify that student scores were not compromised (at any setting or on any occasion). * Prior to the test administration, students did not have access to assessment items or tasks. During the test administration, students did not receive assistance that could lead to inflated scores * (e.g., cheating, cueing, extra help from teacher, or inappropriate accommodation or modification). * After the test administration, student responses/answers were not altered. Other: *      2.2 Were any accommodations or modifications made when this assessment was administered? If so, describe the accommodations or modifications used. *      2.3 When objective scoring is incorporated manually, what is done to ensure the accuracy of the scores? a. Describe the qualifications of the scorers. *      b. From the list below, mark the steps taken to ensure the accuracy of the scores. * Checked the consistency of the scoring (e.g., scored responses twice) for a sample. Checked the accuracy of the score conversions, as appropriate (e.g., raw score to scaled score, or * percent correct, conversions) for a sample. * Verified the accuracy of the keyed entries for database updates/creation. Other: *      3/12/03 Technical Adequacy Evidence: Locally-Developed Assessment Page 3
  5. 5. 2.4 When constructed-response tasks require raters to assign scores to student responses, what were the scoring procedures and what was done to ensure the quality of the scores across raters and occasions? a. Describe the panel of raters participating in the scoring by providing the following: 1) number of raters who evaluated student responses to tasks from this assessment, 2) qualifications of the raters in terms of the sufficiency of their content expertise and the experiences they have had working with students at the relevant grade level. *      b. Describe (or reference) the training opportunities provided to raters by providing descriptions of the following: 1) training procedures and/or materials used to assist raters in developing a common understanding of the scoring criteria, and 2) procedures used to monitor the scoring process to determine if/when retraining was necessary. *      c. Describe (or reference) the process followed to score student responses by providing a description of the following: 1) procedure used to conceal student identities from the raters—if the identities were not concealed, describe what was done to reduce the likelihood that this knowledge influenced the ratings; and 2) procedure used to determine the final score assigned to a student’s response—if a given response received more than one rating, describe how differences were adjusted (if at all) when the ratings assigned to this response were not the same. *      d. Describe (or reference) the steps taken to ensure the comparability of the ratings across raters and over time by providing a description of the following: 1) aspects of the scoring materials and or procedures that were used to increase comparability of ratings across raters (e.g., task-specific scoring criteria, anchor papers, training papers, facilitators/table leaders received standardized training); and 2) procedures used to ensure the consistency of the scoring and training procedures from one setting or year to the next. *      2.5 When constructed-response tasks require raters to assign scores to student responses, what is the consistency (reliability) of scores across raters (i.e., rater consistency), and how was this determined? a. Describe the method used to estimate rater consistency by answering the following questions: 1) Was rater consistency computed based on independent scores, or based on a consensus or arbitration process? 2) If only one rater was used, how was rater consistency determined? *      b. For each constructed-response task on this assessment scored during the scoring session what was the total number of student responses scored and what percent of these responses were used in computing rater consistency? *      c. For the constructed-response task(s) on this assessment, provide data summarizing the extent of rater consistency achieved and describe how these data were obtained. At a minimum, summarize the extent of “percent exact agreement.” *      3/12/03 Technical Adequacy Evidence: Locally-Developed Assessment Page 4
  6. 6. Part 3. Interpretation (How are the assessment results interpreted and reported?) 3.1 From the list below, mark the steps taken to establish the integrity of the scores prior to making interpretations with them. Score data were checked for completeness to ensure that the results of all students tested were * included. If scores were missing, corrective action was taken. Suspicious scores (i.e., extremely high or low scores) were checked for accuracy, and if appropriate, * erroneous scores were corrected. Group scores were checked to ensure that scores of students with IEPs and scores from all other * subgroups were included in district- and building-wide scores, and that the number of students in each of these subgroups was reasonable. Other: *      3.2 Describe how the results from this assessment tool were reported in the Annual Progress Report. *      3.3 From the list below, mark the steps taken to ensure that the displays/presentations (e.g., graphs, tables, figures, or text) of assessment information in the Annual Progress Report were accurate and clear. An accurate verbal description of the achievement domain measured by the instrument was provided * (e.g., “reading comprehension” not just “reading”). * Graphs or other visuals were constructed using size and scale that do not mislead. All displays of data were labeled completely and accurately (e.g., student vs. school norms, * Iowa vs. national norms, mean vs. median grade equivalent, etc.). * Group sizes were included with all displays of data. Where achievement levels were used, the verbal descriptors of the performance associated with the * achievement levels were also given (e.g., description of what “proficient” means) Other: *      3.4 From the list below, mark the steps taken to ensure that the conclusions/interpretations made in the Annual Progress Report are supported by the data reported. The conclusions/interpretations were examined to verify that they were consistent with the data that * were reported. The generalizations were examined to verify that they did not go beyond the knowledge/skills actually * measured by the instrument (e.g. just “reading comprehension” instead of all content standards in reading). The generalizations were examined to verify that they did not go beyond the grade level for which * data were reported. Common or potential misinterpretations or misunderstandings were identified, and cautions were * included in the APR to address them. Other: *      3/12/03 Technical Adequacy Evidence: Locally-Developed Assessment Page 5

×