SlideShare a Scribd company logo
1 of 13
Download to read offline
Validating Practice ComprehensiveTests
For Student And Institutional Bar Success
ProfessorYaira S. Ortiz-Medina
Pontifical Catholic University of Puerto Rico
School of Law
Professor Laurie Zimet
UC Hastings College of the Law
Goals
By the end of this session participants will:
 identify the key elements of a test validation process;
 be able to design a test validation process for their schools;
 be able to interpret the results and make decisions based in the
process’s outcomes.
Reliability
Is the degree to which an assessment tool
produces stable and consistent results.
Phelan andWren (2005-06)
Validity
Refers to how well a test measures what
it is purported to measure.
Phelan andWren (2005-06)
Reliability Coefficient KR20
 Measures test reliability and is an overall measure of internal consistency.
A higher value indicates a stronger relationship between items on the test.
Kelley (1939)
Response Frequencies
The percentage of
times that the
alternative was
selected.
Kelley (1939)
Non Distractor
 Item options that are not chosen by
any student.Therefore, they do not
provide any information to distinguish
different levels of student
performance.When an item has too
many non-distractors, it needs to be
revisited and possibly revised.
Kelley (1939)
Point-Biserial
 Indicates if a question was a good discriminator between better students
and poorer students. Point Biserial ranges from -1 to 1. A positive value
indicates that the students who did well on the test answered the question
correctly.
Kelley (1939)
Checklist Of Findings
 Example
O = Observed N = Not observed NA = Not applicable
O N NA Criteria Comments
X
Results allow me to
identify the mean
score.
Mean score is
15.73.
Checklist Of Findings (Part A)
O = Observed N = Not observed NA = Not applicable
O N NA Criteria Comments
X
Results allow me to identify the maximum
scores.
Maximum score is 21.00.
X
Results allow me to identify the minimum
scores.
Minimum score is 9.00.
X The Reliability Coefficient (KR20) is above .70. KR20 are .38.
X
Detailed item analysis provided allows me to
identify weak questions.
Weak questions are 8, and 9.
Questions 2, 3, and 5 should be
revised.
X
The amount of items is adequate for the test’s
purpose.
The amount of items per
subject is not representative.
Checklist Of Findings (Part B)
O = Observed N = Not observed NA = Not applicable
O N NA Criteria Comments
X
Subjects Learning Objective Report
allows me to identify subjects that need
more attention.
The weaker subject is General
Theory of Obligations.
Civil Procedural Law, Evidence,
Ethics, and Torts might need
reinforce.
X
Procedures before, during and after the
test are specified.
Are specified, but might need
more structure.
X
Results report allows ASP staff to design
an action plan.
ASP could develop an action
plan that will impact peer-
tutoring program and
workshops.
Validating Practice Comprehensive Tests

More Related Content

What's hot

Analyzing and Using Test Item Data
Analyzing and Using Test Item DataAnalyzing and Using Test Item Data
Analyzing and Using Test Item Data
jasper gaboc
 

What's hot (20)

Item analysis
Item analysisItem analysis
Item analysis
 
Item Analysis - Discrimination and Difficulty Index
Item Analysis - Discrimination and Difficulty IndexItem Analysis - Discrimination and Difficulty Index
Item Analysis - Discrimination and Difficulty Index
 
Item analysis
Item analysis Item analysis
Item analysis
 
Item Analysis
Item Analysis Item Analysis
Item Analysis
 
Item Analysis
Item AnalysisItem Analysis
Item Analysis
 
Item analysis
Item analysisItem analysis
Item analysis
 
Analyzing and using test item data
Analyzing and using test item dataAnalyzing and using test item data
Analyzing and using test item data
 
Qualitative item analysis
Qualitative item analysisQualitative item analysis
Qualitative item analysis
 
Item Analysis
Item AnalysisItem Analysis
Item Analysis
 
Item analysis presentation
Item analysis presentationItem analysis presentation
Item analysis presentation
 
Item analysis
Item analysisItem analysis
Item analysis
 
Analyzing and Using Test Item Data
Analyzing and Using Test Item DataAnalyzing and Using Test Item Data
Analyzing and Using Test Item Data
 
Item
ItemItem
Item
 
Item analysis2
Item analysis2Item analysis2
Item analysis2
 
Item analysis
Item analysisItem analysis
Item analysis
 
Item analysis in education
Item analysis  in educationItem analysis  in education
Item analysis in education
 
Test item analysis
Test item analysisTest item analysis
Test item analysis
 
Administering, analyzing, and improving the test or assessment
Administering, analyzing, and improving the test or assessmentAdministering, analyzing, and improving the test or assessment
Administering, analyzing, and improving the test or assessment
 
Improving the test items
Improving the test itemsImproving the test items
Improving the test items
 
Assessment of learning1
Assessment of learning1Assessment of learning1
Assessment of learning1
 

Similar to Validating Practice Comprehensive Tests

Item analysis and validation
Item analysis and validationItem analysis and validation
Item analysis and validation
KEnkenken Tan
 
Test standardization
Test standardizationTest standardization
Test standardization
Kaye Batica
 
Know how of question bank development
Know how of question bank developmentKnow how of question bank development
Know how of question bank development
Manoj Bhatt
 
item analysis.pptx education pnc item analysis
item analysis.pptx education pnc item analysisitem analysis.pptx education pnc item analysis
item analysis.pptx education pnc item analysis
swatisheth8
 
Authentic assessment
Authentic assessmentAuthentic assessment
Authentic assessment
EMC-DE
 
Item Development and Analysis WorksheetStudent Name.docx
Item Development and Analysis WorksheetStudent Name.docxItem Development and Analysis WorksheetStudent Name.docx
Item Development and Analysis WorksheetStudent Name.docx
sleeperfindley
 

Similar to Validating Practice Comprehensive Tests (20)

New item analysis
New item analysisNew item analysis
New item analysis
 
CHAPTER 6 Assessment of Learning 1
CHAPTER 6 Assessment of Learning 1CHAPTER 6 Assessment of Learning 1
CHAPTER 6 Assessment of Learning 1
 
Fulcher standardized testing
Fulcher standardized testingFulcher standardized testing
Fulcher standardized testing
 
Research Procedure
Research ProcedureResearch Procedure
Research Procedure
 
Day 11 t test for independent samples
Day 11 t test for independent samplesDay 11 t test for independent samples
Day 11 t test for independent samples
 
Item analysis and validation
Item analysis and validationItem analysis and validation
Item analysis and validation
 
Test standardization
Test standardizationTest standardization
Test standardization
 
Thompson (2009) -_classical_item_analysis_with_citas
Thompson (2009) -_classical_item_analysis_with_citasThompson (2009) -_classical_item_analysis_with_citas
Thompson (2009) -_classical_item_analysis_with_citas
 
Know how of question bank development
Know how of question bank developmentKnow how of question bank development
Know how of question bank development
 
Analysis of item test
Analysis of item testAnalysis of item test
Analysis of item test
 
Analysis of item test
Analysis of item testAnalysis of item test
Analysis of item test
 
item analysis.pptx education pnc item analysis
item analysis.pptx education pnc item analysisitem analysis.pptx education pnc item analysis
item analysis.pptx education pnc item analysis
 
Item analysis ppt
Item analysis pptItem analysis ppt
Item analysis ppt
 
Authentic assessment
Authentic assessmentAuthentic assessment
Authentic assessment
 
Validity in Psychological Testing
Validity in Psychological TestingValidity in Psychological Testing
Validity in Psychological Testing
 
Achievement test
Achievement testAchievement test
Achievement test
 
Reability & Validity
Reability & ValidityReability & Validity
Reability & Validity
 
Unit. 6.doc
Unit. 6.docUnit. 6.doc
Unit. 6.doc
 
Aligning tests to standards
Aligning tests to standardsAligning tests to standards
Aligning tests to standards
 
Item Development and Analysis WorksheetStudent Name.docx
Item Development and Analysis WorksheetStudent Name.docxItem Development and Analysis WorksheetStudent Name.docx
Item Development and Analysis WorksheetStudent Name.docx
 

Validating Practice Comprehensive Tests

  • 1. Validating Practice ComprehensiveTests For Student And Institutional Bar Success ProfessorYaira S. Ortiz-Medina Pontifical Catholic University of Puerto Rico School of Law Professor Laurie Zimet UC Hastings College of the Law
  • 2. Goals By the end of this session participants will:  identify the key elements of a test validation process;  be able to design a test validation process for their schools;  be able to interpret the results and make decisions based in the process’s outcomes.
  • 3.
  • 4. Reliability Is the degree to which an assessment tool produces stable and consistent results. Phelan andWren (2005-06)
  • 5. Validity Refers to how well a test measures what it is purported to measure. Phelan andWren (2005-06)
  • 6. Reliability Coefficient KR20  Measures test reliability and is an overall measure of internal consistency. A higher value indicates a stronger relationship between items on the test. Kelley (1939)
  • 7. Response Frequencies The percentage of times that the alternative was selected. Kelley (1939)
  • 8. Non Distractor  Item options that are not chosen by any student.Therefore, they do not provide any information to distinguish different levels of student performance.When an item has too many non-distractors, it needs to be revisited and possibly revised. Kelley (1939)
  • 9. Point-Biserial  Indicates if a question was a good discriminator between better students and poorer students. Point Biserial ranges from -1 to 1. A positive value indicates that the students who did well on the test answered the question correctly. Kelley (1939)
  • 10. Checklist Of Findings  Example O = Observed N = Not observed NA = Not applicable O N NA Criteria Comments X Results allow me to identify the mean score. Mean score is 15.73.
  • 11. Checklist Of Findings (Part A) O = Observed N = Not observed NA = Not applicable O N NA Criteria Comments X Results allow me to identify the maximum scores. Maximum score is 21.00. X Results allow me to identify the minimum scores. Minimum score is 9.00. X The Reliability Coefficient (KR20) is above .70. KR20 are .38. X Detailed item analysis provided allows me to identify weak questions. Weak questions are 8, and 9. Questions 2, 3, and 5 should be revised. X The amount of items is adequate for the test’s purpose. The amount of items per subject is not representative.
  • 12. Checklist Of Findings (Part B) O = Observed N = Not observed NA = Not applicable O N NA Criteria Comments X Subjects Learning Objective Report allows me to identify subjects that need more attention. The weaker subject is General Theory of Obligations. Civil Procedural Law, Evidence, Ethics, and Torts might need reinforce. X Procedures before, during and after the test are specified. Are specified, but might need more structure. X Results report allows ASP staff to design an action plan. ASP could develop an action plan that will impact peer- tutoring program and workshops.