2. Introduction
The terminology in ECD is best understood through its representation of layers in the design and
implementation of educational assessment. Assessment is viewed as five coordinated layers of activities,
processes, and elements.
Most directly related to test development: Domain Modeling, Conceptual Assessment Framework.
Assessment
delivery
Assessment
implementation
Conceptual
assessment
framework
Domain
modeling
Domain analysis
3. Assessment as argument
Validity research concerns ways in which we can build and use assessments that support interpretations and
uses of results.
in this diagram:
A: Alternative explanation: further data;
B: Backing: support for warrant;
C: Claim: a proposition we wish to support with data;
D: Data;
R: Rebuttal: negates the evidence;
W: Warrant: a generalization that justifies the inference
from the particular data to the particular claim.
4. ECD layers
1. Domain analysis
Features: marshals information that has implications for assessment in the targeted domain; provides the backing
for assessment arguments.
Role: Gather substantive information about what is to be assessed.
Key entities: Representation forms; analysis of language learning and language use in specific context or
content area.
Examples of knowledge representations: Language framework; language curriculum; proficiency guidelines;
syllabi.
5. ECD layers
2. Domain modeling
Features: articulates the assessment argument.
Role: Express assessment argument in narrative form.
Key entities: Knowledge, skills, and capabilities; task features; potential observations.
Example of knowledge representations: Toulmin diagram; design patterns; claims, evidence, and task
worksheets.
6. ECD layers
3. Conceptual assessment framework
Feature: provides a structure for coordinating the substantive, statistical, and operational aspects of an assessment. The
operational elements of assessment machinery are specified in the form of three models, which instantiate the assessment
argument: The student model, Evidence model, and Task model.
Role: Express assessment argument in specifications for tasks, measurement models, scoring procedures.
Key entities: Student, evidence, and task model; task model variables; rubrics; Measurement models; test
specifications.
Examples of knowledge representations: Task shells; task templates; graphical representation of measurement models.
7. ECD layers
4. Assessment implementation
Feature: includes activities such as authoring tasks, fine-tuning rubrics and examples, and fitting measurement models. in
this layer assessment-engineering developments in technology for creating such elements as adaptive testing, interaction
in simulation environments, and capture and evaluation of complex performances are effected, in accordance with
specifications laid out in the CAF.
Role: Implement assessment, including writing tasks and fitting measurement models.
Key entities: Task materials; pretest data for item and task scoring.
Examples of knowledge representations: Items and tasks; design of test materials and score reports; item parameter
data files.
8. ECD layers
5. Assessment delivery
Features: includes processes for selecting and administering test items, interacting with examinees, reporting scores,
and providing feedback to appropriate parties. The information, structure, and rationale of these processes embody the
assessment argument laid out in domain analysis, are structured by the blueprint laid out in the CAF, and utilize the
operational elements created in assessment implementation.
Role: Coordinate interaction of students and tasks; task and test-level scoring and reporting.
Key entities: Task as presented; responses as collected; scores as evaluated.
Examples of knowledge representations: Rendering of test materials; individual and group-level reports; data files.
9. Critical issues and topics
Conceptions of capabilities and competencies shape the assessment argument;
A perspective on language learning and language use suggests the kinds of inferences we should
draw about examinees, from what kinds of evidence, in what kinds of assessment situations.
10. Language tests developed using ECD
TOFEL iBT
TOEIC
Speaking and Writing test
Advanced Placement (AP) Spanish Language Exam
11. High points of ECD
Supported by other conceptual frameworks; e.g. The Principled Assessment Designs for Inquiry
(PADI) project
Structured measurement models and task design
Reusability and modularity
12. Assessment use arguments
Assessment use arguments extend the argument-based approach beyond the ECD design phase, to
assessment use and validation;
addresses the social context in which an assessment is meant to be used;
Claims in an AUA concern intended uses of an assessment and are justified in terms of beneficence of
consequences and equitability and value sensitivity of decisions made based on assessment results.