1
Assessing the Validity of Inferences Made from Assessment Results
Sources of Validity Evidence
• Validity evidence can be gathered during the development of the assessment or after the
assessment has been developed.
• Some of the methods used to gather validity evidence can support more than one type of
source (e.g., test content, internal structure).
• Large scale assessment and local classroom assessment developers often use different
methods to gather validity evidence.
o Large scale assessment developers use more formal, objective, systematic, and
statistical methods to establish validity.
o Teachers use more informal and subjective methods which often to not involve
the use of statistics.
Evidence based on Test Content
• Questions one is striving to answer when gathering validity evidence based on test
content or construct:
o Does the content of items that make-up the assessment fully represent the concept
or construct the assessment is trying to measure?
o Does the assessment accurately represent the major aspects of the concept or
construct and not include material that is irrelevant to it?
o To what extent do the assessment items represent a larger domain of the concept
or construct being measured?
• The greater the extent to which an assessment represents all facets of a given concept or
construct, the better the validity support based on the test content or construct. There is
no specific statistical test associated with this source of evidence.
• Methods used to gather validity evidence based on test content or construct
o Large Scale Assessments
§ Have experts in the concept or construct being measured create the
assessment items and the assessment itself.
§ Have experts in the concept or construct examine the assessment and
review it to see how well it measures the concepts or construct. These
experts would think about the following during the review process:
§ The extent to which the content of assessment
represents the content or construct’s domain or
universe.
§ How well the items, tasks, or subparts of the
assessment fit the definition of the construct and/or
the purpose of the assessment.
§ Is the content or construct underrepresented, or are
there content or construct-irrelevant aspects of the
assessment that may result in unfair advantages for
2
one or more subgroups (e.g., Caucasians, African
Americans)?
§ What is the relevance, importance, clarity, and lack
of bias in the assessment’s items or tasks
o Local Classroom Assessment
§ Develop assessment blue prints which indicate what will be assessed as
well as the nature of the learning (e.g., knowledge, application, etc.) that
should be represented on the assessment.
§ Build a complete set of learning objectives or targets, showing number of
items and/or percentage of items/questions on the assessment devoted to
each.
§ Discuss with others (e.g., teachers, administrators, conte.
Introduction to writing research questions and determining what variables to use. Introductory concepts for school personnel interested in action research.
1 Network Analysis and Design This assignment is.docxoswald1horne84988
1
Network Analysis and Design
This assignment is worth 30%.
Deadline: Mon, Week 12
Part A: HQ LAN Upgrade (35%)
Background:
ABC is a big company in the US. ABC has employed you as the IT officer of the company.
Your job is to analyse the performance of the HQ LAN, suggest changes to improve the
network performance and provide a report to your boss.
Settings:
Run all simulations for 30 minutes to simulate a working day.
The graphs should be time averaged
Duplicate scenario for each possible setup
Tasks:
1. Analyse the current performance of the HQ LAN for each level and comment on it.
You are required to show all relevant graphs. The graphs for each level can be
overlaid. (10%)
2. Some staffs are unhappy about the speed of the network. Anything that takes more
than 1 second is not desirable. You have decided to try the following to improve the
network performance. Show the relevant graphs and comment on the results: (5%)
a. Increase the link speeds of
i. HQ_Router1 to HQ_Router3 from 1 Gbps to 10 Gbps and
ii. HQ_Router2 to HQ_Router3 from 1 Gbps to 10 Gbps
b. Increase the LANs for level 1, 2 and 3 from 100 Mbps to 1 Gbps
c. Try out 1 other way that meets the requirement.
3. After meeting the requirement, the company has decided to purchase an Ethernet
Server and placed it in the HQ LAN. (10%)
a. Rename it to HQ Server
b. Use a 1Gbps link
c. Set Application: Supported Services to All
d. Set statistics to view the following:
i. Server DB Task Processing Time (Heavy)
ii. Server Email Task Processing Time (Heavy)
iii. Server HTTP Task Processing Time (Heavy)
iv. Server Performance Task Processing Time
e. Show the performance of the HQ Server with the required graphs and
comment on the results
f. Justify the location of the server
g. State at least 3 security measures you will take to protect the HQ LAN from
malicious attacks
4. What would you do so that all the 4 statistics of the HQ server are less than 0.025 s?
Show all relevant graphs. (3 marks)
2
5. Prepare a report and state the additional amount of money that is needed for the
changes you have made to meet the additional requirements. Refer to the given price
list in the Appendix. (7%)
a. Your report should include a content page, a summary of the addressed issues,
objectives, budgeting, proposed solutions and conclusion.
Part B: Network Design (65%)
Background:
Due to your excellent work in the analysis of the HQ LAN, you are now assigned the new
task of designing the LAN for one of ABC’s client, XYZ. The company XYZ is made up of 4
sections and the number of people in each section is as shown below.
1. Research – 20
2. Technical – 10
3. Guests – 4
4. Executives – 2
Set up the following staff profile:
1. Research: file transfer (light), web browsing (heavy) and file print (light)
2. Technical: Database Access (heavy), telnet (heavy) and email (light)
3. Guests: Em.
1 Name _____________________________ MTH129 Fall .docxoswald1horne84988
1
Name: _____________________________
MTH129 Fall 2018 - FINAL EXAM A
Show all work neatly on paper provided. Label all work. Place final answers on the answer sheet.
PART I: Omit 1 complete question. Place an “X” on the problems & answer space you are omitting.
1. Find the inverse of the following functions:
a. 𝑓(𝑥) = 2𝑥 − 3
b. 𝑓(𝑥) =
3𝑥 +1
𝑥−2
2. If 𝑓(𝑥) = 𝑥 2 − 2𝑥 + 3 and 𝑔(𝑥) = −3𝑥 + 4, find the following:
a. (𝑓°𝑔)(𝑥) b. (𝑓°𝑔)(2)
3. Find the domain for the following expression:
a) √𝑥 + 5 𝑏) 7𝑥 2 + 3𝑥 − 1 𝑐)
𝑥 2+4
𝑥 2−9
4. Find the radian measures of the angles with the given degree measures.
a) 81°
Find the degree measures of the angles with the given radian measures.
b)
13𝜋
6
5. Solve the following equations:
a) (5t) = 20
b) 6000 = 40(15)t
6. Expand the following logarithmic expressions:
a. log(𝐴𝐵2 )
b. ln(
4
√3
)
7. Describe how the graph of each function can be obtained from the graph f
a. 𝑦 = 𝑓(𝑥) − 8
b. 𝑦 = 𝑓(𝑥 + 4) − 5
8. A real number t is given 𝑡 =
2𝜋
3
a. Find the reference number for t.
b. Find the terminal point P(x,y) on the unit circle determined by t
c. The unit circle is centered at __________________ and has a radius of _________________
PART II: Omit 1 complete question. Place an “X” on the problems & answer space you are omitting.
2
1. A sum of $7,000 is invested at an interest rate of 4
1
2
% per year, compounding monthly. (round all answers to
the nearest cent)
a. Find the amount of the investment after 2
1
2
years.
b. How long will it take for the investment to amount to $12,000?
c. Using the information in part (a), find the amount of the investment if compounded quarterly.
2. When a company charges price p dollars for one of its products, its revenue is given by
𝑅 = 𝑓(𝑝) = 500𝑝(30 − 𝑝)
a. Create a quadratic function for price with respect to revenue.
b. What price should they charge in order to maximize their revenue?
c. What is the maximum revenue?
d. What would be the revenue if the price was set at $10?
e. Sketch a rough graph – indicate the intercepts and the maximum coordinates.
3. The charges for a taxi ride are an initial charge of $2.50 and $0.85 for each mile driven.
a. Write a function for the charge of a taxi ride as a linear function of the distance traveled.
b. What is the cost of a 12 mile trip?
c. Find the equation of a line that passes through the following points: (1,-2) , (2,5) Express in 𝑦 =
𝑚𝑥 + 𝑏 form
d. Graph part ( c )
4. a. Divide the following polynomial and factor completely.
𝑃(𝑥) = 3𝑥 4 − 9𝑥 3 − 2𝑥 2 + 5𝑥 + 3; 𝑐 = 3
b. Given polynomial−𝑥 2 + 5𝑥 − 6, state the end behavior of its graph.
c. Using the polynomial on part ( c ), would this g
More Related Content
Similar to 1 Assessing the Validity of Inferences Made from Assess.docx
Introduction to writing research questions and determining what variables to use. Introductory concepts for school personnel interested in action research.
1 Network Analysis and Design This assignment is.docxoswald1horne84988
1
Network Analysis and Design
This assignment is worth 30%.
Deadline: Mon, Week 12
Part A: HQ LAN Upgrade (35%)
Background:
ABC is a big company in the US. ABC has employed you as the IT officer of the company.
Your job is to analyse the performance of the HQ LAN, suggest changes to improve the
network performance and provide a report to your boss.
Settings:
Run all simulations for 30 minutes to simulate a working day.
The graphs should be time averaged
Duplicate scenario for each possible setup
Tasks:
1. Analyse the current performance of the HQ LAN for each level and comment on it.
You are required to show all relevant graphs. The graphs for each level can be
overlaid. (10%)
2. Some staffs are unhappy about the speed of the network. Anything that takes more
than 1 second is not desirable. You have decided to try the following to improve the
network performance. Show the relevant graphs and comment on the results: (5%)
a. Increase the link speeds of
i. HQ_Router1 to HQ_Router3 from 1 Gbps to 10 Gbps and
ii. HQ_Router2 to HQ_Router3 from 1 Gbps to 10 Gbps
b. Increase the LANs for level 1, 2 and 3 from 100 Mbps to 1 Gbps
c. Try out 1 other way that meets the requirement.
3. After meeting the requirement, the company has decided to purchase an Ethernet
Server and placed it in the HQ LAN. (10%)
a. Rename it to HQ Server
b. Use a 1Gbps link
c. Set Application: Supported Services to All
d. Set statistics to view the following:
i. Server DB Task Processing Time (Heavy)
ii. Server Email Task Processing Time (Heavy)
iii. Server HTTP Task Processing Time (Heavy)
iv. Server Performance Task Processing Time
e. Show the performance of the HQ Server with the required graphs and
comment on the results
f. Justify the location of the server
g. State at least 3 security measures you will take to protect the HQ LAN from
malicious attacks
4. What would you do so that all the 4 statistics of the HQ server are less than 0.025 s?
Show all relevant graphs. (3 marks)
2
5. Prepare a report and state the additional amount of money that is needed for the
changes you have made to meet the additional requirements. Refer to the given price
list in the Appendix. (7%)
a. Your report should include a content page, a summary of the addressed issues,
objectives, budgeting, proposed solutions and conclusion.
Part B: Network Design (65%)
Background:
Due to your excellent work in the analysis of the HQ LAN, you are now assigned the new
task of designing the LAN for one of ABC’s client, XYZ. The company XYZ is made up of 4
sections and the number of people in each section is as shown below.
1. Research – 20
2. Technical – 10
3. Guests – 4
4. Executives – 2
Set up the following staff profile:
1. Research: file transfer (light), web browsing (heavy) and file print (light)
2. Technical: Database Access (heavy), telnet (heavy) and email (light)
3. Guests: Em.
1 Name _____________________________ MTH129 Fall .docxoswald1horne84988
1
Name: _____________________________
MTH129 Fall 2018 - FINAL EXAM A
Show all work neatly on paper provided. Label all work. Place final answers on the answer sheet.
PART I: Omit 1 complete question. Place an “X” on the problems & answer space you are omitting.
1. Find the inverse of the following functions:
a. 𝑓(𝑥) = 2𝑥 − 3
b. 𝑓(𝑥) =
3𝑥 +1
𝑥−2
2. If 𝑓(𝑥) = 𝑥 2 − 2𝑥 + 3 and 𝑔(𝑥) = −3𝑥 + 4, find the following:
a. (𝑓°𝑔)(𝑥) b. (𝑓°𝑔)(2)
3. Find the domain for the following expression:
a) √𝑥 + 5 𝑏) 7𝑥 2 + 3𝑥 − 1 𝑐)
𝑥 2+4
𝑥 2−9
4. Find the radian measures of the angles with the given degree measures.
a) 81°
Find the degree measures of the angles with the given radian measures.
b)
13𝜋
6
5. Solve the following equations:
a) (5t) = 20
b) 6000 = 40(15)t
6. Expand the following logarithmic expressions:
a. log(𝐴𝐵2 )
b. ln(
4
√3
)
7. Describe how the graph of each function can be obtained from the graph f
a. 𝑦 = 𝑓(𝑥) − 8
b. 𝑦 = 𝑓(𝑥 + 4) − 5
8. A real number t is given 𝑡 =
2𝜋
3
a. Find the reference number for t.
b. Find the terminal point P(x,y) on the unit circle determined by t
c. The unit circle is centered at __________________ and has a radius of _________________
PART II: Omit 1 complete question. Place an “X” on the problems & answer space you are omitting.
2
1. A sum of $7,000 is invested at an interest rate of 4
1
2
% per year, compounding monthly. (round all answers to
the nearest cent)
a. Find the amount of the investment after 2
1
2
years.
b. How long will it take for the investment to amount to $12,000?
c. Using the information in part (a), find the amount of the investment if compounded quarterly.
2. When a company charges price p dollars for one of its products, its revenue is given by
𝑅 = 𝑓(𝑝) = 500𝑝(30 − 𝑝)
a. Create a quadratic function for price with respect to revenue.
b. What price should they charge in order to maximize their revenue?
c. What is the maximum revenue?
d. What would be the revenue if the price was set at $10?
e. Sketch a rough graph – indicate the intercepts and the maximum coordinates.
3. The charges for a taxi ride are an initial charge of $2.50 and $0.85 for each mile driven.
a. Write a function for the charge of a taxi ride as a linear function of the distance traveled.
b. What is the cost of a 12 mile trip?
c. Find the equation of a line that passes through the following points: (1,-2) , (2,5) Express in 𝑦 =
𝑚𝑥 + 𝑏 form
d. Graph part ( c )
4. a. Divide the following polynomial and factor completely.
𝑃(𝑥) = 3𝑥 4 − 9𝑥 3 − 2𝑥 2 + 5𝑥 + 3; 𝑐 = 3
b. Given polynomial−𝑥 2 + 5𝑥 − 6, state the end behavior of its graph.
c. Using the polynomial on part ( c ), would this g
1 Lab 8 -Ballistic Pendulum Since you will be desig.docxoswald1horne84988
1
Lab 8 -Ballistic Pendulum
Since you will be designing your own procedure you will have two
class periods to take the required data.
The goal of this lab is to measure the speed of a ball that is fired
from a projectile launcher using two different methods. The
Projectile launcher has three different settings, “Short Range,”
“Medium Range” and “Long Range,” however you will only need to
determine the speed for any ONE of these Range settings.
Method 1 involves firing the ball directly into the “Ballistic
Pendulum” shown below in Figure 2 for which limited instructions will be provided. Method 2
is entirely up to your group. While you have significant freedom to design your own procedure,
you will need to worry about the random and systematic uncertainties you are introducing
based on your procedure. This manual will provide a few hints to help reduce a few of those
uncertainties.
The ballistic pendulum pictured in Figure 2 is important canonical problem students study to
explore the conservation of momentum and energy. The ball is fired by the projectile launcher
into a “perfectly inelastic collision” with the pendulum. The pendulum then swings to some
maximum angle which is measured by an Angle Indicator.
Caution: The pendulum has a plastic hinge and Angle Indicator which are both fragile. Be
gentle.
Study the ballistic pendulum carefully. Before we begin, here are a few things to consider and
be aware of in Figure 2:
Projectile launcher
Angle indicator (curved
black bar)
Clamp
Pendulum (can be removed
for measurements)
Figure 2: Ballistic Pendulum
Plumb bob
Firing string
Release
point
Figure 1: Projectile Launcher
Bolt for removing pendulum
2
A. Clamping the ballistic pendulum to the table will reduce random uncertainties in the
speed with which the projectile launcher releases the ball. Similarly, you should check
that the various bolts are snug and that the ball is always fully inside the launcher (not
rolling around inside the barrel of launcher).
B. If the lab bench is not perfectly horizontal the plumb bob and angle indicator will not
read zero degrees before you begin your experiment. You should fix AND/OR account
for these discrepancies.
C. In Figure 3 you will notice a tiny gap between the launcher and the pendulum. This
important gap prevents the launcher from contacting the pendulum directly as the ball
is fired. Without this gap an unknown amount of momentum is transferred from the
launcher directly to the pendulum (in addition to the momentum transferred by the
ball) significantly complicating our experiment.
Figure 3: Important gap between Launcher and Pendulum
Equipment
1 Ballistic Pendulum (shown in Figure 2)
A bag with three balls
1 loading rod
1 Clamp
1 triple beam balance scale
Safety goggles for each group member
Any equipment found in your equipment drawer.
Reasonable equipment reque.
1 I Samuel 8-10 Israel Asks for a King 8 When S.docxoswald1horne84988
1
I Samuel 8-10
Israel Asks for a King
8 When Samuel grew old, he appointed his sons as Israel’s leaders.[a]2 The
name of his firstborn was Joel and the name of his second was Abijah, and
they served at Beersheba. 3 But his sons did not follow his ways. They turned
aside after dishonest gain and accepted bribes and perverted justice.
4 So all the elders of Israel gathered together and came to Samuel at
Ramah. 5 They said to him, “You are old, and your sons do not follow your
ways; now appoint a king to lead[b] us, such as all the other nationshave.”
6 But when they said, “Give us a king to lead us,” this displeasedSamuel; so
he prayed to the LORD. 7 And the LORD told him: “Listen to all that the people
are saying to you; it is not you they have rejected, but they have rejected
me as their king. 8 As they have done from the day I brought them up out of
Egypt until this day, forsaking me and serving other gods, so they are doing
to you. 9 Now listen to them; but warn them solemnly and let them
know what the king who will reign over them will claim as his rights.”
10 Samuel told all the words of the LORD to the people who were asking him
for a king. 11 He said, “This is what the king who will reign over you will claim
as his rights: He will take your sons and make them serve with his chariots
and horses, and they will run in front of his chariots. 12 Some he will assign to
be commanders of thousands and commanders of fifties, and others to plow
his ground and reap his harvest, and still others to make weapons of war
and equipment for his chariots. 13 He will take your daughters to be
perfumers and cooks and bakers. 14 He will take the best of your fields and
vineyards and olive groves and give them to his attendants. 15 He will take a
tenth of your grain and of your vintage and give it to his officials and
attendants. 16 Your male and female servants and the best of your cattle[c] and
donkeys he will take for his own use. 17 He will take a tenth of your flocks,
and you yourselves will become his slaves. 18 When that day comes, you will
cry out for relief from the king you have chosen, but the LORD will not
answer you in that day.”
https://www.biblegateway.com/passage/?search=1%20Samuel+8&version=NIV#fen-NIV-7371a
https://www.biblegateway.com/passage/?search=1%20Samuel+8&version=NIV#fen-NIV-7375b
https://www.biblegateway.com/passage/?search=1%20Samuel+8&version=NIV#fen-NIV-7386c
2
19 But the people refused to listen to Samuel. “No!” they said. “We wanta
king over us. 20 Then we will be like all the other nations, with a king to lead
us and to go out before us and fight our battles.”
21 When Samuel heard all that the people said, he repeated it before
the LORD. 22 The LORD answered, “Listen to them and give them a king.”
Then Samuel said to the Israelites, “Everyone go back to your own town.”
Samuel Anoints Saul
9 There was a Benjamite, a man of standing, whose n.
1 Journal Entry #9 What principle did you select .docxoswald1horne84988
1
Journal Entry #9
What principle did you select?
I selected principle 1 of part 1, “Don’t criticize, condemn or complain”.
Who did you interact with?
For this assignment I interacted with my younger cousin.
What was the context?
I had visited my Aunty and she and her husband asked me to stay a while as I was on school
break. They accommodated me and I decided in return to help look after my cousin in the period
when he got out of school and before they got back from work. He is 5 years old and can be quite
the handful.
What did you expect?
I expected that an authoritative approach would easily compel him to follow my instructions so
that the transition from school life into home life would be easy.
What happened?
At first, I used commanding language to get him to change out of his uniform or properly store
his back pack and books before stepping out to play. The first day was difficult and the way I
deal with him were not getting through. On the 2nd day, the same was observed. On the 3rd day,
before he could drop his back pack and run out, I offered to make him a sandwich to eat before
he left to play if he would change and clean up. He rushed up stairs and freshened up. On the
next day, he came home and rushed up to change and freshen up all on his own. I had not
initially offered; but I made him a sandwich regardless.
How did it make you feel?
It made me feel good to be able to get through to my cousin. After this, if I ever needed him to
do something in a better way than previously, I would encourage him onto a different way of
accomplishing the same. I would often offer praise after adoption of the new suggested method
was adopted or offered incentive.
2
What did you learn?
I learnt that in criticizing a person’s action, it is difficult to deter their belief in their methods,
values or beliefs. This usually just gives them the will to justify or defend their positions. It is
almost an exercise in futility to attempt to effect change by complaining, condemning or
criticizing.
What surprised you?
I was surprised by how fast the change was effected after the shift in direction I took to approach
my cousin. In not criticizing his way of doing things any longer and employing a different tactic,
I was able to influence his routine as well as build good rapport with him.
Going forward, how can you apply what you learnt?
Going forward I will attempt to understand that everyone has a belief or image of their own that I
should respect. These beliefs, systems and values are crucial to their inherent dignity and to
criticize or attack this will only fuel conflict.
Running head: Physical activity project 1
Physical activity project:
A 7-day analysis and action plans
Student Name
National University
Physical activity project 2
Introduction
Physical activity (PA) has been a major component of public health since the rise of
chronic illnesses .
1
HCA 448 Case 2 for 10/04/2018
Recently, a patient was transferred to a cardiac intensive care unit (CICU) at Methodist Hospital.
Methodist is a 250-bed hospital, which is one of five hospitals in the University Health System.
The patient was a retired 72-year-old man, who recently (i.e., 25 days ago) had a mild heart
attack and was treated and released from a sister hospital, which is in the same system as
Methodist Hospital. An otherwise health individual, Mr. Charlie Johnson (a husband, father of 4,
and grandfather of 12) is in now need or lots of medication and a battery of tests. To the nurses
on shift, it appears that the entire Johnson family is in patient’s room watching the clinical staff
treated Mr. Johnson. The family overhears everything and they want to know what is being done
to (and for) their loved one. In addition, they want to know the meaning behind the various beeps
coming from the many machines attached to Mr. Johnson.
Over the past 10 years, the latest U.S. News and World report has ranked Methodist Hospital as
one of the Best Hospitals for Cardiology & Heart Surgery. However, it is important to note that
over the past few years, the unit has dropped in the rankings.
Katherine Ross RN, the patient care director of the CICU, which has 14 beds, has held this post
for two years. (See Figure) The unit has a $20 million budget. Ms. Ross has worked at Methodist
Hospital for 16 years. She spends 50 percent of her time on patient safety, 25 percent on staffing
and recruitment, and 20 percent with nurses in relation to their satisfaction with the work and
with families relative to their satisfaction with care. Ten percent of Ms. Ross’s time is spent on
administrative duties. According to Ms. Ross, “I like is working with exceptional nurses who are
very smart and do what it takes with limited resources. However, we don’t always feel
empowered, despite the existence of shared governance, a structure I help to coordinate.”
2
Relationship with Nurses on the Unit:
Nurses on the unit work a three day a week, 12 hours a shift. Ms. Ross says, “we did an
employee opinion survey that went to all employees on the unit, 50 people in all, but only 13
responded. Some of them weren’t sure who their supervisor was. The employees aren’t happy
but our patients are happy.” She adds that “my name is on the unit, not the medical director’s. If
anything goes wrong with the unit, they blame it on nursing. Yet I’m brushed off by people
whom I have to deal with outside of the unit. For example, we have a problem with machines
that analyze blood gases. I spoke with the people there about the technology. This was four
weeks ago. It’s a patient safety issue. I sent them e-mails. I need the work to get done, the staff
don’t feel empowered if I’m not empowered. This goes for other departments as well. For
example, respiratory therapy starts using a new ventilator witho.
1
HC2091: Finance for Business
Trimester 2 2018
Group Assignment
Assessment Value: 20%
Due Date: Sunday 23:59 pm, Week 10
Group: 2- 4 students
Length: Min 2500 words
INSTRUCTIONS
Students are required to form a group to study, undertake research, analyse and conduct academic
work within the areas of business finance covered in learning materials Topics 1 to 10 inclusive.
The assignment should examine the main issues, including underlying theories, implement
performance measures used and explain the firm financial performance. Your group is strongly
advised to reference professional websites, journal articles and text books in this assignment (case
study).
Tasks
This assessment task is a written report and analysis of the financial performance of a selected
listed company on the ASX in order to provide financial and investment advice to a wealthy
investor. This assignment requires your group to undertake a comprehensive examination of a
firm’s financial performance based on update financial statements of the chosen companies.
Group Arrangement
This assignment must be completed IN Group. Each group can be from 2 to maximum 4 student
members. Each group will choose 1 company and once the company has been chosen, the other
group cannot choose the same company. First come first served rule applies here, it means you
need to form your group, choose on company from the list of ASX and register them with your
lecturer as soon as possible. Once your lecturer registers your chosen company, it cannot be
chosen by any other group. Your lecturer then will put your group on Black Board to enable you
to interact and discuss on the issues of your group assignment using Black Board environment.
However, face to face meeting, discussion and other methods of communication are needed to
ensure quality of group work. Each group needs to have your own arrangement so that all the
group members will contribute equally in the group work. If not, a Contribution Statement,
which clearly indicated individual contribution (in terms of percentage) of each member, should
be submitted as a separate item in your assignment. Your individual contribution then will be
assessed based on contribution statement to avoid any free riders.
2
Submission
Please make sure that your group member’s name and surname, student ID, subject name, and
code and lecture’s name are written on the cover sheet of the submitted assignment.
When you submit your assignment electronically, please save the file as ‘Group Assignment-
your group name .doc’. You are required to submit the assignment at Group Assignment
Final Submission, which is under Group Assignment and Due Dates on Black Board.
Submitted work should be your original work showing your creativity. Please ensure the self-
check for plagiarism to be done before final submission (plagiarism check is not over 30% .
1 ECE 175 Computer Programming for Engineering Applica.docxoswald1horne84988
1
ECE 175: Computer Programming for Engineering Applications
Homework Assignment 6
Due: Tuesday March 12, 2019 by 11.59 pm
Conventions: Name your C programs as hwxpy.c where x corresponds to the homework number and y
corresponds to the problem number. For example, the C program for homework 6, problem 1 should be
named as hw6p1.c.
Write comments to your programs. Programs with no comments will receive PARTIAL credit. For each
program that you turn in, at least the following information should be included at the top of the C file:
- Author and Date created
- Brief description of the program:
- input(s) and output(s)
- brief description or relationship between inputs and outputs
Submission Instructions: Use the designated Dropbox on D2L to submit your homework.
Submit only the .c files.
Problem 1 (15 points) Write a program that returns the minimum value and its location, max
value and its location and average value of an array of integers. Your program should call a
single function that returns that min and its location, max and its location and mean value of
the array. Print the results in the main function (not within the array_func function).
See sample code execution below. The declaration of this function is given below:
void array_func (int *x, int size, int *min_p, int *minloc_p, int *max_p, int *maxloc_p, double *mean_p)
/* x is a pointer to the first array element
size is the array size
min_p is a pointer to a variable min in the main function that holds the minimum
minloc_p is a pointer to a variable minloc in the main function that holds the location where the
minimum is.
max_p is a pointer to a variable max in the main function that holds the maximum
maxloc_p is a pointer to a variable maxloc in the main function that holds the location where the
maximum is.
mean_p is a pointer to a variable mean in the main function that holds the mean */
Declare the following array of integers within the main function:
Sample code execution:
int data_ar[] = { -3, 5, 6, 7, 12, 3, 4, 6, 19, 23, 100, 3, 4, -2, 9, 43, 32, 45,
32, 2, 3, 2, -1, 8 };
int data_ar2[] = { -679,-758,-744,-393,-656,-172,-707,-32,-277,-47,-98,-824,-695,
-318,-951,-35,-439,-382,-766,-796,-187,-490,-446,-647};
int data_ar3[] = {-142, -2, -56, -60, 114, -249, 45, -139, -25, 17, 75, -27, 158,
-48, 33, 67, 9, 89, 33, -78, -180, 186, 218, -274};
2
Problem 2 (20 points): A barcode scanner verifies the 12-digit code scanned by comparing the
code’s last digit to its own computation of the check digit calculated from the first 11 digits as
follows:
1. Calculate the sum of the digits in the odd-numbered indices (the first, third, …, ninth
digits) and multiply this sum by 3.
2. Calculate the sum of the digits in the even-numbered indices (the 0th, second, … tenth
digits).
3. Add the results from step 1 and 2. If the last digit of the addition result is 0, then 0 is the
check digit. .
1 Cinemark Holdings Inc. Simulated ERM Program .docxoswald1horne84988
1
Cinemark Holdings Inc.: Simulated ERM Program
Ben Li, Assistant Vice President of Compliance, is assigned the responsibility of developing an ERM
program at Cinemark Holdings Inc. (CHI). Over the past year, Ben has put in place the following ERM
activities:
Risk Identification and Assessment
The risk identification and assessment process steps are as follows:
1) Conduct online surveys of the heads of the 10 business segments and their 1-2 direct reports (15
people) and their mid-level managers (80 people). Exhibit 1 shows the instructions that are
included in the online survey. Exhibit 2 shows samples of the information collected from the
online survey.
2) Each of the 10 business segments separately organizes and compiles the results of the online
survey. They typically compile a robust list of 70-80 potential key risks. Each business segment
then prioritizes their top-5 risks and reports them to Ben Li, resulting in a total of 50 key risks (a
partial sample of the top-50 risk list is shown in Exhibit 3).
3) A consensus meeting is conducted where the 50 risks are shared with the top 10 members of
senior management in an open-group setting at an offsite one-day event. The 50 risks are each
discussed one at a time, after which the facilitator has the group collectively discuss and score
them for likelihood and severity. The risk ranking is calculated as the likelihood score plus the
severity score; the control effectiveness score is used to determine if there is room to improve
the controls and is used in the risk decision making process step. The top-20 risks are identified
as the key risks to CHI and are selected for additional mitigation and advanced to the risk
decision making stage. A Heat Map (see Exhibit 4) is provided to assist in this effort.
4) The 30 risks remaining from the 50 discussed at the consensus meeting are considered the non-
key risks, and these are monitored with key risk indicators to see if, over time, either the
likelihood and/or severity is increasing to the level which would result in one of these being
elevated to a key risk.
Risk Decision Making
Ben Li formed a Risk Committee to look at the risk identification and assessment information and to
define CHI’s risk appetite and risk limits, which were defined as follows:
Risk Appetite
CHI will maintain its overall risk profile in a manner consistent with our mission and vision and with the
expectations of our shareholders.
Risk Limits
CHI will also avoid any individual risk exposures deemed excessive by its Risk Committee; the individual
risk exposures will be determined separately for each key risk. CHI has zero tolerance for risks related to
internal fraud or violations of the employee code of conduct.
2
Ben Li expanded the role of the Risk Committee to also select and implement the risk mitigation for each
of the 20 key risks, at the same time as the committee determines the risk limits. .
1 Figure 1 Picture of Richard Selzer Richard Selz.docxoswald1horne84988
1
Figure 1 Picture of Richard Selzer
Richard Selzer
What I Saw at the Abortion
I am a surgeon. Sick flesh is everyday news. Escaping blood, all the outpourings of
disease, meaty tumors that terrify–I touch these to destroy them. But I do not make symbols of
them.
What I am saying is that I have seen and I am used to seeing. I am a man who has a
trade, who has practiced it long enough to see no news in any of it. Picture me, then. A
professional in his forties, three children, living in a university town—so, necessarily, well—
enlightened? Enough, anyhow. Successful in my work, yes. No overriding religious posture.
Nothing special, then, your routine fellow, trying to do his work and doing it well enough. Picture
me, this professional, a sort of scientist, if you please, in possession of the standard admirable
opinions, positions, convictions, and so on–on this and that matter–on abortion, for example.
All right. Now listen.
It is the western wing of the fourth floor of a great university hospital. I am present
because I asked to be present. I wanted to see what I had never seen: an abortion.
The patient is Jamaican. She lies on the table in that state of notable submissiveness I
have always seen in patients. Now and then she smiles at one of the nurses as though
acknowledging a secret.
A nurse draws down the sheet, lays bare the abdomen. The belly mounds gently in the
twenty-fourth week of pregnancy. The chief surgeon paints it with a sponge soaked in red
antiseptic. He does this three times, each time a fresh sponge. He covers the area with a sterile
sheet, an aperture in its center. He is a kindly man who teaches as he works, who pauses to
reassure the woman.
He begins.
“A little pinprick,” he says to the woman. He inserts the point of a tiny needle at the
midline of the lower portion of her abdomen, on the downslope. He infiltrates local anesthetic into
the skin, where it forms a small white bubble.
The woman grimaces. “That is all you will feel,” the doctor says, “except for a little
pressure. But no more pain.” She smiles again. She seems to relax. She settles comfortably on
the table. The worst is over.
The doctor selects a three-and-one-half-inch needle bearing a central stylet. He places
the point at the site of the previous injection. He aims it straight up and down, perpendicular.
Next he takes hold of her abdomen with his left hand, palming the womb, steadying it. He thrusts
with his right hand. The needle sinks into the abdominal wall.
“Oh,” says the woman quietly.
But I guess it is not pain she feels. It is more a recognition that the deed is being done. Another
thrust and he has speared the uterus.
“We are in,” he says. He has felt the muscular wall of the organ gripping the shaft of his
needle. A further slight pressure on the needle advances it a bit more. He takes his left hand
2
from the woman’s abdomen. He retracts the filament of the stylet from the bar.
1 Films on Africa 1. A star () next to a film i.docxoswald1horne84988
1
Films on Africa
1. A star (*) next to a film indicates that portions of that film might be shown in class in the course of
the semester.
2. All films are in DVD format, unless indicated otherwise.
3. Available: at the Madden and Fresno County Public Libraries, via Netflix, Blackboard or on-line.
4. For the on-line films, you can click on the link and this will lead you directly to the film.
5. Please be advised that a few films have the following notice: Warning: Contains scenes which some
viewers may find disturbing. You decide whether you want to watch them or not.
6. Some films are available on-line via VOD.
7. Let your instructor know if a link is no longer working.
The Africans (9 VHS films – each 60 min or 5 DVDs – each 120 min): Co-
production of WETA-TV and BBC-TV. Presented by Ali A. Mazrui. 1986.
Available at Madden Media & Fresno Public Libraries
Vol. 1 – The Nature of a continent*
Summary: Examines Africa as the birthplace of humankind and discusses
the impact of geography on African history, including the role of the Nile
in the origin of civilization and the introduction of Islam to Africa through its Arabic borders.
Vol. 2 – A Legacy of lifestyles*
Summary: This program explores how African contemporary lifestyles are influenced by
indigenous, Islamic and Western factors. It compares simple African societies with those that
are more complex and centralized, and examines the importance of family life.
Vol. 3 – New gods
Summary: This program examines the factors that influence religion in Africa, paying particular
attention to how traditional religions, Islam, and Christianity co-exist and influence each other.
Vol. 4 – Tools of exploitation
Summary: The impact of the West on Africa and the impact of Africa on the development of the
West are contrasted with an emphasis on the manner in which Africa's human and natural
resources have been exploited before, during, and after the colonial period.
Vol. 5 – New conflicts
Summary: Explores the tensions inherent in the juxtaposition of 3 African heritages, looking at
the ways in which these conflicts have contributed to the rise of the nationalist movement, the
warrior tradition of indigenous Africa, the jihad tradition of Islam, and modern guerilla warfare.
Vol. 6 – In search of stability
Summary: Gives an overview of the several means of governing in Africa. Examines new social
orders to illustrate an Africa in search of a viable form of government in the post-independence
period.
1.
2
Vol. 7 – A Garden of Eden in decay?
Summary: Identifies the problems of a continent that produces what it does not consume and
consumes what it does not produce. Shows Africa's struggle between economic dependence
and decay.
Vol. 8 – A Clash of cultures*
Summary: Discusses the conflicts and compromises which emerge from the coexistence of
many African traditions and modern life. Explores the question of whet.
1 Contemporary Approaches in Management of Risk in .docxoswald1horne84988
1
Contemporary Approaches in Management of Risk in Engineering Organizations
Assignment-1
Literature review
Student name: Hari Kiran Penumudi
student id: 217473484
Table of Contents
2
INTRODUCTION………………………………………………………………………3-4
OBJECTIVES & DELIVERABLES…………………………………………………....4
REVIEW OF LITERATURE…………………………………………………………....5-13
Risk and Risk Management………………………………………………………5-6
Risk Management Frameworks……………………………………………….....6-10
Importance of Risk Management in Engineering………………………….........10-13
GENERAL PROBLEM STATEMENT…………………………………………………13-14
RESEARH STRATEGY…………………………………………………………………14-15
RESOURCES REQUIREMENTS……………………………………………………….16
PROJECT PLANNING…………………………………………………………………..16
REFERNCES…………………………………………………………………………….17-19
Contemporary Approaches in Management of Risk in Engineering Organizations
3
Introduction
The term, ‘risk’ as defined by the Oxford English dictionary is a possibility to meet with any
kind of danger or suffer harm. Risk is a serious issue that every organization has to deal with in
their everyday operations. However, nature and magnitude of risks largely vary from
organization to organization and often depend on the type of the organization. Therefore,
organizations irrespective of their type of operations keep a risk management team that looks
after every risk to which an organization is vulnerable. Organizations in the field of engineering
also have to come across some inherent risks that negatively impact their operations. Engineering
may be defined as the process of applying science to practical purposes of designing structures,
systems, machines and similar things. Therefore, like every other organization, risk assessment
and management is also an integral part of engineering organizations. Since the task of
engineering is mostly complex, the risks in this area are also very complicated. If risks in
engineering field are not mitigated effectively it may produce long-term danger that may affect
both the organizational services and the society in whole. Hence, the activity of risk management
within engineering organizations must be undertaken seriously and measured thoroughly in order
to reduce the threat of risks. Amyotte et al., (2006) simply puts it like within the engineering
practice, an inbuilt risk is always present. Studies have found that despite the knowledge of
inherent risks within the field and activity of engineering, organizations are not very aware in
imparting knowledge about risk management to their engineers. From this the need of education
regarding the risk management approaches arises. Therefore, this paper tries to find out
approaches to management of risks and importance of these approaches within the area of
engineering. Bringing on the contemporary evidence from the literature review related to risk
management approaches, the paper examines how those approaches can be helpful for
4 .
1
Assignment front Sheet
Qualification Unit number and title
Pearson BTEC Levels 4 and 5 Higher
Nationals in Health and Social Care (RQF)
HNHS 17: Effective Reporting and Record-keeping in
Health and Social Care Services
Student name Assessor name Internal Verifier
B. Maher F. Khan
Date issued: Final Submission:
12/10/2018 18/01/2019
Assignment title
Effective Reporting and Record-keeping in Health and Social
Care services
Submission Format
This work will be submitted in 2 different formats:
Assessment 1 should be submitted as a word-processed report document in a standard report
style, which requires the use of headings, titles and appropriate captions. You may also choose
to include pictures, graphs and charts where relevant to support your work. The recommended
word count for this assignment is 1500–2000 words, though you will not be penalised for
exceeding this total.
Assessment 2 requires the submission of evidence from a mock training event on record-
keeping. This will include a set of materials used in the event, to include an electronic
presentation, evidence of your own record-keeping across a range of types of records, as well as
where you will demonstrate you have evaluated the effectiveness of your own completion of
relevant records. The recommended word count for the presentation is 1000–1500 words
(including speaker notes), though you will not be penalised for exceeding this total.
For both assessments, any material that is derived from other sources must be suitably
referenced using a standard form of citation. Provide a bibliography using the Harvard
referencing system.
Unit Learning Outcomes
LO1 Describe the legal and regulatory aspects of reporting and record keeping in a care setting
LO2 Explore the internal and external recording requirements in a care setting
Assignment Brief and Guidance
2
Purpose of this assignment:
The purpose of the assignment is to assess the learner firstly in relation to both the legal and
regulatory aspects of reporting and record keeping in a care setting through producing an internal
evaluative review of record keeping in their own care setting. Secondly, the learner will be
assessed on the internal and external recording requirements in a care setting. Thirdly, the learner
will be assessed on Review the use of technology in reporting and recording service user care in a
care setting and fourthly the learner will demonstrate how to keep and maintain records in own care
setting in line with national and local policies.
Breakdown of assignment:
Assignment:
You need to produce one written piece of work of 2,500 words (+/- 10%) covering all the
assessment criterion in LO1-LO4 as one document.
Unit Learning Outcomes
LO1 Describe the legal and regulatory aspects of reporting and record keeping in a care
setting
LO2 Explore the internal and external recording.
1 BBS300 Empirical Research Methods for Business .docxoswald1horne84988
1
BBS300 Empirical Research Methods for Business
TSA, 2018
Assignment 1
Due: Sunday, 7 October 2018,
23:55 PM
This assignment covers material from Sessions 1-4 and is worth 20% of your total mark
of BBS300. Your solutions should be properly presented, and it is important that you
double-check your spelling and grammar and thoroughly proofread your assignment
before submitting. Instructions for assignment submission are presented in
the “Assignment 1” link and must be strictly adhered to. No marks will be
awarded to assignments that are submitted after the due date and time.
All analyses must be carried out using SPSS, and no marks will be awarded
for assignment questions where SPSS output supporting your answer is not
provided in your Microsoft Word file submitted for the Assignment.
Questions
In this assignment, we will examine the “Real Estate Market” dataset (described at the
end of the assignment ) and “Employee Satisfaction” dataset. Before beginning the
assignment, read through the descriptions of these dataset and their variables carefully.
The “Real Estate Market” dataset can be found in the file “realestatemarket.sav,” and
the “Employee Satisfaction” dataset can be found in the file “employeesatisfaction.sav.”
You will need to carefully inspect both SPSS data files to be sure that the
specification of variable types is correct and, where appropriate, value
labels are entered.
1. (12 marks)
2
Use appropriate graphical displays and measures of centrality and dispersion
to summarise the following four variables in the “Real Estate Market” dataset. For
graphical displays for numeric data, be sure to comment on not only the shape of
the distribution but also compliance with a normal distribution. Be sure to
include relevant SPSS output (graphs, tables) to support your answers.
(a) Price.
(b) Lot Size.
(c) Material.
(d) Condition.
2. (8 marks)
Again consider the variable Price, which records the property price (in AUD). It
is of interest to know if this is associated with the distance of the property is
located to the train station. It i s al so of i nter e st t o kn o w if th e p rop ert y
pri ce s are a sso ciate d with di st an ce to t h e ne ar e st b u s sto p. Carry out
appropriate statistical techniques to assess whether there is a significant
association between the property price and distance to the nearest train (To train)
station and the nearest bus stop (To bus). Be sure to thoroughly assess the
assumptions of your particular analysis, and be sure to include relevant SPSS
output (graphs, tables) to support your answers.
3. (7 marks)
Consider the “Employee Satisfaction” dataset, which asked participants to provide their
level of regularity to a series of thirteen statements. Conduct an appropriate analysis
to assess the reliability of responses to these statements. If the reliability will
increa.
1 ASSIGNMENT 7 C – MERGING DATA FILES IN STATA Do.docxoswald1horne84988
1
ASSIGNMENT 7 C – MERGING DATA FILES IN STATA
Download the world development data covering the years 2000-2016 from the website
“http://databank.worldbank.org/data/reports.aspx?source=World-Governance-Indicators” for the
following upper-middle-income countries.
Countries of Interest:
Albania Ecuador Montenegro
Algeria Equatorial Guinea Namibia
American Samoa Fiji Nauru
Argentina Gabon Panama
Azerbaijan Grenada Paraguay
Belarus Guyana Peru
Belize Iran, Islamic Rep. Romania
Bosnia and Herzegovina Iraq Russian Federation
Botswana Jamaica Samoa
Brazil Kazakhstan Serbia
Bulgaria Lebanon South Africa
China Libya St. Lucia
Colombia Macedonia, FYR St. Vincent and the Grenadines
Costa Rica Malaysia Suriname
Croatia Maldives Thailand
Cuba Marshall Islands Tonga
Dominica Mauritius Turkey
Dominican Republic Mexico Turkmenistan
Tuvalu
Venezuela, RB
Variables of Interest
Control of Corruption: Estimate
Government Effectiveness: Estimate
Political Stability and Absence of Violence/Terrorism:
Estimate
Regulatory Quality: Estimate
Rule of Law: Estimate
Voice and Accountability: Estimate
2
STEP 1 - Download the data from the World-Governance-Indicators database as shown below
STEP 2 - Check the variables of interest
3
Please make sure you are checking the variables with “Estimates”.
TO VIEW THE DEFINITIONS OF THE VARIABLES
4
Step 3 – Select countries of interest
5
Step 4 – Click on “Time” and select the “year range” you are interested in (2000-2016)
6
Step 5 – Click on the “Layout” as shown below
Change the time layout to “Row,” series to “Column” and Country to “Row.”
Next, click on the “apply changes.”
Step 6 – Click on the “Download option” and select “Excel” as shown below
7
STEP 7: Using Excel, Replace the Missing Values With “.” (See previous assignments)
STEP 8: SAVE THE EXCEL DATA FILE ON YOUR COMPUTER PREFERABLY IN A
FOLDER
STEP 9: IMPORT YOUR DATA INTO STATA AND NAME YOUR DATA SET
“WORLD_GOVERNANCE_INDICATORS.” (See previous assignments for steps)
8
STEP 10; RENAME THE VARIABLES AS SHOWN BELOW (See previous assignments for
steps)
Using stata, merge the data set from “ASSIGNMENT 3B” with this dataset
VERY IMPORTANT Note: Merging two datasets requires that both have at least one variable in
common (either string or numeric).
This statement requires that the variable name for “Time” and “Country” should be the same in the two
data set
MERGING THE DATASET FROM “ASSIGNMENT 3” WITH THE DATA FROM THE
WORLD GOVERNANCE INDICATORS
Merging data files in stata
https://www.youtube.com/watch?v=EV-5PztbHs0
https://www.youtube.com/watch?v=Uh7C0mlhB3g&t=54s
https://www.youtube.com/watch?v=2etG_34ODoc
I will strongly encourage you to watch these videos before merging
I will also strongly recommend you read the notes in the link below before you star.
1 Assessment details for ALL students Assessment item.docxoswald1horne84988
1
Assessment details for ALL students
Assessment item 3 - Individual submission
Due date: Week 12 Monday (1 Oct 2018) 11:55 pm AEST
Weighting:
Length:
50% (or 50 marks)
There is no word limit for this report
Objectives
This assessment item relates to the unit learning outcomes as stated in the unit profile.
Enabling objectives
1. Analyse a case study and identify issues associated with the business;
2. Develop and deploy the application in IBM Bluemix;
3. Evaluate existing and new functionalities to address business problems;
4. Prepare a document to report your activities using text and multimedia (for example screenshots, videos).
General Information
The purpose of this assignment is to create a cloud based simulating environment which will help to
identify/understand the problem stated in the given case study using analysis tools available in IBM
Bluemix. In assignment three, you are working individually. By doing this assignment, you will
learn to use skills and knowledge of emerging technologies like cloud computing, IoT, to simulate a
business scenario to capture operational data and share with a visualization tool. You will acquire a
good understanding of smart application design in a cloud environment for efficient application
configuration and deployment.
What do you need to do?
The assignment requires you to do the following -
• Download the ‘Starter_Code_For_Assignment_Three.rar’ given in week 8 to
configure, and deploy a cloud based smart/IoT (Internet of Things) application to
simulate the business case.
• Choose a case study out of given two below and analyse the case study to
understand the business problem and design a solution for those problems.
• Deploy the starter source code in your Bluemix account and modify it to address
all required milestones mentioned in your chosen case study.
• Finally prepare a report according to given format and specifications below and
submit it in Moodle.
2
Report format and specifications -
You are required to submit a written report in a single Microsoft Word (.doc or .docx)
document. There is no word limit but any unnecessary information included in the report
may result in reduced marks.
The report must contain the following content (feel free to define your own sections,
as long as you include all the required content):
o Cover page/title page and Table of contents
o URL of the app and login details of the IBM Bluemix account
o Introduction
o Case study analysis which will report –
o Business problems you have identified in the case study
o Possible solutions for each and how do these solutions address the
business problems?
o What are the solutions you implemented in the application?
o The step by step process you have followed to configure and deploy the smart app
for business case simulation. You may choose to use screenshots and notes to
enrich your report but you must have a video of the pr.
1
CDU APA 6th
Referencing Style Guide
(February 2019 version)
2
Contents
APA Fundamentals .......................................................................................... 3
Reference List ................................................................................................... 3
Citing in the text ............................................................................................... 5
Paraphrase ................................................................................................... 5
Direct quotes................................................................................................. 5
Secondary source .......................................................................................... 6
Personal communications............................................................................. 6
Examples .......................................................................................................... 7
Book .............................................................................................................. 7
eBook ............................................................................................................ 7
Journal article with doi ................................................................................ 7
Journal article without doi ........................................................................... 7
Web page ...................................................................................................... 7
Books - print and online ................................................................................... 8
Single author ................................................................................................ 8
eBook/electronic book ................................................................................ 11
Journal articles, Conference papers and Newspaper articles ........................ 13
Multimedia ..................................................................................................... 16
YouTube or Streaming video ..................................................................... 16
Online images ................................................................................................. 17
Web sources and online documents ................................................................ 20
Web page .................................................................................................... 20
Document from a website ........................................................................... 21
Legislation and cases ...................................................................................... 23
Common abbreviations .................................................................................. 24
Appendix 1: How to write an APA reference when information is missing .. 25
Appendix 2: Author layout.
1
BIOL 102: Lab 9
Simulated ABO and Rh Blood Typing
Objectives:
After completing this laboratory assignment, students will be able to:
• explain the biology of blood typing systems ABO and Rh
• explain the genetics of blood types
• determine the blood types of several patients
Introduction:
Before Karl Landsteiner discovered the ABO human blood groups in 1901, it was thought that all blood was the
same. This misunderstanding led to fatal blood transfusions. Later, in 1940, Landsteiner was part of a team
who discovered another blood group, the Rh blood group system. There are many blood group systems known
today, but the ABO and the Rh blood groups are the most important ones used for blood transfusions. The
designation Rh is derived from the Rhesus monkey in which the existence of the Rh blood group was
discovered.
Although all blood is made of the same basic elements, not all blood is alike. In fact, there are eight different
common blood types, which are determined by the presence or absence of certain antigens – substances that
can trigger an immune response if they are foreign to the body – on the surface of the red blood cells (RBCs
also known as erythrocytes).
ABO System:
The antigens on RBCs are agglutinating antigens or agglutinogens. They have been designated as A and B.
Antibodies against antigens A and B begin to build up in the blood plasma shortly after birth. A person
normally produces antibodies (agglutinins) against those antigens that are not present on his/her erythrocytes
but does not produce antibodies against those antigens that are present on his/her erythrocytes.
• A person who is blood type A will have A antigens on the surface of her/his RBCs and will have
antibodies against B antigens (anti-B antibodies). See picture below.
• A person with blood type B will have B antigens on the surface of her/his RBCs and will have antibodies
against antigen A (anti-A antibodies).
• A person with blood type O will have neither A nor B antigens on the surface of her/his RBCs and has
BOTH anti-A and anti-B antibodies.
• A person with blood type AB will have both A and B antigens on the surface of her/his RBCs and has
neither anti-A nor anti-B antibodies.
The individual’s blood type is based on the antigens (not the antibodies) he/she has. The four blood groups
are known as types A, B, AB, and O. Blood type O, characterized by an absence of A and B agglutinogens, is
the most common in the United States (45% of the population). Type A is the next in frequency, found in 39%
of the population. The incidences of types B and AB are 12% and 4%, respectively.
2
Table 1: The ABO System
Blood
Type
Antigens on
RBCs
Antibodies
in the Blood
Can GIVE Blood
to Groups:
Can RECEIVE
Blood from Groups:
A A Anti-B A, AB O, A
B B Anti-A B, AB O, B
AB A and B
Neither anti-A
nor anti-B
AB O, A, B, AB
O
Neither A nor
B
Both anti-A.
1
Business Intelligence Case
Project Background
Mell Industries is a national manufacturing firm that specializes in textiles based out of
Chicago. Starting out as a small factory in Warrenville, Illinois, the firm experienced a period of steady
growth over the past twenty-four years. Steadily opening new warehouses and factories in the
surrounding areas in Michigan and Indianapolis until eventually moving their base of operations to
Chicago. Due to this expansion, Mell Industries is at the height of its production and hopes to avoid any
interferences or deceleration of growth.
In recent years, the firm has been under heavy media scrutiny for supposedly compensating its
female staff unfairly lower compared to male counterparts. This was initiated when a disgruntled
employee leaked the company payroll allegedly showcasing an unjust gap of income between the
female employee and her male counterpart. This type of gender pay gap is highly criticized and as a
precaution, Mell Industries has hired Cal Poly Pomona to conduct research to determine the validity of
these claims. Mell Industries has provided Cal Poly Pomona with a data set of a sample population of
747 employees. Mell Industries has also offered Cal Poly Pomona compensation for any promising
information gathered. Mell Industries may use information gathered from this project in future
employee compensation decisions.
The initial dataset has been given to you in the form of an excel spreadsheet titled
Case_dataset.xlsx consisting of 12 columns labeled:
● Column A - Employee ID
● Column B - Gender
● Column C - Date of Birth
● Column D - Date of Hire
● Column E - Termination Date
● Column F - Occupation
● Column G - Salary
● Column H to L - Employee Evaluation Metrics
In addition, Mell Industries provided the latest annual employee performance review evaluation
results rating each employee in various performance categories. They have turned over this information
separately and as a consultant, it is your task to provide Mell Industries with the most accurate and
relevant information in a digestible form. Furthermore, using excel skills learned during the course, you
will manipulate and analyze the data set in order to make appropriate managerial decisions. You will
utilize excel functions highlighted in this project as well as a pivot table and chart to form a decision
support system in order to answer the critical thinking questions.
Project Objective
The purpose of this project is to perform a methodical data analysis to assist the company make
an informed decision. This could also serve as a basis for implementing critical adjustments to certain
business aspects if necessary. Illustrate the business process by condensing a large set of data, to
present relevant information with data visualization. We will be utilizing Microsoft Excel 2016 to
complete this project.
2
TA.
A review of the growth of the Israel Genealogy Research Association Database Collection for the last 12 months. Our collection is now passed the 3 million mark and still growing. See which archives have contributed the most. See the different types of records we have, and which years have had records added. You can also see what we have for the future.
2024.06.01 Introducing a competency framework for languag learning materials ...Sandy Millin
http://sandymillin.wordpress.com/iateflwebinar2024
Published classroom materials form the basis of syllabuses, drive teacher professional development, and have a potentially huge influence on learners, teachers and education systems. All teachers also create their own materials, whether a few sentences on a blackboard, a highly-structured fully-realised online course, or anything in between. Despite this, the knowledge and skills needed to create effective language learning materials are rarely part of teacher training, and are mostly learnt by trial and error.
Knowledge and skills frameworks, generally called competency frameworks, for ELT teachers, trainers and managers have existed for a few years now. However, until I created one for my MA dissertation, there wasn’t one drawing together what we need to know and do to be able to effectively produce language learning materials.
This webinar will introduce you to my framework, highlighting the key competencies I identified from my research. It will also show how anybody involved in language teaching (any language, not just English!), teacher training, managing schools or developing language learning materials can benefit from using the framework.
Biological screening of herbal drugs: Introduction and Need for
Phyto-Pharmacological Screening, New Strategies for evaluating
Natural Products, In vitro evaluation techniques for Antioxidants, Antimicrobial and Anticancer drugs. In vivo evaluation techniques
for Anti-inflammatory, Antiulcer, Anticancer, Wound healing, Antidiabetic, Hepatoprotective, Cardio protective, Diuretics and
Antifertility, Toxicity studies as per OECD guidelines
Synthetic Fiber Construction in lab .pptxPavel ( NSTU)
Synthetic fiber production is a fascinating and complex field that blends chemistry, engineering, and environmental science. By understanding these aspects, students can gain a comprehensive view of synthetic fiber production, its impact on society and the environment, and the potential for future innovations. Synthetic fibers play a crucial role in modern society, impacting various aspects of daily life, industry, and the environment. ynthetic fibers are integral to modern life, offering a range of benefits from cost-effectiveness and versatility to innovative applications and performance characteristics. While they pose environmental challenges, ongoing research and development aim to create more sustainable and eco-friendly alternatives. Understanding the importance of synthetic fibers helps in appreciating their role in the economy, industry, and daily life, while also emphasizing the need for sustainable practices and innovation.
This slide is special for master students (MIBS & MIFB) in UUM. Also useful for readers who are interested in the topic of contemporary Islamic banking.
Exploiting Artificial Intelligence for Empowering Researchers and Faculty, In...Dr. Vinod Kumar Kanvaria
Exploiting Artificial Intelligence for Empowering Researchers and Faculty,
International FDP on Fundamentals of Research in Social Sciences
at Integral University, Lucknow, 06.06.2024
By Dr. Vinod Kumar Kanvaria
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
A workshop hosted by the South African Journal of Science aimed at postgraduate students and early career researchers with little or no experience in writing and publishing journal articles.
Executive Directors Chat Leveraging AI for Diversity, Equity, and InclusionTechSoup
Let’s explore the intersection of technology and equity in the final session of our DEI series. Discover how AI tools, like ChatGPT, can be used to support and enhance your nonprofit's DEI initiatives. Participants will gain insights into practical AI applications and get tips for leveraging technology to advance their DEI goals.
Acetabularia Information For Class 9 .docxvaibhavrinwa19
Acetabularia acetabulum is a single-celled green alga that in its vegetative state is morphologically differentiated into a basal rhizoid and an axially elongated stalk, which bears whorls of branching hairs. The single diploid nucleus resides in the rhizoid.
1 Assessing the Validity of Inferences Made from Assess.docx
1. 1
Assessing the Validity of Inferences Made from Assessment
Results
Sources of Validity Evidence
• Validity evidence can be gathered during the development of
the assessment or after the
assessment has been developed.
• Some of the methods used to gather validity evidence can
support more than one type of
source (e.g., test content, internal structure).
• Large scale assessment and local classroom assessment
developers often use different
methods to gather validity evidence.
o Large scale assessment developers use more formal, objective,
systematic, and
statistical methods to establish validity.
o Teachers use more informal and subjective methods which
often to not involve
the use of statistics.
Evidence based on Test Content
• Questions one is striving to answer when gathering validity
2. evidence based on test
content or construct:
o Does the content of items that make-up the assessment fully
represent the concept
or construct the assessment is trying to measure?
o Does the assessment accurately represent the major aspects of
the concept or
construct and not include material that is irrelevant to it?
o To what extent do the assessment items represent a larger
domain of the concept
or construct being measured?
• The greater the extent to which an assessment represents all
facets of a given concept or
construct, the better the validity support based on the test
content or construct. There is
no specific statistical test associated with this source of
evidence.
• Methods used to gather validity evidence based on test content
or construct
o Large Scale Assessments
§ Have experts in the concept or construct being measured
create the
assessment items and the assessment itself.
§ Have experts in the concept or construct examine the
assessment and
review it to see how well it measures the concepts or construct.
These
experts would think about the following during the review
process:
3. § The extent to which the content of assessment
represents the content or construct’s domain or
universe.
§ How well the items, tasks, or subparts of the
assessment fit the definition of the construct and/or
the purpose of the assessment.
§ Is the content or construct underrepresented, or are
there content or construct-irrelevant aspects of the
assessment that may result in unfair advantages for
2
one or more subgroups (e.g., Caucasians, African
Americans)?
§ What is the relevance, importance, clarity, and lack
of bias in the assessment’s items or tasks
o Local Classroom Assessment
§ Develop assessment blue prints which indicate what will be
assessed as
well as the nature of the learning (e.g., knowledge, application,
etc.) that
should be represented on the assessment.
§ Build a complete set of learning objectives or targets, showing
number of
items and/or percentage of items/questions on the assessment
4. devoted to
each.
§ Discuss with others (e.g., teachers, administrators, content
experts, etc.)
what constitutes essential understandings and principles.
§ Ask another teacher to review your assessment for clarity and
purpose.
§ Review assessments before using them to make judgments
about whether
the assessment, when considered as a whole, reflects what it
purports to
measure.
§ Review assessments to make judgments about whether items
on the
assessment accurately reflects the manner in which the concepts
were
taught.
§ Have procedures in place to ensure that the nature of the
scoring criteria
reflect important objectives. For example, if students are
learning a
science skill in which a series of steps needs to be performed,
the
assessment task should require them to show their work.
§ When scoring answers give credit for partial correct answers
when
possible.
§ Examine assessment items to see if they are favoring groups
of students
5. more likely to have useful background knowledge—for instance,
boys or
girls.
Evidence based on Response Processes
• Questions one is striving to answer when gathering validity
evidence based on response
processes:
o Do the assessment takers understand the items on the
assessment to mean what
the assessment developer intend them to mean?
o To what extent do the actions and thought processes of the
assessment takers
demonstrate that they understand the concept or construct in the
same way it is
defined by the assessment developer?
• The greater the extent to which the assessment developer can
be certain that the actions
and thought processes of assessment takers demonstrate that
they understand the concept
or construct in the same way he/she (assessment developer) has
defined it, the greater the
validity support via evidence response processes. There is no
specific statistical test
associated with this source of evidence.
• Methods used to gather validity evidence based on response
processes
6. 3
o Large Scale assessments
§ Analyses of individuals response to items or tasks via
interviews with
respondents
§ Studies of the similarities and differences in responses
supplied by
various subgroups of respondents
§ Studies of the ways that raters, observers, interviewers, and
judges collect
and interpret data
§ Longitudinal studies of changes in responses to items or tasks
o Local classroom assessment
§ Use different methods to measure the same learning objective.
For
example, one could use teacher observation and quiz
performance. The
closer the results of the two match the greater the validity
support.
§ If test should be assessing specific cognitive levels of
learning (e.g.,
application), make sure the items reflect this cognitive level.
For example,
essay items would be more appropriate than fill-in-the-blank
items for
measuring application of knowledge.
§ Reflect on whether or not students could answer items on the
7. assessment
without really knowing the content. For example, is student
performing
well because of good test taking skills and not necessarily
because they
know the content?
§ Ask students before or after taking the assessment how they
interpret the
items to make sure it is in line with how you expected them to
interpret the
items. This procedure is called a think aloud and gives one
insight into the
thought processes of the student.
Evidence based on Internal Structure
• Question one is striving to answer when gathering validity
evidence based on internal
structure:
o To what extent are items in a particular assessment measuring
the same thing?
• If there is more than one item (ideally 6 to 8 items) on a test
measuring the same thing
and students’ individual answers to these item are highly
related/correlated, the greater
the validity support via evidence of internal structure. There
are specific statistical test
associated with this source of evidence, but these statistical
tests are typically used by
large assessment developer and not with local classroom
assessments.
8. • Methods used to gather validity evidence based on internal
structure
o Large Scale assessments
§ Factor- or cluster analytical studies
§ Analyses of item interrelationships, using item analyses
procedures (e.g.,
item difficulty, etc.)
§ Differential item function DIF studies
o Local classroom assessments
§ Include in one assessment multiple items (ideally 6 to 8)
assessing the
same thing. Don’t rely on a single item. For example, on an
assessment
4
use several items to measure the same skill, concept, principal,
or
application. Use of 6 to 8 items ensures consistency to conclude
from the
results that students do or do not understand the concept
measured by that
set of items. For example, one would expect that a student who
understands the concept would perform well on all the items
measuring
that concept and as consistency among the item results increases
so too
does the validity support increase. Consistency of responses
provides good
9. evidence for the validity of the inference that a student does or
does not
know the concept.
Evidence based on Relations to Other Variables
• Questions one is striving to answer when gathering validity
evidence based on
relationships to other variables.
o To what extent are the results of an assessment related to the
results of a different
assessment that measures the “same” thing?
o To what extent are the results of an assessment related to the
result of a different
assessment that measures a different thing?
• If observed relationships match predicted relationships, then
the evidence supports the
validity of the interpretation. There are specific statistical tests
associated with this
source of evidence, but these statistical tests are typically used
by large assessment
developers and not with local classroom assessments.
• Methods used to gather validity evidence based on
relationships to other variables
o Large Scale assessments
§ Correlational studies of
• the strengths and direction of the relationships between the
measure and external “criterion” variables
• the extent to which scores obtained with the measure predict
10. external “criterion” variables at a later date
§ Group separation studies, based on decision theory,
• That examines the extent to which a score obtained with an
instrument accurately predict outcome variables.
§ Convergent validity studies
• That examines the strength and direction of the relationships
between the measure and the other variables that the measure
should, theoretically, have high correlations with.
§ Discriminant validity studies
• That examines the strength and direction of the relationships
between the measure and the other variables that the measure
should, theoretically have low correlations with.
§ Experimental studies
• That test hypotheses about the effects of intervention on
scores
obtained with an instrument
§ Known-group comparison studies
• That test hypotheses about expected difference in average
scores
across various groups of respondents.
5
§ Longitudinal studies
11. • That test hypotheses about expected differences
o Local classroom assessments
o Compare one group of students who are expected to obtain
high scores with
students of another group who are expected to obtain low
scores. One would
expect the scores to match the prior expectations (e.g., students
expected to
score low would actually score low) and the more the results
match the
expectations the greater the validity support.
o Compare scores obtained before instruction with scores
obtained after
instruction. The expectation would be that scores would
increase so increases
from pre to post would provide validity support.
o Compare two measures of the same thing and look for
similarities or
discrepancies between scores. For example, you could compare
homework
and quiz performance and the fewer discrepancies or the more
similarities, the
greater the validity support.
Evidence based on Consequences of Testing
• Questions one is striving to answer when gathering validity
evidence on consequences of
testing:
o What are possible intended and unintended results of having
students take the
12. assessment?
o Are the intended and unintended results of the assessment
used to make decisions
about the levels of student learning?
• Evidence based on consequences of testing is established by
considering both positive
and negative consequences of engaging in testing and using the
test scores for decision
making purposes. For example a consequence of high stake
testing could be the
increased use of Drill-and-practice instructional practices, and
important subjects that are
not tested are ignored.
• Methods used to gather validity evidence based on
consequences of testing
o Large Scale assessments
§ Longitudinal Studies of the extent to which expected or
anticipated
benefits of testing are realized.
§ Longitudinal Studies of the extent of which unexpected or
anticipated
negative consequences of testing occur.
o Local Classroom assessments
§ Before giving the assessment articulate the intended
consequences you
expect, such as assessment results which will help you address
through
instruction, areas in which students skills are lacking. Also
articulate
13. possible unintended consequences such as low scores on an
assessment
lowering a student’s perceived self-efficacy. Another
unintended
consequence of heavy use of objective items could be to
encourage
students to learn for recognition, whereas essay items motivate
students to
learning in a way that stresses the organization of information
§ After giving the assessment compare predicted, intended
consequences
with actual consequences.
6
Evidence based on instruction
• Questions one is striving to answer when gathering validity
evidence on consequences of
testing:
o How well does the content taught or what content the students
have the
opportunity to learn align with what is actually assessed on an
assessment?
• This is only a consideration for local classroom assessment
and not large scale
assessments
o Methods used to gather validity evidence based on instruction
14. § The teacher should reflect on how he/she taught the
information by asking
yourself questions like…Where my instructional methods
appropriate.
Did I spend enough time on each concept? What is the match
between
what I taught and what was assessed? Have students had
adequate
opportunities to learn what I am assessing? Were the concepts
on the
assessment actually taught, and taught well enough, so that
students can
perform well and demonstrate their understanding? Answering
these
questions should help determine whether or not performance on
the
assessment is due to learning or if it is due to other factors such
as the
format of the assessment, gender, social desirability, and other
possible
influences on assessment performance besides instruction.
15. Technical Report
2014–2015 Administration Cycle
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
ii
Table of Contents
Table of Contents
...............................................................................................
16. ............................. ii
List of Tables and
Figures....................................................................................
............................v
PART I: HISTORICAL OVERVIEW AND SUMMARY OF
PROGRAMS ................................1
1. INTRODUCTION
...............................................................................................
................... 1
2. STUDENT ASSESSMENTS IN VIRGINIA
......................................................................... 1
2.1 Historical Overview of SOL Assessments
........................................................................ 1
2.1.1 High Academic
Standards................................................................................
.......... 1
2.1.2 Tests to Measure Student Progress on the SOL
......................................................... 1
2.1.3 Accountability for Student
Achievement................................................................... 2
2.2 Overview of Current Virginia SOL Assessments
............................................................. 2
2.2.1 Online Testing in Virginia
......................................................................................... 3
2.2.2 Computerized Adaptive Testing
................................................................................ 3
2.2.3 Current SOL Assessments
......................................................................................... 3
3. DEVELOPMENT OF SOL ASSESSMENTS
....................................................................... 4
17. 3.1 Content Standards, Curriculum Frameworks, and Test
Blueprints .................................. 4
3.1.1 Standards of Learning (SOL)
..................................................................................... 4
3.1.2 Curriculum Frameworks
............................................................................................ 4
3.1.3 Test Blueprints
............................................................................. ..................
............ 5
3.2 Item Development
...............................................................................................
.............. 6
3.2.1 New Item Development
.............................................................................................
6
3.2.2 New Item Content Review Committees
.................................................................... 7
3.3 New Writing Prompt Development
.................................................................................. 7
3.3.1 Specifications and
Development...........................................................................
..... 7
3.3.2 New Writing Prompt Review Committees
................................................................ 8
3.4 Field Testing
...............................................................................................
...................... 8
3.4.1 Embedded Field Testing of MC and TEIs
................................................................. 8
3.4.2 Stand-Alone Field Testing of Writing Prompts
18. ......................................................... 8
3.4.3 Sampling
...............................................................................................
..................... 9
3.4.4 Data Review Committees
.......................................................................................... 9
3.4.5 Statistics Reviewed During Data Review Meetings
................................................ 10
3.5 Test Construction
...............................................................................................
............. 11
3.5.1 Procedures
...............................................................................................
................. 11
3.5.2 Test Form Review Committees
............................................................................... 12
3.5.3 Procedures for Computerized Adaptive Administrations
........................................ 12
4. TEST ADMINISTRATION
...............................................................................................
.. 13
4.1 Training and Materials
...............................................................................................
..... 13
4.2 Testing Windows
...............................................................................................
............. 13
4.3 Test Security
Procedures..............................................................................
................... 14
4.4 Testing Accommodations
...............................................................................................
19. 14
4.4.1 Testing Accommodations for Students with Disabilities
........................................ 15
4.4.2 Testing Accommodations for LEP Students
............................................................ 16
5. WRITING SCORING
...............................................................................................
............ 17
5.1 Human Scorer Recruitment and Qualifications
.............................................................. 17
5.2 Rangefinding
...............................................................................................
.................... 18
5.3 Scorer Training and Qualifying Procedures
................................................................... 18
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
iii
5.3.1 Anchor Sets
...............................................................................................
............... 18
5.3.2 Practice Sets
...............................................................................................
.............. 19
5.3.3 Qualifying
Sets........................................................................................
23. 10.1 Validity Evidence Based on Test Content
.................................................................... 37
10.1.1 Relation to Content Standards
............................................................................... 37
10.1.2 Educator Input on Item Development
.................................................................... 38
10.1.3 SOL Assessment Alignment Studies
..................................................................... 38
10.2 Validity Evidence Based on Response Processes
......................................................... 38
10.2.1 Item Development
...............................................................................................
... 39
10.2.2 Technology Enhanced Items (TEIs)
...................................................................... 39
10.3 Validity Evidence Based on Internal Structure
............................................................. 40
10.3.1 Internal Consistency
...............................................................................................
40
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
iv
10.3.2 Differential Item Functioning (DIF)
...................................................................... 40
24. 10.3.3 Unidimensionality Evaluation
............................................................................... 40
10.4 Validity Evidence Based on Relationships to Other
Variables .................................... 41
11. ALTERNATE AND ALTERNATIVE ASSESSMENTS
.................................................. 41
12. RESOURCES
...............................................................................................
...................... 42
PART II: STATISTICAL SUMMARIES FOR SPRING 2015
....................................................43
1. OVERVIEW OF STATISTICAL SUMMARIES
................................................................ 43
1.1 Administration
Results....................................................................................
................ 43
1.2 Reliability Estimates for Non-Writing Assessments
...................................................... 43
1.3 Reliability Estimates for Writing
Assessments............................................................... 43
1.4 Decision Consistency and Accuracy Indices
.................................................................. 43
1.5 CAT Pool Information
...............................................................................................
..... 44
1.6 Raw Score-to-Scale Score Conversion Tables and
Conditional SEMs .......................... 44
2. SPRING 2015 STATISTICAL SUMMARY
....................................................................... 44
2.1 Administration
Results....................................................................................
25. ................ 44
2.1.1 Participation by Mode of Administration
................................................................ 44
2.1.2 Percent of Students in each Performance Level
...................................................... 46
2.1.3 Scale Score Summary Statistics
............................................................................... 48
2.2 Reliability Estimates for Multiple-Choice/Technology
Enhanced Item Assessments ... 51
2.2.1 Overall Reliability Estimates
................................................................................... 51
2.2.2 Reliability Estimates by Gender
.............................................................................. 52
2.2.3 Reliability Estimates by Ethnic Group
.................................................................... 54
2.3 Reliability Estimates for Writing
Assessments............................................................... 56
2.3.1 Stratified Alpha
................................................................................ ...............
......... 56
2.3.2 Inter-Rater Reliability
..............................................................................................
60
2.4 Decision Consistency and Accuracy Indices
.................................................................. 61
2.5 CAT Pool Information
...............................................................................................
..... 64
2.5.1 Number of Items in each CAT Pool Overall and by
26. Reporting Category ............... 64
2.5.2 Rasch Item Difficulty Distribution Overall and by
Reporting Category ................. 64
2.5.3 Item Exposure Rates
...............................................................................................
. 68
2.5.4 Pool Usage
...............................................................................................
................ 68
2.5.5 On Target Rates for Meeting Content and Statistical
Criteria ................................. 69
2.5.6 CSEM
...............................................................................................
........................ 69
2.6 Raw Score to Scale Score (RSSS) Conversion Tables and
Conditional SEM ............... 72
REFERENCES
...............................................................................................
...............................95
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
27. v
List of Tables and Figures
PART I: HISTORICAL OVERVIEW AND SUMMARY OF
PROGRAMS ................................1
Table 2.2.3.1 Virginia Standards of Learning Assessments at
Each Grade Level ..................... 3
Table 7.2.1 Virginia SOL Standard Setting Schedule
.............................................................. 25
Figure 8.1.1 Sample Item Characteristic Curve for a
Dichotomous Item ................................ 27
Figure 8.1.2 Sample Category Response Curves for a
Polytomous Item ................................. 28
Figure 8.3.3.1 Linking Process for the SOL Assessments
........................................................ 32
PART II: STATISTICAL SUMMARIES FOR SPRING 2015
....................................................43
Table 2.1.1.1 Percentage of Tests Taken by Mode: Grades 3–8
.............................................. 44
Table 2.1.1.2 Percentage of Tests Taken by Mode: Content-
Specific History ......................... 45
Table 2.1.1.3 Percentage of Tests Taken by Mode: End-of-
Course ......................................... 45
Table 2.1.1.4 Percentage of Tests Taken by Mode: VMAST
Mathematics and Reading ........ 46
Table 2.1.2.1 Grades 3–8 Passing Rates
................................................................................... 46
28. Table 2.1.2.2 Content-Specific History Passing Rates
............................................................. 47
Table 2.1.2.3 End-of-Course Passing Rates
.............................................................................. 47
Table 2.1.2.4 VMAST Grades 3–8 & End-of-Course Passing
Rates ....................................... 47
Table 2.1.3.1 Summary Statistics for Grades 3–8 Reading
Online .......................................... 48
Table 2.1.3.2 Summary Statistics for Grade 6 Mathematics
Online ........................................ 48
Table 2.1.3.3 Summary Statistics for Grades 5 and 8 Science
Online ..................................... 48
Table 2.1.3.4 Summary Statistics for High School End-of-
Course Online .............................. 49
Table 2.1.3.5 Summary Statistics for Grades 8, and End-of-
Course Writing Tests ................. 49
Table 2.2.1.1 Reliability for Grades 3–8 Reading
.................................................................... 51
Table 2.2.1.2 Reliability for Grade 6 Mathematics
.................................................................. 51
Table 2.2.1.3 Reliability for Grades 5 and 8 Science
............................................................... 51
Table 2.2.1.4 Reliability for High School End-of-Course Tests
............................................... 52
Table 2.2.2.1 Reliability for Grades 3–8 Reading by Gender
.................................................. 53
Table 2.2.2.2 Reliability for Grade 6 Mathematics by Gender
................................................. 53
Table 2.2.2.3 Reliability for Grades 5 and 8 Science by Gender
.............................................. 53
29. Table 2.2.2.4 Reliability for High School End-of-Course Tests
by Gender ............................. 54
Table 2.2.3.1 Reliability for Grades 3–8 Reading by Ethnic
Group......................................... 54
Table 2.2.3.2 Reliability for Grade 6 Mathematics by Ethnic
Group ....................................... 55
Table 2.2.3.3 Reliability for Grades 5 and 8 Science by Ethnic
Group .................................... 55
Table 2.2.3.4 Reliability for High School End-of-Course Tests
by Ethnic Group ................... 55
Table 2.3.1.1 Reliability for Grade 8 and End-of-Course Writing
........................................... 56
Table 2.3.1.2 Reliability for Grade 8 and End-of-Course Writing
by Gender ......................... 57
Table 2.3.1.3 Reliability for Grade 8 and End-of-Course Writing
by Ethnic Group ............... 59
Table 2.3.2.1 Inter-Rater Reliability for the Grade 8 Writing
Assessment: Prompts 2809, 2813,
2816, 2817, 2853, and 2854
...............................................................................................
....... 60
Table 2.3.2.2 Inter-Rater Reliability for the End-of-Course
Writing Assessment: Prompts
2101, 2103, 2106, 2108, 2116, and 2124
.................................................................................. 60
Table 2.4.1 Decision Consistency and Accuracy Indices for
Grades 3–8 Reading Online Tests
...............................................................................................
.................................................... 61
30. Table 2.4.2 Decision Consistency and Accuracy Indices for
Grade 6 Mathematics Online Test
...............................................................................................
.................................................... 62
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
vi
Table 2.4.3 Decision Consistency and Accuracy Indices for
Grades 5 and 8 Science Online
Tests
...............................................................................................
........................................... 62
Table 2.4.4 Decision Consistency and Accuracy Indices for
High School End-of-Course
Online Tests
...............................................................................................
............................... 62
Table 2.4.5 Decision Consistency and Accuracy Indices for
Grade 8 and End-of-Course
Writing Tests
...............................................................................................
.............................. 63
Table 2.5.1.1 Number of Items in the Mathematics CAT Pools
Overall and by Reporting
31. Category
...............................................................................................
..................................... 64
Table 2.5.2.1 Rasch Item Difficulty for Items in the
Mathematics CAT SOL Pool Overall and
by Reporting Category
...............................................................................................
............... 64
Table 2.5.2.2 Rasch Item Difficulty for Items in the
Mathematics CAT Plain English Pool
Overall and by Reporting Category
.......................................................................................... 65
Table 2.5.2.3 Rasch Item Difficulty for Items in the
Mathematics CAT SOL Audio Pool
Overall and by Reporting Category
.......................................................................................... 65
Table 2.5.2.4 Rasch Item Difficulty for Items in the
Mathematics CAT Plain English Audio
Pool Overall and by Reporting Category
.................................................................................. 65
Figure 2.5.2.1 Rasch Item Difficulty Distribution for All Items
in the Grade 6 Mathematics
CAT SOL Pool
...............................................................................................
........................... 66
Figure 2.5.2.2 Rasch Item Difficulty Distribution for All Items
in the Grade 6 Mathematics
CAT Plain English Pool
...............................................................................................
............. 66
32. Figure 2.5.2.3 Rasch Item Difficulty Distribution for All Items
in the Grade 6 Mathematics
CAT Audio Pool
...............................................................................................
........................ 67
Figure 2.5.2.4 Rasch Item Difficulty Distribution for All Items
in the Grade 6 Mathematics
CAT Plain English Audio Pool
...............................................................................................
.. 67
Table 2.5.3.1 Maximum Item Exposure Rate by Mathematics
CAT Pool ............................... 68
Figure 2.5.3.1 Item Exposure Rates for Grade 6 Mathematics
CAT Pools .............................. 68
Table 2.5.4.1 Percentage of Items Used within each
Mathematics CAT Pool ......................... 69
Table 2.5.6.1 Average CSEM for each Mathematics CAT Pool
.............................................. 69
Table 2.5.6.2 Average CSEM at the Basic, Proficient, and
Advanced Cut Scores for each
Mathematics CAT
Pool........................................................................................
..................... 69
Figure 2.5.6.1 Average CSEM across the Score Scale for the
Grade 6 Mathematics SOL CAT
Pool
...............................................................................................
............................................ 70
33. Figure 2.5.6.2 Average CSEM across the Score Scale for the
Grade 6 Mathematics Plain
English CAT Pool
...............................................................................................
...................... 70
Figure 2.5.6.3 Average CSEM across the Score Scale for the
Grade 6 Mathematics SOL
Audio CAT Pool
...............................................................................................
........................ 71
Figure 2.5.6.4 Average CSEM across the Score Scale for the
Grade 6 Mathematics Plain
English Audio CAT Pool
...............................................................................................
........... 71
Table 2.6.1 RSSS Conversions for Grade 3 Reading
............................................................... 72
Table 2.6.2 RSSS Conversions for Grade 4 Reading
............................................................... 73
Table 2.6.3 RSSS Conversions for Grade 5 Reading
............................................................... 74
Table 2.6.4 RSSS Conversions for Grade 5 Science
................................................................ 75
Table 2.6.5 RSSS Conversions for Grade 6 Reading
............................................................... 76
Table 2.6.6 RSSS Conversions for Grade 7 Reading
............................................................... 77
Table 2.6.7 RSSS Conversions for Grade 8 Reading
............................................................... 78
Table 2.6.8 RSSS Conversions for Grade 8 Science
................................................................ 79
34. Table 2.6.9 RSSS Conversions for EOC Reading
.................................................................... 80
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
vii
Table 2.6.10 RSSS Conversions for EOC Earth Science
......................................................... 81
Table 2.6.11 RSSS Conversions for EOC
Biology................................................................... 82
Table 2.6.12 RSSS Conversions for EOC Chemistry
............................................................... 83
Table 2.6.13 RSSS Conversions for EOC Algebra I
................................................................ 84
Table 2.6.14 RSSS Conversions for EOC Geometry
............................................................... 85
Table 2.6.15 RSSS Conversions for EOC Algebra II
............................................................... 86
Table 2.6.16 RSSS Conversions for EOC World Geography
.................................................. 87
Table 2.6.17 RSSS Conversions for Grade 8 Writing Core 1
.................................................. 89
Table 2.6.18 RSSS Conversions for Grade 8 Writing Core 2
.................................................. 90
Table 2.6.19 RSSS Conversions for Grade 8 Writing Core 3
.................................................. 91
Table 2.6.20 RSSS Conversions for EOC Writing Core 1
....................................................... 92
35. Table 2.6.21 RSSS Conversions for EOC Writing Core 2
....................................................... 93
Table 2.6.22 RSSS Conversions for EOC Writing Core 3
....................................................... 94
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
1
PART I: HISTORICAL OVERVIEW AND SUMMARY OF
PROGRAMS
1. INTRODUCTION
The Virginia Standards of Learning (SOL) Assessment Program
Technical Report provides
information for users and other interested parties about the
development and technical
characteristics of the assessments within the Virginia
Assessment Program. The SOL technical
report is divided into two parts. Part I presents a summary of
the components of the Virginia
SOL assessment program from the 2014–2015 administration
cycle. Part II provides statistical
information based on results from spring 2015.
36. 2. STUDENT ASSESSMENTS IN VIRGINIA
2.1 Historical Overview of SOL Assessments
In 1994, Virginia initiated significant reform of its K–12
educational system. This reform, which
has evolved over the last 20 years, consists of several major
elements discussed in the following
sections: high academic standards, tests to measure progress,
and accountability.
2.1.1 High Academic Standards
In 1995, the Virginia Board of Education adopted a set of
statewide standards: the Virginia SOL.
The Virginia SOL set forth minimum learning standards for
every child from K–12 in English,
mathematics, science, and history/social science. Over time, the
SOL were expanded to include
the areas of family life, economics and personal finance, fine
arts, foreign language, computer
technology, health and physical education, and driver education.
37. The board recognized the need for regular review and
evaluation of the SOL; therefore, in
September 2000, it approved a cyclical schedule for the review
of the standards. This has
resulted in each subject area undergoing a review and potential
revision every seven years
1
.
2.1.2 Tests to Measure Student Progress on the SOL
Development of tests to measure the SOL began in 1996 with
heavy involvement of classroom
teachers, curriculum specialists, and other local educators
throughout Virginia. A statewide
census field test of the new SOL test items took place in the
spring of 1997. The first
administration of SOL tests took place in the spring of 1998,
and the program has expanded
significantly since that time.
The SOL assessment program is the cornerstone of Virginia’s
system of accountability for public
schools and is authorized in Virginia law and administrative
rules (see Article I, Section 15 and
38. 1
The review cycle can be accessed at the following website:
http://www.doe.virginia.gov/testing/assessment_committees/rev
iew_schedule.pdf
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
2
Article VIII of the Constitution of Virginia and Title 22.1
Chapter 13.2 § 22.1-253.13:3C, Code
of Virginia). The purposes of the assessment program are to
SOL for Virginia public
school students;
ndicates the progress of students
and schools toward
meeting achievement levels on the SOL;
programs; and
The federally enacted No Child Left Behind Act of 2001
39. (NCLB) reinforced many strategies
already present in Virginia’s public education system. For a
number of years, public educators
throughout the commonwealth have focused on instructional
standards, student assessment,
reporting of results, and continuous improvement. To respond to
NCLB, Virginia has maintained
its rigorous academic content standards, measuring students
against defined academic
performance standards, added grade-level assessments in
various subjects, and reported on the
progress of student subgroups at the school, the division, and
the state levels. The Virginia
Assessment Program has been used to meet state and federal
educational requirements including:
s and schools toward
meeting established
achievement levels;
and
ing accountability information for school, school
40. division, and state levels.
2.1.3 Accountability for Student Achievement
The Standards of Accreditation (SOA) for Virginia’s public
schools outlines the state
requirements for student testing and graduation, as well as the
requirements for the accreditation
of schools in the commonwealth. The SOA may be found on the
website of the Virginia
Department of Education:
http://www.doe.virginia.gov/boe/accreditation/.
2.2 Overview of Current Virginia SOL Assessments
The Virginia SOL assessments are standards-based tests
designed to measure student
performance on Virginia’s content standards in the areas of
reading, writing, mathematics,
science, and history/social science. The SOL tests contain
primarily multiple-choice items,
however the mathematics, English, and science assessments also
include technology-enhanced
items (TEI).
2
41. TEIs are developed in a variety of formats that allow students
to indicate their
responses in ways other than the multiple-choice format. The
writing tests administered at grade
8 and high school include writing prompts in addition to
multiple-choice and TEI items.
2
Technology-enhanced item (TEI) types were used operationally
on all assessments with the exception of paper-
pencil tests and History/social science forms.
http://www.doe.virginia.gov/boe/accreditation/
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
3
2.2.1 Online Testing in Virginia
In the 2000 session of the general assembly, legislation was
passed that required and funded a
statewide web-based technology initiative. The goal of this
initiative was for Virginia school
divisions to implement online, web-based SOL instruction,
42. remediation, and testing in Virginia’s
high schools. The initiative provided funding for school
divisions to purchase hardware,
software, and to upgrade network and Internet capabilities.
Because the initial focus of the project was Virginia’s high
schools, the online testing initiative
began with the end-of-course (EOC) SOL tests. The first online
EOC tests were administered in
fall 2001. Since that time, additional SOL tests have been move
to the web-based delivery
system in a phased approached so that all tests are now
available in the online system.
2.2.2 Computerized Adaptive Testing
As part of the continuing effort to provide students with the
best possible testing experience,
Virginia added computerized adaptive testing (CAT) to the
Virginia Standards of Learning
Assessment Program in grade 6 mathematics during the 2014–
2015 school year. The Virginia
Department of Education plans to phase-in CAT for math and
reading assessments in grades 3–8
43. during the next few years. For additional information about
Virginia’s implementation of CAT,
refer to the Virginia Standards of Learning Computerized
Adaptive Testing Technical Manual.
2.2.3 Current SOL Assessments
In 2014–2015, students in grades 3–8 and high school were
tested using SOL assessments in the
content areas listed in Table 2.2.1. The Virginia General
Assembly eliminated five SOL
assessments beginning with the 2014–2015 academic year:
Grade 3 Science, Grade 3 History,
Grade 5 Writing, US History to 1865, and US History 1865 to
Present.
High school tests were designed to address specific course
content, regardless of the student’s
current enrolled grade. The content-specific history assessments
are not grade-level dependent
and are typically taken in the upper elementary or middle school
years.
Table 2.2.3.1 Virginia Standards of Learning Assessments at
Each Grade Level
44. SOL Content Area
Grade Level
3 4 5 6 7 8
Content-
Specific
History
High
School
English: Reading • • • • • • •
English: Writing • •
Mathematics • • • • • •
Science • •
Algebra I •
Geometry •
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
4
45. SOL Content Area
Grade Level
3 4 5 6 7 8
Content-
Specific
History
High
School
Algebra II •
Virginia and U.S. History •
World History I •
World History II •
World Geography •
Earth Science •
Biology •
Chemistry •
Virginia Studies •
Civics and Economics •
46. 3. DEVELOPMENT OF SOL ASSESSMENTS
The Virginia Department of Education works jointly with
Virginia educators and its testing
contractor to develop a series of tests to measure student
achievement on the SOL content
standards. The development of the SOL assessments involves
the use of test blueprints, item
development specifications, multiple review committees, and
field testing.
3.1 Content Standards, Curriculum Frameworks, and Test
Blueprints
3.1.1 Standards of Learning (SOL)
The SOL represent a broad consensus of what parents,
classroom teachers, and school
administrators—as well as academic, business, and community
leaders—believe schools should
teach and students should learn. In each of the four core areas
of English, mathematics, science,
and history/social science, a curriculum framework is provided
that details the specific
47. knowledge and skills students must possess to meet the content
standards for these subjects. The
SOL are reviewed and updated on a seven-year cycle.
3.1.2 Curriculum Frameworks
The SOL Curriculum Frameworks
3
amplify the SOL and provide additional guidance to school
divisions and their teachers as they develop an instructional
program appropriate for their
students. The curriculum frameworks assist teachers as they
plan their lessons by identifying the
essential knowledge and skills students need to learn.
School divisions use the curriculum frameworks as a resource
for developing sound curricular
and instructional programs, but the curriculum frameworks are
not intended to limit the scope of
instructional programs. Additional knowledge and skills that
can enrich instruction and enhance
3
The curriculum frameworks and test blueprints can be accessed
at the following website:
48. http://www.doe.virginia.gov/testing/
http://www.doe.virginia.gov/testing/
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
5
students’ understanding of the content identified in the SOL
should be included as part of quality
learning experiences.
3.1.3 Test Blueprints
The SOL test blueprint
3
serves as a guide for test construction. Each test covers a
number of
SOL. In the test blueprint, SOL are grouped into categories that
address related content or skills.
These categories are called reporting categories. When the
results of the SOL tests are reported,
the scores will be presented in terms of scores for each
reporting category and a total test score.
Each SOL is assigned to only one reporting category.
49. The number of test items that will be assessed in each reporting
category, as well as on the test as
a whole can be found in the test blueprint. Because of the large
number of SOL in each grade-
level content area, every SOL will not be assessed on every
version (form) of an SOL test. By
necessity, to keep the length of a test reasonable, each test will
sample from the SOL within a
reporting category. However, every SOL is eligible for
inclusion on each form of an SOL test.
In some content areas, there are SOL that do not lend
themselves to assessment within the
current format of the SOL tests. The SOL not tested are listed as
“Excluded from Testing” at the
end of the blueprint for each test.
There is a specific blueprint for each test. Each blueprint
contains three components relevant to
each SOL test: general test information, a blueprint summary
table, and the expanded blueprint.
The general test information section provides information about
the following topics:
50. orting categories;
A summary table of the blueprint displays the following
information:
st items in each reporting category;
-test items on the test; and
ield-test items) on the
test.
The expanded blueprint provides full text for each SOL. In
addition, SOL that are excluded from
the test are categorized by the reason they were not included.
51. For grade 6 mathematics, the blueprint also contains
information for two types of tests, the
online CAT and the traditional test. Because tests administered
with CAT tailor the items to each
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
6
student’s ability level, fewer items are needed to accurately
measure student knowledge and
skills. Therefore, the number of SOL items per reporting
category and at the total test level,
reflected in the blueprint, are fewer for tests administered using
CAT compared to tests
administered non-adaptively.
3.2 Item Development
3.2.1 New Item Development
.
Each year Virginia’s assessment contractor conducts an
evaluation of the items available for
52. traditional test forms construction and CAT. Item pool
distributions resulting from this
evaluation map the counts of items by SOL, by item type, by
Rasch item difficulty estimates, and
by cognitive complexity level. Based on the evaluation of
existing items, an item development
plan is developed for each test. Following approval of the item
development plans by the
Virginia Department of Education, new items are developed to
address the needs identified by
the evaluation. All items assess content specified by the SOL
and further defined in the
associated curriculum frameworks.
The item development process is multi-phased and involves a
variety of expert groups. Item
writers external to the testing vendors’ staff are trained on
requirements for the SOL assessment
program. Item writers author items in accordance with item
development plan-driven
assignments and the SOL item specifications. Item writers are
experienced in item authoring for
K-12 statewide assessments and have teaching experience in
their assigned subject matter and
53. grade span. Items are developed for the primary presentation
mode of online delivery.
Testing contractors’ content/assessment specialists review and
edit newly submitted items for
content accuracy and grade-level appropriateness and for
adherence to principles for quality item
construction, accessibility (i.e., universal design
4
), and fairness (e.g., bias, sensitivity, and limited
English proficiency). Content/assessment specialists are usually
former teachers in their
designated subject matter and grade span. Each item is coded
for the SOL it is intended to
measure. Items are reviewed and edited to ensure the annual
batch of new items meets expected
distributions of item difficulty and cognitive complexity levels
as required by the SOL being
assessed.
In addition to the initial review by the contractor’s assessment
specialists, there are a series of
internal item reviews involving different staff expertise. These
reviews include content reviews,
54. a professional editorial review, and a fairness review.
Additional guidance and feedback is
provided regarding the appropriateness of the content match to
the SOL and adherence to item
specifications through Virginia content review committee
meetings, as well as reviews
completed by Virginia Department of Education following these
meetings.
4
The application of the principles of universal design to
assessments entails a blend of good test design,
consideration of as many users as possible, assistive technology
where appropriate, and builds in appropriate visual
design (Dolan & Hall, 2001).
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
7
Item specifications are determined by the Virginia Department
of Education for appropriate
55. assessment of the SOL
3.2.2 New Item Content Review Committees
On an annual basis, Virginia educators from across the state
participate in the development of the
SOL assessments. Every summer, content review committees
convene to review content
materials for the SOL program. Content committees are
composed primarily of educators
teaching the subject of the test, including special education
teachers. A small number of
committee members may be division curriculum staff or other
school division employees. They
represent all grade levels—grade 3 through high school—all
content areas, and the racial/ethnic
diversity of Virginia students. Committee members also
represent a geographical cross section of
Virginia. Approximately one-third of the members of every
committee are new each year in
order to provide a balance of experienced educators and new
members and to bring new
perspectives into committee meetings. The committee members
review the newly developed test
56. items to confirm that they appropriately and fairly measure
student knowledge and skills in
accordance with the SOL and curriculum frameworks.
The committee members receive an orientation to the SOL
assessment program, an overview of
the test development process, and information about their roles.
Training focuses on educators
making judgments about the match of content to SOL, the
appropriateness of the content for the
grade level, and fairness and accessibility issues. Committees
meet separately by grade level and
subject. In addition to reviewing the match of the items to the
SOL, content review committee
members also identify and note their concerns regarding
potential item bias in the areas of
gender, racial/ethnic, religious, socioeconomic, and regional
characteristics. Additionally, item
bias concerns regarding students with disabilities and limited
English proficiency (LEP) may be
noted. Following discussion, the committee as a whole
recommends that an item be accepted,
edited, or rejected. Each committee member is also provided an
individual comment (input)
57. form. While committee recommendations are made by
consensus, committee members also
record their individual recommendations which may differ from
the committee consensus, on the
comment forms. All recommendations are tallied, and all
comments are compiled into a master
document that becomes an official record of the committee
review. Only after committee
recommendations are counted and comments recorded, is the
final decision about an item made.
As a result of the new item review process, some items are
eliminated from the prospective field-
test set, while others are edited in the manner directed for field
testing.
3.3 New Writing Prompt Development
3.3.1 Specifications and Development
Writing prompts are used to assess students’ writing skills on
the SOL writing tests in grade 8
and EOC. New writing prompts are developed and field tested
every four or five years as needed
to support test construction. English language arts content
58. specialists and item writers draft large
numbers of potential writing prompts. Each writing prompt
adheres to SOL specifications and is
written in the form of a question, an issue, or a hypothetical
situation.
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
8
3.3.2 New Writing Prompt Review Committees
As needed, the summer writing content review committees are
asked to provide input on new
writing prompts including evaluating the prompt’s clarity,
appropriateness for the SOL,
similarity to prior prompt topics,
5
and perceived ability of the prompt to elicit an extended
written student response. The review process is similar to that
used for the review of new MC
and TEIs. The committee as a whole provides a consensus
recommendation, with individual
59. members’ comments captured on prompt comment forms. Based
on committee feedback, edits
may be made to the prompts prior to field testing.
3.4 Field Testing
Once items and prompts have been developed, reviewed, and
approved by the content review
committees and the Virginia Department of Education, they are
eligible for inclusion on a field
test.
3.4.1 Embedded Field Testing of MC and TEIs
Field-test items are embedded within the spring test forms in
such a way that they appear
throughout the operational test form and are not identifiable to
students. This allows for the
collection of data on the new items that is not impacted by
motivation, as might occur if the
students knew that the new items did not contribute to their
score.
The position of the field-test items is pre-determined for each
60. core form, as is the number of
field-test variations per core. Each form has the same number of
field-test items to keep the test
length consistent. The number of field-test forms is determined
based on how many new items
need to be field tested in a given year.
For tests administered using CAT (i.e., grade 6 mathematics),
students are randomly assigned a
set of field-test items. These field-test items are not adaptive,
but are constructed similarly to the
field-test sets used for the traditional core forms. The field-test
items are interspersed throughout
the CAT test so that students do not know when a particular
item is a field-test item or an
operational item.
In the fall and summer administrations where the population
taking the test is not representative
of the state’s student population, place-holder items are
included in the field-test positions to
maintain consistent test lengths. These items do not contribute
to a student’s score, nor are the
data used to update item statistics.
61. 3.4.2 Stand-Alone Field Testing of Writing Prompts
For writing tests, new prompts are field tested as needed using a
separate, stand-alone field-test
administration. Typically, new writing prompts are developed
and field tested every four to five
years. The last stand-alone field test for writing occurred during
the 2011–2012 administration.
5
New prompts are compared to old prompt pools to make sure
that the same topic is not used again.
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
9
3.4.3 Sampling
During each spring test administration, test forms are
distributed throughout the commonwealth
in a way that will facilitate timely equating and the collection
of representative field-test data.
62. The manner in which test forms are distributed across the
school divisions is called the sampling
plan. The sampling procedures are based on data files
containing participation counts that
schools submit prior to the spring administration. These files
indicate the number of students in
each school who will take each test online or in paper-and-
pencil format. In conjunction with the
participation counts, the school division’s graduation date and
the date that school closes for the
year are considered when assigning test forms in the sampling
plan.
An attempt is made to assign test forms to divisions in such a
way that approximately equal
numbers of students respond to each field-test variation across
the cores.
6
Also, test forms are
assigned at the school division level so that all schools are
administered the same core for a
given test. The core that is assigned to a division by the above
process is labeled the Main form
for that division. Each division is also assigned an alternate
form. The alternate form is utilized
63. in retesting students in the case of a testing irregularity. For
instance, an administrator may need
to assign a different test form if the student becomes ill during a
test, or if there is a disruption
that prevents the student from completing the test.
The MC/TEI section of the writing tests is assigned to divisions
in the same way as the non-
writing tests. In addition, there are six to seven writing prompts
that are administered each
spring. Of the six to seven prompts, four or five are new writing
prompts that must be equated;
the other prompts have been equated during a previous
administration. In order to obtain enough
data to calibrate each new prompt for equating purposes, the
new prompts are randomly assigned
across the divisions.
The sampling process described above is not needed for CAT
tests because a different equating
design is used. This design is called pre-equating (see Section 8
for more information). Pre-
equating does not require additional student data, so students
can respond to any set of items in
64. the CAT pool and still receive a score report quickly. In
addition, students are administered
different sets of operational items, and the set of items
administered to a student during a first
attempt is blocked for that student during a second attempt. This
process provides a similar
solution to the main and alternate forms available for non-
adaptive administrations. Also, sets of
field-test items are randomly assigned to students during the
administration process. Therefore,
by design, approximately equal numbers of students respond to
each field-test item.
3.4.4 Data Review Committees
In addition to reviewing new items during the summer item
content review meetings, Virginia
educators review field-tested items. During the data review
meeting, committee members’ are
asked to review the appropriateness of items’ content, using the
field-test item statistics to inform
their judgments, as appropriate. During the data review meeting,
committees recommend
65. 6
Each core generally contains multiple forms. The core and
linking items are identical across each form, but the
embedded field-test items vary.
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
10
accepting or rejecting items. As with new item review, comment
(input) forms are the official
record of committee activity.
The same committee that reviews the writing MC items and
TEIs also reviews writing prompt
results (when available). As part of the training for the prompt
reviews, committee members are
provided with information on the scoring rubrics. The Virginia
SOL Writing Field Test Prompt
Evaluation Form, which Virginia’s testing contractor completes
as an evaluation of the prompts,
is also provided to the committee. This form is a hybrid of
qualitative and quantitative
66. information. During the scoring process for field-tested
prompts, scorers and team leaders record
their observations about the student responses to each prompt.
Team leaders then compile a
qualitative report that addresses the following questions:
do?
and provide specific
information and details?
responses to the prompt,
recommend that this prompt be used for live testing?
The report also includes the following pieces of information for
each prompt:
ts’ written responses
Committee members review the prompt and responses to
67. ascertain whether the prompt actually
elicited responses that are complete and well elaborated.
Members also review the prompt itself
for appropriate content and to ensure fairness for all students. A
prompt that elicits responses that
are similar to lists, or a prompt that seems to confuse students is
considered to be poorly
performing and is usually recommended for rejection. In some
circumstances, a prompt will be
recommended for field testing again at a different grade level.
Feedback and comments from
committee members is added to the final report.
3.4.5 Statistics Reviewed During Data Review Meetings
For the purpose of reviewing the quality of new test items,
reviewers are provided with various
data to assist them in decision-making. These data include
classical statistics and item response
theory (IRT) statistics (Rasch measurement model).
The classical statistics calculated for the MC items/TEIs include
68. group (African American,
Caucasian, Hispanic, and Asian);
-values);
-option response distributions for all respondents by
gender and ethnic group; and
-biserial correlations.
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
11
Classical statistics computed for field-tested writing prompts
include
rall and by gender and ethnic
group (African American,
Caucasian, Hispanic, and Asian); and
the writing domain raw and
total scores.
To supplement the classical statistics, item difficulty parameter
estimates based on the Rasch
IRT model are computed. More information about the Rasch
69. model is included in Section 8 of
this report.
Mantel-Haenszel Alpha and the associated chi-square
significance test are computed to evaluate
differential item functioning (DIF). The Mantel-Haenszel Alpha
is a log/odds probability
indicating when it is more likely for one of the demographic
groups to answer a particular item
correctly. When this probability is significantly different across
the various groups, the item is
flagged for further examination.
Response distributions for each demographic group indicate
whether members of a group were
drawn to one or more of the answer choices for the item. If a
large percentage of a particular
group selected an answer choice not chosen by other groups, the
item is inspected carefully.
Statistical analyses merely serve to identify test items that have
unusual characteristics. They do
not specifically identify items that are “biased;” such decisions
are made by item reviewers who
are knowledgeable about the state’s content standards,
70. instructional methodology, and student
testing behavior.
3.5 Test Construction
3.5.1 Procedures
7
New core operational test forms are generally used for the first
time in the spring test
administration. For non-writing tests, generally three new core
forms are developed annually for
all EOC assessments, except EOC World Geography. Typically,
three new core forms are also
developed annually for all writing tests (at grade 8 and EOC).
For all other SOL tests two new
core forms are typically developed annually. In some cases,
fewer core forms are developed and
core forms from previous years are reused.
Test specifications and test construction guidelines are
developed and approved by the Virginia
Department of Education. Test construction guidelines provide
the operational process and the
71. established expectations (both psychometric and content
characteristics) to guide SOL forms
assembly. The goal is to create test forms that are equivalent in
content representation and
psychometric characteristics both within a year and across
years.
7
These procedures apply to non-adaptive tests only. See Section
3.5.3 for information about tests administered
using CAT.
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
12
A common item linking design is used year to year. Items from
a core form from the prior spring
test administration are placed in the new administration’s two or
three core forms and serve as
the anchor items. Anchor items are placed in the same, or nearly
the same sequence positions in
the new core form that they appeared in on the old form. For
72. tests with items that are associated
with passages (reading and writing), the passages (and
associated items) are placed in as close as
possible to the same position within the new test form as they
were placed in the prior form.
Anchor items represent approximately 20–30% of the
operational forms. Content specialists
select anchor items, and psychometrics and the Virginia
Department of Education approve them.
Following the approval of anchor items, content specialists
select the remaining operational
items for each test. During the test construction process,
psychometricians review each form to
see whether it meets the test specification blueprint and the
statistical targets established to
produce forms of similar difficulty, statistical quality, and
content representation within and
across years.
These draft forms are reviewed by the Virginia Department of
Education. Any item replacements
are reviewed by psychometricians. The review process
continues iteratively until the Virginia
73. Department of Education has provided final approval.
3.5.2 Test Form Review Committees
The newly drafted operational test forms for each SOL
assessment are reviewed by content
review committees at the summer meetings. The new core forms
are reviewed for each SOL test.
Committee members receive training for this task including
information on the match of the
items on a form to the SOL test blueprint, the arrangement of
items within the form, and the
balance of topic coverage and item types. Members are asked to
confirm the appropriateness of
the item content and the accuracy of the keys.
Individual committee members have comment forms to record
their overall evaluation of the test
form, as well as comments on individual items. Individual
members’ comments and
recommendations are compiled into a master document
following the meeting. Committee
review may result in the need to replace one or more items on a
core form. These changes are
74. subject to review and approval by psychometrics and the
Virginia Department of Education.
Once operational test cores are finalized, content specialists
select field-test items and create
multiple sets that are embedded into the core forms to create
multiple core form variations. These
field-test sets are reviewed and approved by the Virginia
Department of Education.
For tests administered with CAT, the entire item pool is used to
build on-the-fly tests customized
to each student’s ability level. As such, a test form review for
CAT tests is not used. However,
the item and data review meetings are convened for new items
that are under consideration for
inclusion in the CAT item pool.
3.5.3 Procedures for Computerized Adaptive Administrations
Tests administered using CAT do not follow the same test
construction process as traditionally
administered tests. Instead of having an annual test building
cycle, a process is established for
75. Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
13
embedding test construction into the CAT algorithm which
selects items on-the-fly from a large
pool of SOL items. Item selection is based on statistical and
content constraints that have been
established and approved by the Virginia Department of
Education. These constraints are built to
mimic many of the same types of statistical and content
constraints used to build traditional test
forms (e.g., meeting the test blueprint, having items with
adequate point-biserials). In order to
implement these constraints, additional metadata must be
created and added to the item bank for
items that are part of CAT pools. These additional metadata
allow for more targeted selection of
items during the adaptive administration process. Therefore, the
test construction procedures and
timing for CAT are very different from the test construction
process and timing used to build
traditional test forms.
76. Each year, the CAT item pools will be evaluated after the
administration to ensure that they are
functioning as intended and allow for the adaptive algorithm to
meet the content and statistical
constraints. Results of this evaluation are provided in Part II of
this report. For more information
about CAT test construction, refer to the Virginia Standards of
Learning Computerized Adaptive
Testing Technical Manual.
4. TEST ADMINISTRATION
4.1 Training and Materials
To ensure the successful administration of the SOL assessments,
Virginia Department of
Education staff provides training to the division directors of
testing (DDOTs) before each fall
and spring test administration. DDOTs in turn provide
appropriate training to the divisions’
school test coordinators (STCs). STCs provide training to the
schools’ examiners and proctors
including information about security requirements, manuals,
local directions received from the
77. DDOTs, and other pertinent information. They address training
preparation of the test sites and
the provision of accommodations for eligible students. In
addition, the Virginia Department of
Education provides a standardized training for examiners and
proctors that is available to school
divisions if they choose to use it.
Test implementation manuals contain detailed instructions about
administration procedures.
These manuals are provided on the Virginia Department of
Education website:
http://www.doe.virginia.gov/testing/test_administration/index.s
html.
4.2 Testing Windows
There are three test administrations: spring, summer, and fall.
The spring administration is the
main administration during which most students test. During the
spring administration SOL
assessments for all grades and subjects are provided. The
summer administration is available for
only grade 8 math and reading for those students who are
78. pursuing a modified standard diploma
and for all EOC tests. This administration provides an
opportunity for students to retest who are
enrolled in summer school for EOC courses, for students who
need to retake an SOL tests to
meet graduation requirements, and for transfer students who are
seeking to meet the testing
requirements for graduation. .The fall administration is
available for only grades 6–8 and EOC.
This administration is available for students who are retesting
to meet graduation requirements
http://www.doe.virginia.gov/testing/test_administration/index.s
html
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
14
and for students taking courses with block schedules who
complete a course during the fall.
Some Virginia schools provide a block schedule for grades 6–8.
A fairly long testing window is provided for online assessments
so that schools have enough
79. time to coordinate student access to computers and to
accommodate different school calendars
across the state. Divisions can choose the weeks within the
testing window during which they
will administer the assessments, leaving sufficient time for
make-up testing. The testing window
for the writing assessments is earlier than the testing window
for the non-writing assessments to
provide extra time for the human-scoring of the short paper
(essay) component of the
assessment. In addition, the MC item/TEI component and the
short paper component can be
administered separately. Unlike the online tests, paper writing
tests are administered on a
specific day.
The testing calendar is posted on the Virginia Department of
Education website:
http://www.doe.virginia.gov/testing/test_administration/index.s
html.
4.3 Test Security Procedures
Everyone in the school division who has access to, or assists
with the administration of the
80. paper-and-pencil or online SOL assessments must read the Test
Security Guidelines and sign the
Test Security Agreement. The security agreement requires that
those involved in the test
administration exercise the necessary precautions to ensure the
security of the test content and all
test materials. This includes security procedures pertinent to the
receipt, inventory, distribution,
and storage of test materials. These forms are included in each
examiner’s manual and testing
implementation manual.
8
4.4 Testing Accommodations
All students in tested grade levels and courses are expected to
participate in Virginia’s
assessment program, unless specifically exempted by state or
federal law or by Board of
Education regulations. Virginia’s assessment system includes
students with disabilities and LEP
students. Students with disabilities and LEP students may take
SOL tests with or without
81. accommodations or they may be assessed through alternate or
alternative assessments. The tests
that comprise the Virginia Assessment Program are offered in
English only; administration of the
tests in other languages is not permitted.
The individualized education program (IEP) team or 504
committee has the responsibility for
decisions regarding the need for and the selection of
accommodations for students with
disabilities. Similarly, the LEP committee determines how LEP
students will participate in the
SOL assessments and what, if any, accommodations should be
provided to individual LEP
students. Accommodations allow students with disabilities or
LEP designation more appropriate
access to test content so they can demonstrate their content
knowledge.
8
These manuals may be downloaded from the following website:
http://www.doe.virginia.gov/testing/test_administration/
http://www.doe.virginia.gov/testing/test_administration/index.s
82. html
http://www.doe.virginia.gov/testing/test_administration
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
15
Accommodations considered for testing should be those that the
student uses routinely during
classroom instruction and assessments, as identified in the
student’s IEP, 504 plan, or LEP
participation plan. The student should be familiar with an
accommodation because the use of an
unfamiliar accommodation during testing may have a negative
impact on the student’s
performance. However, it is important to note that certain
accommodations used for instruction
or classroom assessment may not be allowed on the statewide
assessment. Finally, providing an
accommodation based solely on its potential to enhance
performance beyond allowing for more
appropriate access is inappropriate.
4.4.1 Testing Accommodations for Students with Disabilities
83. There are many assessment options for students with
disabilities. These include the SOL
assessments without accommodations, the SOL assessments
with accommodations, and
alternative (on-grade level) or alternate assessments including
the Virginia Substitute Evaluation
Program (VSEP), the Virginia Grade Level Alternative (VGLA)
9
, the Virginia Modified
Achievement Standards Test (VMAST)
10
, and the Virginia Alternate Assessment Program
(VAAP). Information on state assessment options available to
students with disabilities is
provided in the Students with Disabilities: Guidelines for
Assessment Participation document
available on the Virginia Department of Education's website.
http://doe.virginia.gov/testing/participation/index.shtml.
The SOL assessments must be considered by the IEP Team or
504 Committee before
alternate/alternative assessments are considered. Although many
84. students with disabilities will be
able to access the SOL assessments without accommodations,
others will require test
accommodations to address their disabilities and individual
needs. Test accommodations for
students with disabilities are grouped in the following
categories: time/scheduling, setting,
presentation, and response. The accommodations available
within each of these categories are
provided in the table below and are described in more detail on
the Virginia Department of
Education website.
http://www.doe.virginia.gov/testing/participation/guidelines_for
_special_test_accommodations.p
df.
9
VGLA is available for qualifying students with disabilities in
3-8 writing, science, and history.
10
VMAST assessments for End-of-Course (EOC) Algebra I and
EOC Reading will be available for eligible students
with disabilities pursuing a Standard Diploma with credit
85. accommodations. VMAST assessments for grade 8
mathematics and grade 8 reading are available for students with
disabilities pursing a Modified Standard Diploma.
The Modified Standard Diploma is available only to students
who entered the 9th grade for the first time prior to the
2013-2014 school year.
http://doe.virginia.gov/testing/participation/index.shtml
http://www.doe.virginia.gov/testing/participation/guidelines_for
_special_test_accommodations.pdf
http://www.doe.virginia.gov/testing/participation/guidelines_for
_special_test_accommodations.pdf
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
16
4.4.2 Testing Accommodations for LEP Students
Testing accommodation determinations for LEP students, made
by the LEP Committee, should
be based on the evidence collected from the student’s
educational record, such as:
years in U.S., prior schooling;
86. 11
test scores, and other academic
testing achievement;
achievement and comments
from general education teachers; and
e Proficiency Level as reported on the
ACCESS for ELLs score report.
11
Assessing Comprehension and Communication in English
State-to-State for English Language Learners
(ACCESS for ELLs®) is Virginia’s English language
proficiency assessment.
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
17
There are two types of accommodations available for LEP
students on the Virginia SOL
87. assessments—direct and indirect linguistic accommodations.
Direct linguistic testing accommodations involve adjustments to
the language of the test. The
following direct linguistic testing accommodations are available
to LEP students on the SOL
assessments:
ad-Aloud Test (English only)
-paper component only)
Algebra I)
The plain English mathematics test versions include items that
incorporate simpler language but
still measure the full breadth and depth of the SOL mathematics
content standards. Item writers
are trained to use the following guidelines.
have double meaning
88. bullets
Indirect linguistic testing accommodations involve adjustments
to the conditions under which
LEP students take SOL tests. The following indirect linguistic
testing accommodations are
available to LEP students on the SOL assessments:
Additional information about the accommodations available for
LEP students on the Virginia
SOL assessments is provided on the Virginia Department of
Education website.
http://www.doe.virginia.gov/testing/participation/lep_guidelines
.pdf.
89. 5. WRITING SCORING
5.1 Human Scorer Recruitment and Qualifications
The constructed response portion of the SOL writing assessment
is scored by human raters.
Highly qualified, experienced raters score all writing samples.
These raters are drawn from a
database of college graduates who completed the selection
process for scorers. The need for
ethnic and racial diversity is emphasized throughout the
selection process. Scorers for the
Virginia SOL writing test have a minimum of a bachelor’s
degree in an appropriate academic
http://www.doe.virginia.gov/testing/participation/lep_guidelines
.pdf
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
18
discipline (e.g., English, education), demonstrated ability in
performance assessment scoring,
and, preferably, teaching experience at the elementary or
secondary level. The selection process
90. requires that each candidate successfully complete a personal
interview, online scorer training,
and attain a high enough score on a qualification activity.
In addition to the scorers, scoring supervisors, scoring
directors, and content specialists are
involved in the scoring process. Scoring supervisors are
selected based on their proven ability to
score responses accurately and communicate scoring standards
to scorers. Scoring directors are
chosen based on their expertise in evaluating writing and their
experience training and
supervising scorers. A writing content specialist monitors
quality and provides support and
direction for scoring directors. The content specialist is
assigned based on educational
background and scoring experience.
5.2 Rangefinding
The writing samples used for training scorers are selected from
the samples scored during the
rangefinding process. Rangefinding is the process of identifying
model writing samples for the
91. two writing domains (composing/written expression and usage
and mechanics) that characterize
each score point of the writing rubric (1–4). Scoring directors
and the content specialists work
with Virginia educators who served on the rangefinding
committees to create training sets. These
writing samples, and others identified by Virginia Department
of Education and testing
contractor staff, are used as scoring guides during scorer
training, qualifying, and calibration.
The primary goal of the training is to convey the decisions made
during rangefinding to the
scorers and to help them internalize the scoring protocol.
5.3 Scorer Training and Qualifying Procedures
Scorers are required to take several training modules where they
learn about using the online
scoring system and implementing the scoring rubric. Each
student essay response receives a
score on a scale of 1–4 points for each of the domains. The four
rubric score points represent the
following:
92. Training also includes a review of the anchor sets identified
during rangefinding. These sets
represent each of the four rubric scores and both writing
domains. The sets have been scored and
annotated so that scorers learn how and why the scores were
determined. Next they complete
practice sets which include additional writing samples. Finally,
they receive qualification writing
sets which they must score accurately in order to continue
participating in the scoring of student
essays.
5.3.1 Anchor Sets
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
93. 19
Scorers review an anchor set for each of the two domains.
Anchor sets include clear examples of
each score point (1–4) and include annotations that provide a
rationale for the scores assigned
during rangefinding. These papers help the scorers internalize
the scoring rubric.
5.3.2 Practice Sets
After reviewing each anchor set, scorers practice on sample
papers, applying a score for the
domain they are reviewing. The sets include examples of each
score point. After applying scores
to practice papers, scorers review annotations. The annotations
provide feedback on true scores
(the scores approved by the rangefinding committee) and
explain the correct scores for the
practice papers.
5.3.3 Qualifying Sets
In order to qualify to score the Virginia SOL writing
94. assessment, scorers take four sets of 10
papers and must achieve 70% perfect agreement and 100%
adjacent agreement with
rangefinding-committee-approved scores for each domain on
two of the four sets. Scorers who
do not meet these standards are released from the project.
5.4 Scoring Procedures
5.4.1 Two Ratings with Resolution for Nonadjacent Scores
During each administration, all writing responses are scored
twice. Prior to 2014–2015, the two
scores were provided by two professional human scorers.
During summer 2013, a feasibility
study was conducted to evaluate the feasibility of using
automated scoring to provide one of the
two initial scores. The study showed that automated scoring
could be used successfully to score
SOL writing short-paper responses in this manner. The
automated scoring engine was trained on
all SOL writing prompts using student responses collected
during the field-testing. Beginning in
2014–2015, the automated scoring engine was used to provide
95. the second score for each student
response to facilitate more rapid reporting of student scores.
The same scoring rules and quality
control checks are used to ensure accurate scoring of student
writing prompt responses.
After a student response is scored by a human rater and the
automated scoring engine, the scores
are compared. If the scores match or are adjacent, they are
summed together to create the final
student score. If they are not adjacent, a scoring supervisor
provides a third score. If the third
score matches one of the first two, then it is summed with the
matching score to provide the final
score. If the third score is adjacent to one (or both) of the first
two scores, then the third score
and the higher of the adjacent scores are summed together to
provide the final score. If the three
scores do not match, and are not adjacent, then the response is
scored a fourth time by a scoring
director. If the fourth score matches any of the first three
scores, then it is summed with the
matching score which provides the final score. If the fourth
score is adjacent to any of the first
96. three scores, then it is summed with the highest adjacent score
to provide the final score.
For responses that receive a backread score, this third score can
overrule the outcome from the
first two scores. For example, if the first two scores were 2, but
the backread provided a score of
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
20
3, then instead of adding the first two scores together to provide
the final score, the backread
score and the highest adjacent score would be added together.
So in this example, the student
would receive a final score of 5 instead of the original score of
4. If the backread score was 4 in
this example, it would not match or be adjacent to the first two
scores, so the response would
require a fourth score.
5.4.2 Backreading
97. Backreading is a system that allows a scoring supervisor and/or
a scoring director to monitor an
individual rater’s score. Scoring supervisors read responses
already scored by scorers they are
monitoring. While backreading, the scoring supervisor can
evaluate the scorer’s performance,
provide feedback, and—if necessary—adjust an assigned score.
The scoring supervisor may also
halt the scoring activity of an individual or group of scorers
whose performance has declined.
The inter-rater reliability requirements are 65% perfect
agreement and 96% perfect plus adjacent
agreement for all scorers. Approximately 5% of papers are
included in the backreading process.
5.4.3 Validity Checks
Throughout scoring, scorers receive and score validity papers.
These papers are pre-scored
according to rangefinding standards. All scores on validity
papers are approved by the Virginia
Department of Education. Validity papers are used to monitor
consistency in scoring over time;
they are interspersed with and indistinguishable from other
student responses. Approved true
98. scores for these papers are loaded into the scoring system, and a
report is run that indicates what
percentage of accuracy a scorer achieves on validity papers in
scoring against the true score.
Validity papers are used as a check to ensure that scorers, as
well as scoring supervisors, do not
drift from the rubric but continue to score accurately. For
scorers that do not meet the 65%
perfect and 96% perfect plus adjacent agreement, targeted
calibration sets are sent to further
evaluate a scorer and provide re-training.
5.4.4 General Calibration Sets
Calibration is a process whereby scorers apply scores to student
papers that had been scored
previously by scoring directors and possibly a scoring
supervisor. Calibration sets include
student responses and are used as a training tool to improve
agreement among scorers. After
scorers take a calibration set, annotations provide an
explanation for the correct scores for the
responses.
99. Calibration is a form of training that creates consensus and
accuracy among scorers. Calibration
sets may focus on particular scoring issues—including
clarifying a scoring line, showing a
response that is unusual or problematic to score, or showing a
range of responses or performance
skills for a particular score point. The scoring directors present
scorers with calibration sets daily
throughout scoring.
5.4.5 Targeted Calibration Sets
Targeted calibration sets are similar to the general calibration
sets. They include pre-scored
student responses. However, like the qualifying sets, scorers
must achieve 70% exact agreement
Virginia Standards of Learning Assessments Technical Report
2014–2015 Administration Cycle
21
and 100% exact plus perfect agreement in order to be allowed to
continue scoring. Scorers who
100. do not attain the target accuracy rates are dismissed from the
scoring process.
5.5 Rescore Process
The primary purpose of the rescore process is to provide an
additional step to ensure that the
score assigned to the student’s writing sample produced as part
of the writing tests is an accurate
representation of the student’s achievement.
In fall of 2014 the automatic rescore process was eliminated for
term graduates. All rescoring
requests now follow the same process which is initiated by
school divisions. Requests to rescore
a student’s writing sample may be initiated by parents or by
school personnel. All requests must
be reviewed and approved by the school division before being
submitted. Requests should be
considered only if there is substantial evidence that the writing
sample should have received a
higher score. School division staff familiar with the rubric used
to score this assessment must
review the writing sample. Rescore requests should be approved
by the school division only if
101. the reviewers agree that the paper should have received a higher
score according to the rubric. A
school division may request that a student’s writing sample be
rescored if
AND
student for the writing test
should have received a higher score. Evidence of this requires
that at least two school
division staff familiar with the rubric used to score the writing
short-paper portion of the
writing test review the paper and agree that it should have
received a higher score.
6. SCORES
6.1 Raw Scores
A raw score represents the number of points a student received
for correctly answering questions
on a test. For the SOL non-writing tests that consist of MC
items and TEIs only, the raw score
that a student earns is equal to the number of items the student
answers correctly. For the SOL
102. writing tests that have a MC/TEI component and a short-paper
component, the raw score of the
short-paper component is calculated as the weighted sum of the
ratings given for each domain
12
and the total raw score is the sum of the raw scores on the two
components (MC/TEI plus short
paper).
Because different tests have different item types and different
numbers of questions, the raw
score is only useful in relation to that test or content area. For
example, consider a student who
receives a raw score of 59 on mathematics and a raw score of 43
on reading. Now imagine that
there were 75 total items on the mathematics test and 50 total
items on the reading test. In simple
terms, this can mean the respective percentage correct would be
79% for the mathematics test
and 86% for the reading test.
12
Each essay is scored on two elements: 1) composing and written