EQUITY AUDIT
Matt Hise
Name:
• Maria Immacolata Catholic School
Type:
• Parochial
Location:
• Houma, Louisiana
Enrollment:
• 180 Students
Demographic Snapshot
Overview of DataAnalysis
Overview of MICS
• The students utilized for this analysis are currently in the 5th grade at
Maria Immacolata Catholic School in Houma, Louisiana
• Data was pulled from Stanford Score reports from 2010-2012, when
these students were in the 2nd-4th grade
• Since this is a smaller school, there is only one class per grade.
• Though there are a few more students in the current 5th grade class,
these students have been consistently enrolled at MICS and thus
have yearly score reports
• I also chose to use the Scaled Score versus other figures provided,
as I felt this would provide the best snapshot of Adequate Yearly
Progress (AYP)
• To show differentiation amongst scores through the years, I used bar
graphs for each major subject
• Subjects that I analyzed include Total Math, Total Reading, Language,
Spelling, Science (2011-2012 only), and Social Studies (2011-2012
only)
• Stanford Achievement Test Series, Tenth Edition
• Otis-Lennon School Ability Test
Types of Assessment Data
Stanford Achievement Test Series, Tenth
Edition
• Valid and reliable tool needed for objective
measurement of achievement
• Administrators will obtain reliable data to evaluate
progress toward meeting the challenges set forth by
• the No Child Left Behind Act
• national and state standards and high expectations
• Teachers will identify and help children who are at risk
of being left behind
• Parents will understand what their children know and
can do and how they can help
Purpose of Assessment(s)
Otis-Lennon School Ability Test
• Includes tasks such as
• detecting likenesses and differences
• recalling words and numbers
• defining words
• following directions
• classifying
• establishing sequence
• solving arithmetic problem
• completing analogies
• Its intent is to assess thinking skills and provide an understanding
of a student's relative strengths and weaknesses in performing a
variety of reasoning tasks
• It is designed to get a measure of your child's ability level
independent of what they're being taught at school
Purpose of Assessment(s)
Grade level: 5
Dates of administration: mid- April
Dates of calculation:
sent immediately following testing
Date of score distribution to teachers:
Early June
Date of score distribution to parents/students:
Submitted with back-to-school newsletter (July/August)
Timeline
How/When Scores Analyzed:
• How: Viewed on individual basis due to class size
(individual v. group), year to year
• When: Immediately upon receipt. First previewed by
administration, results then shared/discussed with
teachers
Timeline
Yvonne; Principal
• Thoughts:
• Standardized testing is
necessary
• Helps determine a
student‟s intellectual
growth
• Smaller schools allow for
an individualized look at
student progress
Informal Interview #1
Randy;
Administrative Assistant
• Thoughts:
• Standardized testing has
not changed much over
the years
• These days, government
wants to see results
• Is outdated, but
significant changes
would “shock” the
system
Informal Interview #2
Hailey; 2nd Grade Teacher
• Thoughts:
• Standardized testing is
good, but should be
updated
• Students, especially
younger, would benefit
from more visual/spatial
assessments
• Consider implementing
technology
Informal Interview #3
Hollie; 4nd Grade Teacher
• Thoughts:
• Standardized testing
needs to remain
• Parents expect results,
especially in a Catholic
school
• Not broken, why fix it?
Informal Interview #4
Cathy; 7th Grade Teacher
• Thoughts:
• Standardized testing
needs to be updated
• More emphasis on
writing versus bubble-
sheets
• Agrees that changes
would have to be made
from the top, down
Informal Interview #5
• Are the assessments being used effectively? Why or why not?
• No. Current assessment approach is not an accurate reflection of student growth.
• Recommendations
• Consider new factors: technology-based assessments, incorporate emotional intelligence, place more emphasis
on writing.
• Help to increase student achievement?
• Appeal more to today‟s learner (through technology, adhering to emotional intelligence).
• Communicated to everyone?
• Yes. But not to it‟s fullest capacity.
• Are teachers well equipped to analyze data?
• Not entirely. Year-to-year progress is evident, but staff is ill-equipped to process data in it‟s entirety.
• Do teachers understand the purpose of assessments?
• Those who have worked in the public school do, while others who have spent most of their career in private
school do not.
• Conclusions about overall practices of building?
• Standardized testing needs to remain, but should adapt new practices that align with the Common Core initiative
(including the use of technology).
• Writing needs to be more fully assessed via independent testing.
• Place more emphasis on emotional intelligence.
• Place more emphasis on critical thinking.
Process Analysis
• Place more emphasis on emotional intelligence.
• “IQ alone has not been a strong predictor of performance at work
nor in life” (Sparkman, Maulding 644)
• “Life success factors seem to be influenced by emotional
intelligence than cognitive ability” (Sparkman, Maulding 644)
• Place more emphasis on critical thinking.
• “Critical thinking is often listed in college catalogues as one of an
institution‟s educational goals” (Hatcher 29)
Support From Research
• Standardized testing needs to remain, but should adapt new practices that
align with the Common Core initiative (including the use of technology)
• Current assessments “measure skills too narrowly; return results that are „too
little, too late‟ to be useful; and do not adequately assess whether students can
apply their skills to solve complex problems…” (Doorey 30)
• New assessments will be completed on computers or other digital devices, and results
would be returned within a few weeks
• Assessments will feature “complex, multipart tasks” (Doorey 30)
• Assessments will require students to “comprehend and analyze texts across all content
areas that are at a higher level of complexity than those that many districts now use”
(Doorey 30)
• Examples of new-style assessments
• PARCC
• Two-part summative; performance-based and end of year, required non-summative in speaking and listening
• Smarter Balanced
• “Strategically balanced” summative, interim, and formative assessments
Support From Research
• Writing needs to be more fully assessed via independent testing.
• With current testing, “new English teachers may inhibit their students‟
writing as they mold instruction to standardized writing assessments”
(Brimi 53)
• “…state tests cause teachers to guide students to write formulaic
essays, usually of the five-paragraph variety” (Brimi 53)
• Suggestions: Adopt TCAP Writing Assessment
• Formulated in Tennessee, adopted in 1994
• Has been put through various trials, facets, and phases of implementation
• “This assessment measured student writing during their fifth, eighth, and
eleventh grade years via a timed (35 minute) writing session” (Brimi 56)
• Fifth grade students, for example, respond to a narrative prompt
• “Essays graded on a six-point rubric by readers from a private North Carolina
company” (Brimi 56)
Support From Research
• Place more emphasis on emotional intelligence.
• “IQ alone has not been a strong predictor of performance at work
nor in life” (Sparkman, Maulding 644)
• “Life success factors seem to be influenced by emotional
intelligence than cognitive ability” (Sparkman, Maulding 644)
• Place more emphasis on critical thinking.
• “Critical thinking is often listed in college catalogues as one of an
institution‟s educational goals” (Hatcher 29)
Support From Research
Impressions
• Overall, I found that the students scores differentiated very little from
year to year
• Interesting, as group had a separate set of core-subject teachers each
year
• This possibly correlates with this group‟s natural tendencies in respect to
test-taking
• Most notable fluctuations were in Science and Social Studies
• Potentially due to having only two tested years of data vs. three
• Boys, for the most part, scored lower than the girls
• All students scored above 500 in each category from 2010-2012
Interview Protocol Responses
Diane Eschete
• Public school teacher
• 30+ years of experience
• Master Teacher
• Tutor for elementary level grades
Tessie Adams
• Public school teacher
• 20+ years of experience
• Master Teacher
• Terrebonne Parish's 2004-2005
Elementary School Teacher of the
Year
• 2005 Teacher of the Year for
Louisiana
Question: Which indicators does your institution
currently use to evaluate student achievement?
Standardized Test Results
Standardized Test
Results, Benchmark
testing (both monthly and
weekly)
Question: How important are standardized test results
to your institution? Why?
The ultimate thing.
Absolutely. Beyond
obsessive at this point.
Drives everything.
Curriculum, primarily
money.
Very. They're important to
the State Dept. to monitor,
hold schools, teachers,
districts accountable for
properly teaching
students. Result of NCLB.
Question: Which Standardized Tests does your
institution implement?
Leap/iLeap
Leap/iLeap, ITBS Test,
Explorer Assessment (9th),
EOC (End of Course),
PARCC will take place of the
Leap/iLeap in 2014-2015
Question: Do you currently have a member on staff who
prepares standardized test results to be viewed by faculty?
Yes No
Question: Do you believe a visual interpretation of
standardized test results could be helpful to your institution?
Please explain.
Absolutely. Pie charts, bar
graphs. LOVE the bar
graphs. More colors. More
manipulations, the better.
Definitely. Visual is always
better. 80% of learning is
visual. We have Data
Analysis Meetings twice a
month. Teachers look at
scores, figure out solutions to
low numbers.
Question: With the adoption of the Common Core Curriculum,
has your institution shifted testing methodology? If not, do you
plan to? Why or why not?
Yes; has to be aligned with
Common Core. Questions
have to be structured with
Common Core, level of
instruction. Everything has to
be aligned. Leap and iLeap
are aligned with Common
Core.
Nothing. Just being aware of
standards. Rigor and relevance is
key, has been for some time (5
yrs). Common Core minimizes the
number of things, but on a deeper
level. Education should be
spiraled, not overlapped. Building
on concepts.
Common Core is normalizing
education across the nation.
Question: Do you feel that gender plays a role in student
performance on standardized tests? If so, what trends have you
noted?
No. Though boys have shown
better results in Math and
Science, girls in ELA. Though
not sure if that is still true.
No trends with gender. State
Dept. analyzes subgroup
deficiency (ex. black males). Have
never seen gender make a
difference. Socioeconomic makes
a difference, however.
Common Themes
Some common elements include:
• The use of Standardized Testing as a means of
measurement
• The importance of Standardized Test results
• The incorporation of visual test results to help map data
• No notable correlation of test results based on gender
Issues @MICS
• Maria Immacolata Catholic School does many things
correctly with respect to standardized testing.
• However, there are several notable issues that must be
addressed to guarantee the continued success of our
students.
• These issues include:
• Dated means of testing (Stanford 10)
• Sluggish pace with adaptation to Common Core
• Lack of Data Analysis focus groups
• No designated staff member to interpret data
Suggestions
Issue #1: Dated means of testing (Stanford 10)
• With finally adopting the Common Core curriculum, the Diocese
must do away with the current testing method.
• We currently use the same Stanford 10 testing booklets each year,
and have the students growth plotted accordingly.
• Though proven effective, the Stanford is arguably a dated means of
testing.
• Thus, the Diocese should move beyond the current methods
(Leap/iLeap), and begin to implement testing via PARCC.
• In doing so, the Diocese will effectively be ahead of the game with
regard to standardized testing.
Suggestions
Issue #2: Sluggish pace with adaptation to Common Core
• Catholic schools in Louisiana are essentially the last group to adopt
the Common Core curriculum.
• This is primarily due to the lack of federal funding, thus no
“encouragement” from the government.
• However, as the Common Core has become more widespread
throughout Louisiana, the Diocese has opted to adapt it as well.
• The current plan is to introduce the curriculum via a “slow roll”
approach, beginning with the lower elementary grades during the
2013-2014 school year.
• Upper elementary grades will follow suit during the 2014-2015 school
year.
• This method is ideal, however it probably should have occurred a
few years earlier.
Suggestions
Issue #3: Lack of DataAnalysis focus groups
• Due to the size of our school, MICS does not have any Data
Analysis focus groups.
• This issue came to my attention after Ms. Tessie shared her
school‟s method of identifying weak points with respect to testing.
• I think the idea behind this method is terrific, but I also believe that
the key is consistency.
• If a group does not meet consistently, then progress is unlikely.
• In order to bring about this change to our school, the current
leadership must insist on monthly meetings.
• Said meetings should include various approaches and methods for
increasing test scores.
Suggestions
Issue #4: No designated staff member to interpret data
• Upon interviewing my principal, it came to my attention that no
particular member of our staff interprets data trends with respect to
standardized testing.
• The only person who truly does this is our principal, who in turn
only looks for major discrepancies.
• It is relatively unreasonable to expect our current leadership to be
solely responsible for synthesizing data.
• Thus, it is my suggestion that we either hire an additional staff
member to analyze data, or simply designate this responsibility to
individual faculty members.
• This can be done with the upcoming year‟s students being provided to
their future homeroom teacher.
• Ex. 4th grade test scores being handed off to the current 5th grade homeroom
teacher.
• By doing this, teachers can note current trends and implement
methodology throughout the school year to improve test scores.
Synthesis
• The interviews I conducted allowed for me to see standardized testing from
another perspective.
• By mirroring the ways that public schools handle standardized testing, I feel
that the private institutions can benefit immensely.
• While the current methods are by no means ineffective, they can certainly be
improved upon.
• Aspects such as updating testing material, quickening the adaptation to
Common Core, and increasing focus on test results are all elements that can
be improved.
• As previously mentioned, current assessments “measure skills too narrowly;
return results that are „too little, too late‟ to be useful; and do not adequately
assess whether students can apply their skills to solve complex problems…”
(Doorey 30).
• While updating the testing, MICS should note that “state tests cause teachers
to guide students to write formulaic essays, usually of the five-paragraph
variety” (Brimi 53).
Synthesis
• In saying so, an additional writing-type assessment, such as the TCAP
Writing Assessment may be considered as a supplement.
• Arguably, while emphasis should be placed on critical thinking, as “critical
thinking is often listed in college catalogues as one of an institution‟s
educational goals” (Hatcher 29), there should be consideration toward
emotional intelligence too, as “IQ alone has not been a strong predictor of
performance at work nor in life” (Sparkman, Maulding 644).
• Lastly, it should be noted that “collaboration – if done in a structured and
focused manner – can be incredibly important in helping teachers develop
effective teaching practices and problem-solving skills (Berry 10). With that in
mind, the teachers of MICS simply need to collaborate more with respect to a
variety of academic elements, including high-stakes testing.
• The ultimate goal is to guarantee the success of our students. Due to the
quality of education at MICS this will most certainly happen regardless,
however the likelihood of such is simply enhanced with my aforementioned
plan of action.
Reflection
• Would I do anything differently if I did this again?
• If I were to approach this project again, I think I would have preferred a listing of the required project elements
earlier on. This is particularly true with regard to the interviews. A number of individuals whom I interviewed were
under various time constraints, and thus difficult to get ahold of at the expected time.
• What else do I want to know?
• I would like to know if a presentation of this caliber could be potentially used to sway administration, be it a
traditional school board or parochial curriculum development team. Also, how can this presentation be
summarized in such a way that would make it appropriate for a professional portfolio?
• What roadblocks did I encounter?
• With regard to roadblocks, the primary issue I encountered dealt with time constraints. While my time
management was, in my opinion, pretty good, relying on others slowed down the process. As the interviews, for
example, were done as a courtesy to me, it was difficult to express a sense of urgency without badgering.
• Final Recommendations:
• I think Dr. Edgehouse presented the information in a way that was easy to understand, as well as adaptable to
various educational climates. For future students I would suggest, above all, to clarify confusion by asking
questions. These questions can be directed toward fellow cohort members, a critical friends group, or the
professor. In doing so, one can gain not only a deeper understanding of the project, but the value of others‟
perspective as well. Overall, I found this project to incredibly rewarding and fantastic practice for our roles as
future educational leaders.
• Berry, B., & Center for Teaching, Q. (2010). Teacher Effectiveness: The Conditions that Matter Most
and a Look to the Future. Center For Teaching Quality.
• BRIMI, H. (2012). TEACHING WRITING IN THE SHADOW OF STANDARDIZED WRITING
ASSESSMENT: AN EXPLORATORY STUDY. American Secondary Education, 41(1), 52-77.
• Doorey, N. A. (2012). Coming Soon: A New Generation of Assessments. Educational
Leadership, 70(4), 28-34.
• Hatcher, D. L. (2011). Which test? Whose scores? Comparing standardized critical thinking
tests. New Directions For Institutional Research, 2011(149), 29-39.
• OLSAT Test - Otis-Lennon School Ability Test Testing Mom. (n.d.). Testing Mom. Retrieved January
29, 2013, from http://www.testingmom.com/olsat-test-otis-lennon-school-ability-test/
• Pearson. (n.d.). Assessment and Information. Retrieved January 29, 2013, from
www.pearsonassessments.com/HAIWEB/Cultures/en-us/Productdetail.htm?Pid=SAT10C
• SPARKMAN, L. G. (2012). NON-COGNITIVE PREDICTORS OF STUDENT SUCCESS IN
COLLEGE. College Student Journal, 46(3), 642-652.
Works Cited

Equity Audit

  • 1.
  • 2.
    Name: • Maria ImmacolataCatholic School Type: • Parochial Location: • Houma, Louisiana Enrollment: • 180 Students Demographic Snapshot Overview of DataAnalysis
  • 3.
    Overview of MICS •The students utilized for this analysis are currently in the 5th grade at Maria Immacolata Catholic School in Houma, Louisiana • Data was pulled from Stanford Score reports from 2010-2012, when these students were in the 2nd-4th grade • Since this is a smaller school, there is only one class per grade. • Though there are a few more students in the current 5th grade class, these students have been consistently enrolled at MICS and thus have yearly score reports • I also chose to use the Scaled Score versus other figures provided, as I felt this would provide the best snapshot of Adequate Yearly Progress (AYP) • To show differentiation amongst scores through the years, I used bar graphs for each major subject • Subjects that I analyzed include Total Math, Total Reading, Language, Spelling, Science (2011-2012 only), and Social Studies (2011-2012 only)
  • 5.
    • Stanford AchievementTest Series, Tenth Edition • Otis-Lennon School Ability Test Types of Assessment Data
  • 6.
    Stanford Achievement TestSeries, Tenth Edition • Valid and reliable tool needed for objective measurement of achievement • Administrators will obtain reliable data to evaluate progress toward meeting the challenges set forth by • the No Child Left Behind Act • national and state standards and high expectations • Teachers will identify and help children who are at risk of being left behind • Parents will understand what their children know and can do and how they can help Purpose of Assessment(s)
  • 7.
    Otis-Lennon School AbilityTest • Includes tasks such as • detecting likenesses and differences • recalling words and numbers • defining words • following directions • classifying • establishing sequence • solving arithmetic problem • completing analogies • Its intent is to assess thinking skills and provide an understanding of a student's relative strengths and weaknesses in performing a variety of reasoning tasks • It is designed to get a measure of your child's ability level independent of what they're being taught at school Purpose of Assessment(s)
  • 8.
    Grade level: 5 Datesof administration: mid- April Dates of calculation: sent immediately following testing Date of score distribution to teachers: Early June Date of score distribution to parents/students: Submitted with back-to-school newsletter (July/August) Timeline
  • 9.
    How/When Scores Analyzed: •How: Viewed on individual basis due to class size (individual v. group), year to year • When: Immediately upon receipt. First previewed by administration, results then shared/discussed with teachers Timeline
  • 10.
    Yvonne; Principal • Thoughts: •Standardized testing is necessary • Helps determine a student‟s intellectual growth • Smaller schools allow for an individualized look at student progress Informal Interview #1
  • 11.
    Randy; Administrative Assistant • Thoughts: •Standardized testing has not changed much over the years • These days, government wants to see results • Is outdated, but significant changes would “shock” the system Informal Interview #2
  • 12.
    Hailey; 2nd GradeTeacher • Thoughts: • Standardized testing is good, but should be updated • Students, especially younger, would benefit from more visual/spatial assessments • Consider implementing technology Informal Interview #3
  • 13.
    Hollie; 4nd GradeTeacher • Thoughts: • Standardized testing needs to remain • Parents expect results, especially in a Catholic school • Not broken, why fix it? Informal Interview #4
  • 14.
    Cathy; 7th GradeTeacher • Thoughts: • Standardized testing needs to be updated • More emphasis on writing versus bubble- sheets • Agrees that changes would have to be made from the top, down Informal Interview #5
  • 15.
    • Are theassessments being used effectively? Why or why not? • No. Current assessment approach is not an accurate reflection of student growth. • Recommendations • Consider new factors: technology-based assessments, incorporate emotional intelligence, place more emphasis on writing. • Help to increase student achievement? • Appeal more to today‟s learner (through technology, adhering to emotional intelligence). • Communicated to everyone? • Yes. But not to it‟s fullest capacity. • Are teachers well equipped to analyze data? • Not entirely. Year-to-year progress is evident, but staff is ill-equipped to process data in it‟s entirety. • Do teachers understand the purpose of assessments? • Those who have worked in the public school do, while others who have spent most of their career in private school do not. • Conclusions about overall practices of building? • Standardized testing needs to remain, but should adapt new practices that align with the Common Core initiative (including the use of technology). • Writing needs to be more fully assessed via independent testing. • Place more emphasis on emotional intelligence. • Place more emphasis on critical thinking. Process Analysis
  • 16.
    • Place moreemphasis on emotional intelligence. • “IQ alone has not been a strong predictor of performance at work nor in life” (Sparkman, Maulding 644) • “Life success factors seem to be influenced by emotional intelligence than cognitive ability” (Sparkman, Maulding 644) • Place more emphasis on critical thinking. • “Critical thinking is often listed in college catalogues as one of an institution‟s educational goals” (Hatcher 29) Support From Research
  • 17.
    • Standardized testingneeds to remain, but should adapt new practices that align with the Common Core initiative (including the use of technology) • Current assessments “measure skills too narrowly; return results that are „too little, too late‟ to be useful; and do not adequately assess whether students can apply their skills to solve complex problems…” (Doorey 30) • New assessments will be completed on computers or other digital devices, and results would be returned within a few weeks • Assessments will feature “complex, multipart tasks” (Doorey 30) • Assessments will require students to “comprehend and analyze texts across all content areas that are at a higher level of complexity than those that many districts now use” (Doorey 30) • Examples of new-style assessments • PARCC • Two-part summative; performance-based and end of year, required non-summative in speaking and listening • Smarter Balanced • “Strategically balanced” summative, interim, and formative assessments Support From Research
  • 19.
    • Writing needsto be more fully assessed via independent testing. • With current testing, “new English teachers may inhibit their students‟ writing as they mold instruction to standardized writing assessments” (Brimi 53) • “…state tests cause teachers to guide students to write formulaic essays, usually of the five-paragraph variety” (Brimi 53) • Suggestions: Adopt TCAP Writing Assessment • Formulated in Tennessee, adopted in 1994 • Has been put through various trials, facets, and phases of implementation • “This assessment measured student writing during their fifth, eighth, and eleventh grade years via a timed (35 minute) writing session” (Brimi 56) • Fifth grade students, for example, respond to a narrative prompt • “Essays graded on a six-point rubric by readers from a private North Carolina company” (Brimi 56) Support From Research
  • 21.
    • Place moreemphasis on emotional intelligence. • “IQ alone has not been a strong predictor of performance at work nor in life” (Sparkman, Maulding 644) • “Life success factors seem to be influenced by emotional intelligence than cognitive ability” (Sparkman, Maulding 644) • Place more emphasis on critical thinking. • “Critical thinking is often listed in college catalogues as one of an institution‟s educational goals” (Hatcher 29) Support From Research
  • 22.
    Impressions • Overall, Ifound that the students scores differentiated very little from year to year • Interesting, as group had a separate set of core-subject teachers each year • This possibly correlates with this group‟s natural tendencies in respect to test-taking • Most notable fluctuations were in Science and Social Studies • Potentially due to having only two tested years of data vs. three • Boys, for the most part, scored lower than the girls • All students scored above 500 in each category from 2010-2012
  • 23.
    Interview Protocol Responses DianeEschete • Public school teacher • 30+ years of experience • Master Teacher • Tutor for elementary level grades Tessie Adams • Public school teacher • 20+ years of experience • Master Teacher • Terrebonne Parish's 2004-2005 Elementary School Teacher of the Year • 2005 Teacher of the Year for Louisiana
  • 24.
    Question: Which indicatorsdoes your institution currently use to evaluate student achievement? Standardized Test Results Standardized Test Results, Benchmark testing (both monthly and weekly)
  • 25.
    Question: How importantare standardized test results to your institution? Why? The ultimate thing. Absolutely. Beyond obsessive at this point. Drives everything. Curriculum, primarily money. Very. They're important to the State Dept. to monitor, hold schools, teachers, districts accountable for properly teaching students. Result of NCLB.
  • 26.
    Question: Which StandardizedTests does your institution implement? Leap/iLeap Leap/iLeap, ITBS Test, Explorer Assessment (9th), EOC (End of Course), PARCC will take place of the Leap/iLeap in 2014-2015
  • 27.
    Question: Do youcurrently have a member on staff who prepares standardized test results to be viewed by faculty? Yes No
  • 28.
    Question: Do youbelieve a visual interpretation of standardized test results could be helpful to your institution? Please explain. Absolutely. Pie charts, bar graphs. LOVE the bar graphs. More colors. More manipulations, the better. Definitely. Visual is always better. 80% of learning is visual. We have Data Analysis Meetings twice a month. Teachers look at scores, figure out solutions to low numbers.
  • 29.
    Question: With theadoption of the Common Core Curriculum, has your institution shifted testing methodology? If not, do you plan to? Why or why not? Yes; has to be aligned with Common Core. Questions have to be structured with Common Core, level of instruction. Everything has to be aligned. Leap and iLeap are aligned with Common Core. Nothing. Just being aware of standards. Rigor and relevance is key, has been for some time (5 yrs). Common Core minimizes the number of things, but on a deeper level. Education should be spiraled, not overlapped. Building on concepts. Common Core is normalizing education across the nation.
  • 30.
    Question: Do youfeel that gender plays a role in student performance on standardized tests? If so, what trends have you noted? No. Though boys have shown better results in Math and Science, girls in ELA. Though not sure if that is still true. No trends with gender. State Dept. analyzes subgroup deficiency (ex. black males). Have never seen gender make a difference. Socioeconomic makes a difference, however.
  • 31.
    Common Themes Some commonelements include: • The use of Standardized Testing as a means of measurement • The importance of Standardized Test results • The incorporation of visual test results to help map data • No notable correlation of test results based on gender
  • 32.
    Issues @MICS • MariaImmacolata Catholic School does many things correctly with respect to standardized testing. • However, there are several notable issues that must be addressed to guarantee the continued success of our students. • These issues include: • Dated means of testing (Stanford 10) • Sluggish pace with adaptation to Common Core • Lack of Data Analysis focus groups • No designated staff member to interpret data
  • 33.
    Suggestions Issue #1: Datedmeans of testing (Stanford 10) • With finally adopting the Common Core curriculum, the Diocese must do away with the current testing method. • We currently use the same Stanford 10 testing booklets each year, and have the students growth plotted accordingly. • Though proven effective, the Stanford is arguably a dated means of testing. • Thus, the Diocese should move beyond the current methods (Leap/iLeap), and begin to implement testing via PARCC. • In doing so, the Diocese will effectively be ahead of the game with regard to standardized testing.
  • 34.
    Suggestions Issue #2: Sluggishpace with adaptation to Common Core • Catholic schools in Louisiana are essentially the last group to adopt the Common Core curriculum. • This is primarily due to the lack of federal funding, thus no “encouragement” from the government. • However, as the Common Core has become more widespread throughout Louisiana, the Diocese has opted to adapt it as well. • The current plan is to introduce the curriculum via a “slow roll” approach, beginning with the lower elementary grades during the 2013-2014 school year. • Upper elementary grades will follow suit during the 2014-2015 school year. • This method is ideal, however it probably should have occurred a few years earlier.
  • 35.
    Suggestions Issue #3: Lackof DataAnalysis focus groups • Due to the size of our school, MICS does not have any Data Analysis focus groups. • This issue came to my attention after Ms. Tessie shared her school‟s method of identifying weak points with respect to testing. • I think the idea behind this method is terrific, but I also believe that the key is consistency. • If a group does not meet consistently, then progress is unlikely. • In order to bring about this change to our school, the current leadership must insist on monthly meetings. • Said meetings should include various approaches and methods for increasing test scores.
  • 36.
    Suggestions Issue #4: Nodesignated staff member to interpret data • Upon interviewing my principal, it came to my attention that no particular member of our staff interprets data trends with respect to standardized testing. • The only person who truly does this is our principal, who in turn only looks for major discrepancies. • It is relatively unreasonable to expect our current leadership to be solely responsible for synthesizing data. • Thus, it is my suggestion that we either hire an additional staff member to analyze data, or simply designate this responsibility to individual faculty members. • This can be done with the upcoming year‟s students being provided to their future homeroom teacher. • Ex. 4th grade test scores being handed off to the current 5th grade homeroom teacher. • By doing this, teachers can note current trends and implement methodology throughout the school year to improve test scores.
  • 37.
    Synthesis • The interviewsI conducted allowed for me to see standardized testing from another perspective. • By mirroring the ways that public schools handle standardized testing, I feel that the private institutions can benefit immensely. • While the current methods are by no means ineffective, they can certainly be improved upon. • Aspects such as updating testing material, quickening the adaptation to Common Core, and increasing focus on test results are all elements that can be improved. • As previously mentioned, current assessments “measure skills too narrowly; return results that are „too little, too late‟ to be useful; and do not adequately assess whether students can apply their skills to solve complex problems…” (Doorey 30). • While updating the testing, MICS should note that “state tests cause teachers to guide students to write formulaic essays, usually of the five-paragraph variety” (Brimi 53).
  • 38.
    Synthesis • In sayingso, an additional writing-type assessment, such as the TCAP Writing Assessment may be considered as a supplement. • Arguably, while emphasis should be placed on critical thinking, as “critical thinking is often listed in college catalogues as one of an institution‟s educational goals” (Hatcher 29), there should be consideration toward emotional intelligence too, as “IQ alone has not been a strong predictor of performance at work nor in life” (Sparkman, Maulding 644). • Lastly, it should be noted that “collaboration – if done in a structured and focused manner – can be incredibly important in helping teachers develop effective teaching practices and problem-solving skills (Berry 10). With that in mind, the teachers of MICS simply need to collaborate more with respect to a variety of academic elements, including high-stakes testing. • The ultimate goal is to guarantee the success of our students. Due to the quality of education at MICS this will most certainly happen regardless, however the likelihood of such is simply enhanced with my aforementioned plan of action.
  • 39.
    Reflection • Would Ido anything differently if I did this again? • If I were to approach this project again, I think I would have preferred a listing of the required project elements earlier on. This is particularly true with regard to the interviews. A number of individuals whom I interviewed were under various time constraints, and thus difficult to get ahold of at the expected time. • What else do I want to know? • I would like to know if a presentation of this caliber could be potentially used to sway administration, be it a traditional school board or parochial curriculum development team. Also, how can this presentation be summarized in such a way that would make it appropriate for a professional portfolio? • What roadblocks did I encounter? • With regard to roadblocks, the primary issue I encountered dealt with time constraints. While my time management was, in my opinion, pretty good, relying on others slowed down the process. As the interviews, for example, were done as a courtesy to me, it was difficult to express a sense of urgency without badgering. • Final Recommendations: • I think Dr. Edgehouse presented the information in a way that was easy to understand, as well as adaptable to various educational climates. For future students I would suggest, above all, to clarify confusion by asking questions. These questions can be directed toward fellow cohort members, a critical friends group, or the professor. In doing so, one can gain not only a deeper understanding of the project, but the value of others‟ perspective as well. Overall, I found this project to incredibly rewarding and fantastic practice for our roles as future educational leaders.
  • 40.
    • Berry, B.,& Center for Teaching, Q. (2010). Teacher Effectiveness: The Conditions that Matter Most and a Look to the Future. Center For Teaching Quality. • BRIMI, H. (2012). TEACHING WRITING IN THE SHADOW OF STANDARDIZED WRITING ASSESSMENT: AN EXPLORATORY STUDY. American Secondary Education, 41(1), 52-77. • Doorey, N. A. (2012). Coming Soon: A New Generation of Assessments. Educational Leadership, 70(4), 28-34. • Hatcher, D. L. (2011). Which test? Whose scores? Comparing standardized critical thinking tests. New Directions For Institutional Research, 2011(149), 29-39. • OLSAT Test - Otis-Lennon School Ability Test Testing Mom. (n.d.). Testing Mom. Retrieved January 29, 2013, from http://www.testingmom.com/olsat-test-otis-lennon-school-ability-test/ • Pearson. (n.d.). Assessment and Information. Retrieved January 29, 2013, from www.pearsonassessments.com/HAIWEB/Cultures/en-us/Productdetail.htm?Pid=SAT10C • SPARKMAN, L. G. (2012). NON-COGNITIVE PREDICTORS OF STUDENT SUCCESS IN COLLEGE. College Student Journal, 46(3), 642-652. Works Cited