Creating outcomes and assessment measures -jarvis christian college--administrative support units

599 views

Published on

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
599
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Good Morning,I am Cleopatra Allen, the Associate Director for Institutional Research, Planning, Assessment and Effectiveness at Wiley College.
  • We will expound on the institutional effectiveness process Dr. Carter briefly mentioned, and that is Program Assessment.
  • In discussing Program Assessment, let me give you a general definition of “What is Assessment?” Assessment is systematic and ongoing. It is the collection, review, and use of evidence about academic and administrative/educational support programs and services provided by the University for improving student learning and development. Assessment examines quantitative and qualitative evidence regarding student competence, uses this evidence to improve learning for current and future students, and presents results to stakeholders. Data is collected, analyzed and shared to determine skills, knowledge and values students have gained from the University experience.Integrated - tied to the University mission and strategic goals. • Ongoing - part of the ongoing business of the unit. • Implemented gradually - become part of the University culture slowly, implemented carefully. • Multi-faceted - uses multiple methods of assessment on multiple samples and at various points in the learning process. • Pragmatic - be practical with obvious implications to faculty and students. • Faculty-designed and implemented. • Self-renewing - data and information must feed back into the system, both on the University and unit level.The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: 3.3.1.1 educational programs, to include student learning outcomes; 3.3.1.2 administrative support services; 3.3.1.3 educational support services; 3.3.1.4 research within its educational mission, if appropriate; 3.3.1.5 community/public service within its educational mission, if appropriate
  • An objective is a measurable target with a time limit that must be met on the way to attaining a goal. Focus On Outcomes, Not TasksThe ability to focus on outcomes, not tasks, may very well be the primary thing that separates people who can improvise and flow with the unexpected from those who can’t. A task is a means to an end; an outcome is the end itself. Understand that, and you will be able to flow with anything.Learning outcomes focus on what a student should know, think or do as a result of a program.
  • SLO statements should be aligned with mission statements (and goals if applicable). SLO statements should clearly indicate the level and type of competence that is required of graduates of a program. The basis for assessment of student learning is learning outcome statements that clearly indicate and define the learning outcomes. SLO statements should be framed in terms of the program. Useful outcome statements are clear and simple declarative sentences and not bundled. SLO statements should focus on the learning result and not the learning process. SLO statements should be stated such that the outcome can be measured by more than one assessment method.
  • An outcome should address one important function/aspect of the unit typically from one perspective. (ESPECIALLY if unrelated)For example: Students will learn to apply relevant theories of Counseling as well as communicate the theories. Mention assessing various components.
  • The objective should be written in measurable terms, with a target in mind, such that it acts as a standard or measuring stick by which you evaluate your efforts.The intended outcome should be one for which it is feasible to collect accurate and reliable data.
  • What types of things are you striving for?What types of directions do you want to move in?Does the outcome allow for variation in services provided? Do you have it in your budget to assess for this year?Consider stretch targets to improve programAt least two assessment methods should be used to evaluate each program outcome. There are performance indicators (or levels of performance measures) associated with the assessment methods (e.g. 10% improvement on a standardized exam– the assessment method is the examination and the performance indicator associated with this method is 10% improvement of the exam score.)
  • Consider the resources available to you including financial, manpower, time, etc. and set objectives that can be successfully achieved.The outcome should aid in identifying where program improvements are needed. For example, if a standardized exam is used the “sub-scores” on the exam should be used to determine what needs to be improved. Also, determine what standards are expected from students in your program. For some learning outcomes, you may want 100% of graduates to achieve them. This expectation may be unrealistic for other outcomes. You may want to determine what proportion of your students achieve a specific level (e.g., 80% of graduates pass the written portion of the standardized test on the first attempt).
  • The objectives should address issues of current importance such that the information obtained can be useful to the unit in terms of efficiency, function, validation, etc.
  • Poor: This is poor because it says neither what systems nor what information about each system students should know. Are they supposed to know everything about them or just names? Should students be able recognize the names, recite the central ideas, or criticize the assumptions? Better: This is better because it says what theories students should know, but it still does not detail exactly what they should know about each theory, or how deeply they should understand whatever it is they should understand. Best: This is the clearest and most specific statement of the three examples. It provides even beginning students an understandable and very specific target to aim for. It provides faculty with a reasonable standard against which they can compare actual student performance. Example 4: (This example is taken from A Program Guide for Outcomes Assessment at Geneva College, April 2000): Poor: The problem with this is that the statement does not specify the type or quality of research to be done. Better: This specifies the type of research, but not the quality students must achieve. If a student independently does any research that is experimental or correlational, it would be viewed as acceptable. Best: Here, the standard for students to aim for is clear and specific enough to help faculty agree about what students are expected to do. Therefore, they should be able to agree reasonably well about whether students have or have not achieved the objective. Even introductory students can understand the outcome statement, even if they don’t know exactly what experimental and correlational research methods are.
  • An objective is a measurable target with a time limit that must be met on the way to attaining a goal.
  • Poor: This is poor because it says neither what systems nor what information about each system students should know. Are they supposed to know everything about them or just names? Should students be able recognize the names, recite the central ideas, or criticize the assumptions? Better: This is better because it says what theories students should know, but it still does not detail exactly what they should know about each theory, or how deeply they should understand whatever it is they should understand. Best: This is the clearest and most specific statement of the three examples. It provides even beginning students an understandable and very specific target to aim for. It provides faculty with a reasonable standard against which they can compare actual student performance. Example 4: (This example is taken from A Program Guide for Outcomes Assessment at Geneva College, April 2000): Poor: The problem with this is that the statement does not specify the type or quality of research to be done. Better: This specifies the type of research, but not the quality students must achieve. If a student independently does any research that is experimental or correlational, it would be viewed as acceptable. Best: Here, the standard for students to aim for is clear and specific enough to help faculty agree about what students are expected to do. Therefore, they should be able to agree reasonably well about whether students have or have not achieved the objective. Even introductory students can understand the outcome statement, even if they don’t know exactly what experimental and correlational research methods are.
  • Develop and write your program goals and intended outcome statements before selecting assessment methods. Do not develop an assessment instrument and then fit an intended outcome to it.
  • The objectives should address issues of current importance such that the information obtained can be useful to the unit in terms of efficiency, function, validation, etc.
  • You know what percentage meet the criterion but what are you going to improve based on the data.
  • The objectives should address issues of current importance such that the information obtained can be useful to the unit in terms of efficiency, function, validation, etc.
  • The objectives should address issues of current importance such that the information obtained can be useful to the unit in terms of efficiency, function, validation, etc.
  • The objectives should address issues of current importance such that the information obtained can be useful to the unit in terms of efficiency, function, validation, etc.
  • The objectives should address issues of current importance such that the information obtained can be useful to the unit in terms of efficiency, function, validation, etc.
  • An objective is a measurable target with a time limit that must be met on the way to attaining a goal.
  • The objectives should address issues of current importance such that the information obtained can be useful to the unit in terms of efficiency, function, validation, etc.
  • The objectives should address issues of current importance such that the information obtained can be useful to the unit in terms of efficiency, function, validation, etc.
  • The objectives should address issues of current importance such that the information obtained can be useful to the unit in terms of efficiency, function, validation, etc.
  • An objective is a measurable target with a time limit that must be met on the way to attaining a goal.
  • An objective is a measurable target with a time limit that must be met on the way to attaining a goal.
  • An objective is a measurable target with a time limit that must be met on the way to attaining a goal.
  • Creating outcomes and assessment measures -jarvis christian college--administrative support units

    1. 1. 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    2. 2. Strategic Planning Program Assessment Mission Process Review Institutional Effectiveness Processes Program Informed Review Budgeting Process Planning Committee6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    3. 3.  Assessment is a process of gathering and interpreting information to determine the degree to which a program is meeting established goals and then using that information to enhance the program. As a part of institutional effectiveness, assessment is required at all levels. There is one core requirement and one comprehensive standard established by the Southern Association of Colleges and Schools that specifically addresses assessment and student learning outcomes. Core Requirement 2.5 Comprehensive Standard 3.3.16/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    4. 4. Follow-Up on Identify Implemented Expected Changes Outcomes Identify Interpret How Where Results Will Inform Outcomes Teaching/Learning are Addressed Determine Methods and Schedule of Criteria to Assessment Assess Outcomes Identify the Who Will Be Expected Assessed PerformanceSource: Maki, P. (2004). Assessing for Learning: Building A Sustainable Commitment Across the Institution (pp. 4-5). Sterling, VA: Stylus Publishing.
    5. 5. MATURESMART Outcomes Assessment Methods6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    6. 6. CREATING SMART OUTCOMES• What is an outcome? – Process Outcome: The end result of what a program or process is to do, achieve, or accomplish. Process outcomes can be as simple as completion of a task or activity, although this is not as meaningful as it could be and does not provide information for improvement. (Adapted from www.assessment.tamu.edu) – Student Learning Outcome: Student Learning Outcomes (SLOs) are specific statements that describe the learning that students are expected to achieve as a result of the program. That is, active verbs are used to identify the “actions, behaviors, dispositions, and ways of thinking or knowing that students should be able to demonstrate.” (Maki, 2004) 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    7. 7. CREATING SMART OUTCOMES S • Specific M • Measureable A • Attainable R • Realistic T • Time-Bound Drucker, P. F. (2006) 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    8. 8. CREATING SMART OUTCOMES S •Specific • Specific to your unit or program (can start with the key functions or major concerns of the unit) M •Measureable • Learning outcomes that describe the expected abilities, knowledge, values and attitudes a student has based on their A •Attainable learning R •Realistic • Data should create opportunity to make improvements T •Time-BoundInformation Source: 2005 University of Central Florida 4A-1 UCF Academic Program Assessment Handbook September 2006 Information, Analysis, and Assessment 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    9. 9. CREATING SMART OUTCOMES S •Specific • The intended outcome should be one for which it is feasible to collect M •Measureable accurate and reliable data. • Learning outcomes: use Bloom’s Taxonomy -- Relies on active verbs that identify what students should be able to A •Attainable demonstrate, represent, or produce (e.g., create, apply, formulate, etc.) • Process outcomes: focus on how accurate, how economical, how R •Realistic satisfied, how prompt, how many, etc. • Consider resources (e.g., staff, technology, assessment T •Time-Bound support, institutional level surveys, etc.) • Provide opportunities to triangulate dataInformation Source: 2005 University of Central Florida 4A-1 UCF Academic Program Assessment Handbook September 2006 Information, Analysis, and Assessment 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    10. 10. CREATING SMART OUTCOMES S •Specific Questions that can assist in defining aggressive outcomes • How have the students’ experiences with the unit contributed to their M •Measureable abilities, knowledge, values and attitudes? Ask: o Cognitive skills: What does the student know? A •Aggressive o Performance skills: What does the student do? o Affective skills: What does the student care about? R •Realistic •Based on your knowledge of the performance of the unit (benchmarks), is the outcome attainable? T •Time-BoundInformation Source: 2005 University of Central Florida 4A-1 UCF Academic Program Assessment Handbook September 2006 Information, Analysis, and Assessment 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    11. 11. CREATING SMART OUTCOMES S •Specific Consider whether or not you have the human and/or financial resources to assess this outcome. M •Measureable A •Aggressive R •Realistic T •Time-BoundInformation Source: 2005 University of Central Florida 4A-1 UCF Academic Program Assessment Handbook September 2006 Information, Analysis, and Assessment 10/1/2009 Assessment Workshop OIRPA (ALLEN): 09/09
    12. 12. CREATING SMART OUTCOMES S •Specific Can this outcome be achieved within the specified time-frame? M •Measureable A •Aggressive R •Realistic T •Time-BoundInformation Source: 2005 University of Central Florida 4A-1 UCF Academic Program Assessment Handbook September 2006 Information, Analysis, and Assessment 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    13. 13. S •Specific Sample 1: Students will identify behaviors of healthy relationships and design ways in which they will engage in healthy relationships with others.M •Measureable Sample 2: Develop the capacity to understand and interact effectively with others who differ in beliefs, behaviors, values, and worldview.A •AggressiveR •RealisticT •Time-Bound6/8/2012 Assessment Workshop FITTS: 09/09
    14. 14. EVALUATE THESE OUTCOMESS •Specific Example 1: The Office of Financial Aid will decrease the default rate.M •Measureable Example 2: Students will be satisfied with the Registrar’s Office.A •AggressiveR •Results-OrientedT •Time-Bound6/8/2012 Assessment Workshop FITTS: 09/09
    15. 15. S •SpecificM •MeasureableA •AggressiveR •Results-OrientedT •Time-Bound6/8/2012 Assessment Workshop FITTS: 09/09
    16. 16. MATURE SMART Assessment Outcomes Methods6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    17. 17. CREATING MATURE ASSESSMENT METHODS M •Match A •Appropriate T •Target U •Useful R •Reliable E •Efficient 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    18. 18. CREATING MATURE ASSESSMENT METHODS M •Match Match the outcome with the appropriate assessment method. A •Appropriate Consider the ability of an assessment method to address specific T •Target assessment questions, as well as its relevance and utility. U •Useful R •Reliable E •Efficient 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    19. 19. CREATING MATURE ASSESSMENT METHODS M •Match • Which type of assessment method is appropriate? A •Appropriate • Direct: Achievement • Indirect: Perception T •Target • Are the resources available to use this method? U •Useful • Would your department have control over the method of assessment? R •Reliable E •Efficient 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    20. 20. CREATING MATURE ASSESSMENT METHODS M •Match Determine the desired level of performance. A •Appropriate Benchmarking (using institutional data, peer institution data, nationally accepted standard for the field or discipline) T •Target The criterion/benchmark for success should be stated in terms of percentages, percentiles, averages or other quantitative measures. U •Useful Establish a reasonable benchmark. Avoid using absolutes such as “100%”, “zero”, and “all” when establishing criteria. R •Reliable NOTE: If you have previously measured an outcome, it is helpful to use this E •Efficient as the baseline for setting a target for next year. 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    21. 21. CREATING MATURE ASSESSMENT METHODS M •Match Can you make inferences about the progress towards the outcome? A •Appropriate Achieving high-quality assessment requires addressing issues identified T •Target by Linn and Baker (1996) and Herman, Aschbacher, and Winters (1992) such as: •Does the selected method cover the curriculum objectives? U •Useful •Does it match the desired level of complexity? •Can the results be generalized, and to what extent? R •Reliable •Will information be gained that will be useful in improving programs? E •Efficient 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    22. 22. CREATING MATURE ASSESSMENT METHODS M •Match A reliable assessment method is one that yields consistent responses A •Appropriate over time. (e.g., lack ambiguity, completion time, etc.) T •Target U •Useful R •Reliable E •Efficient 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    23. 23. CREATING MATURE ASSESSMENT METHODS M •Match Estimate the time required to develop, administer and evaluate A •Appropriate various assessment methods. T •Target If possible, use two measures. Look for ways to conduct a deeper analysis. Use qualitative and quantitative assessment measures (qualitative: open-ended questions on surveys, focus groups, and U •Useful structured interviews) R •Reliable E •Efficient 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    24. 24. CREATING MATURE ASSESSMENT METHODS: DIRECT METHODS Case studies, hypothetical situation  Standardized Observations responses  Pre-post questionnaire Minute papers Portfolio assignments Standard rubric Checklist Rubric Expert evaluation 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    25. 25. CREATING MATURE ASSESSMENT METHODS: INDIRECT METHODS  Cooperative Institutional Research  Advisory Board Surveys Program (CIRP)  Alumni Surveys  College Student Expectations  Assessment Surveys Questionnaire (CSXQ)  Customer Surveys  Institutional Priorities Survey  Employer Surveys  National Survey of Student Engagement  External Peer Review Surveys (NSSE)  Point of Service Surveys  Your First College Year (YFCY)  Focus Groups  Alumni Surveys  Interviews  Employer Surveys  Case Studies  Graduating Seniors and Graduates Surveys  Retention Data  Non-Returning Student Survey  Student Satisfaction Surveys  Entering Freshmen Survey 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    26. 26. 10/1/2009 Assessment Workshop OIRPA (ALLEN): 09/09
    27. 27. Example Outcome 1: Sample 1: Students will identify behaviors ofM •Match healthy relationships and design ways in which they will engage in healthy relationships with others.A •Appropriate How can we assess this outcome? T •TargetU •UsefulR •Reliable E •Efficient6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    28. 28. M •MatchA •Appropriate T •TargetU •UsefulR •Reliable E •Efficient6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    29. 29. Central Michigan University Provost’s Office. (n.d.) Formative and summative assessment. Central Michigan University. http://www.provost.cmich.edu/assessment/toolkit/formativesummative.htmDrucker, P. F. (2006). The Practice of Management (originally published in 1954). New York, NY: HarperCollins.Stanford University (2007). Analyzing Assessment Results: http://www.stanford.edu/dept/pres- provost/irds/assessment/analyze.pdfUniversity of Central Florida (2005). UCF Academic Program Assessment Handbook February 2005 Information, Analysis, and Assessment: http://oeas.ucf.edu/doc/acad_assess_handbook.pdfMaki, P. (2004). Assessing for Learning: Building A Sustainable Commitment Across the Institution (pp. 4-5). Sterling, VA: Stylus Publishing. 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    30. 30. Mrs. Cleopatra Allen•callen@wileyc.edu•(903) 927-32956/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09
    31. 31. 6/8/2012 Assessment Workshop OIRPA (ALLEN): 09/09

    ×