Designing an evaluation of a tertiary preparatory program sounds

1,196 views

Published on

Presentation

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,196
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Designing an evaluation of a tertiary preparatory program sounds

  1. 1. Designing an Evaluation of a Tertiary Preparatory Program within the University Context By Ravinesh C Deo
  2. 2. Presentation Overview <ul><li>Introduction to a Program Evaluation? </li></ul><ul><li>Benefits of an Evaluation </li></ul><ul><li>Problem Statement </li></ul><ul><ul><li>Challenges during First Year of Tertiary Study </li></ul></ul><ul><ul><li>Preparatory Program in Schools </li></ul></ul><ul><li>Designing an Evaluation </li></ul><ul><li>Ethical Factors to Consider </li></ul><ul><li>Reporting Back Mechanisms </li></ul><ul><li>Conclusion </li></ul><ul><ul><li>Limitations of Current Plan </li></ul></ul>Contents
  3. 3. What is a Program Evaluation <ul><li>A process that examines success of an educational event, syllabus design, content, implementation or achievement of objectives. </li></ul><ul><ul><li>assesses merits of a product, a program or a practice </li></ul></ul><ul><li>Is conducted using a multi-faceted and systematic approach </li></ul><ul><ul><li>focus on design, implementation and effectiveness </li></ul></ul><ul><li>Outcomes enable educational practitioners to gather evidence whether their actions and decisions are appropriate </li></ul><ul><li>Helps determine whether their program fulfils aim and ethos of their academic institutions </li></ul>
  4. 4. The Tertiary Preparatory Program <ul><li>The Open Access College of University of Southern Queensland designed a teaching and a learning program known as the “Preparatory Program in Schools”. </li></ul><ul><ul><li>designed for Year 11 students to complete a set of tertiary courses during their final school years </li></ul></ul><ul><ul><li>fulfils commitment of the College to supporting students who are not able to make informed decisions about career path </li></ul></ul><ul><ul><li>creates avenues for various career options </li></ul></ul>
  5. 5. Why Evaluate the Tertiary Preparatory Program in Schools? <ul><li>A new program so a need to assess the educational quality </li></ul><ul><li>An opportunity to identify components that require improvement </li></ul><ul><li>Outcomes may be relevant to the success of the program in areas like </li></ul><ul><ul><li>policy development </li></ul></ul><ul><ul><li>decision-making </li></ul></ul><ul><ul><li>funding opportunities </li></ul></ul><ul><ul><li>partnerships between the institution and community. </li></ul></ul><ul><li>Promote confidence that academic programs are being monitored, reviewed, modified and researched </li></ul><ul><li>Identify challenges faced by first year students in adapting to university teaching and learning environments </li></ul>
  6. 6. Challenges Faced by First Year Students <ul><li>School leavers face challenges during first year studies, as evidenced by </li></ul><ul><ul><li>Nationally attrition rate of 28.6% in 2002 </li></ul></ul><ul><ul><li>Queensland attrition rate of 31.1% </li></ul></ul><ul><ul><li>After first year, a drop to 10 % (national) & 11.5 % in QLD </li></ul></ul><ul><ul><li>(DETA, 2004). </li></ul></ul><ul><li>Peel (1999) identified some challenges as </li></ul><ul><ul><li>inadequate preparation for independent style of learning </li></ul></ul><ul><ul><li>lack of prior exposure to tertiary environments </li></ul></ul><ul><ul><li>gaps between course structures and expectations of university and school </li></ul></ul><ul><ul><li>“ expectation” that university will deliver life challenges </li></ul></ul><ul><ul><li>diversity of teacher-student relationship and a more formal relationship with lecturers </li></ul></ul>
  7. 7. Performance Linked to Lack of Motivation? <ul><li>Research Evidence? </li></ul><ul><ul><li>Lowe and Cook (2003) showed 20% of students could not adjust to academic and social demands of a university. </li></ul></ul><ul><ul><li>Hillman (2005) found that students from low socio-economic backgrounds have difficulties in balancing work & study commitments. </li></ul></ul><ul><li>These could be worsened by poor motivation and lead to a diminishing of interest in studying </li></ul><ul><li>As a result –performance during first year at university becomes poor. </li></ul>
  8. 8. The Government’s Decision <ul><li>The Bradley Review examined our higher education sector. </li></ul><ul><ul><li>incorporated recommendations on encouraging participation of persons from low socio economic backgrounds in higher education. </li></ul></ul><ul><ul><li>focussed on enhancing partnerships between schools, institutions and communities through outreach activities. </li></ul></ul><ul><li>Declared that by 2020, 20% of tertiary enrolments are expected to be from low socio-economic regions </li></ul><ul><li>(Bradley, Noonan, Nugent, & Scales, 2008). </li></ul>
  9. 9. Response of Open Access College <ul><li>Committed to social inclusion for supporting students who are unable to make informed decisions about prospective careers. </li></ul><ul><li>Provided pathways to those who miss on education opportunities </li></ul><ul><ul><li>Preparatory Program in Schools </li></ul></ul><ul><ul><li>Created alternative pathway for students to complete two tertiary level courses during final school years. </li></ul></ul><ul><li>Designed to increase participation from Ipswich & Moreton region </li></ul><ul><ul><li>identified as Socio Economic Disadvantaged </li></ul></ul><ul><ul><li>Only 35.7% and 19.4% respectively, of Year 12 students enter tertiary institutions (DETA, 2008; Australian Bureau of Statistics, 2006) </li></ul></ul>
  10. 10. Designing an Evaluation <ul><li>Program evaluations may be </li></ul><ul><ul><li>objective-oriented </li></ul></ul><ul><ul><li>expertise-oriented </li></ul></ul><ul><ul><li>management-oriented </li></ul></ul><ul><ul><li>naturalistic/participant-oriented </li></ul></ul><ul><ul><li>(McNeil, 1996). </li></ul></ul><ul><li>To serve a number of purposes, we propose a mixed model whose choice is based around four main objectives </li></ul><ul><ul><li>(a) assess attainment of educational standards by students, </li></ul></ul><ul><ul><li>(b) investigate how effectively students integrate into a tertiary learning environment, </li></ul></ul><ul><ul><li>(c) determine whether students experience increases in motivation for study </li></ul></ul><ul><ul><li>(d) assist students in setting up a prospective career path </li></ul></ul>
  11. 11. Acquisition of Data <ul><li>Question 1 : What are the effects of the program on motivation for studies, development of communication, mathematical skills and decision-making on prospective career paths? </li></ul><ul><li>Purpose of the Question: Assess </li></ul><ul><li>changes in student’s learning profiles (mathematical reasoning & communication), </li></ul><ul><li>increases in motivation for studies and commitment towards action-planning and scheduling study activities, </li></ul><ul><li>integration into a university teaching and learning environment . </li></ul>
  12. 12. <ul><li>To Address Question 1: Data will be collected using 4 main avenues </li></ul><ul><li>Academic Performance </li></ul><ul><ul><li>The Preparatory Program in Schools has courses with 4 assignments each. </li></ul></ul><ul><ul><li>Based on attainments in each, marks will determine how well students are adjusting to university learning environment. </li></ul></ul><ul><ul><li>Continuous monitoring will ensure specific trends in learning attributes are recorded progressively. </li></ul></ul><ul><ul><li>Assignments have a “learning diary” and “summative essay” </li></ul></ul><ul><ul><ul><li>examine plans for future careers. </li></ul></ul></ul><ul><li>Weekly Journal Entry </li></ul><ul><ul><li>As part of non-graded assessment, students will submit weekly journals </li></ul></ul><ul><ul><li>Will provide an opportunity to self-assess their learning journey and the freedom to provide their own perspectives. </li></ul></ul><ul><ul><li>Will provide information unique to individual students (i.e. personal learning attributes). </li></ul></ul>
  13. 13. <ul><li>Covert Observations </li></ul><ul><ul><li>used to collect data on student’s learning profiles within a natural learning environment </li></ul></ul><ul><ul><li>learning behaviours observed by the facilitator, allocated a rating on a numerical scale from 1-10. </li></ul></ul><ul><ul><li>frequency will be weekly </li></ul></ul><ul><li>Focus Groups </li></ul><ul><ul><li>pre-defined student-focus groups </li></ul></ul><ul><ul><li>stimulate discussion on a set of topics from each course. </li></ul></ul><ul><ul><li>examine study management/problem solving abilities </li></ul></ul><ul><ul><li>frequency twice a semester </li></ul></ul>
  14. 14. <ul><li>Question 2 : What are the effects of the program on professional development of the College staff? </li></ul><ul><li>Purpose of Question: </li></ul><ul><ul><li>Obtain staffs’ perceptions on the success of the program </li></ul></ul><ul><ul><li>Reveal challenges (e.g. resources) the institution is facing </li></ul></ul><ul><ul><li>Examine whether the program is having a positive impact on professional development </li></ul></ul><ul><ul><li>Create options for further opportunities for staff that might help improve the program delivery </li></ul></ul>
  15. 15. <ul><li>To Address Question 2: Data will be collected using 2 main avenues </li></ul><ul><li>ff Surveys </li></ul><ul><ul><li>describe general experiences </li></ul></ul><ul><ul><li>frequency will be twice annually. </li></ul></ul><ul><li>SWOT Analysis ( Staff present their roles/responsibilities) </li></ul><ul><ul><li>Strengths (Internal): identify new learning or pedagogical skills </li></ul></ul><ul><ul><li>Weakness (Internal): identify challenges (e.g. anxiety with “new” cohort) </li></ul></ul><ul><ul><li>Opportunities (External): opportunities available (e.g. ICT training,) </li></ul></ul><ul><ul><li>Threats (External): assess the constraints (e.g. workloads) </li></ul></ul>
  16. 16. <ul><li>Question 3 : W hat extent did the program succeed in attracting community confidence? </li></ul><ul><li>Purpose of Question: recognise the community’s role & participation </li></ul><ul><li>Parent-Teacher Interviews </li></ul><ul><ul><li>random batch of parents invited for interview on monthly basis. </li></ul></ul><ul><ul><li>examine any growing community support for the program </li></ul></ul><ul><li>Influences Questionnaire </li></ul><ul><ul><li>based on influences questionnaire (Taylor & Bedford, 2004) </li></ul></ul><ul><ul><li>questions centred on factors that influence a parent in withdrawing their child or continuing </li></ul></ul><ul><ul><ul><li>1 to 5 scale (1 = not interested, 2 = slightly, 3 = unsure, 4 = moderately, 5 = extremely interested). </li></ul></ul></ul>
  17. 17. Validity and Reliability <ul><li>Multi-Model Approach: </li></ul><ul><ul><li>several modes of collection & merging results can lead to a valid, reliable and diverse construction of realities (e.g. Patton, 1997). </li></ul></ul><ul><li>Student Dropout: </li></ul><ul><ul><li>reduction in numbers can bias our results </li></ul></ul><ul><ul><li>to overcome this, we will compare characteristics of those students who have dropped out with those who remain. </li></ul></ul><ul><ul><li>if the two groups are not significantly different in terms of characteristics, then dropouts did not affect our results </li></ul></ul><ul><ul><li>(Heckman, Smith, & Taber, 1998) </li></ul></ul><ul><li>Naturalistic Methods: </li></ul><ul><ul><li>One example is use of covert observations </li></ul></ul><ul><li>(Lynch, 1996). </li></ul><ul><li>Well-defined & Systematic Procedure: </li></ul><ul><ul><li>can be replicated in future, thus ensuring the reliability . </li></ul></ul>
  18. 18. Ethical Factors to Consider <ul><li>Should be considered to avoid harm (Sanderson & Mazerolle, 2008). </li></ul><ul><li>Ethical Clearance sought from University Ethics Committee. </li></ul><ul><li>Principle of Informed Consent (Evans & Jakupec, 1996). </li></ul><ul><ul><li>all participants educated about the evaluation, purpose and potential benefits (Wiles, Crow, Heath, & Charles, 2006). </li></ul></ul><ul><li>Confidentiality of Information : </li></ul><ul><ul><li>Using pseudonyms to avoid name disclosure (Richards & Schwartz, 2002). </li></ul></ul><ul><li>Avoiding Loss of Life-Time Opportunities : </li></ul><ul><ul><li>for example falling behind contemporaries, being graded as unsuccessful or losing their career as an outcome of evaluation (Israel & Hay, 2006), </li></ul></ul><ul><ul><li>outcomes not used to process admissions in any other academic programs. </li></ul></ul>
  19. 19. <ul><li>Non-disclosure of Findings – no information disclosed to any third party </li></ul><ul><li>Avoid Psychosocial Consequences : </li></ul><ul><ul><li>e.g. loss of self-esteem, self-concept or confidence (Bibby, 1997). </li></ul></ul><ul><ul><li>done by ensuring challenged student’s cognitive limitations are not disclosed. </li></ul></ul><ul><li>Privacy to Information: </li></ul><ul><ul><li>granting freedom to deny response to any particular question (Sharp, 1994). </li></ul></ul><ul><ul><li>avoid discussing characteristics that discovers the subject’s identity </li></ul></ul><ul><li>Cultural Sensitivity: </li></ul><ul><ul><li>exercised to protect identity of community and cultural denomination (Hafford & Lyon, 2007). </li></ul></ul><ul><li>Data Security: </li></ul><ul><ul><li>deleting all subject identifiers </li></ul></ul><ul><ul><li>security devices such as passwords to protect electronic data . </li></ul></ul>
  20. 20. Reporting Back Mechanisms <ul><li>Progress Reports : </li></ul><ul><ul><li>fortnightly reports on changes in students’ learning profiles </li></ul></ul><ul><li>Newsletters : parents and school sent monthly newsletter </li></ul><ul><ul><li>ensuring that no specific results are disclosed to students to avoid potential bias , “Hawthorne effect” (Kuper, Lingard, & Levinson, 2008). </li></ul></ul><ul><li>Consultative Meetings : progress reports discussed in consultative meetings </li></ul><ul><li>General Assembly Meeting : merged in a master document </li></ul><ul><ul><li>Tabled in a College Assembly to gather insights and strategic course of actions such as </li></ul></ul><ul><ul><ul><li>restructuring the program </li></ul></ul></ul><ul><ul><ul><li>providing professional development for staff </li></ul></ul></ul><ul><ul><ul><li>promoting the program across wider community . </li></ul></ul></ul>
  21. 21. Limitations of Evaluation Plan <ul><li>Conditional Conclusions : Outcomes true under specific conditions that are couched in probabilities rather than absolute certainty. </li></ul><ul><li>Time Frame : proposed 2 semester timeframe not be adequate to conduct a comprehensive assessment. </li></ul><ul><li>Predisposition : selection based on interview so all interested students may be eager to join. </li></ul><ul><ul><li>may attract those who are predisposed to a positive outcome </li></ul></ul><ul><ul><li>measuring changes in profiles may overstate the achievements </li></ul></ul><ul><li>Maturation : Does not take into account effects of maturation. </li></ul><ul><ul><li>events outside the program cause changes in knowledge or behaviours </li></ul></ul><ul><ul><li>growth of abilities due to maturation not quantified easily </li></ul></ul><ul><ul><li>a positive outcome due to maturation treated as a plus of the program </li></ul></ul><ul><li>Overcome These? </li></ul><ul><ul><li>Increase the length of study, 2-5 year period </li></ul></ul><ul><ul><li>Use counter-control mechanisms based on previous year’s findings </li></ul></ul>
  22. 22. References <ul><li>Australian Bureau of Statistics. (2006). Socioeconomic indexes for areas. Retrieved April 20, 2011, from http://www.abs.gov.au/websitedbs/D3310114.nsf/home/Seifa_entry_page. </li></ul><ul><li>Bibby, M. (1997). Introduction: Education research and morality. In M. Bibby (Ed.), Ethics and education research (Review of Australian research in education no. 4) . Melbourne, Vic: Australian Association for Research in Education. </li></ul><ul><li>DETA. (2008). Next step 2008: A report on the destinations of year 12 completers from 2007 in Queensland . Queensland: Department of Education, Training and the Arts. </li></ul><ul><li>Evans, T., & Jakupec, V. (1996 ). Research ethics in open and distance education: Context, principles and issues. Distance Education, 17 (1), 72-94. </li></ul><ul><li>Hafford, C., & Lyon, K. (2007). The Challenges of Conducting Evaluation in Tribal Communities . Washington, D.C: Office of Family Assistance. </li></ul><ul><li>Heckman, J., Smith, J., & Taber, C. (1998). Accounting for dropouts in evaluations of social experiments. Review of Economics and Statistics, 80 (1), 1-14. </li></ul><ul><li>Hillman, K. (2005). The first year experience: The transition from secondary school to university and TAFE in Australia . Victoria, Australia: Australian Council for Educational Research. </li></ul><ul><li>Israel, M., & Hay, L. (2006). Research ethics for social scientists: between ethical conduct and regulatory compliance . London: Sage. </li></ul><ul><li>Kuper, A., Lingard, L., & Levinson, W. (2008). Qualitative Research Critically appraising qualitative research. British Medical Journal, 337 , a1035 </li></ul><ul><li>Law, H., & Cook, A. (2003). Mind the gap: are students prepared for higher education?. Journal of Further and Higher Education, 27 (1), 53-76. </li></ul><ul><li>Lynch, B. (1996). Language program evaluation . Cambridge: Cambridge University Press. </li></ul><ul><li>McNeil, J. D. (1996). Curriculum: A Comprehensive Introduction (5th ed.). Los Angeles: Harper Collins College Publishers </li></ul><ul><li>Patton, M. Q. (1997). Utilization-focused evaluation: the new century text (3rd ed.). Thousand Oaks, California: Sage. </li></ul><ul><li>Peel, M. (1999). Where to now? Transition from Secondary to Tertiary: A Performance Study Higher Education Series (Vol. 36, pp. 13-16). </li></ul><ul><li>Retrieved April 20, 2011, from http://www.abs.gov.au/websitedbs/D3310114.nsf/home/Seifa_entry_page. </li></ul><ul><li>Richards, H. M., & Schwartz, L. J. (2002). Ethics of qualitative research: are there special issues for health services research? . Family Practice, 19 (2), 35-39. </li></ul><ul><li>Sanderson, J., & Mazerolle, P. (2008). An evaluation of the book. Retrieved April 27, 2011, from http://www.fpq.com.au/pdf/EGB_Survey.pdf. </li></ul><ul><li>Sharp, C. A. (1994). What is Appropriate Evaluation? Ethics and standards. Evaluation News & Comment, 3 (2), 34-41. </li></ul><ul><li>Wiles, R., Crow, G., Heath, S., & Charles, V. (2006). Anonymity and Confidentiality ESRC National Centre for Research Methods, NCRM Working Paper Series (Vol. 2/06.). </li></ul>

×