Assessing IFP students’ needs assessment Florencia Franceschina Manchester Metropolitan University [email_address]   Infor...
1. Introduction <ul><li>Increased weight of the ‘student voice’ in HE (e.g., NSS, International Student Barometer, etc.)  ...
Defining self-assessment <ul><li>“ Self-assessment refers to the involvement of learners in making judgements about their ...
Why assess learning needs? <ul><li>Because it can help us to identify: </li></ul><ul><li>learners’ strengths and weaknesse...
Issue #1:  How  useful  is  learning needs self-assessment? <ul><li>Useful for policy makers </li></ul><ul><li>Useful for ...
<ul><li>“ Whether it be a junior high school student wondering whether to crack open his social studies textbook one more ...
Issue #2:  How  valid  is  learning needs (self-)assessment? <ul><li>Do students know what they need? </li></ul><ul><li>Do...
Issue #3:  How  reliable  is  learning needs self-assessment? <ul><li>Are all learners equally able to make accurate judge...
2. Brief review of the literature <ul><li>Self-assessment … </li></ul><ul><li>…  is difficult. </li></ul>
<ul><li>Meta-studies reveal typically low correlation between self-views and actual performance: </li></ul><ul><ul><li>.21...
<ul><li>Self-assessment … </li></ul><ul><li>…  is difficult. </li></ul><ul><li>…  is not always valued by learners (e.g., ...
<ul><li>Self-assessment … </li></ul><ul><li>…  is difficult. </li></ul><ul><li>…  is not always valued by learners (e.g., ...
<ul><li>Self-regulation is subject to the influence of educational/cultural background factors  </li></ul><ul><li>(e.g., P...
<ul><li>Self-assessment … </li></ul><ul><li>…  is difficult. </li></ul><ul><li>…  is not always valued by learners (e.g., ...
<ul><li>Students in advanced courses tend to be more accurate self-assessors than students on introductory courses (Falchi...
<ul><li>Self-assessment … </li></ul><ul><li>…  is difficult. </li></ul><ul><li>…  is not always valued by learners (e.g., ...
<ul><li>“ High-performing students were accurate, with accuracy improving over multiple exams. Low-performing students sho...
<ul><li>Self-assessment … </li></ul><ul><li>…  is difficult. </li></ul><ul><li>…  is not always valued by learners (e.g., ...
<ul><li>Students in science tend to be more accurate self-assessors than students in other areas (Falchikov and Boud, 1989...
<ul><li>Self-assessment … </li></ul><ul><li>…  is difficult. </li></ul><ul><li>…  is not always valued by learners (e.g., ...
<ul><li>Although student self-assessment remains inaccurate after training, it does seem to be subject to some improvement...
Wide range of data collection methods available <ul><li>Formal / informal </li></ul><ul><li>Planned / opportunistic </li><...
Some commonly used tools <ul><li>Questionnaires </li></ul><ul><li>Focus groups </li></ul><ul><li>Online discussion fora </...
3. Methodology <ul><li>Cohort of 51 IFP students </li></ul><ul><li>Min. language entry requirement: IELTS 5.5 or equivalen...
Mixed nationalities <ul><li>Saudi Arabia (22%) </li></ul><ul><li>Bahrain (14%) </li></ul><ul><li>Nigeria (14%) </li></ul><...
Mixed linked degrees
Data collected <ul><li>Learning needs questionnaire </li></ul><ul><ul><li>Student version (individual self-assessment) </l...
Analyses <ul><li>Perceived needs vs. observed academic performance </li></ul><ul><li>“ Objective” diagnostic measures vs. ...
4. Findings <ul><li>Overview </li></ul><ul><li>“ Objective” measures </li></ul><ul><li>“ Subjective” measures </li></ul><u...
Correlations between ‘objective’ measures - - - - - IFP attend. - - - - EAP attend. .65 - - - IFP  mark .67 .81 - - EAP un...
 
 
Correlations between ‘objective’ measures - - - - - IFP attend. - - - - EAP attend. .65 - - - IFP  mark .67 .81 - - EAP un...
 
Correlations between ‘objective’ measures - - - - - IFP attend. - - - - EAP attend. .65 - - - IFP  mark .67 .81 - - EAP un...
 
 
How did “objective” measures work? <ul><li>Language test at entry was poor predictor or later performance. This could be d...
Staff perceptions about students’ needs at entry <ul><li>Tutors most frequently identified general language weakness as an...
<ul><li>“ Sometimes there have been some difficulties in their ability to express themselves effectively in written work b...
<ul><li>“ Re quired much repetition on basic tasks appear not to understand instructions. ” </li></ul><ul><li>“ there are ...
<ul><li>Tutors and support staff thought the IFP students’ numeracy skills and subject background knowledge was on the who...
Relative difficulty of IFP modules (MMU ‘units’)
 
Students’ perceptions about their needs at entry <ul><li>Indirect method of calculating match between students’ perceived ...
<ul><li>Maximum possible score for successful matching of needs and actual performance: 5 </li></ul><ul><li>(W, R, S, L, I...
Perceived vs observed order of difficulty of subskills <ul><li>Predicted by students: W < IT < L < S/R  </li></ul><ul><li>...
 
Students’ perceptions about their needs at exit <ul><li>Needs questionnaire (at exit) answers match performance in subskil...
End-of-year student feedback on EAP unit
Learning need as judged by students  at the end of the IFP IT Writing Reading Listening Speaking Lower Higher
EAP unit performance 40 21 38 Writing 47 16 43 Reading 52 15 50 Speaking 53 14 52 Listening 87 27 80 IT Median SD Mean sco...
5. Summary and Conclusion  <ul><li>Not all objective measures that seem relevant make useful predictions. </li></ul><ul><l...
<ul><li>Given how unreliable IFP students’ judgements seem to be, we should be careful when using these judgements to info...
6. Practical suggestions <ul><li>Bear in mind that learning needs self-assessment is difficult, especially for IFP student...
<ul><li>Students become more accurate with training and practice, so provide them with opportunities to develop this skill...
<ul><li>Whenever possible, combine self-assessment with other sources of information about your students’ learning needs. ...
References <ul><ul><ul><li>Boud, David, and Falchikov, Nancy. 1989. Quantitative studies of student self-assessment in hig...
Upcoming SlideShare
Loading in...5
×

10 07-17 inform conf slides

497

Published on

Learning needs self assessment in International Foundation Programme students

Published in: Education
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
497
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
1
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • This roughly matches the end-of-year indirect self-assessment (end-of-year survey). This roughly matches the end-of-year indirect self-assessment (end-of-year survey). This may be an indication that the students have improved their self-assessment skills (but it may be due to other factors). Bear in mind that this is a very global measure of self-assessment accuracy.
  • 10 07-17 inform conf slides

    1. 1. Assessing IFP students’ needs assessment Florencia Franceschina Manchester Metropolitan University [email_address] Inform Conference, University of Reading, 17 July 2010
    2. 2. 1. Introduction <ul><li>Increased weight of the ‘student voice’ in HE (e.g., NSS, International Student Barometer, etc.) </li></ul><ul><li>The ‘student voice’ is valued in different ways: </li></ul><ul><ul><li>Students as a customers </li></ul></ul><ul><ul><li>Students as individuals capable of self-directed learning </li></ul></ul><ul><li>Learning needs self-assessment relevant to either view </li></ul>
    3. 3. Defining self-assessment <ul><li>“ Self-assessment refers to the involvement of learners in making judgements about their own learning […]”. </li></ul><ul><li>Boud and Falchikov (1989: 529) </li></ul>
    4. 4. Why assess learning needs? <ul><li>Because it can help us to identify: </li></ul><ul><li>learners’ strengths and weaknesses </li></ul><ul><li>learners’ goals and expectations </li></ul>
    5. 5. Issue #1: How useful is learning needs self-assessment? <ul><li>Useful for policy makers </li></ul><ul><li>Useful for curriculum developers </li></ul><ul><li>Useful for teachers </li></ul><ul><li>Useful for students </li></ul>
    6. 6. <ul><li>“ Whether it be a junior high school student wondering whether to crack open his social studies textbook one more time before the test or a medical student deciding whether to practice her intubating technique before her next shift, students make more effective decisions about where to apply their learning efforts when they can actually discern their strengths and weaknesses.” </li></ul><ul><li>(Dunning et al., 2004: 85) </li></ul>
    7. 7. Issue #2: How valid is learning needs (self-)assessment? <ul><li>Do students know what they need? </li></ul><ul><li>Do even experts know which needs we should explore and aim to address? For example, the relevance of some needs may not be apparent at the time of assessment (Grant, 2002: 157). </li></ul>
    8. 8. Issue #3: How reliable is learning needs self-assessment? <ul><li>Are all learners equally able to make accurate judgements about their learning needs? </li></ul><ul><li>How good are IFP students at it? </li></ul>
    9. 9. 2. Brief review of the literature <ul><li>Self-assessment … </li></ul><ul><li>… is difficult. </li></ul>
    10. 10. <ul><li>Meta-studies reveal typically low correlation between self-views and actual performance: </li></ul><ul><ul><li>.21 (Hansford and Hattie, 1982) </li></ul></ul><ul><ul><li>.39 (Falchikov and Boud, 1989) </li></ul></ul><ul><li>Similar findings by Dunning et al., 2003; Dunning et al., 2004; Powers, 2002; Ross, 1998. </li></ul>
    11. 11. <ul><li>Self-assessment … </li></ul><ul><li>… is difficult. </li></ul><ul><li>… is not always valued by learners (e.g., compared to summative assessment) </li></ul>
    12. 12. <ul><li>Self-assessment … </li></ul><ul><li>… is difficult. </li></ul><ul><li>… is not always valued by learners (e.g., compared to summative assessment) </li></ul><ul><li>… is modulated by culture. </li></ul>
    13. 13. <ul><li>Self-regulation is subject to the influence of educational/cultural background factors </li></ul><ul><li>(e.g., Purdie and Hattie, 1996; Olaussen and Bråten, 1999) </li></ul>
    14. 14. <ul><li>Self-assessment … </li></ul><ul><li>… is difficult. </li></ul><ul><li>… is not always valued by learners (e.g., compared to summative assessment) </li></ul><ul><li>… is modulated by culture. </li></ul><ul><li>… is modulated by level of study. </li></ul>
    15. 15. <ul><li>Students in advanced courses tend to be more accurate self-assessors than students on introductory courses (Falchikov and Boud, 1989). </li></ul>
    16. 16. <ul><li>Self-assessment … </li></ul><ul><li>… is difficult. </li></ul><ul><li>… is not always valued by learners (e.g., compared to summative assessment) </li></ul><ul><li>… is modulated by culture. </li></ul><ul><li>… is modulated by level of study. </li></ul><ul><li>… is modulated by academic ability level. </li></ul>
    17. 17. <ul><li>“ High-performing students were accurate, with accuracy improving over multiple exams. Low-performing students showed moderate prediction accuracy but good postdiction accuracy. Lowest performing students showed gross overconfidence in predictions and postdictions. Judgments of performance were influenced by prior judgments and not prior performance. Performance and judgments of performance had little influence on subsequent test preparation behavior.” </li></ul><ul><li>Hacker et al. (2000: 160) </li></ul>
    18. 18. <ul><li>Self-assessment … </li></ul><ul><li>… is difficult. </li></ul><ul><li>… is not always valued by learners (e.g., compared to summative assessment) </li></ul><ul><li>… is modulated by culture. </li></ul><ul><li>… is modulated by level of study. </li></ul><ul><li>… is modulated by academic ability level. </li></ul><ul><li>… is differently accurate depending on area of study. </li></ul>
    19. 19. <ul><li>Students in science tend to be more accurate self-assessors than students in other areas (Falchikov and Boud, 1989). </li></ul>
    20. 20. <ul><li>Self-assessment … </li></ul><ul><li>… is difficult. </li></ul><ul><li>… is not always valued by learners (e.g., compared to summative assessment) </li></ul><ul><li>… is modulated by culture. </li></ul><ul><li>… is modulated by level of study. </li></ul><ul><li>… is modulated by academic ability level. </li></ul><ul><li>… is differently accurate depending on area of study. </li></ul><ul><li>… can be developed as a skill. </li></ul>
    21. 21. <ul><li>Although student self-assessment remains inaccurate after training, it does seem to be subject to some improvement. For example: </li></ul><ul><li>McDonald (2010) has shown that involving students in meta-assessment training can help mathematics students to improve their test performance. </li></ul><ul><li>Chen (2008) found that through feedback and practice students’ self-assessments of English speaking skills become more aligned with tutor assessment. </li></ul>
    22. 22. Wide range of data collection methods available <ul><li>Formal / informal </li></ul><ul><li>Planned / opportunistic </li></ul><ul><li>Individual / group </li></ul><ul><li>By self / peer / tutor / outside expert </li></ul>
    23. 23. Some commonly used tools <ul><li>Questionnaires </li></ul><ul><li>Focus groups </li></ul><ul><li>Online discussion fora </li></ul><ul><li>Staff-student liaison committees </li></ul><ul><li>Diaries </li></ul><ul><li>Objective tests </li></ul><ul><li>Observation </li></ul><ul><li>Action research </li></ul>
    24. 24. 3. Methodology <ul><li>Cohort of 51 IFP students </li></ul><ul><li>Min. language entry requirement: IELTS 5.5 or equivalent </li></ul><ul><li>Followed up for the duration of IFP (2009-2010) </li></ul>
    25. 25. Mixed nationalities <ul><li>Saudi Arabia (22%) </li></ul><ul><li>Bahrain (14%) </li></ul><ul><li>Nigeria (14%) </li></ul><ul><li>Pakistan (8%) </li></ul><ul><li>Turkey (8%) </li></ul><ul><li>UAE (8%) </li></ul><ul><li>China (4%) </li></ul><ul><li>India (4%) </li></ul><ul><li>Iran (4%) </li></ul><ul><li>Sri Lanka (4%) </li></ul><ul><li>DR Congo (2%) </li></ul><ul><li>Hong Kong (2%) </li></ul><ul><li>Kenya (2%) </li></ul><ul><li>Turkish Cyprus (2%) </li></ul><ul><li>USA (2%) </li></ul><ul><li>Zimbabwe (2%) </li></ul>
    26. 26. Mixed linked degrees
    27. 27. Data collected <ul><li>Learning needs questionnaire </li></ul><ul><ul><li>Student version (individual self-assessment) </li></ul></ul><ul><ul><li>Staff version (group assessment) </li></ul></ul><ul><li>Diagnostic language test </li></ul><ul><li>Attendance records </li></ul><ul><li>Coursework and exam results (all IFP modules) </li></ul><ul><li>Staff interviews </li></ul><ul><li>End-of-year module feedback (EAP module) </li></ul>
    28. 28. Analyses <ul><li>Perceived needs vs. observed academic performance </li></ul><ul><li>“ Objective” diagnostic measures vs. observed academic performance </li></ul>
    29. 29. 4. Findings <ul><li>Overview </li></ul><ul><li>“ Objective” measures </li></ul><ul><li>“ Subjective” measures </li></ul><ul><li>Improvements in judgements? </li></ul>
    30. 30. Correlations between ‘objective’ measures - - - - - IFP attend. - - - - EAP attend. .65 - - - IFP mark .67 .81 - - EAP unit mark .36 .22 - Lang test at entry IFP attend. EAP attend. IFP mark EAP unit mark Lang test at entry
    31. 33. Correlations between ‘objective’ measures - - - - - IFP attend. - - - - EAP attend. .65 - - - IFP mark .67 .81 - - EAP unit mark .36 .22 - Lang test at entry IFP attend. EAP attend. IFP mark EAP unit mark Lang test at entry
    32. 35. Correlations between ‘objective’ measures - - - - - IFP attend. - - - - EAP attend. .65 - - - IFP mark .67 .81 - - EAP unit mark .36 .22 - Lang test at entry IFP attend. EAP attend. IFP mark EAP unit mark Lang test at entry
    33. 38. How did “objective” measures work? <ul><li>Language test at entry was poor predictor or later performance. This could be due to a number of reasons. (Similarly surprising patterns reported for UCAS tariff points/FY performance elsewhere.) </li></ul><ul><li>Attendance was reasonably good predictor. This confirms findings in previous studies (e.g., Hughes, 2009). </li></ul><ul><li>EAP unit was best predictor of IFP performance. </li></ul>
    34. 39. Staff perceptions about students’ needs at entry <ul><li>Tutors most frequently identified general language weakness as an issue to be addressed. </li></ul><ul><li>“ when marking essays, it is obvious that many students have a very weak grasp of syntax sometimes, and this will hold them back in the future.” </li></ul>
    35. 40. <ul><li>“ Sometimes there have been some difficulties in their ability to express themselves effectively in written work but this is not always the case. The main problem I have found with international students who do have language difficulties is their lack of commitment in addressing the issue or them denying the fact there is a problem despite overwhelming evidence and continuous advice to attend support classes. Some students believe their English skills are better than they truly are. ” </li></ul>
    36. 41. <ul><li>“ Re quired much repetition on basic tasks appear not to understand instructions. ” </li></ul><ul><li>“ there are others who have a very poor command of the English language and struggle with their research and assignments. ” </li></ul>
    37. 42. <ul><li>Tutors and support staff thought the IFP students’ numeracy skills and subject background knowledge was on the whole the same or better than in UK students. </li></ul>
    38. 43. Relative difficulty of IFP modules (MMU ‘units’)
    39. 45. Students’ perceptions about their needs at entry <ul><li>Indirect method of calculating match between students’ perceived needs and actual performance: </li></ul><ul><ul><li>Self-assessment questionnaire answers coded as 1 if student ticked ‘I need to study more of this’/’I need help with this’, and 0 if student ticked ‘I don’t want much of this’/’I don’t need help with this’ </li></ul></ul><ul><ul><li>EAP unit summative assessment marks were coded as 0 if mark was above mean and 1 if below mean. </li></ul></ul><ul><ul><li>Match between 1 and 2 above was checked. </li></ul></ul>
    40. 46. <ul><li>Maximum possible score for successful matching of needs and actual performance: 5 </li></ul><ul><li>(W, R, S, L, IT) </li></ul><ul><li>IFP students’ average score: 2.5 </li></ul>
    41. 47. Perceived vs observed order of difficulty of subskills <ul><li>Predicted by students: W < IT < L < S/R </li></ul><ul><li>Observed: W (38%) < R (43%) < S (50%) < L (52%) < IT (80%) </li></ul>
    42. 49. Students’ perceptions about their needs at exit <ul><li>Needs questionnaire (at exit) answers match performance in subskills at an average score of 2.3 (no improvement) </li></ul>
    43. 50. End-of-year student feedback on EAP unit
    44. 51. Learning need as judged by students at the end of the IFP IT Writing Reading Listening Speaking Lower Higher
    45. 52. EAP unit performance 40 21 38 Writing 47 16 43 Reading 52 15 50 Speaking 53 14 52 Listening 87 27 80 IT Median SD Mean scores Assessment area
    46. 53. 5. Summary and Conclusion <ul><li>Not all objective measures that seem relevant make useful predictions. </li></ul><ul><li>Tutors and support staff’s global, subjective assessment of students’ learning needs seems to be accurate. </li></ul><ul><li>These IFP students do not seem to be very good at assessing their learning needs in detail, but they are relatively better when their judgements are considered at a more general, group level. </li></ul>
    47. 54. <ul><li>Given how unreliable IFP students’ judgements seem to be, we should be careful when using these judgements to inform our practice. </li></ul><ul><li>The highest value of learning needs self-assessment is probably for the students themselves. </li></ul>
    48. 55. 6. Practical suggestions <ul><li>Bear in mind that learning needs self-assessment is difficult, especially for IFP students. </li></ul><ul><li>When interpreting IFP students’ needs self-assessment, remember that it can be influenced by: </li></ul><ul><ul><li>cultural factors </li></ul></ul><ul><ul><li>academic ability level (good vs. weak students) </li></ul></ul><ul><ul><li>area of study (science vs. other) </li></ul></ul><ul><ul><li>assessment context (formative/summative; high/low stakes) </li></ul></ul>
    49. 56. <ul><li>Students become more accurate with training and practice, so provide them with opportunities to develop this skill. </li></ul><ul><li>Close the feedback loop with staff and students, for example through: </li></ul><ul><ul><li>As part of regular assessment feedback </li></ul></ul><ul><ul><li>Websites, VLEs, notice boards </li></ul></ul><ul><ul><li>Programme committee meetings </li></ul></ul><ul><ul><li>Informal reporting by staff in class </li></ul></ul>
    50. 57. <ul><li>Whenever possible, combine self-assessment with other sources of information about your students’ learning needs. </li></ul><ul><li>Take advantage of the wide range of data collection tools available. </li></ul><ul><li>Methodological tip: ask students to make self-assessment judgements using same scales as ‘objective’ performance measures you will use to compare. (This has its own problems, though.) </li></ul>
    51. 58. References <ul><ul><ul><li>Boud, David, and Falchikov, Nancy. 1989. Quantitative studies of student self-assessment in higher education: a critical analysis of findings. Higher Education 18:529-549. </li></ul></ul></ul><ul><ul><ul><li>Chen, Yuh-Mei. 2008. Learning to self-assess oral performance in English: a longitudinal case study. Language Teaching Research 12:235-262. </li></ul></ul></ul><ul><ul><ul><li>Dunning, David, Heath, Chip, and Suls, Jerry M. 2004. Flawed self-assessment. Implications for health, education and the workplace. Psychological Science in the Public Interest 5:31-54. </li></ul></ul></ul><ul><ul><ul><li>Dunning, David, Johnson, Kerri, Ehrlinger, Joyce, and Kruger, Justin. 2003. Why people fail to recognize their own incompetence. Current Directions in Psychological Science 12:83-87. </li></ul></ul></ul><ul><ul><ul><li>Falchikov, Nancy, and Boud, David. 1989. Student self-assessment in higher education: a meta-analysis. Review of Educational Research 59:395-430. </li></ul></ul></ul><ul><ul><ul><li>Grant, Janet. 2002. Learning needs assessment: assessing the need. British Medical Journal 324:156-159. </li></ul></ul></ul><ul><ul><ul><li>Hacker, Douglas J, Bol, Linda, Horgan, Dianne D, and Rakow, Ernest A. 2000. Test prediction and performance in a classroom context. Journal of Educational Psychology 92:160-170. </li></ul></ul></ul><ul><ul><ul><li>Hughes, Nicola T. 2009. Attendance as a measure of student motivation and engagement. InForm 3:7-8. </li></ul></ul></ul><ul><ul><ul><li>McDonald, Betty. 2010. Improving learning through meta assessment. Active Learning in Higher Education 11:119-129. </li></ul></ul></ul><ul><ul><ul><li>Olaussen, Bodil S, and Br 蚯 e n, Ivar. 1999. Students' use of strategies for self-regulated learning: cross-cultural perspectives. Scandinavian Journal of Educational Research 43:409-432. </li></ul></ul></ul><ul><ul><ul><li>Powers, Donald E. 2002. Self-assessment of reasoning skills. ETS Research Report RR-02-22. Princeton, NJ: Educational Testing Service (ETS). </li></ul></ul></ul><ul><ul><ul><li>Purdie, Nola, and Hattie, John. 1996. Cultural differences in the use of strategies for self-regulated learning. American Educational Research Journal 33:845-871. Ross, Stephen John. 1998. Self-assessment in second language testing: a meta-analysis and analysis of experiential factors. Language Testing 15:1-20. </li></ul></ul></ul>
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×