Selecting And Training Students For The External Review Panels

814 views

Published on

Selecting And Training Students For The External Review Panels: The Romanian Experience. Presented at 3rd Quality Assurance Forum in Budapest

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
814
On SlideShare
0
From Embeds
0
Number of Embeds
9
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • behaved  .... ANOSR contributed to the metodology ...
  • Based on the description of his/her roles and attributions, an ideal profile of student evaluator was developed. The profile was build around four areas of characteristics: (1) Knowledge, (2) Skills, (3) Personal values and (4) Motivation.
  • The overall principle of the training: the interaction of students with QA experts and ANOSR’s senior representatives shall offer a better understanding of the quality assurance phenomenon. speakers – Romanian experts in QA
  • patronizing, telling them what to do and not to do
  • planning and organizing the visit, managing time, tracking indicators, interviewing, taking notes, writing the report.
  • visit planning, taking notes, interviewing techniques, time management, working in an evaluation team, writing a report. simulation of a meeting with students a full day simulation including real visit to a university and a meeting with a professor
  • (sparqs, German student's accreditation pool, Flemish system of selection panel, Nordic system?)
  • (sparqs, German student's accreditation pool, Flemish system of selection panel, Nordic system?)
  • Selecting And Training Students For The External Review Panels

    1. 1. Traian Brum ă and Violeta Caragea University of Bucharest SELECTING AND TRAINING STUDENTS FOR THE EXTERNAL REVIEW PANELS: THE ROMANIAN EXPERIENCE
    2. 2. GREVA 2003
    3. 4. SBU 2
    4. 6. <ul><li>Students trying to CHANGE universities </li></ul>
    5. 7. <ul><li>Students trying to IMPROVE universities </li></ul>
    6. 8. ARACIS starts in 2006
    7. 10. <ul><li>ANOSR was entrusted with selecting, training and nominating students for evaluation teams. </li></ul>
    8. 11. Timeline <ul><li>March 2007: ANOSR’s Methodology </li></ul><ul><li>April and May 2007: training sessions for 64 students.  </li></ul><ul><li>May – June 2007: 11 pilot institutional evaluations involving 13  student evaluators.  </li></ul><ul><li>August – October 2007: evaluating the impact of student participation in the review teams.  </li></ul><ul><li>February – March 2008: Rethinking the student evaluators training. </li></ul><ul><li>April 2008: 27 more students were trained </li></ul><ul><li>June – July 2008: 6 more institutional evaluations </li></ul>
    9. 12. <ul><li>Our aim is to ... </li></ul>
    10. 13. <ul><li>Continue the </li></ul><ul><li>European conversation </li></ul>
    11. 14. <ul><li>Selection </li></ul>
    12. 15. <ul><li>analyzing the self-evaluation report and other documents,  </li></ul><ul><li>preparing the site visit,  </li></ul><ul><li>participating to different visit activities (interviews, visiting academic infrastructure etc.),  </li></ul><ul><li>writing the external evaluation report. </li></ul>Job description:
    13. 16. Job description -> profile: <ul><li>  </li></ul><ul><li>(1) Knowledge,  </li></ul><ul><li>(2) Skills,  </li></ul><ul><li>(3) Personal values and,  </li></ul><ul><li>(4) Motivation. </li></ul>
    14. 17. Profile -> selection intruments: <ul><li>A letter of intent, </li></ul><ul><li>An essay, </li></ul><ul><li>A curriculum vitae. </li></ul><ul><li>+ An assessment grid (scoring1 to 100) </li></ul>
    15. 18. <ul><li>main criterion - the score (more than 50pt) , </li></ul><ul><li>participation in training – mandatory, </li></ul><ul><li>additional criteria were (eg: to evaluate an institution with a similar profile) </li></ul>Nomination:
    16. 19. <ul><li>  </li></ul><ul><li>exclude the criteria related to personal values; </li></ul><ul><li>essay replaced by an interview taken after the training; </li></ul><ul><li>a peer evaluation exercise was introduced. </li></ul>Changes:
    17. 20. <ul><li>Training </li></ul>
    18. 21. Learning outcomes for 2007: <ul><li>Understanding the global context of higher education and the role of QA; </li></ul><ul><li>Understanding the role of QA in the Romanian context and the challenges ahead; </li></ul><ul><li>Operating with basic concepts related to QA; </li></ul><ul><li>Understanding the emerging principles of the QA process in Romania; </li></ul><ul><li>Being able to work with the basic documents regarding QA; </li></ul><ul><li>Understanding the students' role in institutional evaluations reviewers’ team. </li></ul>
    19. 22. Training delivery <ul><li>Facilitated by two students  from ANOSR.  </li></ul><ul><li>Seven presentations delivered by guest speakers; </li></ul>
    20. 23. <ul><li>Participants’ feedback   was very positive.  Some i mproving suggestions: </li></ul><ul><li>better time management, </li></ul><ul><li>more practical activities. </li></ul>Feedback
    21. 24. <ul><li>interviews with student evaluators, </li></ul><ul><li>evaluation of students' reports and  </li></ul><ul><li>the report of the independent audit of the institutional evaluations pilot phase.   </li></ul>Assessing the impact of the training program:
    22. 25. <ul><li>students raised a significant number of (negative) issues ; </li></ul><ul><li>students reports contained specific recommendations, grounded in the reality of the university life;  </li></ul><ul><li>their involvement was seen as mature, pragmatic and competent. </li></ul>Results ( + ):
    23. 26. <ul><li>a need for improvements regarding the site visit organization; </li></ul><ul><li>there is little awareness about the students’ role; </li></ul><ul><li>students reports didn't covered enough the issues for which their experience and perspective is valuable; </li></ul><ul><li>in some cases students’ interventions during the site visit contained an imperative tone, lack of strong arguments, tendentious generalizations of situations encountered in one department for the whole university; </li></ul><ul><li>a guide for writing the report would have been necessary. </li></ul>Results ( - ):
    24. 27. Rethinking the Training: <ul><li>the input consisting of: </li></ul><ul><ul><li>results of the feedback meeting of the 2007 training coordinating team; </li></ul></ul><ul><ul><li>student interviews; </li></ul></ul><ul><ul><li>an independent evaluation report of the pilot phase. </li></ul></ul>
    25. 28. Conclusions <ul><li>The training for students was insufficient; </li></ul><ul><li>The major gap was the lack of evaluation skills. </li></ul>
    26. 29. Changes in training: <ul><li>Learning outcomes focus shifted to practical evaluation skills </li></ul><ul><li>Training delivery: roleplays, simulations. </li></ul>
    27. 30. To do list (1) <ul><li>Better promoting QA to students and the possibility to be involved; </li></ul><ul><li>Connecting the training program for the external reviewers with the programs for the students involved in internal QA; </li></ul><ul><li>Joint training for students and other experts; </li></ul><ul><li>Developing additional modules to the initial training program; </li></ul><ul><li>Building an online platform for the student evaluators community; </li></ul>
    28. 31. To do list (2) <ul><li>Blended learning; </li></ul><ul><li>Further improving the training methods; </li></ul><ul><li>Improving the documentation (describing the role of students in review teams, developing a handbook for student evaluators); </li></ul><ul><li>Inviting foreign students evaluators as guest speakers; </li></ul><ul><li>Finding a dynamic balance between refreshing the pool and building on the experience; </li></ul><ul><li>Recognition of ECTS credits awarded by all Romanian universities. </li></ul>
    29. 32. QUESTIONS <ul><li>Does it worth to extend the student involvement to the study programmes evaluations?  Do we have the capacity to do it? </li></ul><ul><li>How should we design the &quot;pool&quot; / &quot;community&quot;/ &quot;system&quot; in order to have it refreshed every year and to build on the expertise and experience that is being generated every year? </li></ul><ul><li>How many reviews a student should participate in? </li></ul><ul><li>How can we improve the integration of the student in the reviewers’ team? </li></ul><ul><li>Could we have a paid secretariat (project manager + 1 staff); should it be run by ARACIS or by the NUS? </li></ul>
    30. 33. <ul><li>Contact: </li></ul><ul><li>Violeta Caragea: [email_address] </li></ul><ul><li>Traian Bruma: [email_address] </li></ul>

    ×