Successfully reported this slideshow.

ACB4 tec pre - p4 - presenting a technical paper

570 views

Published on

Published in: Technology, Education
  • Be the first to comment

ACB4 tec pre - p4 - presenting a technical paper

  1. 1. Eng. Nisansa Dilushan de Silva, Dr. Shahani Markus Weerawarana, Dr. Amal Shehan Perera Presented by– Nisansa Dilushan de Silva Department of Computer Science and Engineering University of Moratuwa Sri Lanka Enabling Effective Synoptic Assessment via Algorithmic Constitution of Review Panels IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE 2013)
  2. 2. •Introduction •Challenges in creating assessment panels •Proposed Solution •Results •Discussion •Conclusion •Future work and Recommendation Outline
  3. 3. • 5th semester • Creativity & Software Engineering rigor • Pre-industrial training (6th semester ) • Is comparable to a mini Capstone project • Course design straddles several program ILOs [3] Introduction - Software Engineering Project (CS 3202) [2] [2] Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012. [3] Anderson, L. & Krathwohl, D. A. (2001). Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives New York: Longman.
  4. 4. • A large number of students (n=101) • Assessment of the end-of-semester project demonstrations • Technical standpoint • Creative standpoint • Needs a synoptic assessment methodology so that it “enables students to integrate their experiences, providing them with important opportunities to demonstrate their creativity”[4] Software Engineering Project- Challenges [4] Jackson, N. (2003). Nurturing creativity through an imaginative curriculum, Imaginative curriculum Project, Learning and Teaching Support Network, Higher Education Academy.
  5. 5. • Student projects cover a large spectrum of technologies. How to evaluate all of them fairly? • Is it enough to use university academic staff only? [2] • Involve external evaluators from the industry each being an expert of some of the technologies used by students? • We went with the second approach by agreeing with Elton that assessment of creative work should be „viewed in light of the work‟, highlighting important aspects as, “the ability of experts to assess work in their own field of expertise” and “the willingness to employ judgment”. [5] Software Engineering Project- Challenges [2] Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012. [5] Elton, L. (2005). Designing assessment for creativity: an imaginative curriculum guide. York: Higher Education Academy (in offline archive).
  6. 6. • Is one evaluator matched according to technological expertise enough to fairly evaluate a student project? • Creative projects -> Evaluation is subjective • Thus might not be a fair judgement • We decided to use panels of evaluators in accordance to what Balchin suggests when he states that “consensual assessment by several judges” can be used to enhance the reliability of subjective evaluation. [6] Software Engineering Project- Challenges [6] Balachin, T. (2006). Evaluating creativity through consensual assessment, in N. Jackson, M. Oliver, M. Shaw and J. Wisdom (eds) Developing Creativity in Higher Education: Imaginative Curriculum. Abingdon: Routledge.
  7. 7. • Mapping individual evaluators to student projects according to the expertise in technology was fairly an easy task… Creating Evaluation Panels - Challenges
  8. 8. • But with panels it is not so easy to assign the „best fit‟ panel of evaluators comprising external industry experts and internal faculty members considering the technologies used in the student‟s project. Creating Evaluation Panels - Challenges ?
  9. 9. • Multiple and often conflicting evaluator availability constraints • Balance of external versus internal evaluators in the panels • Minimization of the number of panel composition reshuffles • Avoidance of the same internal evaluator who assessed the mid-semester project demonstration being included in the end-semester evaluation panel • Preventing internal evaluators who mentored specific projects being included in the end- semester evaluation panel for the same projects Creating Evaluation Panels - Challenges
  10. 10. Introduce an algorithm to Automate the Composition and Scheduling Process for the Synoptic Assessment Panels Solution
  11. 11. Input : Student technology requests (Matrix A) Total Number of Student Requests= 6
  12. 12. Input : Evaluator technology expertise (Matrix B)
  13. 13. • Internal evaluator: conflict data (Matrix C) • Was a mentor • Was the mid-semester evaluator More Inputs…. I. Evaluator 1 I. Evaluator 2 ... I. Evaluator 7 Student 1 1 0 ... 0 Student 2 0 0 ... 1 ... ... ... ... ... Student101 1 1 ... 0
  14. 14. • Evaluator availability ( Matrix D ) • Some are available throughout the day • Some arrive late • Some plan to leave early More Inputs…. Time slot 1 Time slot 2 ... Time slot 24 I. Evaluator 1 1 1 ... 0 I. Evaluator 2 1 1 ... 1 ... ... ... ... ... I. Evaluator 7 1 1 ... 0 E. Evaluator 1 1 1 ... 1 E. Evaluator 2 0 0 ... 1 ... ... ... ... ... E. Evaluator 9 0 1 ... 0
  15. 15. • Other constraints in place to ensure a fair evaluation being taken place [2] • Grading calibration • Minimum number of evaluators in a panel • At least one internal evaluator in a panel More Inputs…. [2] Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012.
  16. 16. • Create panel slots according to the evaluator availability Algorithm: Rough outline
  17. 17. • Calculate the student technology request feasibility Algorithm: Rough outline E. Evaluator 1 E. Evaluator 2 ... E. Evaluator 9 Student 1 0 1 ... 0 Student 2 1 2 ... 1 ... ... ... ... ... Student101 1 1 ... 1
  18. 18. • Create a set of panels with a randomization algorithm • Assign panel slots to created panels • Apply internal evaluator constraints. • Calculate the merit value for each panel slot against each project (Matrix F) Algorithm: Rough outline Panel-slot 1 Panel-slot 2 ... Panel-slot 108 Student 1 20 4 ... 0 Student 2 9 32 ... 65 ... ... ... ... ... Student101 0 19 ... 16
  19. 19. • Create the The inverted value matrix (Matrix G) • Run the well-known combinatorial optimization algorithm known as the “The Hungarian algorithm” [7] on the above data. ( Results in the Boolean Matrix H) Algorithm: Rough outline [7] Kuhn, H. W. (1955). The Hungarian method for the assignment problem, Naval Research Logistics Quarterly, 2:83–97
  20. 20. • The underlying randomness of the base „seed‟ evaluation panels that was fed in to the core algorithm might introduce a certain degree of unfairness to the system. • To fix this we introduced a second layer of indirection which was placed over the core algorithm. • Thus the result was now a pool of panel assignment schedules instead of a single schedule. • The best one of which was selected as the winning schedule. Algorithm: Removing the randomness
  21. 21. Algorithm: Removing the randomness 400 420 440 460 480 500 520 1 11 21 31 41 400 420 440 460 480 500 520 1 251 501 751 1001 1251 1501 1751 Algorithm pool results. (Left: pool size=50, right: pool size =2000) x-axis: epochs, y-axis: total value of assignment schema
  22. 22. • The algorithm was able to match 120 of the total 141 feasible requests giving an 85.12% success rate. • The average number of requests satisfied per student was 92.1% among the 83 students whose requests constituted the 141 feasible requests. • The average number of requests satisfied per technology was 71.69% among the 18 technologies. Results
  23. 23. • The advantage of automating the panel composition process • When some external evaluators make sudden changes in their time constraints • Cancellation of their commitment mere hours prior to the commencement of the project demonstrations. • In this critical situation the algorithm facilitated rapid recalculation that produced an alternate optimal schedule Discussion
  24. 24. This approach can be considered as a major improvement over the manual assignment of panels for Synoptic Assessment. Conclusion
  25. 25. The future work is to implement an online application. Our recommendation is that other educators could use this application for a similar purpose. Future work & Recommendation
  26. 26. [1] TALE 2012 Presentation, Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012. [2] Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012. [3] Anderson, L. & Krathwohl, D. A. (2001). Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives New York: Longman. [4] Jackson, N. (2003). Nurturing creativity through an imaginative curriculum, Imaginative curriculum Project, Learning and Teaching Support Network, Higher Education Academy. [5] Elton, L. (2005). Designing assessment for creativity: an imaginative curriculum guide. York: Higher Education Academy (in offline archive). [6] Balachin, T. (2006). Evaluating creativity through consensual assessment, in N. Jackson, M. Oliver, M. Shaw and J. Wisdom (eds) Developing Creativity in Higher Education: Imaginative Curriculum. Abingdon: Routledge. [7] Kuhn, H. W. (1955). The Hungarian method for the assignment problem, Naval Research Logistics Quarterly, 2:83–97 References

×