Eng. Nisansa Dilushan de Silva, Dr. Shahani Markus Weerawarana, Dr. Amal Shehan Perera
Presented by–
Nisansa Dilushan de S...
•Introduction
•Challenges in creating assessment panels
•Proposed Solution
•Results
•Discussion
•Conclusion
•Future work a...
• 5th semester
• Creativity & Software Engineering rigor
• Pre-industrial training (6th semester )
• Is comparable to a mi...
• A large number of students (n=101)
• Assessment of the end-of-semester project demonstrations
• Technical standpoint
• C...
• Student projects cover a large spectrum of technologies.
How to evaluate all of them fairly?
• Is it enough to use unive...
• Is one evaluator matched according to technological
expertise enough to fairly evaluate a student project?
• Creative pr...
• Mapping individual evaluators to student projects
according to the expertise in technology was fairly an
easy task…
Crea...
• But with panels it is not so easy to assign the „best fit‟ panel
of evaluators comprising external industry experts and
...
• Multiple and often conflicting evaluator availability
constraints
• Balance of external versus internal evaluators in
th...
Introduce an algorithm to
Automate the Composition and
Scheduling Process for the
Synoptic Assessment Panels
Solution
Input : Student technology requests
(Matrix A)
Total Number of Student Requests= 6
Input : Evaluator technology expertise
(Matrix B)
• Internal evaluator: conflict data (Matrix C)
• Was a mentor
• Was the mid-semester evaluator
More Inputs….
I. Evaluator ...
• Evaluator availability ( Matrix D )
• Some are available throughout the day
• Some arrive late
• Some plan to leave earl...
• Other constraints in place to ensure a fair
evaluation being taken place [2]
• Grading calibration
• Minimum number of e...
• Create panel slots according to the evaluator availability
Algorithm: Rough outline
• Calculate the student technology request feasibility
Algorithm: Rough outline
E. Evaluator 1 E. Evaluator 2 ... E. Evalu...
• Create a set of panels with a randomization algorithm
• Assign panel slots to created panels
• Apply internal evaluator ...
• Create the The inverted value matrix (Matrix G)
• Run the well-known combinatorial optimization algorithm
known as the “...
• The underlying randomness of the base „seed‟
evaluation panels that was fed in to the core
algorithm might introduce a c...
Algorithm: Removing the randomness
400
420
440
460
480
500
520
1 11 21 31 41
400
420
440
460
480
500
520
1 251 501 751 100...
• The algorithm was able to match 120 of the total
141 feasible requests giving an 85.12% success
rate.
• The average numb...
• The advantage of automating the panel composition
process
• When some external evaluators make sudden changes in their
t...
This approach can be considered as a
major improvement over the manual
assignment of panels for Synoptic
Assessment.
Concl...
The future work is to implement an
online application.
Our recommendation is that other
educators could use this applicati...
[1] TALE 2012 Presentation, Weerawarana, S. M., Perera, A.
S., Nanayakkara, V. (2012). Promoting Innovation, Creativity an...
Upcoming SlideShare
Loading in...5
×

ACB4 tec pre - p4 - presenting a technical paper

411

Published on

Published in: Technology, Education
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
411
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
1
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Good EveningI’ll be explaining the research conducted for “Enabling Effective Synoptic Assessment via Algorithmic Constitution of Review Panels” by the “Department of Computer Science and Engineering University of Moratuwa, Sri Lanka
  • The outline of the presentation is…..
  • First of all let me give you a brief introduction to University of Moratuwa…
  • The Department of Computer Science and Engineering is one of the 7 departments in the faculty of Engineering.Department of Computer Science and Engineering offers theHonours Bachelor of Science degree in Engineering.
  • In the 5th semesterof their studies, the students of the department have to complete a compulsory module called the Software Engineering Project. The module acts as a mini Capstone project for the students who will undergo their industrial training in the following semester. Thus the course straddles several program Intended Learning Outcomes.The students are supposed to complete a project that would enhance their creativity & Software engineering rigor.
  • In handling this module, as lecturers we face a number of challenges.The first is the sheer number of student projects. Since this is a compulsory module, each year we have roughly 100 student projects to grade. In this particular year in which the described algorithm was introduced, there were 101 students.Secondly since the projects are supposed to be creative, the need to be evaluated in the creative standpoint as well as the technical standpoint. This calls for a synoptic assessment methodology. “Synoptic assessment encourages students to combine elements of their learning from different parts of a programme and to show their accumulated knowledge and understanding of a topic or subject area. A synoptic assessment normally enables students to show their ability to integrate and apply their skills, knowledge and understanding with breadth and depth in the subject. It can help to test a student's capability of applying the knowledge and understanding gained in one part of a programme to increase their understanding in other parts of the programme, or across the programme as a whole. Synoptic assessment can be part of other forms of assessment.”
  • As explained by prior research on the same course, it is not enough just to have the internal academic staff grading these projects. We decided to involve external experts for the specific technologies that the students have used.
  • Then the next question is whether one evaluator matched according to technological expertise enough to fairly evaluate a student project?As you might have already guessed, in cases such as this where creativity is involved, the evaluation of anybody, be it an expert of a given technology or not, is going to be subjective. Thus this might not give us the fair judgment that we strived for.Yet again referring to prior research on creative assessment, we decided to involve panels of evaluators so that they can have “consensual assessment”
  • Say we have a set of student projects;And a set of evaluators;Each project having a set of technologies;Each evaluator being an expert to a set of technologies.We can come up with a mapping relatively easily
  • But when we have panels, it is not that easy to find the best fit panel.
  • In creating evaluation panels, we face some more challenges than mapping the bestfit panels to projects.
  • Total number of student requests (r=244)
  • To handle the additional requirements explained previously, we have few more inputs.
  • To handle the additional requirements explained previously, we have few more inputs.
  • To handle the additional requirements explained previously, we have few more inputs.
  • 2. Some technologies might not have an expert. Thus total number of satisfiable student requests =141Our objective was to comeup with the optimum panel allocation so that the maximum number of these satisfiable student requests are fulfilled.
  • 2. Some technologies might not have an expert. Thus total number of satisfiable student requests =141Our objective was to comeup with the optimum panel allocation so that the maximum number of these satisfiable student requests are fulfilled.
  • The reason for taking the product for the internal evaluators and the summation for the external evaluators is the fact that the internal evaluator matrix (C) is essentially a binary matrix about existence or nonexistence of constraints and conflicts and the external evaluator matrix (E) on the other hand is an integer matrix with the relevance factor of each external evaluator for each project as elements. 𝑤2- to dampen the net merit value and prevent a single panel gaining above average advantage over the others within the same time-slot category. 𝑤1- The constant weight variable was used so that the calculated gross merit value (VPro,g−pan) does not get completely subdued by the damping variable value w2. Thus the value of w2 is initiated to a variable value which is proportional to the size of the relevant time slot category and is reduced each time a certain panel is reused. 𝑤1 is set to a value which would ensure that the value obtained by the multiplication of the gross merit value and 𝑤1 will be greater than the maximum possible 𝑤2even at the minimum attainable value of the gross merit value.The reason for taking the product for the internal evaluators and the summation for the external evaluators is the fact that the internal evaluator matrix (C) is essentially a binary matrix about existence or nonexistence of constraints and conflicts and the external evaluator matrix (E) on the other hand is an integer matrix with the relevance factor of each external evaluator for each project as elements. 𝑤_2- to dampen the net merit value and prevent a single panel gaining above average advantage over the others within the same time-slot category. 𝑤_1- The constant weight variable was used so that the calculated gross merit value (V_(Pro,g−pan)) does not get completely subdued by the damping variable value〖 w〗_2. Thus the value of w_2 is initiated to a variable value which is proportional to the size of the relevant time slot category and is reduced each time a certain panel is reused. 𝑤_1 is set to a value which would ensure that the value obtained by the multiplication of the gross merit value and 𝑤_1 will be greater than the maximum possible 𝑤_2even at the minimum attainable value of the gross merit value.
  • 2. Some technologies might not have an expert. Thus total number of satisfiable student requests =141Our objective was to come up with the optimum panel allocation so that the maximum number of these satisfiable student requests are fulfilled.
  • This “future work” has actually now started.
  • ACB4 tec pre - p4 - presenting a technical paper

    1. 1. Eng. Nisansa Dilushan de Silva, Dr. Shahani Markus Weerawarana, Dr. Amal Shehan Perera Presented by– Nisansa Dilushan de Silva Department of Computer Science and Engineering University of Moratuwa Sri Lanka Enabling Effective Synoptic Assessment via Algorithmic Constitution of Review Panels IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE 2013)
    2. 2. •Introduction •Challenges in creating assessment panels •Proposed Solution •Results •Discussion •Conclusion •Future work and Recommendation Outline
    3. 3. • 5th semester • Creativity & Software Engineering rigor • Pre-industrial training (6th semester ) • Is comparable to a mini Capstone project • Course design straddles several program ILOs [3] Introduction - Software Engineering Project (CS 3202) [2] [2] Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012. [3] Anderson, L. & Krathwohl, D. A. (2001). Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives New York: Longman.
    4. 4. • A large number of students (n=101) • Assessment of the end-of-semester project demonstrations • Technical standpoint • Creative standpoint • Needs a synoptic assessment methodology so that it “enables students to integrate their experiences, providing them with important opportunities to demonstrate their creativity”[4] Software Engineering Project- Challenges [4] Jackson, N. (2003). Nurturing creativity through an imaginative curriculum, Imaginative curriculum Project, Learning and Teaching Support Network, Higher Education Academy.
    5. 5. • Student projects cover a large spectrum of technologies. How to evaluate all of them fairly? • Is it enough to use university academic staff only? [2] • Involve external evaluators from the industry each being an expert of some of the technologies used by students? • We went with the second approach by agreeing with Elton that assessment of creative work should be „viewed in light of the work‟, highlighting important aspects as, “the ability of experts to assess work in their own field of expertise” and “the willingness to employ judgment”. [5] Software Engineering Project- Challenges [2] Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012. [5] Elton, L. (2005). Designing assessment for creativity: an imaginative curriculum guide. York: Higher Education Academy (in offline archive).
    6. 6. • Is one evaluator matched according to technological expertise enough to fairly evaluate a student project? • Creative projects -> Evaluation is subjective • Thus might not be a fair judgement • We decided to use panels of evaluators in accordance to what Balchin suggests when he states that “consensual assessment by several judges” can be used to enhance the reliability of subjective evaluation. [6] Software Engineering Project- Challenges [6] Balachin, T. (2006). Evaluating creativity through consensual assessment, in N. Jackson, M. Oliver, M. Shaw and J. Wisdom (eds) Developing Creativity in Higher Education: Imaginative Curriculum. Abingdon: Routledge.
    7. 7. • Mapping individual evaluators to student projects according to the expertise in technology was fairly an easy task… Creating Evaluation Panels - Challenges
    8. 8. • But with panels it is not so easy to assign the „best fit‟ panel of evaluators comprising external industry experts and internal faculty members considering the technologies used in the student‟s project. Creating Evaluation Panels - Challenges ?
    9. 9. • Multiple and often conflicting evaluator availability constraints • Balance of external versus internal evaluators in the panels • Minimization of the number of panel composition reshuffles • Avoidance of the same internal evaluator who assessed the mid-semester project demonstration being included in the end-semester evaluation panel • Preventing internal evaluators who mentored specific projects being included in the end- semester evaluation panel for the same projects Creating Evaluation Panels - Challenges
    10. 10. Introduce an algorithm to Automate the Composition and Scheduling Process for the Synoptic Assessment Panels Solution
    11. 11. Input : Student technology requests (Matrix A) Total Number of Student Requests= 6
    12. 12. Input : Evaluator technology expertise (Matrix B)
    13. 13. • Internal evaluator: conflict data (Matrix C) • Was a mentor • Was the mid-semester evaluator More Inputs…. I. Evaluator 1 I. Evaluator 2 ... I. Evaluator 7 Student 1 1 0 ... 0 Student 2 0 0 ... 1 ... ... ... ... ... Student101 1 1 ... 0
    14. 14. • Evaluator availability ( Matrix D ) • Some are available throughout the day • Some arrive late • Some plan to leave early More Inputs…. Time slot 1 Time slot 2 ... Time slot 24 I. Evaluator 1 1 1 ... 0 I. Evaluator 2 1 1 ... 1 ... ... ... ... ... I. Evaluator 7 1 1 ... 0 E. Evaluator 1 1 1 ... 1 E. Evaluator 2 0 0 ... 1 ... ... ... ... ... E. Evaluator 9 0 1 ... 0
    15. 15. • Other constraints in place to ensure a fair evaluation being taken place [2] • Grading calibration • Minimum number of evaluators in a panel • At least one internal evaluator in a panel More Inputs…. [2] Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012.
    16. 16. • Create panel slots according to the evaluator availability Algorithm: Rough outline
    17. 17. • Calculate the student technology request feasibility Algorithm: Rough outline E. Evaluator 1 E. Evaluator 2 ... E. Evaluator 9 Student 1 0 1 ... 0 Student 2 1 2 ... 1 ... ... ... ... ... Student101 1 1 ... 1
    18. 18. • Create a set of panels with a randomization algorithm • Assign panel slots to created panels • Apply internal evaluator constraints. • Calculate the merit value for each panel slot against each project (Matrix F) Algorithm: Rough outline Panel-slot 1 Panel-slot 2 ... Panel-slot 108 Student 1 20 4 ... 0 Student 2 9 32 ... 65 ... ... ... ... ... Student101 0 19 ... 16
    19. 19. • Create the The inverted value matrix (Matrix G) • Run the well-known combinatorial optimization algorithm known as the “The Hungarian algorithm” [7] on the above data. ( Results in the Boolean Matrix H) Algorithm: Rough outline [7] Kuhn, H. W. (1955). The Hungarian method for the assignment problem, Naval Research Logistics Quarterly, 2:83–97
    20. 20. • The underlying randomness of the base „seed‟ evaluation panels that was fed in to the core algorithm might introduce a certain degree of unfairness to the system. • To fix this we introduced a second layer of indirection which was placed over the core algorithm. • Thus the result was now a pool of panel assignment schedules instead of a single schedule. • The best one of which was selected as the winning schedule. Algorithm: Removing the randomness
    21. 21. Algorithm: Removing the randomness 400 420 440 460 480 500 520 1 11 21 31 41 400 420 440 460 480 500 520 1 251 501 751 1001 1251 1501 1751 Algorithm pool results. (Left: pool size=50, right: pool size =2000) x-axis: epochs, y-axis: total value of assignment schema
    22. 22. • The algorithm was able to match 120 of the total 141 feasible requests giving an 85.12% success rate. • The average number of requests satisfied per student was 92.1% among the 83 students whose requests constituted the 141 feasible requests. • The average number of requests satisfied per technology was 71.69% among the 18 technologies. Results
    23. 23. • The advantage of automating the panel composition process • When some external evaluators make sudden changes in their time constraints • Cancellation of their commitment mere hours prior to the commencement of the project demonstrations. • In this critical situation the algorithm facilitated rapid recalculation that produced an alternate optimal schedule Discussion
    24. 24. This approach can be considered as a major improvement over the manual assignment of panels for Synoptic Assessment. Conclusion
    25. 25. The future work is to implement an online application. Our recommendation is that other educators could use this application for a similar purpose. Future work & Recommendation
    26. 26. [1] TALE 2012 Presentation, Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012. [2] Weerawarana, S. M., Perera, A. S., Nanayakkara, V. (2012). Promoting Innovation, Creativity and Engineering Excellence: A Case Study from Sri Lanka, IEEE International Conference on Teaching, Assessment and Learning for Engineering 2012 (TALE2012), Hong Kong, August 2012. [3] Anderson, L. & Krathwohl, D. A. (2001). Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives New York: Longman. [4] Jackson, N. (2003). Nurturing creativity through an imaginative curriculum, Imaginative curriculum Project, Learning and Teaching Support Network, Higher Education Academy. [5] Elton, L. (2005). Designing assessment for creativity: an imaginative curriculum guide. York: Higher Education Academy (in offline archive). [6] Balachin, T. (2006). Evaluating creativity through consensual assessment, in N. Jackson, M. Oliver, M. Shaw and J. Wisdom (eds) Developing Creativity in Higher Education: Imaginative Curriculum. Abingdon: Routledge. [7] Kuhn, H. W. (1955). The Hungarian method for the assignment problem, Naval Research Logistics Quarterly, 2:83–97 References
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×