Use of online quizzes to support inquiry-based learning in chemical engineering


Published on

Online quizzes have been developed to help prepare first year undergraduate Chemical Engineering students for participating in group based assignments carried out in an inquiry-based learning (IBL) format. These online quizzes based within WebCT Vista allow the students to test their understanding of the fundamental chemical process principles required for the assignments before they participate in the IBL activity. Currently, the classes size is about 70 students therefore it is important to develop the students’ ability to carry out independent and self- directed learning to acquire these core skills. Using these online quizzes, the students are able to self-assess their strengths and weaknesses in the core chemical engineering principles and practice so that they come to the IBL group work more prepared.
The effectiveness of the online quizzes has been evaluated, using a triangulation approach incorporating a student questionnaire, student focus group and project leaders’ interview. Preliminary analysis of the results suggests that the students have found the online quizzes beneficial for developing their core skills in chemical process principles. The presentation will provide: a showcase for the online quizzes created; feedback from the first cohort of students to use the resources; and lessons learned and future developments.

Published in: Business, Technology
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Use of online quizzes to support inquiry-based learning in chemical engineering

    1. 1. Use of online quizzes to support inquiry-based learning in chemical engineering Dr Diane Rossiter, CPE Dr Catherine Biggs, CPE Bob Petrulis, CiLASS
    2. 2. Contents <ul><li>Why? - Background to implementation </li></ul><ul><li>How? - Creating online quizzes </li></ul><ul><li>Was it worth it? </li></ul><ul><ul><li>A student’s perspective </li></ul></ul><ul><ul><li>Our perspective </li></ul></ul><ul><li>Lessons learnt and future developments </li></ul>
    3. 3. Why? Background to implementation <ul><li>Course aims – CPE1002 </li></ul><ul><li>Inquiry based learning format </li></ul><ul><li>Need for online quizzes – CiLASS project </li></ul>
    4. 4. Course Aims <ul><li>CPE1002 aims to provide an introduction to the principles of chemical engineering through discussion of the chemical industry and the development and application of material balances over a range of equipment and processes. </li></ul><ul><li>CPE1002 is a core subject in Year 1. </li></ul>
    5. 5. Inquiry based learning <ul><li>First implemented in 2005/2006 </li></ul><ul><ul><li>Students not seeing the connection between core technical skills and chemical engineering practice </li></ul></ul><ul><ul><li>Opportunity for change – Associate Prof Paul Lant (PAL) (University of Queensland) </li></ul></ul><ul><ul><li>Increasing students numbers by x 2.5 </li></ul></ul><ul><ul><li>(28 in 2003/04 to 70 in 2007/08) </li></ul></ul>
    6. 6. What we tried to teach them <ul><li>Core Technical skills </li></ul><ul><ul><li>Material balances, units, system boundaries etc </li></ul></ul><ul><ul><li>Dealing with uncertainty </li></ul></ul><ul><li>Personal Skills </li></ul><ul><ul><li>Working and communicating in a group </li></ul></ul><ul><ul><li>Independent and self directed learning </li></ul></ul><ul><li>Transferable Skills </li></ul><ul><ul><li>Technical reporting </li></ul></ul><ul><ul><li>Presentations </li></ul></ul><ul><ul><li>Communication </li></ul></ul>
    7. 7. <ul><li>Problem-based Tutorials (2 hours per week) </li></ul><ul><ul><li>Group Assignments with real data, authentic, </li></ul></ul><ul><ul><li>not always one neat solution </li></ul></ul><ul><li>Lectures/Keynotes (1 hour per week) </li></ul><ul><ul><li>Overview and introduction </li></ul></ul><ul><ul><li>Directed learning </li></ul></ul><ul><li>Homework Sheets and Paper-based Quiz </li></ul><ul><ul><li>Independent study and re-enforcement </li></ul></ul>How we tried to teach them major shift of focus
    8. 8. Typical format of a tutorial <ul><li>Feedback on homework – small class, 4 staff using flipcharts located in different corners </li></ul><ul><li>IBL Activity – working in groups, typically prepare OHP with their proposed solution </li></ul><ul><li>Feedback on IBL Activity – whole class </li></ul><ul><li>Homework/Assignment – study time, queries </li></ul>
    9. 9. Why introduce online quizzes? <ul><li>Student Perspective </li></ul><ul><ul><li>Independent self assessment of core skills – identify weaknesses and strengths </li></ul></ul><ul><ul><li>Increase student’s confidence to actively participate in group work, multi-cultural </li></ul></ul><ul><ul><li>Instant feedback and directed study </li></ul></ul>
    10. 10. Why introduce online quizzes? <ul><li>Our Perspective </li></ul><ul><ul><li>Reduce staff marking load </li></ul></ul><ul><ul><li>Re-focus the scaffold of teaching </li></ul></ul><ul><ul><ul><li>“ sage on the stage to guide on the side” </li></ul></ul></ul><ul><ul><li>Opportunity for implementation – DR experience in online courses </li></ul></ul>CiLASS Project CiLASS Centre of Excellence in Inquiry based learning in Arts and Social Sciences
    11. 11. Contents <ul><li>Why? - Background to implementation </li></ul><ul><li>How? – Creating online quizzes </li></ul><ul><li>Was it worth it? </li></ul><ul><ul><li>A student’s perspective </li></ul></ul><ul><ul><li>Our perspective </li></ul></ul><ul><li>Lessons learnt and future developments </li></ul>
    12. 12. Development of Online Quizzes <ul><li>CAA not new (see HEA resources (Seale, 2002 – LTSN Generic Centre) for review) </li></ul><ul><li>Variety of tools available from custom made (e.g. E3AN project Question Buddy, ASTutE) </li></ul><ul><li>Generic quiz tools embedded within VLEs </li></ul><ul><li>Our choice – WebCT Vista Quiz Tool, supported at institutional level with training. </li></ul><ul><li>9 types of question styles – calculated, MCQ etc </li></ul>
    13. 13. Creating Quiz Content <ul><li>Mechanism with 6 step process using Respondus 3.5 as development tool </li></ul><ul><li>Reasons for choosing Respondus 3.5 </li></ul><ul><li>Ease of use of interface, support available </li></ul><ul><li>Able to print databank of questions </li></ul><ul><li>Use of WORD templates for importing questions </li></ul><ul><li>Publish to various VLEs including WebCT Vista </li></ul>Respondus 3.5,
    14. 14. Import from MS Word <ul><li>Can import multiple choice, true-false, paragraph, short answer, matching, and multiple response questions from a file. </li></ul><ul><li>The questions must be organized in a format that is acceptable to Respondus and the file must be stored in one of the following formats: plain text (.txt), rich-text (.rtf), MS Word (.doc) or tab/comma delimited (.csv) format. </li></ul><ul><li>Example – Mixer Flowsheet MCQ with figure </li></ul>
    15. 15. Figure 1 WORD file format for multiple choice style question Title: Mixer Flowsheet 2) Using the flowsheet for the mixer provided and assuming steady state conditions with no reaction occurring, determine the overall material balance and select the correct answer from the following options: @ For steady state conditions and no reaction, the overall material balance becomes: total mass flowrate IN = total mass flowrate OUT *a. A + B = P @ Correct. b. A - B = P @ Incorrect, B is an input stream. c. A + P = B @ Incorrect, P is an output stream. d. P + B = A @ Incorrect, B is an input stream. Not possible for calculated style questions! Mixer A (kg/h) B (kg/h) P (kg/h)
    16. 16. For Calculated Questions <ul><li>Need to enter details directly into Respondus databank. </li></ul><ul><li>Example from excess air calculation relating to a chemical reaction taking place </li></ul><ul><li>Screen shot 1 – what students sees </li></ul><ul><li>Screen shot 2 – what designer sees </li></ul><ul><li>Generates up to 100 alternatives of same calculation! </li></ul>
    17. 19. Useful features of Calculated Questions <ul><li>Generates up to 100 alternatives of same calculation! So facilitates repeated use. </li></ul><ul><li>Answer required to certain decimal places or significant figures – key skill for engineers! </li></ul><ul><li>Can request they include units with answer – another key skill! </li></ul>
    18. 20. Other question style - matching
    19. 21. Our Implementation <ul><li>A databank of 78 online quiz questions relating to five core chemical principles topics was developed </li></ul><ul><ul><li>unit conversions using the unity brackets approach (26); </li></ul></ul><ul><ul><li>mass to mole conversions (8); </li></ul></ul><ul><ul><li>calculations and definitions relating to material balances (13); </li></ul></ul><ul><ul><li>material balance calculations without reactions (20); and </li></ul></ul><ul><ul><li>material balances with reactions (11). </li></ul></ul><ul><li>Questions types included calculated, multiple choice, short answer, matching pairs, fill-in-the-blanks, true-false. </li></ul>
    20. 22. Key steps Step 1. Brainstorming core concepts and relevant Q’s Step 4. Type question directly into Respondus v3.5 e.g calculated Step 2. Type question into MS Word e.g. MCQ, T/F Step 3. & 5 Combine in Respondus & preview Step 6. Publish in WebCT & do settings Based on what students found hard in the past
    21. 23. Implementation into CPE1002 <ul><li>Available to students in self-test mode within WebCT Vista (2008) Published to coincide with relevant key note lectures. </li></ul><ul><li>Feedback from individual questions directed to lecture notes and textbook </li></ul><ul><li>Same databank of questions was also used for WebCT Vista Quiz (10%) </li></ul><ul><ul><li>In Week 9, this online test was made available with a time limit of 1 hour </li></ul></ul><ul><ul><li>The online test was released for 7 days and at the same time the self-test quizzes were made unavailable. </li></ul></ul><ul><ul><li>Out of 70 students, 63 did the online test. </li></ul></ul><ul><ul><li>The average score for the online test was 8.3% (out of 10%). Low marks for some due to not accessed self-test quizzes earlier and caught about by exactness of CAA responses required. </li></ul></ul>
    22. 24. Contents <ul><li>Why? - Background to implementation </li></ul><ul><li>How? - Creating online quizzes </li></ul><ul><li>Was it worth it? </li></ul><ul><ul><li>A student’s perspective </li></ul></ul><ul><ul><li>Our perspective </li></ul></ul><ul><li>Lessons learnt and future developments </li></ul>
    23. 25. Was it worth it? Student focus group Student questionnaires Project leaders’ interview
    24. 26. Student questionnaire results 1 <ul><li>To what extent has your use of online self-test quizzes helped you … </li></ul>1. End of semester 1 Feedback by Year 1 students, 15 th Feb 2008, 54 responses out of 69. 1.8% didn’t know, 3.7% didn’t use and 5.6% gave no response Total of 81.5% quizzes had helped to some extent in developing core skills 7.4% 31.5% 50% 9. to develop core skills required for IBL actvities… Very little Some Very much/ quite a bit
    25. 27. Student focus group 2 2. Eight students – Interviewer R. Petrulis (CiLASS) 3 rd March 2008 “ The homeworks made us work hard, but the quizzes really helped us learn how to do the homeworks.” Did everyone else do the quizzes and the homeworks as well? [Yes.] Did you find that one was more useful than the other? “ I tried the quizzes a couple times, and they weren’t very hard . The homeworks were more challenging, and you could talk to the teachers about them. That felt so much more helpful.” A couple of you said you didn’t use the quizzes at all. Could you say why? “ They were beneficial because of the immediate feedback . With the homework, it took a week or so. The online quizzes also referred you to the book and page for more information. If you just get a grade, that’s meaningless.” “ Because you could access the quiz more than once , that made it really helpful for revision.” How about the online quizzes? What worked?
    26. 28. Project leaders’ Interview 3 3. Interviewer – R. Petrulis (CiLASS), Interviewees – D. Rossiter and C. Biggs, 20 March 2008 DR: It reinforces that you could describe something in three different ways and it would have meaning to different people. It’s not that the quizzes, assignments and homework cover different things—they don’t. They make it accessible to different types of learners . I think that was highlighted by what you got in the focus group. CB: The point of the quizzes was to help those who needed the basics; not to challenge those who needed to be challenged, because the homework assignments were there to do that. I think this shows that we were right to set this up in the first place. BP: Those students said they preferred to do the homework. CB: That was quite interesting. BP: There were two responses (at focus group) that I got when I asked about the online quizzes. One was that the students found it helpful for learning the material and for revision. Another, smaller group, said they found them to be too basic.
    27. 29. Contents <ul><li>Why? - Background to implementation </li></ul><ul><li>How? - Showcase of online quizzes </li></ul><ul><li>Was it worth it? </li></ul><ul><ul><li>A student’s perspective </li></ul></ul><ul><ul><li>Our perspective </li></ul></ul><ul><li>Lessons learnt, conclusions and future developments </li></ul>
    28. 30. Lessons Learnt <ul><li>Work as part of a team for creating quizzes it is much better for beta testing, being creative in question development and support. </li></ul><ul><li>3 levels of Bloom’s taxonomy of learning can be achieved with careful choice of question styles (see Cullen and Fielding, LTU at MMU “MCQ’s – testing higher order skills”, 2007). Quiz worth (10%) </li></ul><ul><li>Level 1 Knowledge reproduction – True/false, matching pairs , fill-in-blanks </li></ul><ul><li>Level 2 Comprehension – MCQ </li></ul><ul><li>Level 3 Application – calculated (invaluable for engineering) </li></ul><ul><li>Automated marking in WebCT Vista doesn’t allow for “marks for method” so need other assessment strategies for testing higher order skills in engineering. </li></ul>
    29. 31. Lessons Learnt (2) <ul><li>Bloom’s Levels 4 & 5 Analysis and Evaluation </li></ul><ul><li>Homeworks (formative), IBL assignments (40%) and final exam questions (50%) test the students’ higher order levels of learning </li></ul><ul><li>Project time offset by saving time on Homework and paper-based test marking. Benefits likely to be more obvious in future years. </li></ul><ul><li>Handing control for their learning to students is effective for them and us! </li></ul>
    30. 32. Conclusions <ul><li>Majority of students helped in developing their core skills by using self-test quizzes. </li></ul><ul><li>Staff also benefitted from developing own CAA skills and change of role! </li></ul><ul><li>Some referred in reflective statements for 2 nd IBL assignment that they accessed the quizzes repeatedly to help them solve the authentic IBL problems!! </li></ul><ul><li>“ The online quizzes were really helpful because whenever I got stuck, I refer back to them and it never failed!” </li></ul>
    31. 33. Future Developments <ul><li>Develop databank of questions further. </li></ul><ul><li>Disseminate quiz development mechanism to own department and get colleagues to engage in discussion about roll-out to other courses, if appropriate. </li></ul>
    32. 34. Acknowledgements <ul><li>Paul Lant, University of Queensland </li></ul><ul><li>CiLASS </li></ul>