David Maxwell & Lynnette Flynn,
Charles Sturt University
The Test Centre tool was implemented to aid student reflection; to inform the teaching approach and for subject assessment purposes. The challenges the subject coordinator experienced; initial student attitudes and suggested areas for improvement will be addressed.
This presentation reviews an initial utilisation of the Test Centre tool by the subject coordinator of an undergraduate distance subject ‘Advertising Principles’ (Bachelor of Media Communication. This subject requires students to gain an understanding of complex concepts in the advertising process: research, positioning, strategy, creative execution and media selection. There is a need to progressively assess student learning throughout the subject. This is required for a number of reasons: as an aid to students reflecting and monitoring their learning; for student progress to inform the teaching approach of the subject, and to obtain evidence of student learning for assessment purposes. This presentation examines the assumed level of knowledge and intuitiveness of the user required for successful test implementation to meet assessment requirements of the subject. The outcomes of the implementation and students attitudes to the test centre tool will be explored. The subject coordinator and educational designer will make suggestions for further developments and improvement to the tool.
A Critique of the Proposed National Education Policy Reform
Using the Test Centre Tool: an opportunity to inform learning and teaching
1. Using the Test centre tool: an opportunity to inform learning and teaching David Maxwell School of Communication, Charles Sturt University Lynnette Flynn Learning and Teaching Services, Charles Sturt University 1
4. Decisions… What are the options? What are the timeframes? What are the known issues? Decisions made! Initially 75 questions (5 topics) needed before the mid semester break, and 75 (5 topics) in week 12 4
10. Lessons learned Start loading the questions as early as possible – and beware of formatting issues! Consider pos and cons of randomisation Pre-circulate the planned test window dates and times. This appeared to reduce the problem of non-completion. Avoid religious festivals! Investigate the special access function Consider how many attempts you want to give the students… 10
13. The future? Make more user friendly / intuitive (terminology?) Enable result data to be sorted numerically (in order of testing) as well as alphabetically Enable result data to be re-processed into charts or graphs to assist verification of testing objectives Provide the ability to export the results as a block into an excel spreadsheet rather than one by one 13
14. “The content definitely helped with my learning of certain advertising concepts, as well as jargon and relationships between agencies and their clients. This was possible because we were able to review the test once we'd taken it, and it continued to be available to us throughout the duration of the course, allowing us to refer back to it whenever we wanted”. 14
Editor's Notes
IntroductionDavid Maxwell lecturer school of communication. Advertising & Commercial RadioTeaching via F2F and DE Avid interact user Lynnette FlynnManager Educational Design and Media Team Arts Faculty, Educational Designer School of Communication
Snap shot of the virtual classWHY did we use Test Centre?The Bachelor of Media Communication is a degree offered by distance education. The degree offers core subjects in technical and creative areas of media communication. Of particular interest to us is the subject advertising principles which introduces students to structure application and process of advertising and its role in the marketing communication mix .For the subject this year, autumn semester2009, there were 30 students enrolled .The aim is for students to become members of the virtual advertising agency to complete a team project as well as an individual assignments and a series of quizzes to determine achievement of learning milestones.
Test as done by Internal studentsAbout the subject David- Based on Internal subject where tests were done on paper and handed in.- The learning design of this subject required the student to be tested on knowledge gained from weeks 2 to 11. - The testing was meant to be done weekly, the plan originally was to email a set of questions to each of the 35 students in the cohort have them complete and return the test which was then to be manually marked and the score returned with any desired feedback. -Problems : creating mailing list, gathering returns, grading and feedbacketc - Solution! Use the test centre which once the questions were loaded could execute the rest of the task without further time consuming effort from the SC.Many unknowns at the time of decisionLynn- Many reasons why you test :*as an aid to student reflection and monitoring of their learning ,gauge own progress *to inform the teaching approach of the subject, *and to obtain evidence of student learning for assessment purposes
Decisions- David-What are the options OASIS, word docs, email lists-What are the timeframes-The known issues – exporting resultsDecisions made75 questions in the week before the mid semester break and 75 in week 12. Initially just Topics 1-5 75 questions –
Table of Contents as the students sees it - How did we get to this point?David- Many, many, hours of loading questions-Examined and interrogated the ‘how to ’ manualLynnMany, many, hours of loading questions and formatting (more about that in LESSONS LEARNED)-First section completed !!! Students go gogo!
Announcement of student directions/instructionsNeed to set up the use of the tool….. Need to Announce the How/when/what to get the students in there, they had had lots of Announcements to say it was coming – but you need to judge the right amount of information – not too little and not too much.
Screen shot of an individual question – as the students sees it
Screen shot of Grade submissions screenSuccess? NOT YET? Students (well most of them) completed successfully…. but that is only half the storyPicture this…. - Output is not intuitive (explain next screen)…..-It’s not possible to export the grades to the subject grade sheet (known problem before using tool).-Religious festivals etc-Granting further access for ‘missing and/or late students
Screen shot of the Summary of Results page - Output is not intuitive . Although the questions were loaded in order and NOT randomised the answers were scrambled (alpha not numerical) which meant it was laborious putting the questions back into their categories so the pattern of learning could be verified which was the prime objective of using the test .
Lessons LearnedFrom this first use of test centre we established the need to:-Start loading the questions as early as possible and FORMAT CAREFULLY -consider if you need to keep them blocked or they can be randomised- When naming the test or question pool as the information appears in the students view consider how well you want the source material for questions to be identified as the more information you give the more open book the test becomes -Pre circulate the planned test window dates and times . This appeared to reduce the problem of non completion and avoid religious festivals If you can make the special access tool work this would overcome the need to either duplicate the test under another name to achieve different access Consider how many attempts you want to give the students, as once they time out you have to set new dates and opening and closing times in order for the student to be allowed in a second or more times.
Screen shot of the editing pane as seen by the creatorMany hours of formatting can be avoided if you take care to de-format your text first and/or use the Word paste function when pasting into editor
Completed question without cleaning up formatting
We are using the test centre again this semester for a mixed set of questions. Multiple choice and short answer.We are now experienced in loading and reading the result of a multiple choice output.The short answer is new ground.The tool is certainly useful and if the following could be improved from a users viewpoint it would be even better.Develop a way to load questions from another source without having to de format the source material so it won’t clash with the format embedded in the tool which cannot be identified Provide the user with the ability to unscramble the questions so they can be returned to their required groupings to verify the achievements required Enable the score data to be reprocessed to present the results in graphs or charts and for the results to be exported as a block rather than one by one. Not to hard if you have only a few students but for a large cohort it would be a very slow process.The end point even with all the time taken it is a useful tool