Your SlideShare is downloading. ×
0
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Foundationsf2f presv2[1]
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Foundationsf2f presv2[1]

364

Published on

BRIT staff day - Power P

BRIT staff day - Power P

Published in: Education, Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
364
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
9
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • QVDC – Who we are and what we do. Marinda Chang and myself eAssessment is a hot topic and growing consensus about the opportunities afforded by the technology particularly for an competency based assessment environment. This session activity is designed for VET practitioners and managers who are confident with traditional approaches of assessment but are yet to explore the possibilities and considerations of eAssessment. If you are familiar with Blooms Taxonomy this is targeted to Remember and Understanding -   The first part of the session will go through a background to eAssessment, possibilities presented by eAssessment, benefits, Standards for NVR Registered Training Organisations (which has replaced the AQTF) with a focus on authenticity and validity.   The second part of the session is looking at some good practice examples of eAssessment and the opportunity to provide a ‘bridge to practice’.   Please note that the Standards for NVR Registered Training Organisations replaced the former AQTF standards for relevant applicants/RTOs. The Standards for NVR Registered Training Organisations are now the standards guiding nationally consistent, high-quality training and assessment services in the vocational education and training system. We will be referring to the new standards in this session. For more information about the AQTF / Standards for NVR Registered Training Organisations go to http://www.asqa.gov.au/about-asqa/national-vet-regulation/standards-for-nvr-registered-training-organisations.html
  • In this session, we want to challenge you to think about how you are currently doing assessment – this is a good opportunity to consider assessment broadly and eAssessment more specifically. So by show of hands,
  • In this session, we want to challenge you to think about how you are currently doing assessment – this is a good opportunity to consider assessment broadly and eAssessment more specifically. So by show of hands,
  • In this session, we want to challenge you to think about how you are currently doing assessment – this is a good opportunity to consider assessment broadly and eAssessment more specifically. So by show of hands,
  • In this session, we want to challenge you to think about how you are currently doing assessment – this is a good opportunity to consider assessment broadly and eAssessment more specifically. So by show of hands,
  • In this session, we want to challenge you to think about how you are currently doing assessment – this is a good opportunity to consider assessment broadly and eAssessment more specifically. So by show of hands,
  • In this session, we want to challenge you to think about how you are currently doing assessment – this is a good opportunity to consider assessment broadly and eAssessment more specifically. So by show of hands,
  • In this session, we want to challenge you to think about how you are currently doing assessment – this is a good opportunity to consider assessment broadly and eAssessment more specifically. So by show of hands,
  • PPT this slide has a transition which adds the B&R results. Qn: How many people are doing eAssessment?   Generally, there will be less than half of the people raising their hands to this question, link this information to the benchmarking survey results. Alternative: depending on the number of people, you could ask individuals to identify the type of eAssessment that is being used; this information could be used later in the session.   So we have <insert> percentage of people who are doing eAssessment. This is interesting as the Australian Flexible Learning Framework conduct national surveys of e-learning activity – their results suggest over forty per cent of Registered Training Organisations (RTOs) and more than sixty per cent of teachers and trainers are using some form of e-assessment (Australian Flexible Learning Framework 2010). So with this information in mind, let’s think about what eAssessment is and how we define it. What is eAssessment? Australian Flexible Learning Framework (2010) 2010 VET teacher/trainer results, DEEWR, Canberra, [Accessed December 2011] http://e-learningindicators.flexiblelearning.net.au/docs/10results/2010_TeacherTrainer_results_StateTerritoryProviderType.pdf
  • – image only   Another show of hands, how many people in the room have been into a bank in the last 3 months?   Generally only a few people put up their hands   And how many people have done their banking online.   Generally most people out up their hands   Now, how many people call this e-banking? <Call out – hey honey, I’m just about to do some e-banking>   Most of us would do out banking online, but we don’t call it e-banking – it’s just the way we do our banking. In many ways, we can think of eAssessment like this – it’s just the way we do assessment. In fact, when I show you the definition of eAssessment, you’ll find out why many people think that eAssessment is actually the dominant form of assessment in the VET sector.   http ://www.news.com.au/money/banking/mobiles-deliver-banks-a-transaction-boom/story-e6frfmcr-1225954227259 http://www.smh.com.au/business/sighs-of-relief-heard-from-the-bankers-bunkers-20110915-1kbt9.html
  • Note this slide has a transition red line around gathering quality evidence eAssessment can be defined as:   E-assessment is the use of information technology for any assessment-related activity. The diagram below is in the presentation and shows the end-to-end assessment process. The point is that even people who are doing ‘traditional’ forms of assessment, are still recording and reporting via Student Management Systems – therefore, doing eAssessment. It is important to keep in mind that the assessment process is more than just evidence gathering. If eAssessment was only about evidence gathering, then this implies that this is the most important part of the assessment process. All parts of the assessment process are important. Australian Flexible Learning Framework (2011) E-assessment guidelines for the VET sector, DEEWR, Canberra, [Accessed December 2011] http://www.flexiblelearning.net.au/files/E-assessment%20guidelines%20for%20the%20VET%20sector.pdf So, how many people are doing eAssessment now you know the definition?   Generally most people raise their hands at this point.   So most of us are doing eAssessment – because when we think eAssessment, most people think of the evidence collection process, and from this, they think of the online quiz. Research tells us that the online quiz is the dominant form of eAssessment/evidence gathering methodology.   When you think about how many people raised their hands, and consider the definition of eAssessment – you can see why many people think that eAssessment is actually to dominant form of assessment across the VET sector.   Callan, V. and Clayton, B. (2010) E-assessment and the AQTF: Bridging the divide between practitioners and auditors, Australian Flexible Learning Framework, Commonwealth of Australia, Canberra.
  • So what is happening around the eAssessment space Nationally? In 2009, Victor Callan (from Queensland) and Berwyn Clayton (Victoria University) conducted a piece of research for the Australian Flexible Learning Framework that looked at the perception that there was a ‘divide’ between practitioners and auditors. The research is available on the Framework website, and there is a brochure that summarises their findings.   A few key issues they identified were: Lack of diversity in evidence collection: As mentioned earlier, they identified that the dominant for of evidence collection was the online quiz. Auditor’s perceptions vs. Validity: They found that many auditors were suspicious of eAssessment methodologies and were perhaps ‘more wedded to paper based methods’ than they should be. However, they also found that some assessors had a ‘set and forget’ approach to eAssessment development and that this could result in a lack of validity of those methods. Authenticity: They also found that there was a preoccupation (by both auditors and assessors) with the issue of authenticity – How do we know the person doing the assessment is the person getting the award. This is something that we will spend more time on later in the session.   Victor and Berwyn also identified that the VET sector needed a set of good practice guidelines and case study examples of what eAssessment is occurring across the sector.   So with these recommendations in mind, the Australian Flexible Learning Framework approached the National Quality Council (NQC) to assist in resourcing the development of guidelines. Eventually the project took off and Rob Stowell (who has done heaps around assessment in the VET sector), was commissioned to develop a set of guidelines, with a colleague of his (Reece Lamshed) developing case studies.   The guidelines are available on the Framework website and the case studies should be available sometime this year. What is significant about these guidelines is that they are a point to support the development of eAssessment and provide a set of questions to ask when exploring different methodologies. Some of the questions may be for your IT departments, others are for managers, and if you are purchasing eAssessment resourced, then your service providers.   Last year, we hosted Rob and Reece who presented about the guidelines, we also had a few other events and as a result of the popularity of these – we put together these sessions to support eAssessment across Queensland. You’ll notice on the website there are a range of other activities on and we encourage you to participate in a range of activities.   There are also a range of other resources that can help you move into the eAssessment areas – and these will be available in the resources provided with this session.   So that’s a bit of a background – let’s dig into eAssessment methodologies in a little more detail. http://www.flexiblelearning.net.au/files/WhatMatters_E-assessment_May2010.pdf
  • In this activity, we are going to brainstorm the different technologies that can be used during the eAssessment process.   Do this on a white board – ideally write up the assessment process and leave the evidence gathering until last, this will be the longest section. Alternative 1: get people to discuss in small groups and report back. Alternative 2: hand out categories to groups and ask groups to report back. Use the page below to assist in the debrief – there is a summary on the PPT slide 9   Debrief See print out
  • In this activity, we are going to brainstorm the different technologies that can be used during the eAssessment process.   Do this on a white board – ideally write up the assessment process and leave the evidence gathering until last, this will be the longest section. Alternative 1: get people to discuss in small groups and report back. Alternative 2: hand out categories to groups and ask groups to report back. Use the page below to assist in the debrief – there is a summary on the PPT slide 9   Debrief See print out
  • In this activity, we are going to brainstorm the different technologies that can be used during the eAssessment process.   Do this on a white board – ideally write up the assessment process and leave the evidence gathering until last, this will be the longest section. Alternative 1: get people to discuss in small groups and report back. Alternative 2: hand out categories to groups and ask groups to report back. Use the page below to assist in the debrief – there is a summary on the PPT slide 9   Debrief See print out
  • Plan assessment - Forums, blogs, email, voice boards and VOIP. Gather evidence - Mobile devices, video (POV), ePortfolio, blogs, wikis, digital stories, VOIP, virtual classroom, quiz, digital simulation, forum, etc. Support candidate - Virtual classroom, LMS, CLMS, etc. Assessment decision - Quiz, simulation, etc. Provide feedback - Email, SMS, LMS, virtual classroom, blog, forum, mobile phone, VOIP, etc. Record /Report - LMS and designated databases, etc. Review process - Forums, blogs, email, voice boards and VOIP.
  • In this activity, we are going to brainstorm the different technologies that can be used during the eAssessment process.   Do this on a white board – ideally write up the assessment process and leave the evidence gathering until last, this will be the longest section. Alternative 1: get people to discuss in small groups and report back. Alternative 2: hand out categories to groups and ask groups to report back. Use the page below to assist in the debrief – there is a summary on the PPT slide 9   Debrief See print out
  • Go through the benefits – this can be via the slide, or via another activity - this depends on the size and nature of the group. As you can see, there are a wide range of technologies that can be used across the assessment process and these present a range of possibilities for all stakeholders in the assessment process. Improved quality of assessment evidence: Using technology to enhance the type of evidence that is collected – we can move beyond traditional forms of evidence collection and into real life, real time collection. For example, hairdressing students collecting images of their work for inclusion into e-portfolios. More efficient assessment feedback processes that promote learning: Using learning management systems and other technologies to allow for most instant forms of feedback. Reliable submission, storage and retrieval of assessment evidence: Move away from lost assessments, and having to manually manage the significant workloads we have. Improved validity and reliability of assessment through use of authentic assessment tasks: rather than trying to replicate a workplace environment – use the technology to access the workplace, then shift the assessment. For example, in Tasmania, assessors are carrying a smart phone, with the assessment tools then doing their assessments onsite. Enhanced learner engagement through interactive assessment with adaptive feedback: As learning management systems become more sophisticated, we are able to use branching to work through real life scenarios. For example, using the lesson resource in Moodle to start to develop scenarios for learning. Promote peer and self assessment that better prepares learners for work and life in the community: Peer and self assessment prepare the learner for the workplace in which peer and self assessment is a common feature. Technology allows us to have learners post information and then critique each other’s work. More accurate and timely information on program effectiveness that can be used to inform program design and delivery: A feature of assessment should be the last part of the cycle, which is to review the assessment process. Using technology can allow us to incorporate this feedback into our assessment processes more easily. Reconstruct the purpose of assessment in VET: This is an exciting development. For the last many years, we have worked within a paradigm of assessment of learning - the learner does the learning and then undertakes an assessment. Technology allows us to move beyond this so the assessment is integrated into the learning process and we move to a place where we have assessment for learning (not assessment for learning). So the eAssessment provides opportunities for all stakeholders. However, sometimes, eAssessment can be more hassle than it’s worth. Sure the benefits are great – but there are a number of issues to consider – and who has the time? This is where the e-assessment guidelines come in. The guidelines are not about telling us ‘how’ to do the eAssessment - they are in place to support the ‘what’ do I need to consider and ‘what’ I should be asking.  
  • In this section go through the guidelines briefly – and how they can be used - the guidelines support the ‘what’ of eAssessment not the how to do it. You could probably spend a lot of time going through these individually. However, the point here is to be able to direct people to the guidelines to get more information about what they need to be considering.   Below is a summary of the guidelines and the main points in each of the categories.   One of the things to note is that there is very little that is new about the guidelines. The majority of the guidelines are adapted from standards that already exist.   Another thing is that some of the guidelines are open to interpretation. For example, with the technical standards – you might determine that interoperability is not important for you and so you don’t choose to meet this standard. The important point is that you don’t accidentally get locked into a specific form of technology without realising the implications.  
  • 1.1 Web based functionality E-assessment providers must meet minimum web based functionality requirements to support e-assessment 1.2 Desktop functionality E-assessment providers must meet minimum desktop functionality requirements to support e-assessment.
  • 2.1 Accessibility E-assessment providers must confirm that e-assessment resources and materials are accessible to people with disabilities 2.2 Portability  E-assessment resources and materials must be transportable between different repositories and learning management systems. 2.3 Desk top content formats E-assessment resources and materials must meet the standard for desk top content formats.
  • 2.4 Mobile content formats E-assessment resources and materials must meet the standards for mobile content formats. 2.5 Metadata To support discovery and re-use of e- assessment resources and materials across the VET system, assessment content should be described using the Vetdata standard.
  • 3.1 Benchmarks E-assessment resources and materials must meet the requirements of the relevant Training Package or accredited course. 3.2 Assessment principles E-assessment resources and materials must provide for valid, reliable, fair and flexible assessment
  • 3.3 Personalisation E-assessment resources and materials must provide for personalisation of assessment. 3.4 Validation E-assessment resources and materials must be systematically validated. 3.5 Workplace and regulatory requirements E-assessment resources and materials must address workplace and regulatory requirements
  • 3.6 Candidate authentication and security E-assessment resources and materials must provide for candidate authentication and the security of both the assessment process and assessment data. 3.7 Maintenance E-assessment resources and materials must be maintained.
  • 4.1 Collaboration E-assessment must be developed in consultation with industry and other stakeholders. 4.2 Evidence collection E-assessment involves collecting quality evidence for use in assessment decision making. 4.3 Feedback E-assessment feedback must identify candidate strengths, areas for improvement and ways in which performance may be improved.
  • 4.4 Assessment judgements E-assessment involves assessors in evaluating evidence and making assessment judgments. 4.5 Recording and reporting e-assessment outcomes E-assessment outcomes must be accurately recorded, reported and stored. 4.6 Complaints and appeals E-assessment processes must provide for complaints and appeals from candidates.
  • 5.1 E-assessment support services E-assessment providers must have appropriate support services for assessors and candidates. 5.2 E-assessment deployment strategy E-assessment providers should have an e-assessment deployment strategy.
  • Rules of evidence + Principles of assessment – the point here is that some of these are also about good design principles.   This slide is to act as a reminder of the definitions (and as a prelude to the next activity). While most know these, this is just a way to reinforce that we are all on the same page.   The definitions are provided below and are taken from the Standards for NVR Registered Training Organisations 2011 which are under subsection 185(1) of the National Vocational Education and Training Regulator Act 2011.   Principles of assessment are required to ensure quality outcomes. Assessments should be fair, flexible, valid and reliable as follows:   a) Fairness: Fairness requires consideration of the individual candidate’s needs and characteristics, and any reasonable adjustments that need to be applied to take account of them. It requires clear communication between the assessor and the candidate to ensure that the candidate is fully informed about, understands, and is able to participate in, the assessment process, and agrees that the process is appropriate. It also includes an opportunity for the person being assessed to challenge the result of the assessment and to be reassessed if necessary.   b) Flexible: To be flexible, assessment should reflect the candidate’s needs; provide for recognition of competencies no matter how, where or when they have been acquired; draw on a range of methods appropriate to the context, competency and the candidate; and, support continuous competency development.   c) Validity: There are five major types of validity: face, content, criterion (i.e. predictive and concurrent), construct and consequential. In general, validity is concerned with the appropriateness of the inferences, use and consequences that result from the assessment. In simple terms, it is concerned with the extent to which an assessment decision about a candidate (e.g. competent/not yet competent, a grade and/or a mark), based on the evidence of performance by the candidate, is justified. It requires determining conditions that weaken the truthfulness of the decision, exploring alternative explanations for good or poor performance, and feeding them back into the assessment process to reduce errors when making inferences about competence.   Unlike reliability, validity is not simply a property of the assessment tool. As such, an assessment tool designed for a particular purpose and target group may not necessarily lead to valid interpretations of performance and assessment decisions if the tool was used for a different purpose and/or target group   d) Reliability: There are five types of reliability: internal consistency; parallel forms; split-half; inter-rater; and, intra-rater. In general, reliability is an estimate of how accurate or precise the task is as a measurement instrument. Reliability is concerned with how much error is included in the evidence.   Rules of evidence are closely related to the principles of assessment and provide guidance on the collection of evidence to ensure that it is valid, sufficient, authentic and current as follows:   a) Validity: see Principles of assessment.   b) Sufficiency: Sufficiency relates to the quality and quantity of evidence assessed. It requires collection of enough appropriate evidence to ensure that all aspects of competency have been satisfied and that competency can be demonstrated repeatedly. Supplementary sources of evidence may be necessary. The specific evidence requirements of each unit of competency provide advice on sufficiency.   c) Authenticity: To accept evidence as authentic, an assessor must be assured that the evidence presented for assessment is the candidate’s own work.   d) Currency: Currency relates to the age of the evidence presented by candidates to demonstrate that they are still competent. Competency requires demonstration of current performance, so the evidence must be from either the present or the very recent past.   http://www.comlaw.gov.au/Details/F2011L01356
  • In this activity, we are going to brainstorm How do you currently ensure your assessments met the rules of evidence and principles of assessment   Do this on a white board – ideally write up the assessment criteria into headings to complete.   Alternative 1: get people to discuss in small groups and report back Alternative 2: hand out categories to groups and ask groups to report back   Debrief When doing the debrief, the question to highlight is - keeping in mind the broad definition of eAssessment and the different technologies already identified - What do they consider is different for eAssessment? The point is that many of the processes they are doing for their current assessment processes will be no different when utilising eAssessment methods. The skill in the debrief will be to identify the commonalities and recognise there are some differences, but there are more similarities than differences. The point to this activity is to have people starting to feel more comfortable with transitioning to eAssessment methodologies. The main issue you will find is with authenticity of the candidate – how do you know they are that person? Which is addressed in the next PPT slide.
  • An issue that is commonly raised is the issue of authenticity – how do you know the evidence presented for assessment is the candidate’s own work? http://www.liquidsilver.org/tag/ustream/ http://connect.in.com/angelina-jolie/profile-561.html http://i1.kym-cdn.com/photos/images/original/000/057/731/OldFatguy.png http://ms10.deviantart.com/art/Herp-Derp-Ed-205423949
  • Authentication of learner’s work can be done in a range of ways. Examples include: Direct Observation Paper based signatures Third Party paper based signatures These might be related to methods like these: Signed declaration from the student that states ‘..all work will be their own...’ Digital Signature Systems (eg. PDF annotations on iPad) Unique student logins to assessment items System analytics that show when the learner was online Third Party Verification of evidence (example workplace supervisor)   There are a range of emerging technologies around authentication. Examples include: Biometrics is an emerging area of technology that can be used to authenticate a learner is engaged with a computer system by using identification through human attributes such as eye scanning, facial recognition, finger prints and voice recognition. However, there may be privacy issues for individuals and workplaces in the capturing and storing of student images/information in these ways. Examples of emerging Biometric technologies Android OS (Icecream hands) Example Facial Recognition Facial Recognition on Mobile OS http://www.youtube.com/watch?v=5TDO9ok4sWI&feature=player_embedded Facial/identity Recognition iOS App http://mashable.com/2011/09/12/terminator-vision/ Geolocation through mobile devices is also making it possible to validate the time frames and locations of training or assessment by recording location data.   However, if we go back to the original ways of ensuring authenticity – these methods of centre on two things; Quality assessments gather evidence over time Assessors should be striving for a relationship with the learner   Here are a couple of stories about how some assessors are dealing with these issues. In the case studies that support the E-assessment guidelines, there is the story of a hairdressing apprentice. In her portfolio, she included a picture of an ‘up style’ which was very different to the other images in her portfolio. It demonstrated techniques that the felt the student hadn’t demonstrated in the other assessment tasks. The assessor’s first step was to do a Google image search, to find out if the image had been taken from an online source. Then, in a subtle way, the assessor approached the apprentice’s supervisor to ask about the image, the assessor asked question like; Was the photo taken in this salon? Did the apprentice work on the style? Could the supervisor provide some information about the circumstances? The supervisor explained that the student had really impressed them with the style, apparently this was what the student was passionate about. The assessor went away and completed her assessment judgement. It was because the assessor had a relationship with the apprentice and had a good understanding about the apprentice’s abilities that the assessor was able to identify an image that may not have been the apprentice’s own work. An anecdotal story that was told to me tells about a lecturer who had a ten week class, which was facilitated completely online. The class requirements were participation in weekly discussions, availability for regular chat sessions, as well as (blogs) reflective journals and development of a personal learning network (PLN). After the final assessment was complete, the lecturer was asked; ‘how do you know that they were the people who did their final assessments’. To which the lecturer explained that over the ten week period, they developed such a good relationship with each of the students, that they were able to tell their development, their writing style, their understanding. For me, these examples demonstrate the importance of an assessor; gathering evidence over time developing a relationship with the learner. So, while it’s nice to keep an eye out for the latest trends in technology – it’s critical that we stick to the fundamentals of good assessment practice. Introducing new tools and technologies should change the fundamental assessment principles.  
  • Activity – Break into groups – Read the case study individually- take notes – Then in your groups report back on the following Mobility - iPhone Tasmanian Skills Institute ePortfolio – Western Sydney TAFE Video RPL – Taree Community College Virtual classroom – Canberra Institute of Technology
  • Skills Tech Tasmania Cohort: Apprentices – vehicle repair Technology: iPhone (iQTI) / Compass Competency checklist, photo, video, audio. Capture evidence, instant feedback Compiled into zip, email via 3G to office computer Recorded in SMS
  • Western Sydney TAFE Cohort - Hairdressing apprentices Technology: Mahara / foliospaces Upload photos of haircuts / image analysis Instant feedback through foliospaces Recording: CLAMS / SIS
  • Canberra Institute of Technology Cohort: Off-campus students - Games design (theoretical) Technology: Virtual classroom (Wimba), LMS (Moodle), Research quiz, Forum Instant feedback, off campus delivery Recording: Equella / Banner (SMS)
  • Taree Community College Cohort: Aged care - RPL Technology: Video (bloggie) / streamfolio, LMS (Moodle), Virtual classroom (Elluminate) Feedback via streamfolio, virtual classroom Recording: SIM
  • An assessor behind the technology Range of technologies Technologies introduced into the assessment process Adhered to the AQTF (Standards for NVR registered training organisations) Challenges of privacy and confidentiality High degree of digital literacy requirements > train Bandwidth requirements Provision of non-digital alternatives Challenges of authenticity Accessibility compliance issues Attending to data loss Positive impacts on learning
  • An assessor behind the technology Range of technologies Technologies introduced into the assessment process Adhered to the AQTF (Standards for NVR registered training organisations) Challenges of privacy and confidentiality High degree of digital literacy requirements > train Bandwidth requirements Provision of non-digital alternatives Challenges of authenticity Accessibility compliance issues Attending to data loss Positive impacts on learning
  • An assessor behind the technology Range of technologies Technologies introduced into the assessment process Adhered to the AQTF (Standards for NVR registered training organisations) Challenges of privacy and confidentiality High degree of digital literacy requirements > train Bandwidth requirements Provision of non-digital alternatives Challenges of authenticity Accessibility compliance issues Attending to data loss Positive impacts on learning
  • Transcript

    • 1.  
    • 2. Today's Meet
      • http://todaysmeet.com/britstaffday2012
    • 3. Timeline
    • 4. Catalyst
      • economic shifts
      • industry needs
      • economies of scale
      • continuous improvement
    • 5. Yin and Yang - organisational systems
      • legitimate
      • shadow
      • S curve
    • 6. National VET e-learning Strategy
      • Since 2000 invested over $10 million
              • research
              • toolboxes
              • standards
      • primarily - Content
    • 7. We are bad at predicting the future
      • "Television won't last! It is a flash in the pan." Mary Somerville, pioneer of radio education broadcasts, 1948
      • "The wireless music box has no imaginable commercial value. Who would pay for a message sent to no one in particular?" Associates of David Sarnoff responding to the latter's call foe investment in radio, 1921
      • "There is no reason anyone would want a computer in their home" Ken Olson President and founder of Digital Equipment Corp. DEC, maker of big business mainframe computers, arguing against the PC,1977.
    • 8. Today
      • The session will explore
      • eAssessment overview and context
      • why eAssessment?
      • possibilities
      • benefits
      • case studies
      • technologies
    • 9. Question
      • How many people are doing eAssessment?
      • 2011 E-learning Benchmarking Survey
        • 46% of registered training organisations (RTOs) were using e-learning for assessment.
        • 62% teachers and trainers are using online assessment activities
    • 10. Banking http://www.news.com.au/money/banking/mobiles-deliver-banks-a-transaction-boom/story-e6frfmcr-1225954227259 http://www.smh.com.au/business/sighs-of-relief-heard-from-the-bankers-bunkers-20110915-1kbt9.html VS
    • 11. eAssessment (defined) E-assessment covers a wide range of activities where digital technologies are used in assessment
    • 12. Context and background
      • *E-assessment and the AQTF: Bridging the divide between practitioners and auditors (Callan & Clayton, 2009)
        • Issues identified:
          • Lack of diversity in evidence collection
          • Auditors backgrounds vs Validity
          • Authenticity
      • Note: the AQTF has been replaced by Standards for NVR registered training organisations
      • E-assessment guidelines for the VET sector
    • 13. Over to you
      • Brainstorm the different technologies that can be used during the eAssessment process
    • 14. Over to you
      • Brainstorm the different technologies that can be used during the eAssessment process
        • Plan assessment
        • Gather evidence
        • Support the candidate
        • Make the assessment decision
        • Provide feedback
        • Record and report the decision
        • Review the assessment process
    • 15. Universal truths
    • 16. Assessment technologies Function ITC Plan assessment Forums, blogs, email, voice boards and VOIP. Gather evidence Mobile devices, video (POV), ePortfolio, blogs, wikis, digital stories, VOIP, virtual classroom, quiz, digital simulation, forum, etc. Support candidate Virtual classroom, LMS, CLMS, etc. Assessment decision Quiz, simulation, etc. Provide feedback Email, SMS, LMS, virtual classroom, blog, forum, mobile phone, VOIP, etc. Record /Report LMS and designated databases, etc. Review process Forums, blogs, email, voice boards and VOIP.
    • 17. Over to you
      • Brainstorm the benefits of eAssessment to
        • Plan assessment
        • Gather evidence
        • Support the candidate
        • Make the assessment decision
        • Provide feedback
        • Record and report the decision
        • Review the assessment process
      • What about the elephants insert image of elephant
    • 18. Benefits of eAssessment
      • Improved quality of assessment evidence
      • More efficient assessment feedback processes that promote learning
      • Reliable submission, storage and retrieval of assessment evidence
      • Improved validity and reliability of assessment through use of authentic assessment tasks
      • Enhanced learner engagement through interactive assessment with adaptive feedback
      • Promote peer and self assessment that better prepares learners for work and life in the community
      • More accurate and timely information on program effectiveness that can be used to inform program design and delivery
      • Reconstruct the purpose of assessment in VET.
    • 19. Guidelines structure
      • Five broad categories
      • Infrastructure provision
      • Technical standards
      • e-assessment development and maintenance
      • e-assessment practices
      • e-assessment context
    • 20. 1. Infrastructure provision
      • 1.1 Web based functionality
      • E-assessment providers must meet minimum web based functionality requirements to support e-assessment
      • 1.2 Desktop functionality
      • E-assessment providers must meet minimum desktop functionality requirements to support e-assessment.
    • 21. 2. Technical Guidelines
      • 2.1 Accessibility
      • E-assessment providers must confirm that e-assessment resources and materials are accessible to people with disabilities
      • 2.2 Portability  
      • E-assessment resources and materials must be transportable between different repositories and learning management systems.
      • 2.3 Desk top content formats
      • E-assessment resources and materials must meet the standard for desk top content formats.
    • 22. 2. Technical Guidelines
      • 2.4 Mobile content formats
      • E-assessment resources and materials must meet the standards for mobile content formats.
      • 2.5 Metadata
      • To support discovery and re-use of e- assessment resources and materials across the VET system, assessment content should be described using the Vetdata standard.
    • 23. 3. Development & maintenance
      • 3.1 Benchmarks
      • E-assessment resources and materials must meet the requirements of the relevant Training Package or accredited course.
      • 3.2 Assessment principles
      • E-assessment resources and materials must provide for valid, reliable, fair and flexible assessment
    • 24. 3. Development & maintenance
      • 3.3 Personalisation
      • E-assessment resources and materials must provide for personalisation of assessment.
      • 3.4 Validation
      • E-assessment resources and materials must be systematically validated.
      • 3.5 Workplace and regulatory requirements
      • E-assessment resources and materials must address workplace and regulatory requirements
    • 25. 3. Development & maintenance
      • 3.6 Candidate authentication and security
      • E-assessment resources and materials must provide for candidate authentication and the security of both the assessment process and assessment data.
      • 3.7 Maintenance
      • E-assessment resources and materials must be maintained.
    • 26. 4. Practices
      • 4.1 Collaboration
      • E-assessment must be developed in consultation with industry and other stakeholders.
      • 4.2 Evidence collection
      • E-assessment involves collecting quality evidence for use in assessment decision making.
      • 4.3 Feedback
      • E-assessment feedback must identify candidate strengths, areas for improvement and ways in which performance may be improved.
    • 27. 4. Practices
      • 4.4 Assessment judgements
      • E-assessment involves assessors in evaluating evidence and making assessment judgments.
      • 4.5 Recording and reporting e-assessment outcomes
      • E-assessment outcomes must be accurately recorded, reported and stored.
      • 4.6 Complaints and appeals
      • E-assessment processes must provide for complaints and appeals from candidates.
    • 28. 5. Context
      • 5.1 E-assessment support services
      • E-assessment providers must have appropriate support services for assessors and candidates.
      • 5.2 E-assessment deployment strategy
      • E-assessment providers should have an e-assessment deployment strategy.
    • 29. Requirements
      • Principles of assessment
      • are required to ensure quality outcomes
      • Assessments should be
        • Fair
        • Flexible
        • Valid
        • Reliable.
      Rules of evidence
      • are closely related to the principles of assessment and provide guidance on the collection of evidence.
      • Evidence should be
        • Valid
        • Sufficient
        • Authentic
        • Current.
    • 30. Over to you again.......
      • How do you currently ensure your assessments met the rules of evidence and principles of assessment?
    • 31. Authenticity Actual = http://www.liquidsilver.org/tag/ustream/ http://connect.in.com/angelina-jolie/profile-561.html http://i1.kym-cdn.com/photos/images/original/000/057/731/OldFatguy.png http://ms10.deviantart.com/art/Herp-Derp-Ed-205423949
    • 32. Considerations
      • Authenticity (common sense)
      • Require evidence over time
      • Build a relationship with the learner.
      • Example: hairdressing apprentice
      Authenticity (technical)
      • Technical
      • Biometrics
      • Geolocation
    • 33. Guess what over to you again............
      • Mobility – iPhone Tasmanian Skills Institute
      • ePortfolio – Western Sydney TAFE
      • Video RPL – Taree Community College
      • Virtual classroom – Canberra Institute of Technology
    • 34. Mobile
      • Tasmanian Skills Institute
      • Cohort : Apprentices – vehicle repair
      • Technology : iPhone (iQTI) / Compass
      • Assessment : Competency checklist, photo, video, audio
      • Process : Capture evidence, instant feedback, Compiled into zip, email via 3G to office computer, Recorded in SMS
    • 35. ePortfolio
      • Western Sydney TAFE
      • Cohort - Hairdressing apprentices
      • Technology : Mahar a / f oliospaces
      • Assessment : Upload photos of haircuts / image analysis
      • Process : Instant feedback through foliospaces , recording ( CLAMS / SIS)
    • 36. Virtual classroom
      • Canberra Institute of Technology
      • Cohort : Off-campus students - Games design (theoretical)
      • Technology : Virtual classroom ( Wimba ), LMS ( Moodle ), Research quiz, Forum
      • Assessment : Instant feedback, off campus delivery
      • Process : Recording: Equella / Banner (SMS)
    • 37. Video RPL
      • Taree Community College
      • Cohort : Aged care - RPL
      • Technology : Video ( bloggie ) / streamfolio , LMS ( Moodle ), Virtual classroom ( Elluminate )
      • Assessment : Feedback via streamfolio , virtual classroom
      • Process : Recording (SIM)
    • 38. Lessons learnt
      • An assessor behind the technology
      • Range of technologies
      • Technologies introduced into the assessment process
      • Adhered to the Standards for NVR registered training organisations
      • Challenges of privacy and confidentiality
      • High degree of digital literacy requirements > train
      • Bandwidth requirements
      • Provision of non-digital alternatives
      • Challenges of authenticity
      • Accessibility compliance issues
      • Attending to data loss
      • Positive impacts on learning
    • 39. Do we have time to play?
      • Text based evidence - IQTI player
      • Audio based evidence - live scribe pen
      • Image based evidence - phone, flip camera, iPad, tablet device
    • 40. Review
      • Today's Meet - http://www.todaysmeet.com/britstaffday2012
      • Environment - Yin Yang - S curve
      • Explored eAssessment - end to end, dominant, listed technologies, types of evidence and the technologies that support, benefits, elephants, examined in detail four case studies
      • Experimented
    • 41.
      • “ Think left and think right and think low and think high.
      • Oh, the thinks you can think if only you try”
    • 42. Contact us
      • Toni-Maree Pettigrew
      • Marinda Chang
      • [email_address]

    ×