Sankey, M. 2023. Embracing student innovation in the age of Generative AI (Keynote Presentations). The 2023 WATTLE forum: InspirEd Horizons: Embracing Educational Innovation and Generative AI. University of Wollongong. 25 September.
Embracing student innovation in the age of Generative AI
1. CRICOS Provider No: 00300K (NT/VIC) 03286A (NSW) RTO Provider No: 0373 TEQSA Provider ID PRV12069
Embracing student innovation in the age of
Generative AI
2023 WATTLE FORUM
Professor Michael Sankey
Director, Learning Futures and Lead Education Architect
President, Australasian Council on Open Distance and eLearning (ACODE)
Community Fellow, Australasian Society for Computers in Learning in Tertiary Education (ASCILITE)
https://michaelsankey.com
2. Charles Darwin University acknowledges all
First Nations people across the lands on
which we live and work, and we pay our
respects to Elders both past and present.
2
3. • We got over COVID, then Academic Integrity hit the fan, and then Generative AI
• It has revolutionised the way we write (and more) and work
• Adopting a disposition that embraces innovation is the only option
• It ain’t going away (though might morph), nor should it
• So, we reevaluate what we do, in teaching & assessments (essays, quizzes, exams)
• We look for ways to help our students process information in different ways
• Provide opportunities to contribute to the growth of knowledge
• Just imagine if our students were being productive
• Isn’t that what their potential employers would want?
• Yup!
About this presentation
3
6. Self or cloud hosted
•Institution largely either self hosted or hosted an instance
with the vendor on a private cloud, allowing customisations
that made upgrading more difficult
SaaS
•Software as a service (SaaS) vendors moving clients onto using
the one version of the software. Less customisation possible,
but upgrades happen much more easily
API
• With self hosted systems, institutions had to develop APIs (application program
interface) to allow other systems to communicate with each other
LTI & xAPI
• The advent of LTI (learning tools Interoperability) allows learning system to
invoke and to communicate with external systems against a common global
standard. This is linked with extra ‘experience’ data available through xAPI
Transmission of information
• Systems were used to provide links to documents and learning elements
contained within a repository. Limited tools in the LMS limited engagement
opportunities
Participatory creation
• The advent of more tools to allow for the co-creation, sharing and peer-review
of learning episodes. Greater interoperability has allowed for this to be more
easily mediated
Walled garden approach
• Where the LMS was the central repository for learning and pathways inside the
LMS led students to different elements in the one garden
Open garden approach
• The LMS still has a role but now so do many other systems that can interoperate.
Pathways lead between the different gardens providing far more variety
Antecedents and descendant in a changing digital ecology
8. 8
• AI-Enabled Applications for Predictive,
Personal Learning
• Generative AI
• Blurring the Boundaries between
Learning Modalities
• HyFlex
• Microcredentials
• Supporting Students’ Sense of
Belonging and Connectedness
Key technologies and practices
24. • ChatGPT, Bing Chat, Google Bard (available for free) are useful
for generating content and ideas to help get words on the page.
• Misinformation is serious. So, we teach students how to use it,
how to understand its limitations, and to fact-check its outputs.
• Just as we teach students how to read critically, how to evaluate
and corroborate evidence, and how to distinguish good
arguments from bad and recognise rubbish.
• But, we don’t want to put Gen AI at the centre of education.
Teaching students how to use Generative AI
24
25. • AI provides instant feedback on students’ writing, simplify
complex information and scaffold information on specific task
• It’s helpful for neurodivergent students or students with English
as a second language
• Students struggling to understand concepts can ask AI to
provide examples to aid their understanding.
• The use of AI by students pivots them from being consumers of
learning materials to creators of new learning resources.
Students can become creators (productive)
25
26. • Half are experiencing
pockets of changes in
assessment practice
• 34% reporting some
good examples
starting to emerge.
• Only 3 felt that it was
to early to tell,
• Just one (1)
institution said ‘we
are all over this’
Are you seeing any serious reconsideration of
assessment approaches yet?
26
27. What is stopping your academic staff from fully engaging
with this?
27
28. • 34% slowly moving away
from remote exams
• 11 institutions continuing
for the time being
• As a natural response some
(5) are moving back to
face-to-face exams
• But these were
metropolitan Unis.
• Those responding ’other’
have already moved away
and are encouraging
academic staff you use
more authentic forms of
assessments.
Will you be continuing with remote exams?
28
29. 10 CDU priorities for assessment
29
• Reduce emphasis on final high-stakes exams
• Reduce propensity for wide-spread quizzes for important assessments
• Look for opportunities for course-wide assessments (alignment across units)
• Weight assessment items aligned with level of learning (low for low-stakes)
• Increase emphasis on formative assessment feedback ‘for learning’ (feedback
literacy)
• Designing active, collaborative, authentic assessment
• Increase the use of WIL, group and peer assessment
• Assessment for inclusion
• Increased use of multimodal assessments
• Reduce essays and long form text that can be easily cheated
31. • The onus is on us to design assessments relevant to students’
future careers. That’s the value proposition
• If I had 6 people working for me, and I knew that Gen AI could
make them more productive, would I be asking them to use it?
• By contextualising assessment tasks, linking it to their
development, we encourage learners to engage with their future
• Let’s not fool ourselves students will turn to AI, when things start
getting though, so additional strategies are needed
Continue to design ‘authentic’ assessments
31
32. • A key concern is that students won’t actually understand what they
have submitted.
• So, we balance written work with other kinds of assessments, such as
In-person oral presentations (viva’s) that cannot be produced by AI.
• Supplementing essays with other assessments need not come at the
expense of good assessment design.
• There’s good reasons to vary written work with other forms of
assessment; e.g. oral communication skills are enormously valuable
across a range of professions and yet are often undervalued in HE.
• This does not mean that writing skills are poised to become less
important as AI tools start to more prominent.
Balance essays with other types of assessment
32
33. • Designing assignments where students can demonstrate their
understanding (independent of written work) not just recite knowledge.
• Pen-and-paper exams are back for some. Whilst others are shifting
away from exams to more ‘authentic’ assessments (assessments that
evaluate real-world skills that students will employ in the workplace).
• Few workplaces require their employees to write detailed discussions of
difficult questions by hand, and without a computer.
• Viva’s require students to understand and communicate the ideas
they’ve defended in their essay (regardless if they are outsourced).
• Assignments that ask for deep research on recent developments are,
are relatively AI-resistant. But you need to continuously reviewed this.
Develop AI-resistant assessments
33
34. • This model scored 76.5% on the
multiple choice section of the
Bar exam, up from 73.0% with
Claude 1.3.
• When compared to college
students applying to graduate
school, Claude 2 scores above
the 90th percentile on the
reading and writing exam, and
similarly to the median
applicant on quantitative
reasoning.
https://www.anthropic.com/index/claude-2
34
35. Ask students to:
• Include their personal experience or perspectives in their writing.
• Analyse a class discussions.
• Untangle a complex instruction that involve long texts that do not
fit a typical ChatGPT prompt, or
• Write about a very recent events (in the last week or so) that may
not be found in the ChatGPT database yet.
• But test it first
Set personalised, complex or topical tasks
35
36. • Provide them the readings that they must use.
• Though ChatGPT Plus can analyse PDFs
• Source these readings from University Databases (or closed
journals)
• Submit a word version or use a common drive with version
history enabled
• Ask them to reflect on what they learned from doing an activity
(3-500 words max)
For Essays
36
37. 37
• ECU101 (David Waters)
• Sem 1
• 171 student.
• Minimising risk of
ChatGPT
• David used a specific
article from 2022 which
the student needed to
answer questions about.
40. • https://scite.ai & https:// elicit.com) 2 new tools
• Ask simple questions and get reliable answers from the
full-texts of millions of research articles
• Effectively use information from research articles to
support your research tasks
• Automate time-consuming research tasks like
summarizing papers, extracting data, and synthesizing
your findings.
Be very aware
40
41. • Present questions using images, figures, or charts as auxiliary
information, and a nonspecific question as stem.
• For example, ‘which section of the figure below demonstrates. . . ?’
• Present auxiliary visuals as hotspot questions where the student
must click on an area of the image to indicate the correct answer
• For example, ‘select the area on the image which shows . . .’
• Present questions using a series of images, or a video
accompanied with conditional logic branching questions.
• For example, ‘at this point in the interaction, which question should
you ask the customer?’
Multiple choice?
41
43. • Present questions that require the student to apply a concept
or principle to an up-to-date scenario or case study.
• For example, ‘the VOICE legislation to hold a referendum went through
parliament a while back, but there were those that spoke against it.
What are the implication of the dissenting voices?’
• Present the answer
• Then get student to choose the appropriate question.
• Peer Assessments
• In this case get them to use ChatGPT then students have to check the
validity each others work
Cont…
43
45. • ChatGPT is not good at providing appropriate sources and quotations
(but some others can).
• So, engage students in writing practices focused on correcting factual
errors and locating accurate data sources (compare and contrast activity)
• Ask students cite and reference the work of others accurately and
properly by using in-text citations or including a bibliography.
• Ask them to critique a piece of writing generated by ChatGPT by
analysing and interpreting how it conveys an idea and assessing its
strengths and weaknesses in terms of readability, credibility,
comprehensiveness, accuracy and so on.
Take advantage of AI’s shortcomings
45
46. • Just imagine if our
students were being
productive
• That’s what their potential
employers are telling us
they want
46
Generated by Dall-E
49. • Bearman, M., Ajjawi, R., Boud, D., Tai, J. & Dawson, P. (2023). CRADLE Suggests… assessment and genAI. Centre for Research in
Assessment and Digital Learning, Deakin University, Australia. doi:10.6084/m9.figshare.22494178 Available from:
https://figshare.com/articles/online_resource/CRADLE_Suggests_Assessment_and_genAI/22494178
• Dastin, J., & Tong, A. (2023). Google, one of AI’s biggest backers, warns own staff about chatbots. Thomson Reuters Products.
16 June. https://www.reuters.com/technology/google-one-ais-biggest-backers-warns-own-staff-about-chatbots-2023-06-15/
• Gonsalves, C. (2023) On ChatGPT: what promise remains for multiple choice assessment? Journal of Learning Development in
Higher Education. Issue 27 (April). ISSN: 1759-667X
• Koplin, J., Sparrow, R., Hatherley, J., Rivers, N.,& Williams, I. (2023). Tailoring university assessment in the age of ChatGPT.
Monash University. 15 May. Available from: https://lens.monash.edu/@politics-society/2023/05/15/1385696/tailoring-
university-assessment-in-the-age-of-chatgpt
• Lee, J. (2023) Effective assessment practices for a ChatGPT-enabled world. Times Higher Education. 8 May,
https://www.timeshighereducation.com/campus/effective-assessment-practices-chatgptenabled-world
• Liu, D., & Bridgeman, A. (2023)., How can I update assessments to deal with ChatGPT and other generative AI? Assessment,
Educational integrity. Teaching tips University of Sydney. 23 January. Available from: https://educational-
innovation.sydney.edu.au/teaching@sydney/how-can-i-update-assessments-to-deal-with-chatgpt-and-other-generative-ai/
• Lodge, J., Howard, S., & Broadbent, J. (2023). Assessment redesign for generative AI: A taxonomy of options and their viability.
2 May. Available from: https://www.linkedin.com/pulse/assessment-redesign-generative-ai-taxonomy-options-viability-lodge/
• Lodge, J. (2023). Assessing learning processes instead of artefacts won’t be easy. 5 June. Available from:
https://www.linkedin.com/pulse/assessing-learning-processes-instead-artefacts-wont-easy-lodge/
• Sayers D. (2023). A simple hack to ChatGPT-proof assignments using Google Drive. Times Higher Education. 23 May.
https://www.timeshighereducation.com/campus/simple-hack-chatgptproof-assignments-using-google-drive
References
49