A presentation about using ChatGPT and other Large Language Model based tools in as a student in education. Specifically targeted at students of the Hochschule Magdeburg-Stendal
ChatGPT for academics - guide to using large language models safely in studies
1. Large Language Models
and How to Tame Them.
Using ChatGPT in academic settings.
by David Döring
AI.Analytics
h2.de/ZAKKI
2. Poll
●
Please answer the questions.
●
What do you think about, when you hear ChatGPT?
●
Have you used ChatGPT (or related tools) for studying or
academics?
●
How good were the Results in your opinion?
●
Please use the Q&A Tab to collect upcoming questions.
●
My Link
3. Outline
●
What is this about?
●
Technobabble
●
Application Recommendations
●
Risks and Side Effects
●
Discussion and Questions
7. LLM – Large Language Model
●
Are AI-Models that produce natural language
●
AI-Models are purely statistical functions
●
LLMs use Billions of “Parameters” = Large
●
LLMs require more calculation power than a single
computer can handle
●
LLMs usually are Cloud-Services
14. GPT: Pretrained
●
Pretrained on huge
amount of text
●
Fine-tuned to Chat
●
Improved through
rating system
●
Adaptable to new
applications
15. Fine-Tuning
OpenAI employees created or gathered many example Dialogs
to „teach“ human help-desk communication.
Please explain the transformer achitecture.
Yes, I could.
Could you explain this in simple language?
The transformer architecture is a deep learning model that has gained significant
popularity and success in various natural language processing (NLP) tasks. It was
introduced in the paper "Attention is All You Need" by Vaswani et al., published in 2017.
The core idea behind the transformer is the use of self-attention mechanisms to weigh
the importance of different words in a sentence, enabling the model to capture
relationships and dependencies regardless of their positions within the input sequence.
16. Reinforcement Learning
Please explain the transformer achitecture.
1: ... 2: ... 3: ...
4: ... 5: ... 6: ...
Which Answer do humans prefer?
Employees and Users rank and rate answers.
17. GPT: Transformer
„Attention is All You Need“ – 2017 Google Brain/Research
Attention-Layer
f(x,y) = ( M × y ) × x
Bildquellen:
https://www.tensorflow.org/text/tutorials/transformer
https://arxiv.org/abs/1706.03762
Self-Attention
f(x) = ( M × x ) × x
casual
Self-Attention
19. GPT – What does it do?
●
handles text as tokens
●
predict most likely token
●
requires context
●
generates text that look and feel correct
20. GPT – What does it NOT do?
●
Use the internet for correct answers.
●
Fact-Check
●
obtain up-to-date information
●
Actually perform given tasks like
– „calculate“
– „research“
– „imagine“
21. How can we use it anyway?
That doesn’t sound usefull for academics. So...
22. GPT as a product is an API
…
API:
Application
Programming
Interface
Bing Chat
Microsoft
Copilot
23. Use the correct tool for the job!
●
ChatGPT on chat.openai.com is likely not the
tool you need
●
Many free tool use GPT-3.5
●
Many paid tools use GPT-4.0
●
Use tools that can reference sources.
●
Prefer tools that protect your data rights.
24. What can we use LLMs for?
Getting started /
Beating Procrastination
Research and
Summaries
Gathering Ideas
Programmierung &
Formatierung
Improvement Text to Presentation
Review
Roleplay
(e.g. Exam situation)
Translation
26. What are we allowed to use?
●
The use of AI-generated text in your work is permitted.
●
Use the official Declaration of Originality from H2 (EN; DE)
●
mark AI-generated or enhanced content
●
reference with "supported by …”
●
include section “overview of resources used”
●
how you used these tools, including date and settings
●
review AI-generated content.
●
Mistakes made by using LLMs are your mistakes!
27. Known Problems
Bias and Stereotypes lack of education
loss of basic skills
„Hallucinations“
confidently incorrect
Traceability unclear
Quoteability
complicated
Data Protection
Inequality
Financial Burden
Lack of Transparency
Rights and
Regulations unclear
Add more of your own context.
Ask for Reasoning. Fact Check.
Use tools with added context.
Start quick but spend more time.
Check the FAQs of Apps.
Spend money for better results.
Support Open-Source.
28. Better Results
Repeat Stuff
Work Interactively
Add Knowledge and Context
Improve your Prompt
●
GPT-3.5 Context Size: 4,097 tokens
●
“short memory”
●
Important knowledge needs to be repeated
●
Start with known good Prompts
●
Github “Awesome ChatGPT Prompts”
●
Roleplay. Start by describing it’s role.
●
Provide information from citeable sources.
●
Use Tools that Automate that.
●
Perplexity.AI, ChatPDF.com, Bing Chat, Bard …
●
Repeat and refine requests
●
Use tools that allow reworking and editing
●
Notion.so, Microsoft Office 365, …