The document discusses the experience and background of some members of a group related to using structured data, documents, predictive modeling, and clinical decision making. It outlines areas to explore for a learning analytics project, including what systems or software exist, how to reduce the time taken, and how to provide personalized options. It then discusses the group's work in various areas like data assurance, business intelligence strategy, and performance support.
1. Context
No previous experience of learning analytics
Experience includes:
using structured data fields, documents and
data assurance
working with European and international
partners on standards (some group
members)
experience of predictive modelling in drug
discovery (some group members)
experience of clinical decision making (some
group members)
2. Where to start? Detectin
these clic
g pattern
s–
ks – are t I’ve done all
does the he
? informat y patterns -
ms or software ion exist?
e
What syst
If system feedback – btw just to let you know ….that
would be useful. You originally knew at the time but
have filed away – can it retrieve for you?
Do w Different ways of learning –
e have some people like to read first,
any le some prefer trial and error rather
a rning
What is the value in this data? than reading, some read part
then do some training, some
exploration of gathering data
if we present a paper and ask don’t want to do training – don’t
for funding but senior want to be embarrassed by
management say no ? e time making mistakes in front others
How to reduce th tions
rsonalised op
taken – pe
nd , so
e recomme
How do w
in good clinic
al practice; d
had been rem ouble checkin
?
uch is f2f
oved with an g
assessment a
nd each pers
onymised m
responsibility on taking
for each step
Is it learning or performance data?
5. Data Assurance
• Input into creation of data audit categories for
PIL, SPC based on research of error types and
previous survey analytics
• Reviewed error types against internal helpdesk
data and anecdotal feedback from agency system
champions network
• Information sorting of anecdotal feedback about
help when reviewing and processing agency data
6. Business Intelligence Strategy
• Input into requirements gathering, prioritising areas
that could/couldn’t be covered by BI (e.g. types of
text analysis, clicks, ratings, feedback, visualisation )
• Explored alternative options for visualising analytics
Performance Support
• Input into requirements gathering
• Explored creation & comparison of variables and fields
types to analyse whether something is right or wrong
• Review of language used in agency discussions and
surveys
7. Data Sources:
•Survey Monkey over 5 year period (CSV – text –
text analysis tools)
Training evaluations 439 responses
Systems Feedback survey 313 responses
Performance Support survey 10 responses
•PS interviews - 55 pages, 26949 words
•Tools: tagcrowd; onlineutility
Conclusion! insufficient for
identification of MHRA language trends
8. What worked well
• Feeding into multiple projects at the same time,
avoiding duplication and/or silos
• Time to raise questions and discuss openly in a group
• Variables example to understand the process
What could be improved
• Schedules challenging for f2f meetings (online tools…)
• Access to data sources
• Access to analytic tools – text analysis process slow
9. Where next
• Learning analytics group unfolding into wider cross agency
Learning Technologies network (session - 26/04/13)
• Representation in final stages of performance support
procurement (analytic capabilities)
• Areas for future discussion:
– data literacy compared to making things easier for users
– ethics including identification of people from anonymised datasets
– where / how we record anything from a pre-learning discussion e.g. I
think I’m going to be able to do x, x & x afterwards and how
• BI timescales & capabilities; on-going options to explore
other learning analytics tools separately
10. Where next
• Learning analytics group unfolding into wider cross agency
Learning Technologies network (session - 26/04/13)
• Representation in final stages of performance support
procurement (analytic capabilities)
• Areas for future discussion:
– data literacy compared to making things easier for users
– ethics including identification of people from anonymised datasets
– where / how we record anything from a pre-learning discussion e.g. I
think I’m going to be able to do x, x & x afterwards and how
• BI timescales & capabilities; on-going options to explore
other learning analytics tools separately