Comparing Studiosity with other forms of Academic Support – An ‘ecosystem’ of student support services.
Jennifer Lawrence, Program Director, University of New England
Students First 2020 - Usage and impact of academic support
1. Usage and impact of academic
support
Jennifer Lawrence
Program Director Academic Success
@JennyALawrence
https://www.linkedin.com/in/jennyalawrence/
Jennifer.Lawrence@une.edu.au
4. Academic Support @ UNE
Academic
Support
External
(Studiosity)
Internal
(Academic
Skills Office)
Peer
Tutoring
RESTART &
PREP
First Year
Advisors
Tutors & Unit
Coordinators
5. Academic Support @ UNE
Data:
• January 2017 – December 2019 (note this splits UNE’s
summer Trimester 3)
• Just UG students
• Sources on usage:
– Studiosity dashboard exports
– Academic Skills Office booking system and SRM emails
– Vygo peer tutoring pilot
– RESTART & PREP SRM emails
~35,000 Students over this period of time
~112,000 Students trimester enrolments
8. Student use
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Awareness of the
serive
The timing it was
available
Previous
experience with
the tool (good or
bad)
Whether friends
had accessed it in
the past
Whether the
student thought it
was easy to use
Whether it was
provided by
internal UNE staff
or an external
service
Whether it was
focused on
subject specific
material
If it was
recommended by
a unit coordinator
Influencing factors
Not important Somewhat important Important Really important
Data Source: Internal UNE student survey conducted in late 2019 (N=781; ~5%)
9. Student use
• Students are pragmatic and habitual in their choices
– Influenced by practicalities, not friends
– Reluctant to change their habits
• Different types of students seek help from different
sources
• Expectation setting for both staff and students is key to
satisfaction, but challenging
• Opposite to findings of other studies (with on-campus
students) showing the ‘shop around’ academic support
services and are deeply influenced by friend
recommendations
11. Impact of support – Success
Stats
• One-way ANOVA
P = 0.007
N = 112,465
Conclusions
• We’re very sure
that students who
seek help do better
Analysed using Excel & SPSS
12. Impact of different types of
support - Success
Stats
• One-way
ANOVA
Lowest P ~0.8
N = 112,465
Conclusions
• Different types
of support
show mixed
impacts
Analysed using Excel & SPSS
13. Impact of support - Retention
Stats
• Binomial Test
P = 0.03
N = 79,677
Conclusions
• We’re reasonably
sure that students
who seek help are
more likely to
persist with their
studies40
50
60
70
80
90
2017T1 2017T2 2017T3 2018T1 2018T2 2018T3
Student progression between Trimesters
1 Trimester (Y) 1 Trimester (N)
1 or 2 Trimesters (Y) 1 or 2 Trimesters (N)
Analysed using Excel & SPSS
14. The BIG question
Correlation or Causation?
Are we just helping the
students who would succeed
anyway, or do these services
make a difference?
15. Comparison across Trimesters
Stats
• One-way ANOVA
P = 0.02
N = 12,459
Conclusions
• Support makes a
difference to
student success
outcomes
Analysed using Excel & SPSS
S – Supported NS – Not Supported
16. Summarising conclusions
1. Different students seek academic support from different
places for pragmatic reasons, and tend to stick to that
choice over time.
2. Students who access academic support, regardless of
what type, do better and are more likely to be retained.
3. Academic support services make a measurable
difference to student outcomes – they don’t just support
students who ‘would succeed anyway’.
18. Usage and impact of academic
support
Jennifer Lawrence
Program Director Academic Success
@JennyALawrence
https://www.linkedin.com/in/jennyalawrence/
Jennifer.Lawrence@une.edu.au
Editor's Notes
Mid-sized university, but highest proportion of both online and part-time students of any institution (90% of study at least one unit online)
Also the most mature student cohort in the country (Utas gets close – more seniors, but also more school leavers, CQU/CSU/SCU also close)
Multiple avenues for academic support at UNE
I manage the Internal ASO team, our Peer tutoring endeavours, and the external Studiosity contract personally. I also have data on our RESTART and PREP interactions. However, student interactions with First Year Advisors and tutors/unit coordinators are not collected consistently so I don’t have data on these.
Graph showing relative volumes of usage
We started to wonder what proportion of these students were using more than one service – as in, which of these are overlapping, and to what extent?
A greater proportion of students are seeking help from these services each year, but they tend to pick one or the other and not use both. This is where it got really interesting!
Demographic analysis showed that:
International students were more likely to seek help from ASO Appointments and Studiosity Writing Feedback, but Domestic students were most likely to use Studiosity Writing Feedback.
All modes of students were most likely to use Studiosity Writing Feedback, but On-Campus students were much more likely than Online students to seek help from the ASO.
Non-mature students (below 25) were more likely to seek help than mature students overall. Mature students were more likely than non-mature students to see the ASO, but both were more likely to use Studiosity than ASO.
Commencing students much more likely to seek help than continuing students. That is, students are most likely to seek help in their first year, then less likely after that. This last one is likely a result of advertising support services specifically to commencing students during orientation, so any new service takes several years to grow with the cohort.
Analysis showed that only a very small proportion of these students switch services within or between trimesters – that is, students pick a support service and tend to stick with it rather than shop around.
Survey results
So we asked them!
Overall
What did we learn? What factors influenced this?
Pragmatism – logistics, timing, perceived ease of use. Don’t care about friends recommendations.
Broad impact on success of seeking help, using ANOVA (Analysis of variance). For those interested in statistics I’ve included the technical terminology, but for those who aren’t I’ll explain a little further.
The ANOVA test compares the mean (average) marks across the different groups of students (those who sought help from each service, those who didn’t) and tells us whether there’s a significant difference between them. In this case I looked at over 100,000 student trimester enrolments (this is our N score, or the number of cases we’re examining – a larger sample means we can have greater confidence in the results, and because we’re looking at all undergraduate students at UNE over 3 years this is a very large sample set), and their average mark across their units in those trimesters. In this case we can see what appears to be a difference visually, but the ANOVA tells us whether this is truly a difference or whether the variation could be nothing more than a coincidence. A ‘P’ value of less than 0.05 is considered significant (that is, there’s a >95% chance the difference in grades is because of help seeking), and here we see a value much lower than that. It’s worth being aware, though, that the ANOVA test is very generalised – it tells us whether there’s a needle in this haystack but won’t tell us where, or what particular factors contribute to that difference.
Impact on success/grades across different services
This graph shows the same data, but instead of categorised throughout time (showing the different trimesters) I’ve categorised it by the type of support services students accessed. For example, a small number of students accessed all available academic support – and all of them passed. Highlighted in blue for comparison is the group who didn’t seek support from these services. You can see that some accessed only one type of support, or both Studiosity services, or the writing feedback via both Studiosity and ASO. Looking across this graph you can see a high degree of variability – in the previous graph it was pretty clear even visually that the group who got support did better than the group that didn’t, but here we’re seeing huge variation. What this means is that whilst we know students who get support do better than those who don’t, there’s not one particular type of support that shows clear benefits over any of the others, with the possible exception of saying that students who seek support from several different services (these columns on the far left) tend to have a higher pass rate than those who only use one service, or who use none. And that visual check is bourne out in the statistical analysis – just like for the previous model, with this one the P test tell sus whether there is statistical significance to these results, and the P test results for each category here were quite high (the lowest category was in the 0.8 range, well above our desired 0.05) indicating there’s nothing particularly significant there.
The next step from this analysis would be to examine if particular types of students (perhaps by demographic factors) get better results from particular types of support – perhaps if you’re a mature, FIF student studying business the writing support offered by Studiosity will be more likely to lead to success than booking an appointment with ASO. I haven’t yet gone that far, but I hope to
so we can tailor our recommendations to students.
Broad impact on retention of seeking help (regardless of service)
For each student enrolled in a particular trimester for study I checked whether they were still enrolled the next trimester, or the one after that (to account for students who routinely don’t study over Trimester 3 during summer), and compared this with their help seeking. Rather than looking at unit level withdrawals prior to or after census date, this is looking at whether students progress and persist with their studies in the trimester after they sought help. This is likely related to success (a student who passes a prerequisite subject is more able to progress to the next trimester, regardless of whether they want to persist). Note that the data for this examines students who studied from 2017T1 through to 2018T3, and their progression on into 2019T1 and 2019T2. So this is a smaller sample set (N is lower), but looks at the same period of time. The P value for this test, looking at persistence on into 1 or 2 trimesters into the future, was 0.03 – lower than our usual 0.05 threshold, so enough to be significant, but not to the same degree that we saw with success impacts. So we’re reasonably sure there’s an impact that isn’t just coincidence, whereas with the tests looking at grades we were very very sure.
So then I ran the same test I did for the success impact study – comparing the different types of support to see if there were particular patterns that led to retention. And I got the same result – I won’t show this chart as it’s quite a messy one, but suffice it to say that it was very similar to what we saw with student success. The numbers were extremely variable, in both a positive and negative direction, with no statistically significant connections. Essentially, there was no clear story there except the overarching narrative we’re seeing that support, regardless of the type of support, leads to better retention and success. So we’re very sure that students who seek help are more likely to persist with their studies, but it doesn’t matter much what kind of support. Again, there may be variation based on the type of students seeking different types of support, but I haven’t gone that far down the rabbit hole yet.
This has proven difficult to ascertain at UNE – some previous studies at other institutions have used ATAR or entry scores to make this comparison, but most UNE students are mature and enter the university via different mechanisms. I considered looking at just the students who entered with ATAR and comparing those who sought support with those who didn’t, but at that point it started getting down to a very small sample size (N of less than 5000) which didn’t give me the same degree of certainty I’d become used to with the previous studies. It also occurred to me that I’d be counting out UNE’s most significant cohort, and potentially end up using whatever findings I made to make decisions about support for that majority cohort that were based on data from the minority cohort.
In the end, I started comparing the same students across different trimesters of their study – if they accessed help in one trimester but not in the next (or vice versa) did it have an impact?
Compare the same students in different semesters (ones who started with help, then abandoned it; and those who started with no help then found it later)
For this test, I looked particularly at 2019 Trimester 1 and 2, just for ease of data manipulation. By this stage I was running reasonably complex formulae to get my data in the right format, and with over 100,000 entries that was beginning to be difficult to manage, so I limited my data set to take a closer look. 2019 was the year we had the highest proportion of students accessing our support services, as by then more students were aware of them, so this gave me a more balanced data set.
When I ran the numbers you can see some variation just in the chart here – the students who went from accessing support to not accessing support (the first box plot) grades were more negative, whereas the ones who went from no support to accessing support don’t show that level of skew. The ones who stuck with their pattern of support (particular no support, which was by far the largest group) were more balanced – both positive and negative. So I ran the ANOVA check to test for significance. And they were! Most of these had such a low P value that it didn’t register at my settings (4 decimals), a couple came in at 0.02. This means that in each of these categories we see a very different distribution of grade changes from trimester to trimester – the change in the pattern of student behaviour in terms of accessing support has an impact in almost all directions (be it positive or negative). The only one that was higher was the comparison between those who consistently accessed support, and those who changed from no support to accessing support. This was a bit of a curiosity to me – on every other comparison we see significant differences, but for students who started off with no support then later started accessing it we don’t see a difference with students who were accessing support all along. When I thought about this, it essentially means the students who were not accessing support to start with but then do are the ones who ‘would succeed anyway’, and the students who accessed support all along are essentially using that support to ‘catch up’.
So what does all this mean? This is the beginning of establishing causation. Looking at the same student over time, whether or not they access support has a significant impact on their success. This means the support services are changing things for these students – the students accessing support see a different outcome BECAUSE of that support. It’s not that the students who are prone to seeking help will succeed anyway, the support can change things.
So what does this mean?
Firstly, we’re not barking up the wrong tree in spending $000’s on support services, they do make the differences we hope they will in terms of supporting students success and retention.
Secondly, there are implications here for how we evaluate such services and make decisions about them – rather than comparing an internal academic skills support office to external support services such as Studiosity, say, we should see the whole suite of support services as an ecosystem. We have a diverse student body, we need a diverse set of tools to support them. Our focus in managing such systems should be on making sure we have the right mix and balance of such services, that there are no gaps between what the whole ecosystem provides and what our students need, and that they are well informed of what each service offers so they can make effective decisions about how and when to seek help.