1. Students’ Expectations of
Learning Analytics
Alexander Whitelock-Wainwright, Dragan Gašević, and Ricardo Tejeiro
A.Wainwright@Liverpool.ac.uk
2. Aims
• Highlight the importance of stakeholder perspectives in learning
analytics services.
• Develop an instrument to measure students’ expectations of learning
analytics services.
3. Stakeholder Engagement
• Early engagement of stakeholders reduces likelihood of service
dissatisfaction (Brown, Venkatesh, and Goyal, 2014).
• Stakeholders should be engaged in learning analytics service
implementations (Ferguson, Macfadyen, Clow, Tynan, Alexander, and
Dawson, 2014).
• Limited instances of engagement (Tsai and Gašević, 2017).
4. Ideological Gap
• Learning analytics policies driven by managerial beliefs (Sclater,
2016).
• Not necessarily reflective of what students would expect.
• Discrepancies between beliefs (Ng and Forbes, 2009).
5. Student Beliefs
• Non-validated psychometric instrument (Arnold and Sclater, 2017).
• Dashboard features (Schumacher and Ifenthaler, 2017).
6. Instrument Development
• Student expectations of learning analytics services.
• Theoretical framework of expectations.
• Identified themes:
• Ethical and Privacy Expectations
• Agency Expectations
• Intervention Expectations
• Meaningfulness Expectations
7. Expectations
• Important feature of human cognition (Roese and Sherman, 2007).
• Framed as beliefs about the future (Olson and Dover, 1976).
• Ideal and predicted expectations (Thompson and Suñol, 1995).
8. Ethical and Privacy Expectations
• Students must be fully informed (Drachsler and Greller, 2016).
• Consent (Prinsloo and Slade, 2015; Sclater, 2016).
• Student perspectives (Slade and Prinsloo, 2014).
9. Agency Expectations
• Learning analytics to improve student support (Prinsloo and Slade,
2017).
• Students are agents in their own learning (Winne and Hadwin, 2012).
• Student-centred learning analytics (Kruse and Pongsajapan, 2012).
10. Intervention Expectations
• Identify at-risk students (Campbell, DeBlois, and Oblinger, 2007).
• Improve the student-teacher relationship (Liu, Bartimote-Aufflick,
Pardo, and Bridgeman, 2017).
• Dashboard expectations (Schumacher and Ifenthaler, 2017).
12. Instrument Development
• Initially created 79 items.
• Reduced to 37 items through peer review.
• Pilot test with 210 respondents (University of Edinburgh).
• 19 items retained and re-worded.
• Further roll-out with 674 respondents (University of Edinburgh).
• 12 item Student Expectations of Learning Analytics Questionnaire (SELAQ).
14. Model Validity
• 191 respondents from the University of Liverpool.
• Confirmatory Factor Analysis.
• MIMIC modelling – differential item functioning (Muthén, 1989).
The aims of our talk are to…
Highlight the importance of stakeholder perspectives in learning analytics services…
And to outline the development and validation of an instrument designed to explore student expectations of learning analytics services…
Taking what we know from information systems research… the inclusion of stakeholders in the design and implementation stages of a service… does reduce the likelihood of future dissatisfaction… as the service is reflective of what the stakeholders actually want…
In terms of learning analytics… researchers have called for higher education institutes to allow stakeholders to be involved with learning analytics service implementations…
But findings do suggest that the level of engagement from stakeholders… in learning analytics implementations has been low…
A notable example of limited student engagement has been the development of learning analytics policies… which tend to be led by what managers… researchers… and practitioners believe students want…
It is reasonable to assume that the intentions behind such policies are to improve learning performance… or to provide additional support…
Nevertheless… these may not be reflective of what students want from learning analytics…
The creation of a service which is not representative of student expectations… is an example of an ideological gap… which is a main cause of dissatisfaction…
To offset the possibility of creating an ideological gap… there is a need to explore student beliefs towards learning analytics services…
In the case of Arnold and Sclater… they sought to explore student attitudes toward the analysis and handling of educational data… the main issue here… and it has been highlighted by the authors… is that the instrument used has not been validated so their findings are questionable…
Schumacher has explored student expectations of dashboard features… this is an important step… but learning analytics is not predicated on the inclusion of this one tool… there are many aspects of learning analytics services…
Despite the issues mentioned… these examples present important steps in enabling students to have a say in learning analytics service developments…
Given the limitations of previous work… we sought to develop an instrument to explore student expectations of learning analytics services…
For the next few slides… I will…
Present the theoretical foundation on which the instrument is based on…
Then… I will discuss the four themes identified in the learning analytics literature… which guided the development of the questionnaire items…
Expectations are a fundamental part of human cognition… they influence motivation… and even the judgements we form…
They are also not too dissimilar from beliefs… where beliefs refer to a probability judgement between an object and an attribute… expectations are only discernible by the point in time on which this judgement is made…
In other words… expectations are framed as beliefs about the future…
But… the term expectation is general and does not allow for different types or levels of expectations…
Thompson… therefore… decomposed expectations into ideal and predicted… which refer to what individuals hope for… and what individuals believe they are most likely to receive…
Conceptualising expectations in this way will enable researchers to gain a better understanding of what students may desire from a learning analytics service… and also what they expect as minimum… thus through the use of segmentation procedures we can understand what features are most important to students…
There is a lot of literature in learning analytics that has discussed ethical and privacy issues…
In particular… the DELICATE checklist provides a series of advisory points to guide learning analytics implementations… one of which is to inform students about the procedures being undertaken… whether this is the type of data collected… the analyses carried out… or even how the findings will be used…
This then connects to the next point of consent… there has been debate concerning the extent to which students should consent to all components of learning analytics… or aspects such as interventions…
Even within this literature… these decisions are being made by researchers and practitioners… as opposed to asking students what they expect…
A good example of engaging students in this debate is presented by Slade… who found students to want universities to obtain consent before undertaking any learning analytics processes…
Together… this literature identifies a series of points that requires the input from students
Agency expectations refers to learning analytics services not creating a culture of passivity in higher education…
At the end of the day… students are responsible for their own learning… they are active agents who set their own goals and strategies… it is not for learning analytics to remove the ability of students to make their own decisions…
There are occasions where institutions may offer additional support… such as providing regular updates on how students are progressing towards particular goals… but we should always be mindful of whether students expect to make their own decisions based on the analyses provided…
Learning analytics has previously been dominated with research attempting to identify at-risk students… which then leads to the implementation of early interventions that should deter students from dropping out… although research findings have said this is usually not the case…
Beyond merely predicting dropouts… learning analytics has moved onto improving other aspects of education… for example… trying to improve student-teacher relationships by enabling tutors to better understand how students are performing and whether issues are present…
As with agency expectations… researchers are making inferences about the type of service students would like to receive in exchange for the disclosure of personal information… we don’t know what type of services students want…
There has been progress with the investigation in dashboard feature expectations… but more work is need
Meaningfulness expectations refers to the feedback from learning analytics services… being relevant to students… in order to promote a positive change such as motivating learning…
As with information systems… perceptions of usefulness are intrinsic to its acceptance…
If the feedback provided through learning analytics services is not pedagogically meaningful to students then they will not use it…
Therefore… we have included this theme to cover student beliefs about the applicability and relevance of learning analytics feedback to their learning…
Using the identified themes… we created 79 items that were subjected to peer review…
This process led to 37 items… which were used in a pilot study of the instrument at the University of Edinburgh…
210 students completed the questionnaire and provided qualitative feedback on each item…
The results were then analysed using factor analysis… which left 19 items… the qualitative feedback also identified some issues with the wording so additional changes were made…
These 19 items were then used in a final roll-out to all students at the University of Edinburgh…
We received 674 responses… which we again analysed using factor analysis…
This analysis did identify problems with certain items… such as cross-loadings and multicollinearity problems… these were removed which left us with the final 12 item student expectations of learning analytics questionnaire…
The final 12 items can be explained by a two-factor solution… these factors are ethical and service expectations…
And these factors are applicable to both the ideal and predicted subscales…
The 12-item instrument was then distributed to students at the University of Liverpool…
191 responses were collected…
The results for both scales show that the two-factor structure has an adequate fit… which strengthens the validity of the instrument.
Although the purported factor structure does have an adequate fit… it is important to understand whether the instrument is invariant across sub-groups…
Up until now… there has been no research exploring student perspectives of learning analytics across various subgroups… such as gender… or faculties…
The use of MIMIC modelling… or multiple indicator multiple cause modelling… allows researchers to explore whether there is differential item functioning within a instrument through the addition of direct effects of covariates on factor scores… and items of interest…
So we may find that the female students have higher ethical expectations overall…
This approach is important… as it can be used to overcome the limitation of researchers assuming that all student groups hold the same expectations toward learning analytics services…
Rather… we can start to tailor policy decisions to meet the needs of various sub-groups…
Therefore… for those who may want to use this instrument… the use of MIMIC modelling can enable a greater understand of whether expectations of learning analytics are invariant across groups or not…
The next steps in the project are to assess the reliability and validity of the instrument cross-culturally…
So… to achieve this we have distributed the questionnaire at Tallinn and the netherlands…
We are still waiting to run the survey in madrid…
From this data collection we will be able to assess whether expectations change across different institutions… it will also enable a greater number of universities to utilise this tool for their implementation of learning analytics…