1

How can DIT academic staff use Webcourses data
and reporting to make better informed decisions
around student learning?
» The research is can be seen as a an evaluation of the data analysis tools within
Blackboard Learn. - What information can we glean from these analytical
features?
» Explore how engagement with course modules compares with
module grades
Instructor View in BlackBoard.
Test Module created for the
purposes of workshop

» This study will focus on the following features
– Module Reports (Real data)
– Performance Dashboard
– Retention Center

2
Submission of
documentation
to REC helped
finalise project
plan

Key to REC
approval
3
Literature Review

•Educause Learning Initiative (Oblinger, Brown)
•Papers submitted to the annual LAK – Learning and Analytics Knowledge
Conference 2011/2012/2013

•The Educause Annual Conference 2012/2013 , Waiting on 2013 (120 days)
•Other Key Players – Dawson, Siemens , Oblinger ,Brown, Elsa,Campbell,Mc William,
•ECAR 2013 – Discusses students lukewarm attitude towards learning analytics
•Previous data mining projects within LMS
•Purdue University – Very much the flagship project fro L.A (Campbell)
•The Indicators Project- Conducts research into the analysis of research data within a
LMS (Beer , Clark, Jones) - Click activity? Critical of some of these studies as
debatable if click activity is indicator of LMS activity.
4
Previous studies on LMS data
The project identified a positive correlation between student participation In online discussion forums and final
academic performance. (Macfadyen & Dawson, 2010)
Active site engagement with LMS can serve as an effective predictor of course outcomes (Smith, Lange, & Huston
2012, p. 60) ;(Dawson, Mc William, & Tan (2008, p. 227)
Other Readings
2013 ECAR Report (Dahlstrom, E., Walker, J., & Dziuban, C. (2013)
7/10 HEI see learning analytics as a major priority but only 10% of HEI collect system generated data needed for
analytics
Discusses Ethical and Privacy Issues

Highlighted some considerations when I was submitting to REC
Discusses student’s lukewarm attitude to learning analytics
Openness and transparency – Adhere to good ethical guidelines/information privacy guidelines
Personalised outreach not impersonalised digital profiling
ECAR 2012/2013 –LMS listed in top three for preferred method of communication along with face to
face interaction and email. (Dahlstrom, E., Walker, J., & Dziuban, C. (2013)

ECAR 2013 Discusses student’s lukewarm attitudes towards learning analytics (Dahlstrom, Walker, & Dziuban, 2013)
LAK 11 -> Raise deep and complex privacy issues Perception of a digital big brother (Brown , 2013) ,(Ferguson
,2012) (LAK 11 Educause, 2011), (Prinsloo & Slade, 2013)

5
Other Lit Review Findings

Lit review highlights numerous studies involving mining of LMS data using third
party software outside of the LMS such as SAS, SPSS,Business Objects,
Oracle,Student Explorer etc. - Can be extremely difficult.
Very few studies focus on the inbuilt reporting features of LMS.

Lack of research into inbuilt reporting features within LMS

Commercial systems’ reportage of data is “basic and not intuitive”.
“The current visualisation mechanisms available in BlackBoard 8.0 and BlackBoard
Vista are limited in scope and difficult for teachers to readily interpret and action.”
Dawson, S., & McWilliam, E. (2008).

Other research studies have indicated possibilities for course
redesign
6
» Snowball sampling /referral sampling technique to identify staff
participants (Next step -Circulate information sheets and consent
forms) (Nov)
Mixed Method
» Quantitative Analysis conducted at end of Semester one on 4-5
modules via module reporting. Engagement within course
modules will be compared with assessment results. All data deidentified.(Jan)
» Resource – Workshop for staff demonstrating (Feb)
 Module Reports
 Performance Dashboard
 Retention Center
7

» Interview staff participants in March 2014 (Qualitative)
John Campbell identifies these factors within LMS as highly predictive of student success
(Feldstein, 2013)

Dummy data provided
during workshop to
demonstrate analysis 8
features
Module Reports (Based on 4-5 DIT Course Modules)

Performance Dashboard

9
Target Journals
» Journal of Information Technology Education
» International Journal of Technology, Knowledge and
Society
» MERLOT Journal of Online Learning and Teaching

Target Conference
» LAK2015 (5th Learning Analytics and Knowledge
Conference 2015)
10
Eportfolio
» Currently migrating e-Portfolio from Mahara to Yola

11

M Sc Applied eLearning - WIP Presentation

  • 1.
    1 How can DITacademic staff use Webcourses data and reporting to make better informed decisions around student learning?
  • 2.
    » The researchis can be seen as a an evaluation of the data analysis tools within Blackboard Learn. - What information can we glean from these analytical features? » Explore how engagement with course modules compares with module grades Instructor View in BlackBoard. Test Module created for the purposes of workshop » This study will focus on the following features – Module Reports (Real data) – Performance Dashboard – Retention Center 2
  • 3.
    Submission of documentation to REChelped finalise project plan Key to REC approval 3
  • 4.
    Literature Review •Educause LearningInitiative (Oblinger, Brown) •Papers submitted to the annual LAK – Learning and Analytics Knowledge Conference 2011/2012/2013 •The Educause Annual Conference 2012/2013 , Waiting on 2013 (120 days) •Other Key Players – Dawson, Siemens , Oblinger ,Brown, Elsa,Campbell,Mc William, •ECAR 2013 – Discusses students lukewarm attitude towards learning analytics •Previous data mining projects within LMS •Purdue University – Very much the flagship project fro L.A (Campbell) •The Indicators Project- Conducts research into the analysis of research data within a LMS (Beer , Clark, Jones) - Click activity? Critical of some of these studies as debatable if click activity is indicator of LMS activity. 4
  • 5.
    Previous studies onLMS data The project identified a positive correlation between student participation In online discussion forums and final academic performance. (Macfadyen & Dawson, 2010) Active site engagement with LMS can serve as an effective predictor of course outcomes (Smith, Lange, & Huston 2012, p. 60) ;(Dawson, Mc William, & Tan (2008, p. 227) Other Readings 2013 ECAR Report (Dahlstrom, E., Walker, J., & Dziuban, C. (2013) 7/10 HEI see learning analytics as a major priority but only 10% of HEI collect system generated data needed for analytics Discusses Ethical and Privacy Issues Highlighted some considerations when I was submitting to REC Discusses student’s lukewarm attitude to learning analytics Openness and transparency – Adhere to good ethical guidelines/information privacy guidelines Personalised outreach not impersonalised digital profiling ECAR 2012/2013 –LMS listed in top three for preferred method of communication along with face to face interaction and email. (Dahlstrom, E., Walker, J., & Dziuban, C. (2013) ECAR 2013 Discusses student’s lukewarm attitudes towards learning analytics (Dahlstrom, Walker, & Dziuban, 2013) LAK 11 -> Raise deep and complex privacy issues Perception of a digital big brother (Brown , 2013) ,(Ferguson ,2012) (LAK 11 Educause, 2011), (Prinsloo & Slade, 2013) 5
  • 6.
    Other Lit ReviewFindings Lit review highlights numerous studies involving mining of LMS data using third party software outside of the LMS such as SAS, SPSS,Business Objects, Oracle,Student Explorer etc. - Can be extremely difficult. Very few studies focus on the inbuilt reporting features of LMS. Lack of research into inbuilt reporting features within LMS Commercial systems’ reportage of data is “basic and not intuitive”. “The current visualisation mechanisms available in BlackBoard 8.0 and BlackBoard Vista are limited in scope and difficult for teachers to readily interpret and action.” Dawson, S., & McWilliam, E. (2008). Other research studies have indicated possibilities for course redesign 6
  • 7.
    » Snowball sampling/referral sampling technique to identify staff participants (Next step -Circulate information sheets and consent forms) (Nov) Mixed Method » Quantitative Analysis conducted at end of Semester one on 4-5 modules via module reporting. Engagement within course modules will be compared with assessment results. All data deidentified.(Jan) » Resource – Workshop for staff demonstrating (Feb)  Module Reports  Performance Dashboard  Retention Center 7 » Interview staff participants in March 2014 (Qualitative)
  • 8.
    John Campbell identifiesthese factors within LMS as highly predictive of student success (Feldstein, 2013) Dummy data provided during workshop to demonstrate analysis 8 features
  • 9.
    Module Reports (Basedon 4-5 DIT Course Modules) Performance Dashboard 9
  • 10.
    Target Journals » Journalof Information Technology Education » International Journal of Technology, Knowledge and Society » MERLOT Journal of Online Learning and Teaching Target Conference » LAK2015 (5th Learning Analytics and Knowledge Conference 2015) 10
  • 11.
    Eportfolio » Currently migratinge-Portfolio from Mahara to Yola 11