All ArticlesDigital LearningTeaching and Learning

Learning Analytics

The growth of mobile and digital technology has changed the way we access, deliver, participate, and support learning in Higher Education. More students are going online to access resources, share their work, collaborate with others, submit their assignments, and access their schedules. Likewise, more tutors are incorporating technology into their practice and adopting a flipped or blended learning approach. Meanwhile, the systems enabling these developments are becoming increasingly interconnected at a time when rising enrolment is intensifying the demands on university services. Taken together, these trends create not only an opportunity but an imperative to use student generated information to support learning and drive innovation in university services.

This is the premise of learning analytics which looks to use student generated data to inform decision-making at every level of the university. It was also the topic of a recent conference I attended at the University of Gloucestershire back in November. At the conference representatives from Gloucestershire, Greenwich, the OU, and South Wales spoke about their experience designing, testing, and implementing analytical systems in their colleges.

Although the presentations primarily focused on practical considerations, each dealt loosely with the ethical and philosophical debates around collecting, measuring, and analysing student data. At the core of the debate is whether learning analytics should be descriptive, predictive, or prescriptive. Descriptive analytics aggregates and relays information without prescribing action or processing it through an artificially imposed filter like an achievement score. Predictive analytics examines trends in the data and forecasts potential behaviours and outcomes which may require intervention. Prescriptive analytics models a student’s learning arc and compares it against a prescribed outcome. It then makes specific recommendations either to staff or students to help keep learning on track.

Currently, the lack of rich data sources combined with poor online engagement means prescriptive analytics is just not viable at scale, which is why most debates focus on descriptive and predictive modelling. At the Gloucestershire conference two examples stood out for me as representative of these two approaches – Greenwich and the University of South Wales. What makes these two unique is that they are both using the same software developed by JISC, the Joint Information Systems Committee, to achieve very similar ends but through different means. USW has chosen a descriptive approach while Greenwich aspires for more predictive modelling. Both colleges are focused on student retention and completion; however, the USW system simply captures and presents data to personal tutors while the Greenwich platform opens access to both students and staff, providing an ‘attainment score’ and suggesting additional support when predicted to be necessary.

The rest of my post will focus these two colleges and the progress they’ve made.

University of South Wales

The University of South Wales partnered with JISC to develop the Data Explorer tool and Study Goal mobile app. Data Explorer is a highly-customizable analytics dashboard that aggregates and displays information from multiple systems. At USW, this tool was used by personal tutors to gain data-driven insights into their students. It also helped them prepare for coaching sessions by drawing on the students’ attendance data, online activity, and goals recorded in the app. Researchers at USW found that a significant portion of staff consider Data Explorer efficient, convenient and an excellent resource for authentic dialogue with their students; however, a further review also revealed that the majority of students don’t engage enough online to provide the rich data needed for meaningful guidance and support. The next stage of USW-JISCs project will be to develop a student-facing dashboard that hopefully will help address this problem by making their engagement levels explicit and getting students more involved in learning design.

University of Greenwich

The University of Greenwich also partnered with JISC to introduce Data Explorer to their college; however, unlike USW, Greenwich’s goal is to use learning analytics to predict student achievement. Sutdents are given access to Study Goal, a mobile app which records their attendance, their engagement, and their grades. This information is filtered back to students in the form of an attainment score on the app while their teachers and personal tutors have access to the same information through the Data Explorer Dashboard. As students progress through their course they may periodically receive emails and notifications from the app suggesting additional support or resources. They may also be contacted by their personal tutor to discuss their progress and wellbeing or schedule meetings to address any gaps in their attainment score. The kinds of data Greenwich collects includes background information for statistical purposes like name, date of birth, gender, and ethnicity, as well as details about courses, modules, assessments, grades, activity in Moodle and other digital spaces, attendance, library usage, and responses to surveys. Like USW, Greenwich aims to make all this information accessible through a student facing dashboard.

Final Thoughts

Whether you agree or disagree with either approach, learning analytics has a lot of potential to benefit university staff and students. Systems like these could potentially help review and refine the way the curriculum is delivered by identifying valuable forms of online engagement. It could also provide useful insight into the success of particular teaching, learning, and assessment strategies.

Using readily available data helps us answer questions like ‘what happened’ when a course fails to achieve its goals or succeeds beyond all expectations. It could also help us identify ‘what is happening now’ when we are trying to evaluate our students’ progress. Unfortunately, university resources rarely increase at the same rate as enrolment and the larger and more digital classes get the more challenging it is for tutors and support staff to monitor learning and wellbeing. However, with learning analytics, teachers, counsellors, and personal tutors gain a window into learning behaviors that are less detectable online.

I am an enthusiastic, though hesistant, supporter of LA, and I firmly believe that human judgment and expertise should be supported, not supplanted, by algorithmic reasoning. LA can help us identify trends and correlations, but its best use will be to help us ask the right questions and pursue the right paths whenever a new challenge arises.

Leave a Reply

Your email address will not be published. Required fields are marked *