Skip to main content

Learning analytics - the pitfalls of prediction

Photo of classroom of students, superimposed with graphic of interconnected computers.

Picture yourself as a teacher, with a class of new university students in front of you. You’ve been teaching for 20 years, and by now you feel you can confidently predict which ones are going to be overdue with their assignments, who will consistently come to class late, and which ones probably won’t last the year. That’s what experience has taught you. The question is, what will you do with your predictions? Stop wasting your time on those students now… or put extra effort in to help them get over the line? Equally important – will you ever question your assumptions and the impacts they have?

Those are among the issues faced, on a massive scale, by higher education institutions which use learning analytics – the collection and analysis of large amounts of student data to understand the behaviours and contexts of their learners (Vu et al., 2020). Institutions are increasingly combining student data from a range of digital interactions (online learning, enrolments, academic results, finances etc.)  to predict individual learner outcomes (EDUCAUSE, 2020).

With the growing focus on ‘learner success’, there is definitely a benign side to this. For example, Ekowo & Palmer (2016, pp.2-3) describe the situation at Georgia State University, where analytics-based predictions were used to target support and interventions, apparently with great success in terms of completion rates for under-represented groups such as black and Latino students. 

However, the rapid development and growing impact of learning analytics has also prompted concerns. The International Council for Distance and Open Education (ICDE),  in its Global Guidelines: Ethics in Learning Analytics, has usefully summarised these as nine “core issues”: transparency, data ownership and control, accessibility of data, validity and reliability of data, institutional responsibility and obligation to act, communications, cultural values, inclusion, consent, and student agency and responsibility (Slade & Tait, 2019). 

Let’s take the issue of validity. Contrary to some assumptions, algorithms are not free of bias just because they are processed by machines – human decision-making still determines the choice of data used and what significance it is given. For example, use of historical data may reinforce stereotypes by reflecting decades of discrimination against certain groups (Blackmore, 2020, p.111).  On the other hand, well-known risk factors such as work and family responsibilities may be left out (EDUCAUSE, 2020, p.21). An intriguing new question highlighted by Acosta (2020) is the extent to which changed patterns of behaviour and circumstance due to COVID-19 will invalidate any predictive power of past data.

Another source of unease is how outcome predictions are used, especially now that completion rates are a key measure of an institution’s success. Ekowo & Palmer (2016, pp.2-3) claim that Mount Saint Mary’s University encouraged students with poor predicted outcomes to withdraw before their numbers needed to be included in the university’s completion statistics (though its president argued he was protecting students from the consequences of failure) .

As educators in this digital age, we have a responsibility to inform ourselves about how our institutions are using learning analytics and how that intersects with our calling to bring out the best in all our learners. An excellent starting point is the ICDE report mentioned above. If necessary, we must challenge any practices which are unethical or lead to even greater inequality. We must also create opportunities to educate our learners about the impacts of algorithms such as learning analytics on their lives – because knowledge is power. 


References

Acosta, A. (2020, May 4). Coronavirus throws predictive algorithms out for a loop. New America. http://newamerica.org/education-policy/edcentral/coronavirus-throws-predictive-algorithms-out-loop/

Blackmore, B. (2020). Predictive risk models in criminal justice. In Shouting zeros and ones: Digital technology, ethics and policy in New Zealand. Bridget Williams Books.

EDUCAUSE. (2020). 2020 EDUCAUSE horizon report: Teaching and learning edition (p. 58). EDUCAUSE. https://library.educause.edu/-/media/files/library/2020/3/2020_horizon_report_pdf.pdf?la=en&hash=08A92C17998E8113BCB15DCA7BA1F467F303BA80

Ekowo, M., & Palmer, I. (2016). The   promise and peril of predictive analytics in higher education (p. 36). New America. https://www.lonestar.edu/multimedia/The%20Promise%20and%20Peril%20of%20Predictive%20Analytics%20in%20Higher%20Education.pdf

Slade, S., & Tait, A. (2019). Global guidelines: Ethics in learning analytics (p. 16). International council for open and distance education. https://static1.squarespace.com/static/5b99664675f9eea7a3ecee82/t/5ca37c2a24a694a94e0e515c/1554218087775/Global+guidelines+for+Ethics+in+Learning+Analytics+Web+ready+March+2019.pdf

Vu, P., Adkins, M., & Henderson, S. (2020). Aware, But Don’t Really Care: Students’ Perspective on Privacy and Data Collection in Online Courses. Journal of Open Flexible and Distance Learning, 23(2), 42–51. http://www.jofdl.nz/index.php/JOFDL/article/view/350


Image attribution

Image remixed from:
Students in my class by alist, published under a CC BY-NC 2.0 licence


Licence

This blog post is licensed under a CC-BY-NC-SA licence

Comments

  1. That's an outstanding post! I really like the introduction paragraph - it captured my interest to read the full article ;-). Thanks to the course feed - I noticed the blog entry.

    ReplyDelete

Post a Comment