Photo by 3MotionalStudio on Pexels


This course takes a critical approach to considering the increasing role that surveillance technologies driven by artificial intelligence and big data are predicted to play in education and the injustices they inflict on marginalised groups. It introduces participants to the concepts of algorithmic bias and algorithmic injustice. We will explore how algorithmic decision-making systems used in educational contexts work, how they are biased against marginalised groups, and how this results in algorithmic injustice in educational institutions. Then we will explore algorithmic injustice in the context of the use of facial recognition for the purposes of proctoring and in the context of the use of emotion recognition systems for the purposes of supporting social and emotional learning. The course challenges participants to go beyond standard accounts of algorithmic bias in education to consider whether there might be more specific algorithmic injustices that harm pupils specifically in their capacity as learners, such as epistemic injustices and affective injustices. And finally, participants will be challenged to reimagine more equitable and just educational futures for all learners and especially those from marginalised groups.


On completing this OER, participants will be able to:

  • Understand how algorithmic decision-making works and its effects on marginalised groups.
  • Identify and assess algorithmic injustice, harm, and the power dynamics bound up with the design of algorithms driven by big data.
  • Discuss the use of algorithmic decision-making in educational contexts.
  • Make recommendations towards a more equitable future for learning.

This OER is divided into seven sections:

  • The home page (where you are now) briefly introduces the OER, and outlines the course objectives, content, its intended audience, how participants can use it, how long it will take, and requirements to access it.
  • Block One explores the increasing role that a range of AI-driven decision-making is predicted to play in education.
  • Block Two explores how algorithmic decision-making works and introduces the concepts of algorithmic bias and algorithmic injustice.
  • Block Three examines how these concepts play out in the context of using facial recognition technologies and emotion recognition technologies in educational contexts and considers their impact on marginalised groups.
  • Block Four explores the implications of the topics covered over blocks one to three for the futures of education.
  • The section on references lists references to works referred to over the course.
  • The section on feedback gives you the opportunity to say what you thought about the course so that it may be improved.

Participants will typically be educational professionals and Masters-level students working in digital education, learning technology, digital futures, and e-learning.


It’s best to follow along step-by-step but you can also dip in and out of the content​.


The coursework shouldn’t take more than four hours in total but any timings listed are approximate since no two people work at exactly the same pace. Please note that optional activities are not included in this time estimate. If you do the optional activities provided, it will take more time.


You will need a computer or mobile phone with internet access. Your device will need audio and video enabled as there may be videos to watch. You will need a PDF reader to access downloadable documents. You will also need a sense of curiosity about the subject and some motivation, determination, and time set aside to get to the end. A willingness to think carefully and question and challenge what you are reading is also required, as is the willingness to engage in conversation with others about the issues as you consider how you might apply what you learn to your own context.


This OER (Open Education Resource) was produced by Dr. Aisling Crean, a student on the MSc in Digital Education at the University of Edinburgh in April 2021. It also forms part of the co-created fourth block of Dr. Jen Ross’s course Digital Futures for Learning on the MSc programme. The content of the OER is based on Aisling Crean’s position paper assignment undertaken at the University of Edinburgh for Dr. Ross’s Digital Futures for Learning module. ​ The original paper was titled ‘Is emotional artificial intelligence a trustworthy technology to deploy in schools?’ You will have the opportunity to read it in block three of the course. ​

Blog at WordPress.com.