Skip to content

Algorithmic alchemy – socio-technical reproduction of social inequality in education

Algorithmic systems promise to make grading objective and individualize education. Instead of subjective preferences of teachers algorithms seemingly base their decisions about grading and allocating tasks on seemingly objective data. Yet, due to their nontransparent nature new forms of inequality are unwittingly introduced. Datasets and their algorithmic analysis are subject to algorithmic bias, thus not only reproducing but enhancing existing educational inequalities. Two cases for algorithmic bias in education are discussed: intelligent tutoring systems and automated grading. Tobias Röhl and Matthias Kirchner situate algorithmic systems in their practical context and argue for a strengthening of the teacher profession in light of the portrayed risks.

The article “Algorithmic alchemy – the socio-technical reproduction of social inequality in education” was published in the journal “Soziale Probleme 2/2023” and is also available in OpenAccess.

Authors:Tobias Röhl, PHZH and Matthias Kirchner, PH Bern

Featured image: Alan Warburton / © BBC / Better Images of AI / Quantified Human / CC-BY 4.0